Adventures in SLI


#1

After much consideration, I have decided to put my PS4 dreams on hold and (once again) renew my membership to the PCMASTERRACE. My current system includes:

  • ASUS Z87-A mobo

  • Intel i7-4790 CPU @ 3.60GHz

  • 16GB RAM

  • GeForce 970 GPU

  • ASUS 27" 144Hz primary monitor

  • Windows 10 Home

Note that I don’t overclock anything. I’m also using air cooling. That’s just how I am.

I would like a little more power out of the build. The only option available was to get a new GPU else I would have to get a new motherboard and start over with processor, etc. I feel like there’s still some life left in this system (the mobo, RAM, case and monitor are almost 2 years old) so I opted for the GPU. Rather than spend the money on a new 980Ti, which would be $625+, I decided to purchase another 970 for half the price. I once said I would never again use SLI in a single-monitor setup but I thought I would give it another try. This has allowed me to collect some comparison values.

All tests were conducted at max settings for each game unless otherwise noted. Frames-per-second (fps) were measured using Nvidia Shadowplay because I’m cheap and lazy. These tests were performed quickly, over a short time span, and mainly for the lulz. I also downloaded a (likely spyware-infested) GPU monitor for this.

DIRTY BOMB
single 970 75-80 fps
970 SLI ~80 fps

Dirty Bomb is listed as an SLI game on the Nvidia website. Reddit tells me that Dirty Bomb is actually terrible with SLI. Can confirm. But GPU monitor showed the SLI temp around 63C while playing.

BORDERLANDS: TPS
single 970 100-110 fps
970 SLI 100-170 fps

Borderlands was a mess. Frames varied on a whole new level once I added the second card. I hope that was due to the age and simplicity of their game engine. Pretty sure Battleborn uses this same engine (recommended GPU is a GeForce 660). Temps ranged from 69-73C at 25% fan speed.

EVOLVE
single 970 ~70 fps
970 SLI ~100 fps

Temps stayed around 72C with a 30% fan speed. Not much else to report.

THE DIVISION
single 970 55-60 fps (high settings)
970 SLI 90+ fps (high settings)

I was given a free copy of The Division for purchasing the GPU. The game recommended high video settings (as opposed to ultra) so I used those for the test. I consistently had 75-78C temps on the main GPU at 40% fan speed. But the game did look pretty. I might try raising to ultra settings and see what happens.

OVERWATCH
single 970 ~90 fps
970 SLI 125+ fps

Overwatch seemed to do rather well once I properly set the SLI profile (did not use “Nvidia recommended”). I don’t think I ever dropped below 125 fps. Temps generally stayed below 75C but would occasionally rise above for brief periods when action would get slow. I assume the fans would decrease speed and this caused the momentary rise in heat.

So what does all this mean? Is SLI a viable option?

My initial goal was to find a cost-effective solution to maximize my current system and achieve frames-per-second near 144 (refresh rate of my monitor). I believe I have found that by adding a second GeForce 970 to my build. For half the price of the 980Ti I have increased fps on some of these games by 40% or more. The big win is to maintain framerates above 60 which I think is easily attainable.

The main point here is that I already owned one GeForce 970 card. If I had owned a lesser card I definitely would have spent the money on a 980Ti.

But I do not feel that SLI is the bad thing many people feel it is. When I first installed the second card I had an issue where my Nvidia control panel disappeared. I had to reinstall the drivers with only one monitor hooked up (little bit of voodoo) to make it work again. So far I have had no other problems.

Maybe I will revisit this thread in a couple weeks and update on how these SLI cards are doing. Dark Souls 3 comes out on Tuesday and I assume it will support SLI. That might be a great test for the setup.


#2

Shoutout for mentioning voodoo :smiley: The days of 3DFX are not gone!


#3

The problem with SLI is that due to the the way it is engineered you do not benefit from having the extra video ram. For example if you have two cards that each have 4 GB of ram then your system won’t function like it is using 8 GB but instead just 4 GB. So effectively you are tossing 4 GB of vid ram by the wayside…

I prefer my consoles (console gamer since I was a kid) but I did build a pretty great rig for streaming games back in 2013 that I am considering upgrading. Here are the specs:

  • INTEL I7-3930K 3.2GHZ 12MB LGA 2011 RETAIL

  • 2 WD VELOCIRAPTOR SATA III 500GB 10K RPM 64MB HD (configured for raid 0)

  • 4 GSKILL 4GB DDR3 1600 MHZ RIPJAWS-X MEMORY (recently found out that the MOBA can hold up to 32GB and I plan on taking advantage of that soon)

  • ASUS P9X79 LE INTEL X79 QUAD 3 WAY SLI SATA 3 USB 3.0

  • 2 MSI NVIDIA GTX680 2GB PCIE VIDEO CARD (running SLI)

  • This rig has a dedicated avermedia capture card and the CPU is overclocked and liquid cooled. The HDs also have their own cooling enclosure.

Despite the age of this rig and it’s components I have yet to find anything that I can’t run on high specs. I have decided to upgrade the components for extra power and because the advances in tech over the last three years has produced add-ons that generate way less heat (this rig is a furnace in small room). I was looking at the 970 and the 980TI. Honestly going to a 970 should still give me a noticeable performance boost (along with allowing me to utilize VR) in addition to a decrease in heat generation. It won’t be a gigantic boost like going to a 980TI would be. I’m a big fan of the overkill for no other reason then future proofing.

Honestly SLI has been kind of a hassle in the long run. For example every time I upgrade my Nvidia drivers it turns my SLI off by default. For the first 6 months that I had the rig I didn’t even realize that my SLI was turned off. Personally I won’t go the SLI route in the future. It’s great when on a budget and wanting more power without a full upgrade. Some people argue that in some games SLI actually lowers frame rates. I’ve read some benchmarks one guy did and he showed this was true on games that supported PhysX. He was running SLI with two Titans and tried running a dedicated PhysX engine with a 970. His FPS went from 130 fps without PhysX to 100 fps with it. He then tried running with the 970 disabled and SLI disabled but with the second Titan as a dedicated PhysX which resulted in an average of 150 FPS. I’ll see if I can find the link to share for reference. Basically the conclusion the community has come to is that on games that support it, a dedicated PhysX card provides better performance than SLI provided that your PhysX card is fast enough to keep up with your main GPU.


#4

I thought DirectX 12 is supposed to allow for memory stacking? Provided the game developers and drivers build for it. The option is there but the individual game has to make use of it. I’m sure this will take some dedicated coding.

But is it really a big deal to have 8GB video memory over 4GB (or 7GB over 3.5 in the case of the 970)? I look at it as these people who are throwing 32GB of system RAM into their rigs these days; everything runs just fine on 16GB, they just want that 32 to build up the peen. Is Windows really utilizing that extra 16GB for everyday gaming?

My goal was to reach framerates near 144 to feel I was getting the best use out of my monitor. I believe I achieved that.

I was aware of the SLI turning off when updating drivers from my first time using SLI a couple years ago. It’s not a big deal for me to open the control panel and hit that radio button.

It’s fine that SLI isn’t for you. But your last sentence in the quote confirms the point I was trying to make in this thread: If you already own a single 970 (or similar mid-range card) then it is likely in your best interest to utilize SLI rather than spend (possibly double) the money on the high-end model of that series.


#5

Well that really comes down to if you are only using your rig to run the game. The whole point of more RAM is to aid in multi-tasking. A lot of people are lucky to have one killer rig much less multiple and as such are forced to multi-task on the same PC while playing games. Imagine someone running Battlefield Hardline on ultra settings while streaming to Twitch, running OBS/Xsplit, their webcam and playing music. Which honestly is actually a pretty common situation in this day and age. I know some streamers have separate rigs they set up just to handle the stream but not all do. The more RAM you have in that situation the better and honestly RAM is very cheap considering the performance upgrade it provides.


#6

Meant to address this in my original reply. Good point that Direct X 12 is supposed to address the memory stacking issue but the problem is not just the need for dedicated coding but for developers to actually give a damn. Right now there are only about 11 mainstream games with Direct X 12 support and only 4 of them are AAA titles. Plus those 4 titles are also console ports (Rise of the Tomb Raider, Gears of War Ultimate Edition, Quantum Break and Hitman (2016)). The Quantum Break PC version is a UWA (Universal Windows App) and a very shoddy port. Thankfully Remedy made the console version native but they probably didn’t have a choice as UWA is not up and running yet on the Xbone. As a Microsoft developer it has not gotten me jazzed about UWA :frowning:.

The gaming and dev community seems to think that Direct X 12 will not be fully adopted yet (and I tend to agree) due to how the current business practice is to develop for console and then port to PC. While the Xbox One can support the Direct X 12 API, Phil Spencer (the boss of the Xbox division) has been quick to point out that while the Xbox One will benefit it will also be hampered by the current hardware (CPU, RAM, etc). Also I want to add that the console market is dominated by AMD graphics which is why we have console ports that are not completely optimized for Nvidia cards except where the dev puts in the extra work.

So in a nutshell Direct X 12 could fix the SLI memory problem if the industry stopped focusing on console as their dev target…