NVIDIA's 3-way SLI: Can we finally play Crysis?
by Anand Lal Shimpi on December 17, 2007 3:00 PM EST- Posted in
- GPUs
Quad SLI Redux?
With the hardware requirements met, it's time to look at the software requirements. Currently, 3-way SLI is only supported under Windows Vista, not XP. Other than the OS stipulation, 3-way SLI isn't really any different from conventional 2-card SLI. Many of you will remember the ill fated Quad SLI product NVIDIA brought to market just under two years ago.
Quad SLI had three major problems that kept it from being a worthwhile product:
1) It relied on the 7950 GX2, which was a single card, dual GPU solution. The problem is that the each GPU on a 7950 GX2 was slower than a 7900 GTX. So a single 7950 GX2 was slower than a pair of 7900 GTXs. Quad SLI used two of these 7950 GX2s, so the performance improvement over a pair of 7900 GTXs wasn't all that great at its best.
2) The best performing games with Quad SLI used AFR (Alternate Frame Rendering) to divide up the rendering workload, where each GPU was responsible for rendering its own frame. The result is that GPU 1 would render frame 1, while GPU 2 would work on the next frame, GPU 3 would work on the third and GPU 4 would work on the fourth. DirectX 9 unfortunately only allowed for a 3-frame render ahead, meaning that this AFR mode wouldn't work. With the vast majority of games being DX titles, this posed a significant problem to Quad SLI performance.
3) The final issue with Quad SLI was that by the end of the year, G80 was out, and G80 was much faster. A pair of 8800 GTXs demolished a Quad SLI setup, and in some cases even a single card was faster.
Thankfully, 3-way SLI doesn't have these problems. The three-card SLI setup relies on regular 8800 GTX/Ultra cards, which are still among the fastest GPUs that NVIDIA offers today. The 3-frame render ahead limitations of DX9 aren't present in DX10, so we can get good scaling with AFR in DX10 titles.
The problem of planned obsolesce is a concern though and it's almost inevitable that 3-way SLI based on G80 will be replaced very soon. There's no doubt that NVIDIA will eventually replace the 8800 GTX and Ultra with G92 based variants, which will reduce power consumption and improve performance. The fact that G80 came out over a year ago should preclude any thoughts of purchasing a brand new 3-way SLI setup, but for users who already have two 8800 GTX or Ultra cards, adding a third is a mostly reasonable proposition.
The Test
Special thanks to both EVGA and ASUS for providing us with hardware for this review. Both EVGA and ASUS sent us 8800 Ultras and 780i based motherboards for this comparison, although it is worth mentioning that you could use a 680i motherboard and any brand (or mixture of brands) of 8800 GTX/Ultra cards - provide of course that your 680i motherboard has the necessary x16 PCIe slots.
Test Setup | |
CPU | Intel Core 2 Extreme QX9650 @ 3.33GHz |
Motherboard | EVGA nForce 780i SLI |
Video Cards | NVIDIA GeForce 8800 Ultra x 3 |
Video Drivers | NVIDIA: 169.18 |
Hard Drive | Seagate 7200.9 300GB 8MB 7200RPM |
RAM | 4x1GB Corsair XMS2 DDR2-800 4-4-4-12 |
Operating System | Windows Vista Ultimate 32-bit |
48 Comments
View All Comments
chizow - Tuesday, December 18, 2007 - link
Derek Wilson in 8800GT Review:Completely valid point about using 32-bit vs. 64-bit and somewhat of a hot topic over in the video forums. Honestly you have $5000+ worth of hardware in front of you, yet getting a 64-bit version of Vista running benchmarks at resolutions/settings where 64-bit and 2GB+ would help the most is too difficult? C'mon guys, seriously this is the 2nd sub-par review in a row (512 GTS review was poor too).
Also, could you clarify the bit about 680i boards being able to accomplish the same thing? Exactly what spurred this change in Tri-SLI support? Driver support? Seems Anand used 169.08 but I thought the 169.25 was the first to officially support Tri-SLI from the patch notes. Or has it always been supported and the 780i just hyping up a selling point that has been around for months? Also, the 780i article hinted there would be OC'ing tests with the chipset and I don't see any here. Going to come in a different article? Thanks.
blppt - Tuesday, December 18, 2007 - link
Yeah, seriously. Especially since the 64bit Crysis executable does away with the texture streaming engine entirely...how can you make a serious "super high end ultimate system" benchmark without utilizing the most optimized, publicly available version of the game? Is it that the 64bit Vista drivers dont support 3-way SLI yet?Otherwise, putting together a monster rig with 3 $500 videocards and then testing it with 32bit vista seems rather silly....
Ryan Smith - Tuesday, December 18, 2007 - link
Address space consumption isn't 1:1 with video memory, it's only correlated, and even less so in SLI configurations where some data is replicated between the cards. I'm not sure what exact value Anand had, but I'm confident Anand had more than 2GB of free address space.JarredWalton - Tuesday, December 18, 2007 - link
Testing at high resolutions with ultra-insane graphics settings serves one purpose: it makes hardware like Quad-SLI and Tri-SLI appear to be much better than it really is. NVIDIA recommended 8xAA for quad-SLI back in the day just to make sure the difference was large. It did make QSLI look a lot better, but when you stopped to examine the sometimes sub-20 FPS results it was far less compelling.Run at 4xAA on a 30" LCD at native resolution, and it's more than just a little difficult to see the image quality difference, with sometimes half the frame rate of 4xAA. A far better solution than maxing out every setting possible is to increase quality where it's useful. 4xAA is even debatable at 2560x1600 - certainly not required - and it's the first thing I turn off when my system is too slow for a game. Before bothering with 8xAA, try transparent supersampling AA. It usually addresses the same issue with much less impact on performance.
At the end of the day, it comes down to performance. If you can't enable 8xAA without keeping frame rates above ~40 FPS (and minimums above 30 FPS), I wouldn't touch it. I play many games with 0xAA and rarely notice aliasing on a 30" LCD. Individual pixels are smaller than on 24", 20", 19", etc. LCDs so it doesn't matter as much, and the high resolution compensates for other areas. Crysis at 2560x1600 with Very High settings? The game is already a slide show, so why bother?
0roo0roo - Tuesday, December 18, 2007 - link
faster is faster, the best is expensive and sometimes frivolous. at that price point you arent thinking like a budget buyer anymore. like exotic cars, you can't be that rational about it. its simply power ...power NOW.crimson117 - Tuesday, December 18, 2007 - link
If it's true that it's all about power, then just find the most expensive cards you can buy, install them, and don't bother playing anything. Also, tip your salesman a few hundred bucks to make the purchase that much more expensive.0roo0roo - Tuesday, December 18, 2007 - link
look, its not like you don't get any advantage from it. its not across the board at this point, but its still a nice boost for any 30" gamer.seriously, there are handbags that cost more than this sli stuff.
JarredWalton - Tuesday, December 18, 2007 - link
Next up from AnandTech: Overclocked Handbags!Stay tuned - we're still working on the details...