NVIDIA's 3-way SLI: Can we finally play Crysis?
by Anand Lal Shimpi on December 17, 2007 3:00 PM EST- Posted in
- GPUs
Final Words
NVIDIA always does this. We got Quad SLI with the 7950 GX2, only to be replaced shortly thereafter by G80, and now we're getting 3-way SLI with the 8800 GTX/Ultra, which we all know is on the way to being replaced by G92. Investing in a 3-way SLI setup today would be a terrible idea, you're buying into old technology and you're buying it after it's already been made obsolete by a new GPU. It's only a matter of time before G92 makes its way up the food chain, and three of those bad boys with even more shader power should give us a much cooler running, and faster, 3-way SLI setup than what we've tested here today.
The setup works, we didn't run into any software issues, and we can't deny that there are some definite performance improvements in certain games. The problem is that 3-way SLI just doesn't scale well enough, in nearly enough titles to justify the price.
We'd love to say that 3-way SLI is exactly what you need to play Crysis, because at least that way there'd be a configuration in existence that would run that game well, but we just can't. The game currently doesn't scale well at all from two to three cards.
And that's the fundamental problem with 3-way SLI, it's a better effort than Quad SLI was, but it's doomed from the start: it's built on old technology. We'd much rather have a couple of faster G92 based GPUs than SLI-ing three 1.5 year old GPUs together.
Then there's the bigger issue of SLI and CrossFire technologies in general, scaling is a little too dependent on software. You're increasing the execution resources of a standard 2-card SLI setup by 50%, but the performance impact is no where near that. Whereas if you added 50% more SPs to those two 8800 Ultras you'd see a much more tangible outcome. It's an extreme version of the way Intel makes quad-core CPUs, but instead of sticking two die on a single package, you have two die spread over two cards - that's hardly efficient. GPU architectures have changed dramatically over the past few years, yet we're still left with the same old multi-GPU technology. It's time for a change.
48 Comments
View All Comments
kilkennycat - Tuesday, December 18, 2007 - link
...it's far more likely to be used by a (nV) video card functioning as a GPGPU for either gaming --- or in the short-term --- professional desktop applications. nV is making great strides in the professional scientific number-crunching and signal-processing communities with their CUDA toolset running on their current GPU offerings. They currently own ~ 86% of the "workstation graphics" market, but in a rapidly-increasing number of cases, graphics is not the sole function of the current nV workstation hardware. Wait for nVidia's next generation silicon and driver software which will be far more focussed on seamlessly merging GPU and GPGPU functionality. Also, wait for their true next-gen motherboard chip-set and not the cobbled-together "780i" which will implement symmetrical PCIe2.0 on all 3 PCIe x16 slots. Arriving about the same time as their next gen GPU family. Mid-2008 would be my guess.aguilpa1 - Tuesday, December 18, 2007 - link
Funny how your review doesn't address this blatant issue. yes it will run tri sli but don't expect it to do with the same Yorkfield used on the test board they used. Engineering samples of the QX9650 ran fine on the 680i SLI's but were changed with the retail versions. Whether it was Intels pissy way of getting back at Nvidia for not licensing SLI to them or Nvidia's way of making a buck off of selling an almost already obsolete board (nehalems coming next year). At this stage...who cares.ilovemaja - Tuesday, December 18, 2007 - link
that quote: His response? "JESUS". "No", I said, "not even Jesus needs this much power".Is one of the funnyest things i heard in my live.
Thanks for another good article, you are the best.
acejj26 - Tuesday, December 18, 2007 - link
In Crysis, you say that the third card offers a 7% performance boost over the 2 card configuration, however, it is only offering 1 fps more, which is just about 2%. Those numbers should be changed.Sunrise089 - Tuesday, December 18, 2007 - link
Not complaining, but I've noticed the last several GPU articles have been written by Anand, which isn't his normal gig. On top of that we get a reference to another GPU editor from back in the day. What's up?compy386 - Tuesday, December 18, 2007 - link
I'd be interesting to do a comparision between SLI and Crossfire once AMD gets some drivers out that actually support quad SLI. I saw a board on newegg that looks like I'd fit 3 3870s as well.AcydRaine - Tuesday, December 18, 2007 - link
AMD doesn't support "Quad-SLI" at all. There are a few boards on Newegg that will fit 4x3780s. Not just 3.compy386 - Tuesday, December 18, 2007 - link
The 3870s take up 2 slots so I only see boards that fit 3. Most of the boards will take 4 3850s though. Again, I'd like to see the performance number comparisons for scaling purposes.compy386 - Tuesday, December 18, 2007 - link
The 3870s take up 2 slots so I only see boards that fit 3. Most of the boards will take 4 3850s though. Again, I'd like to see the performance number comparisons for scaling purposes.SoBizarre - Tuesday, December 18, 2007 - link
Well, I’m glad to see this evaluation of 3-way SLI. It just gave me an idea about overcoming performance issues in games like Crysis. There is no need for building ridiculously expensive machines which draws insane amount of power. I have a better solution (although it won’t work for all of you). I’m just not going to buy a game which I can’t play in its full galore on decent system, at mainstream resolution (1680x1050).I don’t expect the latest and greatest, “show off” kind of a game to be playable at 2560x1600 with highest settings, full AA and AF. Not on a system with Q6600 and single 8800GT. But if you can’t do it on a system like one used by Anand here? Well, then it’s becoming ridiculous.
I’m trying to imagine a proud owner of machine with QX9650 @ 3.33GHz, 3 (that’s THREE) 8800 Ultras and shiny 30-inch monitor, not being able to play a game he just bought. What would be his thoughts about developer of that game? I guess not the pretty ones…