NVIDIA's 3-way SLI: Can we finally play Crysis?
by Anand Lal Shimpi on December 17, 2007 3:00 PM EST- Posted in
- GPUs
What resolutions do you need to be running?
If you're willing to spend over $1,500 on graphics cards alone, we're going to assume that you've already got a 30" display at your disposal. But in the off chance that you don't, will you see any benefit from having this much GPU power? We took a closer look at three of our benchmarks to find out.
Bioshock, the best 3-way SLI scaler we've seen today, paints a very clear picture. The 3-way 8800 Ultra setup is CPU bound until we hit 2560 x 1600, while the normal 2 card setup doesn't even come close to being CPU limited, even at 1680 x 1050.
What this tells us is that as long as the game is stressful enough, you'll see a benefit to a 3-way SLI setup even at low resolutions, just not as much as you would at higher resolutions. Pretty simple, right?
Unreal Tournament 3 shows absolutely no benefit to adding a third card, and even shows a slight performance decrease at 1680 x 1050. It isn't until 2560 x 1600 that we see any performance difference at all between the two and three card setups.
With Crysis we didn't adjust resolution, instead we varied the image quality settings: medium, high and very high. Just as with varying resolution, adjusting image quality settings increases the impact of 3-way SLI. Unfortunately where 3-way makes its biggest impact (very high quality), we're at an unplayable setting for much of the game.
What sort of a CPU do you need for this thing?
We've already established that at higher resolutions 3-way SLI can truly shine, but how ridiculous of a CPU do you need to run at those high detail settings?
The theory is that the better a game scales from 2 to 3 GPUs, the more GPU bound and less CPU bound it is. The worse a game scales, there's greater the chance that it's CPU bound (although there are many more reasons for poor scaling from 2 to 3 GPUs).
Clock speed | Bioshock | Oblivion | Crysis |
3.33GHz | 103.8 | 49.0 | 43.2 |
2.66GHz | 101.7 | 48.3 | 37.3 |
2.00GHz | 90.9 | 47.3 | 30.9 |
In Bioshock, the difference in performance at 2.66GHz and 3.33GHz is negligible, but once we drop the clock speed to 2.0GHz you start to see performance drop off. What this tells us is that even at mid-2GHz clock speeds, even a 3-way 8800 Ultra setup is GPU bound in Bioshock. And even at 2.0GHz, the 3-way setup is far from fully CPU bound as performance is still better than the two card system with a 3.33GHz CPU.
Similarly, Oblivion isn't CPU bound at all. Even at 2.0GHz, we don't see a significant drop in performance.
Crysis does actually benefit from faster CPUs at our 1920 x 1200 high quality settings. Surprisingly enough, there's even a difference between our 3.33GHz and 2.66GHz setups. We suspect that the difference would disappear at higher resolutions/quality settings, but the ability to maintain a smooth frame rate would also disappear. It looks like the hardware to run Crysis smoothly at all conditions has yet to be released.
We feel kind of silly even entertaining this question, but yes, if you want to build a system with three 8800 Ultras, you don't need to spend $1000 on a CPU. You can get by with a 2.66GHz chip just fine.
48 Comments
View All Comments
IKeelU - Tuesday, December 18, 2007 - link
When will this nonsense stop? It is perfectly reasonable for a game company to "permit" users to increase the detail if they so choose. On "high" the game looks and runs great on a sub-$400 video card. In fact, on "high" it looks better than anything out there, on any platform. At least with a "very high" setting available, the game will continue to look good a year from now when other games have caught up.andrew007 - Tuesday, December 18, 2007 - link
Uuuh... no, Crysis is not playable at "high" at any decent resolution on my 8800GT and 3.4GHz overclocked quad core Q6600. Decent being 1280 x whatever. And when you drop to medium, the game looks nothing special. Sure, there are a few areas that look great (forest level for example) but overall I was certainly not blown away. Unlike replaying Bioshock in 1920x1200 which this setup is capable of running very smoothly and which looks amazing in DX10. Quite simply, Crysis is one of the worst optimized games ever. At least it doesn't crash, that's something I guess. Looking forward to replaying it in 2 years. Come to think of it, it was the same with Far Cry, it took 2 years to be able to play that game with decent frame rates.JarredWalton - Tuesday, December 18, 2007 - link
There's nothing that says Crytek can't make a game where maximum detail settings exceed the capacity of every PC currently available. We've seen this in the past (Oblivion for one), and then a year later suddenly the game is more than playable at max settings on less expensive hardware. It doesn't appear that Tri-SLI is fully implemented for many titles, and considering the age of Crysis I'd expect more performance over time. Just like many games don't fully support SLI (or support it at all in some cases) at launch, only to end up greatly benefiting once drivers are optimized.FWIW, I'm playing Crysis on a single 8800 GTX at High detail settings and 1920x1200 with a Core 2 Duo 3.0GHz 2MB (OC'ed E4400). It might be too sluggish for online play where ping and frame rates matter more, but for single player I'm having no issues with that res/settings. It's a matter of what you feel is necessary. I'm willing to shut off AA to get the performance I want.
tshen83 - Tuesday, December 18, 2007 - link
First of all, people who will fork over 1500 dollars worth of GPUs will want to play all games at the highest settings. That means highest AA and AF settings. I don't think you used AA and AF in your testing. It is almost pointless to play without AA for such a nice setup at 120fps(bioshock) where you are becoming CPU bound rather than GPU bound.Secondly, your Crysis test used 1920x1200. Why not 2560x1600? Why not 2560x1600 at 4xAA and 16xAF? Crysis at 1920x1200 without AA and AF are severely CPU bound in your case, as you have witnessed that a faster CPU gave you linear scaling.
Third, there is actually no point of testing triple SLI at any other resolution other than 2560x1600. The target audience triple SLI is aimed at are those with 30 inch Cinema Displays.
I think to be fair, you should rerun the benchmarks in a non-CPU bound situation with AA+AF on, you will see the proper scaling then.
Thanks,
eternalkp - Tuesday, December 25, 2007 - link
very good point TshenI have a 30inch monitor.
the 7900gtx was killing my frame rate.
i was getting average 25fps @ 2560x1600, medium, 2X AA, 16X aniso...in FEAR Perseus Mandate.
Just bought MSI OC 8800GTS G92 and very happy with it.
Now i can crank up maximum graphic setting, 4X AA, 16X aniso @ average 40fps...very nice. :D
Crysis is a hot engine, i only get 30fps @ medium, AA off.
YES. what is the point of 3 GPU and have your AA/Aniso off?
game will look like crap.
Crysis recommends 4gb of ram.
kmmatney - Tuesday, December 18, 2007 - link
"Third, there is actually no point of testing triple SLI at any other resolution other than 2560x1600"The point was testing at settings that are "playable". Who cares if the framerate goes from 8 to 12 @ 2560 x 1600. Its unplayable.
I don't see how even an "enthusiast" wouldn't see triple SLI as a wate of money, though.
cmdrdredd - Tuesday, December 18, 2007 - link
The point is that running 1600x1200 is really not anything you shouldn't be able to do with one card. Even 1920x1080 in many games is perfect. Showing off 10000000fps means jack, turn the res and AA/AF up and show us what it can push out.defter - Tuesday, December 18, 2007 - link
The author missed one advantage of 3-way SLI:Of course it doesn't make any sense to spend >$1500 on three 8800GTX/Ultras today, but what about those folks that already have a SLI 8800GTX/Ultras?
For them adding a third card could be a reasonable upgrade option in comparison to replacing both cards with new G92 based cards.
3-way SLI isn't for everyone, but it has its advantages.
praeses - Tuesday, December 18, 2007 - link
I was under the impression that bioshock did not support AA in DX10. If that is indeed the case, that's hardly the fault of the benchmarker/reviewer.Also, I see much merrit in benchmarking at 1920x1200, its a much more common resolution and desktop-friendly resolution given the physical foot print of monitors. Lets be honest, many games aren't sitting 4ft from their displays. At 2-3ft a 24" display which most likely has 1920x1200 is much more comfortable for longer action based viewing. Ideally though they would have a lower dot pitch or simply higher resolution on the smaller screen.
tshen83 - Tuesday, December 18, 2007 - link
One more thing: you are using Vista Ultimate 32bit with 4GB of memory. Since in 32bit, you have 3 768MB Ultras(2.4GB reserved just for video cards) , the system will only see about 1.5GB of memory. That is not sufficient system memory for high resolution benchmarks, especially Crysis.