GeForce 9800 GTX and 3-way SLI: May the nForce Be With You
by Derek Wilson on April 1, 2008 9:00 AM EST- Posted in
- GPUs
The Test
Once again we used the Skulltrail system for most of our comparisons. We’ve added on the 790i board for 3-way SLI performance scaling tests.
I didn’t believe I would be saying this so soon, but our experience with 790i and SLI has been much much worse than on Skulltrail. We were plagued by power failure after power failure. With three 9800 GTX cards plugged in, the system never got up over 400 W when booting into windows, but after a few minutes the power would just flicker and cut out.
It didn’t make sense that it was the PSU size, because it wasn’t even being loaded. We did try augmenting the PSU with a second one to run one of the cards, but that didn’t work out either. The story is really long and arduous and for some reason involved the Power of the Dark Side, but our solution (after much effort) was to use one power supply for the system and graphics cards and one power supply for the drives and fans. Each PSU needed to be plugged into its own surge protector and needed to be on different breakers.
The working theory is that power here isn’t very clean, and the 790i board is more sensitive to fluctuations in the quality of the power supplied (which is certainly affected by the AC source). Isolating breakers and using surge protectors was the best we could do, and we are very thankful it worked out. It seems likely that a good quality 1000-1500 VA UPS would have been enough to provide cleaner power and solve the issue, but we didn’t have one to test with.
Once we handled this we were mostly able to benchmark. We could get a good 15 minutes of up time out of the system, but after repeated benchmarking instability crept back in and we’d need to wait a while before we tried again. The majority of these problems were on 3-way and Quad SLI, but we did have a hiccup with a two card SLI configuration as well. We didn’t have any trouble at all with single card solutions (even single 9800 GX2 solutions).
Before anyone says heat, we were testing in an open air environment in a room with an ambient temp of about 15 degrees C, with one 120mm fan blowing straight into the back of the GPUs and another blowing through the memory (we did take care not to interfere with the CPU HSF airflow as well). The graphics cards did get warm, but if heat was the issue here, I’d better get a bath of LN2 to run this thing submerged in ready.
It is very important that we note one more time that this is the C0 engineering sample stepping and that NVIDIA explicitly told us that stability might be an issue in some situations. The retail C1 stepping should not have these issues.
Here’s our test setup:
Test Setup | |
CPU | 2x Intel Core 2 Extreme
QX9775 @ 3.20GHz |
Motherboard | Intel D5400XS (Skulltrail) |
Video Cards | ATI Radeon HD 3870
x2 NVIDIA GeForce 8800 Ultra NVIDIA GeForce 9800 GTX NVIDIA GeForce 9800 GX2 |
Video Drivers | Catalyst 8.3 ForceWare 174.74 |
Hard Drive | Seagate 7200.9 120GB 8MB 7200RPM |
RAM | 2xMicron 2GB FB-DIMM DDR2-8800 |
Operating System | Windows Vista Ultimate
64-bit SP1 |
49 Comments
View All Comments
nubie - Tuesday, April 1, 2008 - link
It is all well and good to bash nVidia for lack of SLi support on other systems, but why can't the AMD cards run on an nVidia motherboard? Kill 2 birds with one stone there, lose your FB-Dimms and test on a single platform.Apples to apples, you can't say nVidia is at fault when the blame isn't entirely theirs (besides, isn't the capability to run Crossfire and SLi on one system a little out of the needs of most users?).
Is it that AMD allows crossfire on all Except nVidia motherboards? (Do VIA, or SiS make a multi-PCIe board?) If so, then we are talking Crossfire availability on Intel and AMD chipsets, and not nvidia, whereas nVidia allows only their own. That sounds like 30/70% blame AMD vs nVidia.
PeteRoy - Tuesday, April 1, 2008 - link
It is too hard to understand these graphs, use the ones you had in the past, I can't understand how to compare the different systems in these graphs.Use the graphs from the past with the best on top and the worst on bottom.
araczynski - Tuesday, April 1, 2008 - link
i'm using a 7900gtx right now, and the 9800gtx isn't impressing me enough to warrant $300, i might just pick up an 8800gts512 in a month when they're all well below $200. overclocking one would be more than "close enough" for me.araczynski - Tuesday, April 1, 2008 - link
... in any case, i'd much prefer to see benchmarks comparing a broader range of cards than seeing this sli/tri/quad crap. your articles are assuming that everyone upgrades their cards everytime nvidia/ati shit something out on a monthly basis.Denithor - Tuesday, April 1, 2008 - link
...because they set out to accurately compare nVidia's latest high end card to other high end options available.I'm sure in a few days there will be a followup article showing a broader spectrum of cards at more usable resolutions so we (the common masses) can see whether or not this $300 card really brings any benefit with its high price tag.
Ndel - Tuesday, April 1, 2008 - link
your benchmarks dont even make any sense =/why use a system 99.9 percent of the people dont have.
this is not even relative to what other enthusiasts currently have, how are we suppose to believe these benchmarks at all.
grain of salt...
SpaceRanger - Tuesday, April 1, 2008 - link
I believe he used the fastest processor out there to eliminate IT at the bottleneck for the benchmark.Rocket321 - Tuesday, April 1, 2008 - link
Derek -I enjoyed this article for a few reasons that made it different.
First, the honesty and discussion of problems expirenced. This helps to convay the many issues still expirenced with multi GPU solutions.
Second the youtube video. This is a neat use of available technology. Could this be useful in other ways? Maybe in the next low/mid GPU roundup it could be used to show a short clip of each card playing a game at the same point.
This could visually show where one card gets choppy and a better card doesn't.
Finally - Using a poll in the forums - really great idea to do this for relavent info and then add to an article.
Thanks for the good article!
chizow - Tuesday, April 1, 2008 - link
Thanks for making AT reviews worth reading again Derek. You addressed many of the problems I've had with the ho-hum reviews of late, like emphasizing major problems encountered during testing and dropping some incredibly insightful discoveries backed by convincing evidence (Vsync issue). Break throughs such as this are part of what make PC hardware fun and exciting.A few things you touched on but didn't really clarify was performance on Skulltrail vs. NV chipsets and memory bandwidth/amount on the 9800 vs. Ultra. I'd like to see a comparison of Skulltrail vs. 780/790i and then just future disclaimers like (Skulltrail is ~20% slower than the fastest NV solutions).
With the 9800 vs Ultra I'm a bit disappointed you didn't really dig into overclocking at all or further investigation on how much some of the issues you talked about impacted or benefitted performance, like memory bandwidth. I think its safe to say the 9800GTX as a refined G92 8800GTS has significant overclocking headroom while the Ultra does not (its basically an overclocked GTX). It would have been nice to see how much memory overclocks would've benefitted overall performance alone, then max overclocks on both the core/shader and memory.
But again, great review, I'll be reading over it again to pick up on some of the finer details.
lopri - Tuesday, April 1, 2008 - link
How many revisions the 790i have been through already? Major ones at that. Usually minor revisions are like A0->A1->A2, I thought. As a matter of fact I don't even remember if there was any nVidia chip that is 'C' revision, except maybe MCP55 (570 SLI).