GeForce 9800 GTX and 3-way SLI: May the nForce Be With You
by Derek Wilson on April 1, 2008 9:00 AM EST- Posted in
- GPUs
Yes, NVIDIA leads the way in performance. They own the fastest single GPU card, the fastest multiGPU single card, and the fastest multi card configurations. People who want the best of the best do pay a premium for the privilege, but that isn't something everyone is comfortable with. Most of us would much rather see a high end card that doesn't totally depart from sanity in terms of actual value gained through the purchase. Is the 9800 GTX that solution? That's what we are here to find out.
We've gotten a lot of feedback lately about our test system. Yes, at the very high end we haven't seen what we would have expected if all things were equal between all platforms. But the fact is that making a single platform work for apples to apples comparisons between CrossFire and SLI is worth it. With this review, we aren't quite there, as we just uncovered a HUGE issue that has been holding us back from higher performance with our high end hardware. We do have some numbers showing what's going on, but we just didn't have time to rerun all of our hardware after we discovered the solution to the issue. But we'll get to that shortly.
The major questions we will want to answer with this review are mostly about value. This card isn't a new architecture and it isn't really faster than other single card single GPU solutions. But the price point does make a difference here. At about $400, AMD's Radeon 3870X2 will be a key comparison point to this new $300 part. With the 8800 Ultra and GTX officially leaving the scene, the 9800 GX2 and 9800 GTX are the new top two in terms of high end hardware at NVIDIA. The price gap between these two is very large (the 9800 GX2 costs about twice as much as a stock clocked 9800 GTX) and the 3870X2 falls right in between them. Does this favor AMD or NVIDIA in terms of value? Does either company need to adjust their price point?
Things are rarely straightforward in the graphics world, and with the crazy price points and multi-GPU solutions that recently burst on to the scene, we’ve got a lot of stuff to try and make sense out of. Let us take you through the looking glass...
49 Comments
View All Comments
nubie - Tuesday, April 1, 2008 - link
It is all well and good to bash nVidia for lack of SLi support on other systems, but why can't the AMD cards run on an nVidia motherboard? Kill 2 birds with one stone there, lose your FB-Dimms and test on a single platform.Apples to apples, you can't say nVidia is at fault when the blame isn't entirely theirs (besides, isn't the capability to run Crossfire and SLi on one system a little out of the needs of most users?).
Is it that AMD allows crossfire on all Except nVidia motherboards? (Do VIA, or SiS make a multi-PCIe board?) If so, then we are talking Crossfire availability on Intel and AMD chipsets, and not nvidia, whereas nVidia allows only their own. That sounds like 30/70% blame AMD vs nVidia.
PeteRoy - Tuesday, April 1, 2008 - link
It is too hard to understand these graphs, use the ones you had in the past, I can't understand how to compare the different systems in these graphs.Use the graphs from the past with the best on top and the worst on bottom.
araczynski - Tuesday, April 1, 2008 - link
i'm using a 7900gtx right now, and the 9800gtx isn't impressing me enough to warrant $300, i might just pick up an 8800gts512 in a month when they're all well below $200. overclocking one would be more than "close enough" for me.araczynski - Tuesday, April 1, 2008 - link
... in any case, i'd much prefer to see benchmarks comparing a broader range of cards than seeing this sli/tri/quad crap. your articles are assuming that everyone upgrades their cards everytime nvidia/ati shit something out on a monthly basis.Denithor - Tuesday, April 1, 2008 - link
...because they set out to accurately compare nVidia's latest high end card to other high end options available.I'm sure in a few days there will be a followup article showing a broader spectrum of cards at more usable resolutions so we (the common masses) can see whether or not this $300 card really brings any benefit with its high price tag.
Ndel - Tuesday, April 1, 2008 - link
your benchmarks dont even make any sense =/why use a system 99.9 percent of the people dont have.
this is not even relative to what other enthusiasts currently have, how are we suppose to believe these benchmarks at all.
grain of salt...
SpaceRanger - Tuesday, April 1, 2008 - link
I believe he used the fastest processor out there to eliminate IT at the bottleneck for the benchmark.Rocket321 - Tuesday, April 1, 2008 - link
Derek -I enjoyed this article for a few reasons that made it different.
First, the honesty and discussion of problems expirenced. This helps to convay the many issues still expirenced with multi GPU solutions.
Second the youtube video. This is a neat use of available technology. Could this be useful in other ways? Maybe in the next low/mid GPU roundup it could be used to show a short clip of each card playing a game at the same point.
This could visually show where one card gets choppy and a better card doesn't.
Finally - Using a poll in the forums - really great idea to do this for relavent info and then add to an article.
Thanks for the good article!
chizow - Tuesday, April 1, 2008 - link
Thanks for making AT reviews worth reading again Derek. You addressed many of the problems I've had with the ho-hum reviews of late, like emphasizing major problems encountered during testing and dropping some incredibly insightful discoveries backed by convincing evidence (Vsync issue). Break throughs such as this are part of what make PC hardware fun and exciting.A few things you touched on but didn't really clarify was performance on Skulltrail vs. NV chipsets and memory bandwidth/amount on the 9800 vs. Ultra. I'd like to see a comparison of Skulltrail vs. 780/790i and then just future disclaimers like (Skulltrail is ~20% slower than the fastest NV solutions).
With the 9800 vs Ultra I'm a bit disappointed you didn't really dig into overclocking at all or further investigation on how much some of the issues you talked about impacted or benefitted performance, like memory bandwidth. I think its safe to say the 9800GTX as a refined G92 8800GTS has significant overclocking headroom while the Ultra does not (its basically an overclocked GTX). It would have been nice to see how much memory overclocks would've benefitted overall performance alone, then max overclocks on both the core/shader and memory.
But again, great review, I'll be reading over it again to pick up on some of the finer details.
lopri - Tuesday, April 1, 2008 - link
How many revisions the 790i have been through already? Major ones at that. Usually minor revisions are like A0->A1->A2, I thought. As a matter of fact I don't even remember if there was any nVidia chip that is 'C' revision, except maybe MCP55 (570 SLI).