GeForce 9800 GTX and 3-way SLI: May the nForce Be With You
by Derek Wilson on April 1, 2008 9:00 AM EST- Posted in
- GPUs
The 9800 GTX and EVGA’s Cards
The 9800 GTX is a 128 shader, G92 based card (yes, another one) that comes in at 675MHz core, 1.69GHz shader clock, and 2.2GHz (effective) memory clock. This puts the raw power of the card up over the 8800 Ultra, but there is one major drawback to this high end part: it only has a 256-bit memory bus hooked up to 512MB of RAM.
The added memory might not come into play a lot, but the fact that the 8800 Ultra has essentially 50% more effective memory bandwidth does put it at an advantage in memory performance limited situations. This means there is potential for performance loss at high resolutions, high levels of AA, or in games with memory intensive effects. While we get that $300 US puts this card in a different class than the 8800 Ultra, and thus NVIDIA is targeting a different type of user, we would have liked to see a card with more bandwidth and more memory (especially when we look at the drop off in performance between Crysis at 19x12 and 25x16).
9800 GTX cards are capable of 3-way SLI with the two SLI connectors on the top. Of course, NVIDIA requires that we use an NVIDIA motherboard for this purpose. We are not fans of artificial technical limitations based on marketing needs and would much prefer to see SLI run on any platform that enables multiple PCIe x16 slots. With normal SLI, we do have the Skulltrail option, but NVIDIA has chosen not to enable 3-way capability on this board either.
We wanted to be able to include 3-way SLI numbers in our launch review (which has been one incredible headache, but more on that later), and EVGA was kind enough to help us out by providing the hardware. We certainly appreciate them enabling us to bring you numbers for this configuration today.
We were also able to get our hands on a C0 engineering sample 790i board for testing. Let’s just say that the experience was … character building. Running a QX9770 and 1333Mhz DDR3 at 9:9:9:24, we had what could best be described as a very rough time getting 3-way and even quad SLI with two 9800 GX2 boards to work in this system. We were lucky to get the numbers we did get. Let’s take a look at what we tested with
49 Comments
View All Comments
nubie - Tuesday, April 1, 2008 - link
It is all well and good to bash nVidia for lack of SLi support on other systems, but why can't the AMD cards run on an nVidia motherboard? Kill 2 birds with one stone there, lose your FB-Dimms and test on a single platform.Apples to apples, you can't say nVidia is at fault when the blame isn't entirely theirs (besides, isn't the capability to run Crossfire and SLi on one system a little out of the needs of most users?).
Is it that AMD allows crossfire on all Except nVidia motherboards? (Do VIA, or SiS make a multi-PCIe board?) If so, then we are talking Crossfire availability on Intel and AMD chipsets, and not nvidia, whereas nVidia allows only their own. That sounds like 30/70% blame AMD vs nVidia.
PeteRoy - Tuesday, April 1, 2008 - link
It is too hard to understand these graphs, use the ones you had in the past, I can't understand how to compare the different systems in these graphs.Use the graphs from the past with the best on top and the worst on bottom.
araczynski - Tuesday, April 1, 2008 - link
i'm using a 7900gtx right now, and the 9800gtx isn't impressing me enough to warrant $300, i might just pick up an 8800gts512 in a month when they're all well below $200. overclocking one would be more than "close enough" for me.araczynski - Tuesday, April 1, 2008 - link
... in any case, i'd much prefer to see benchmarks comparing a broader range of cards than seeing this sli/tri/quad crap. your articles are assuming that everyone upgrades their cards everytime nvidia/ati shit something out on a monthly basis.Denithor - Tuesday, April 1, 2008 - link
...because they set out to accurately compare nVidia's latest high end card to other high end options available.I'm sure in a few days there will be a followup article showing a broader spectrum of cards at more usable resolutions so we (the common masses) can see whether or not this $300 card really brings any benefit with its high price tag.
Ndel - Tuesday, April 1, 2008 - link
your benchmarks dont even make any sense =/why use a system 99.9 percent of the people dont have.
this is not even relative to what other enthusiasts currently have, how are we suppose to believe these benchmarks at all.
grain of salt...
SpaceRanger - Tuesday, April 1, 2008 - link
I believe he used the fastest processor out there to eliminate IT at the bottleneck for the benchmark.Rocket321 - Tuesday, April 1, 2008 - link
Derek -I enjoyed this article for a few reasons that made it different.
First, the honesty and discussion of problems expirenced. This helps to convay the many issues still expirenced with multi GPU solutions.
Second the youtube video. This is a neat use of available technology. Could this be useful in other ways? Maybe in the next low/mid GPU roundup it could be used to show a short clip of each card playing a game at the same point.
This could visually show where one card gets choppy and a better card doesn't.
Finally - Using a poll in the forums - really great idea to do this for relavent info and then add to an article.
Thanks for the good article!
chizow - Tuesday, April 1, 2008 - link
Thanks for making AT reviews worth reading again Derek. You addressed many of the problems I've had with the ho-hum reviews of late, like emphasizing major problems encountered during testing and dropping some incredibly insightful discoveries backed by convincing evidence (Vsync issue). Break throughs such as this are part of what make PC hardware fun and exciting.A few things you touched on but didn't really clarify was performance on Skulltrail vs. NV chipsets and memory bandwidth/amount on the 9800 vs. Ultra. I'd like to see a comparison of Skulltrail vs. 780/790i and then just future disclaimers like (Skulltrail is ~20% slower than the fastest NV solutions).
With the 9800 vs Ultra I'm a bit disappointed you didn't really dig into overclocking at all or further investigation on how much some of the issues you talked about impacted or benefitted performance, like memory bandwidth. I think its safe to say the 9800GTX as a refined G92 8800GTS has significant overclocking headroom while the Ultra does not (its basically an overclocked GTX). It would have been nice to see how much memory overclocks would've benefitted overall performance alone, then max overclocks on both the core/shader and memory.
But again, great review, I'll be reading over it again to pick up on some of the finer details.
lopri - Tuesday, April 1, 2008 - link
How many revisions the 790i have been through already? Major ones at that. Usually minor revisions are like A0->A1->A2, I thought. As a matter of fact I don't even remember if there was any nVidia chip that is 'C' revision, except maybe MCP55 (570 SLI).