ATI's New High End and Mid Range: Radeon X1950 XTX & X1900 XT 256MB
by Derek Wilson on August 23, 2006 9:52 AM EST- Posted in
- GPUs
The Elder Scrolls IV: Oblivion Performance
While it is disappointing that Oblivion doesn't have a built in benchmark, our FRAPS tests have proven to be fairly repeatable and very intensive on every part of a system. While these numbers will reflect real world playability of the game, please remember that our test system uses the fastest processor we could get our hands on. If a purchasing decision is to be made using Oblivion performance alone, please check out our two articles on the CPU and GPU performance of Oblivion. We have used the most graphically intensive benchmark in our suite, but the rest of the platform will make a difference. We can still easily demonstrate which graphics card is best for Oblivion even if our numbers don't translate to what our readers will see on their systems.
Running through the forest towards an Oblivion gate while fireballs fly by our head is a very graphically taxing benchmark. In order to run this benchmark, we have a saved game that we load and run through with FRAPS. To start the benchmark, we hit "q" which just runs forward, and start and stop FRAPS at predetermined points in the run. While not 100% identical each run, our benchmark scores are usually fairly close. We run the benchmark a couple times just to be sure there wasn't a one time hiccup.
As for settings, we tested a few different configurations and decided on this group of options:
Oblivion Performance Settings | |
Texture Size | Large |
Tree Fade | 100% |
Actor Fade | 100% |
Item Fade | 66% |
Object Fade | 90% |
Grass Distance | 50% |
View Distance | 100% |
Distant Land | On |
Distant Buildings | On |
Distant Trees | On |
Interior Shadows | 95% |
Exterior Shadows | 85% |
Self Shadows | On |
Shadows on Grass | On |
Tree Canopy Shadows | On |
Shadow Filtering | High |
Specular Distance | 100% |
HDR Lighting | On |
Bloom Lighting | Off |
Water Detail | High |
Water Reflections | On |
Water Ripples | On |
Window Reflections | On |
Blood Decals | High |
Anti-aliasing | Off |
Our goal was to get acceptable performance levels under the current generation of cards at 1600x1200. This was fairly easy with the range of cards we tested here. These settings are amazing and very enjoyable. While more is better in this game, no current computer will give you everything at high res. Only the best multi-GPU solution and a great CPU are going to give you settings like the ones we have at high resolutions, but who cares about grass distance, right?
While very graphically intensive, and first person, this isn't a twitch shooter. Our experience leads us to conclude that 20fps gives a good experience. It's playable a little lower, but watch out for some jerkiness that may pop up. Getting down to 16fps and below is a little too low to be acceptable. The main point to bring home is that you really want as much eye candy as possible. While Oblivion is an immersive and awesome game from a gameplay standpoint, the graphics certainly help draw the gamer in.
Oblivion is the first game in our suite where ATI's latest and greatest actually ends up on top. The margin of victory for the X1950 CrossFire isn't tremendous, measuring in at 6.5% over the 7900 GTX SLI.
As a single card, the 7950 GX2 does better than anything else, but as a multi-GPU setup it's not so great. The 12% performance advantage at 2048 x 1536 only amounts to a few more fps, but as you'll see in the graphs below, at lower resolutions the GX2 actually manages a much better lead. A single X1950 XTX is on the borderline of where Oblivion performance starts feeling slow, but we're talking about some very aggressive settings at 2048 x 1536 - something that was simply unimaginable for a single card when this game came out. Thanks to updated drivers and a long awaited patch, Oblivion performance is no longer as big of an issue if you've got any of these cards. We may just have to dust off the game ourselves and continue in our quest to steal as much produce from as many unsuspecting characters in the Imperial City as possible.
Oblivion does like having a 512MB frame buffer, and it punishes the X1900 XT 256MB pretty severely for skimping on the memory. If you do enjoy playing Oblivion, you may want to try and pick up one of the 512MB X1900 XTs before they eventually disappear (or start selling for way too much).
In contrast to Battlefield 2, it seems that NVIDIA's 7900 GTX SLI solution is less CPU limited at low resolution than ATI's CrossFire. Of course, it's the higher resolutions we are really interested in, and each of the multi card options we tested performs essentially the same at 1600x1200 or higher. The 7950 GX2 seems to drop off faster than the X1950 XTX, allowing ATI to close the gap between the two. While the margin does narrow, the X1950 XTX can't quiet catch the NVIDIA multi-GPU single card solution. More interestingly, Oblivion doesn't seem to care much about the differences between the X1950 XTX, X1900 XTX, and X1900 XT. While the game does seem to like a 512MB of onboard memory, large differences in memory speed and small differences in core clock don't seem to impact performance significantly.
74 Comments
View All Comments
JarredWalton - Wednesday, August 23, 2006 - link
We used factory overclocked 7900 GT cards that are widely available. These are basically guaranteed overclocks for about $20 more. There are no factory overclocked ATI cards around, but realistically don't expect overclocking to get more than 5% more performance on ATI hardware.The X1900 XTX is clocked at 650 MHz, which isn't much higher than the 625 MHz of the XT cards. Given that ATI just released a lower power card but kept the clock speed at 650 MHz, it's pretty clear that there GPUs are close to topped out. The RAM might have a bit more headroom, but memory bandwidth already appears to be less of a concern, as the X1950 isn't tremendously faster than the X1900.
yyrkoon - Wednesday, August 23, 2006 - link
I think its obvious why ATI is selling thier cards for less now, and that reason is alot of 'tech savy' users, are waiting for Direct3D 10 to be released, and want to buy a capable card. This is probably to try an entice some people into buying technology that will be 'obsolete', when Direct3D 10 is released.Supposedly Vista will ship with Directx 9L, and Directx 10 (Direct3D 10), but I've also read to the contrary, and that Direct3D 10 wont be released until after Vista ships (sometime). Personally, I couldnt think of a better time to buy hardware, but alot of people think that waiting, and just paying through the nose for a Video card later, is going to save them money. *shrug*
Broken - Wednesday, August 23, 2006 - link
In this review, the test bed was an Intel D975XBX (LGA-775). I thought this was an ATI Crossfire only board and could not run two Nvidia cards in SLI. Are there hacked drivers that allow this, and if so, is there any penalty? Also, I see that this board is dual 8x pci-e and not dual 16x... at high resolutions, could this be a limiting factor, or is that not for another year?DerekWilson - Wednesday, August 23, 2006 - link
Sorry about the confusion there. We actually used an nForce4 Intel x16 board for the NVIDIA SLI tests. Unfortunately, it is still not possible to run SLI on an Intel motherboard. Our test section has been updated with the appropriate information.Thanks for pointing this out.
Derek Wilson
ElFenix - Wednesday, August 23, 2006 - link
as we all should know by now, Nvidia's default driver quality setting is lower than ATi's, and makes a significant difference in the framerate when you use the driver settings to match the quality settings. your "The Test" page does not indicate that you changed the driver quality settings to match.DerekWilson - Wednesday, August 23, 2006 - link
Drivers were run with default quality settings.Default driver settings between ATI and NVIDIA are generally comparable from an image quality stand point unless shimmering or banding is noticed due to trilinear/anisotropic optimizations. None of the games we tested displayed any such issues during our testing.
At the same time, during our Quad SLI followup we would like to include a series of tests run at the highest possible quality settings for both ATI and NVIDIA -- which would put ATI ahead of NVIDIA in terms of Anisotropic filtering or in chuck patch cases and NVIDIA ahead of ATI in terms of adaptive/transparency AA (which is actually degraded by their gamma correction).
If you have any suggestions on different settings to compare, we are more than willing to run some tests and see what happens.
Thanks,
Derek Wilson
ElFenix - Wednesday, August 23, 2006 - link
could you run each card with the quality slider turned all the way up, please? i believe that the the default setting for ATi, and the 'High Quality' setting for nvidia. someone correct me if i'm wrong.thanks!
michael
yyrkoon - Wednesday, August 23, 2006 - link
I think as long as all settings from both offerings are as close as possible per benchmark, there is no real gripe.Although, some people seem to think it nessisary to run AA as high resolutions (1600x1200 +), but I'm not one of them. Its very hard for me to notice jaggies even at 1440x900, especially when concentrating on the game, instead of standing still, and looking with a magnifying glass for jaggies . . .
mostlyprudent - Wednesday, August 23, 2006 - link
When are we going to see a good number of Core 2 Duo motherboards that support Crossfire? The fact that AT is using an Intel made board rather than a "true enthusiast" board says something about the current state of Core 2 Duo motherboards.DerekWilson - Wednesday, August 23, 2006 - link
Intel's boards are actually very good. The only reason we haven't been using them in our tests (aside from a lack of SLI support) is that we have not been recommending Intel processors for the past couple years. Core 2 Duo makes Intel CPUs worth having, and you definitely won't go wrong with a good Intel motherboard.