ATI's New High End and Mid Range: Radeon X1950 XTX & X1900 XT 256MB
by Derek Wilson on August 23, 2006 9:52 AM EST- Posted in
- GPUs
Splinter Cell: Chaos Theory Performance
We make use of the Lighthouse demo for Splinter Cell: Chaos Theory. We have been using this benchmark for quite some time and facilitate automation with the scripts published at Beyond 3D. This benchmark is fairly close to in game performance for our system, but midrange users may see a little lower real world performance when tested with a lower speed processor.
Our settings all used the highest quality level possible including the extra SM3.0 features. As the advanced shaders and antialiasing are mutually exclusive under SC:CT, we left AA disabled and focused on the former. We set anisotropic filtering to 8x for all cards.
For this 3rd person stealth game, ultra high frame rates are not necessary. We have a good playing experience at 25 fps or higher. There may be the framerate junkie out there who likes it a little higher, but our recommendation is based on consistency of experience and ability to play the game without a degraded experience.
NVIDIA's 7900 GTX SLI does almost as well as X1900 CrossFire, but the 14% advantage X1950 CF has over X1900 CF puts it way out in front. The 7950 GX2 once again splits the difference between the X1950 XTX and the 7900 GTX SLI.
While X1950 XTX leads all the single-GPU single-card solutions, there really isn't that much difference between the playability of the X1900 XTX, 7900 GTX, and X1900 XT. The extra 256MB of RAM the original X1900 XT has does give it a 7.5% advantage over it's baby brother at this resolution.
ATI leads again in Splinter Cell: Chaos Theory, both in dual-GPU and single-GPU configurations. Here the GX2 occupies a nice middle ground, and all of the tested cards manage to remain playable up through 2048x1536. Using the "Chuck Patch" it is also possible to enable AA+HDR on ATI hardware, though time constraints and the fact that there is no NVIDIA equivalent caused us to skip this test for now.
74 Comments
View All Comments
Vigile - Wednesday, August 23, 2006 - link
My thought exactly on this one Anand...Anand Lal Shimpi - Wednesday, August 23, 2006 - link
You can run dual monitors with a CrossFire card as well, the CrossFire dongle that comes with the card has your 2nd DVI output on it :)Take care,
Anand
kneecap - Wednesday, August 23, 2006 - link
What about VIVO? The Crossfire Edition does not support that.JarredWalton - Wednesday, August 23, 2006 - link
For high-end video out, the DVI port is generally more useful anyway. It's also required if you want to hook up to a display using HDCP - I think that will work with a DVI-to-HDMI adapter, but maybe not? S-VIDEO and Composite out are basically becoming seldom used items in my experience, though the loss of component out is a bit more of a concern.JNo - Thursday, August 24, 2006 - link
So if I use DVI out and attach a DVI to HDMI adaptor before attaching to a projector or HDTV, will I get a properly encrypted signal to fully display future blu-ray/hd-dvd encrypted content?The loss of component is a bit of a concern as many HDTVs and projectors still produce amazing images with component and, in fact, I gather that some very high resolutions+refresh rates are possible on component but not DVI due to certain bandwidth limitations with DVI. But please correct me if I am wrong. I take Anandtech's point on the crossfire card offering more but with a couple of admittedly small quesiton marks, I see no reason not to get the standard card and crossfire for the second later if you decided to go that route...
JarredWalton - Thursday, August 24, 2006 - link
I suppose theoretically component could run higher resolutions than DVI, with dual-link being required for 2048x1536 and higher. Not sure what displays support such resolutions with component inputs, though. Even 1080p can run off of single-link DVI.I think the idea with CF cards over standard is that they will have a higher resale value if you want to get rid of them in the future, and they are also more versatile -- TV out capability being the one exception. There are going to be a lot of people that get systems with a standard X1950 card, so if they want to upgrade to CrossFire in the future they will need to buy the CrossFire edition. We all know that at some point ATI is no longer going to make any of the R5xx cards, so if people wait to upgrade to CrossFire they might be forced to look for used cards in a year or two.
Obviously, this whole scenario falls apart if street prices on CrossFire edition cards end up being higher than the regular cards. Given the supply/demand economics involved, that wouldn't be too surprising, but of course we won't know for another three or four weeks.
UNESC0 - Wednesday, August 23, 2006 - link
thanks for clearing that up Anand, news to me!TigerFlash - Wednesday, August 23, 2006 - link
I was wondering if anyone thinks it's wise to get an intel core duo 2 motherboard with crossfire support now that AMD is buying out ATI. Do you think ATI would stop supporting Intel motherboards?johnsonx - Wednesday, August 23, 2006 - link
Of course not. AMD/ATI isn't stupid. Even if their cross-licensing agreement with Intel didn't prevent them from blocking Crossfire on Intel boards (which it almost surely does), cutting out that part of the market would be foolish.
dderidex - Wednesday, August 23, 2006 - link
What's with the $99 -> $249 gap?Weren't we supposed to see an X1650XT, too? Based on RV570? ...or RV560? Something?