ATI's New High End and Mid Range: Radeon X1950 XTX & X1900 XT 256MB
by Derek Wilson on August 23, 2006 9:52 AM EST- Posted in
- GPUs
Power to the People
The major power hog of this generation is the X1900 XTX, as we have made clear in past articles. Almost disturbingly, a single X1900 XTX draws more power than a 7950 GX2, and X1900 XTX CrossFire is more power hungry than 7950 Quad SLI. While ATI already had the slightly lower clocked X1900 XT available for those who wanted something that acted slightly less as a space heater, they needed something that performed better and fit into the same (or better) power envelope to round out this generation of GPUs for them. What they latched on to has now given graphics cards sporting the R580+ a much needed drop in power: GDDR4.
As we explained in the GDDR4 section, the optimizations made to this generation of graphics memory technology have been designed with both power savings and potential speed in mind. We've already seen how the higher speed memory pulls through in our performance tests, but how does it hold up on the power front?
For this test, used our Kill-A-Watt to measure system power at the wall. Our load numbers are recorded as maximum power draw during a run of 3DMark06's fill rate and pixel shader feature tests.
Apparently, JEDEC and ATI did their jobs well when deciding on the features of GDDR4 and making the decision to adopt it so quickly. Not only has ATI been able to improve performance with their X1950 XTX, but they've been able to do so using significantly less power. While the X1950 XTX is still no where near the envelope of the 7900 GTX, drawing the same amount of power as the X1900 XT and 7950 GX2 is a great start.
It will certainly be interesting to see what graphics makers can do with this RAM when focusing on low power implementations like silent or budget products.
74 Comments
View All Comments
DerekWilson - Saturday, August 26, 2006 - link
yeah ... i didn't test power with crossfire -- which is a whole lot higher. also, i have a minimal set of componets to make it work -- one hdd, one cdrom drive, and no addin cards other than graphics.we'll do multi-gpu power when we look at quadsli
ElFenix - Thursday, August 24, 2006 - link
the review states that power consumption was measured at the wall wtih a kill-a-watt, during a 3Dmark run.in addition to the water cooling, it could be he's running a more efficient PSU. in a powerful system drawing 220 watts from the power supply would draw 277 watts from the wall with an 80% efficient PSU (like a good seasonic) and draw 314 watts with a 70% efficient PSU. that's a pretty decent difference right there.
... still waiting for nvidia's HQ driver run...
poohbear - Thursday, August 24, 2006 - link
thanksRock Hydra - Wednesday, August 23, 2006 - link
With those competitively price parts, hopefully nVIDIA will respond with lower prices.CreepieDeCrapper - Wednesday, August 23, 2006 - link
I'm not familiar with 1920x1440, did you mean 1920x1200? What resolution were these tests performed? Thank you!JarredWalton - Wednesday, August 23, 2006 - link
1920x1440 is a standard 4:3 aspect ratio used on many CRTs. It is often included as performance is somewhat close to 1920x1200 performance.CreepieDeCrapper - Wednesday, August 23, 2006 - link
Thanks, I've been using my LCD for so long I forgot about the vintage CRT res's out there ;) Plus I never ran that particular res on my CRT when I had one, so I just wasn't familiar.cgaspar - Wednesday, August 23, 2006 - link
While average frame rates are interesting, I _really_ care about minimum frame rates - 300fps average is useless if at a critical moment in a twitch game the frame rate drops to 10fps for 3 seconds - this is especially true in Oblivion. Of course it's possible that the minimums would be the same for all cards (if the game is CPU bound in some portion), but they might not be.JarredWalton - Wednesday, August 23, 2006 - link
A lot of games have instantaneous minimums that are very low due to HDD accesses and such. Oblivion is a good example. Benchmarking also emphasizes minimum frame rates, as in regular play they occur less frequently. Basically, you run around an area for a longer period of time in actual gaming, as opposed to a 30-90 second benchmark. If there's a couple seconds at the start of the level where frame rates are low due to the engine caching textures, that doesn't mean as much as continuos low frame rates.More information is useful, of course, but it's important to keep things in perspective. :)
kmmatney - Wednesday, August 23, 2006 - link
The charts show tht the 7900GT gets a huge boost from being factory overclocked. It would be nice to see if the X1900XT 256 MB can also be overclocked at all, or if there is any headroom.