BFG PhysX and the AGEIA Driver
Let us begin with the BFG PhysX card itself. The specs are the exact same as the ASUS card we previewed. Specifically, we have:130nm PhysX PPU with 125 million transistors
128MB GDDR3 @ 733MHz Data Rate
32-bit PCI interface
4-pin Molex power connector
The BFG card has a bonus: a blue LED behind the fan. Our BFG card came in a retail box, pictured here:
Inside the box, we find CDs, power cables, and the card itself:
As we can see here, BFG opted to go with Samsung's K4J55323QF-GC20 GDDR3 chips. There are 4 chips on the board, each of which is 4 banks of 2Mb x 32 RAM (32MB). The chips are rated at 2ns, giving a maximum clock speed of 500MHz (1GHz data rate), but the memory clock speed used on current PhysX hardware is only 366MHz (733MHz data rate). It is possible that lower than rated clock speeds could be implemented to save on power and hit a lower thermal envelope. It might be possible that a lower clock speed allows board makers to be more aggressive with chip timing if latency is a larger concern than bandwidth for the PhysX hardware. This is just speculation at this point, but such an approach is certainly not beyond the realm of possibility.
The pricing on the BFG card costs about $300 at major online retailers, but can be found for as low as $280. The ASUS PhysX P1 Ghost Recon Edition is bundled with GRAW for about $340, while the BFG part does not come with any PhysX accelerated games. It is possible to download a demo of CellFactor now, which does add some value to the product, but until we see more (and much better) software support, we will have to recommend that interested buyers take a wait and see attitude towards this part.
As for software support, AGEIA is constantly working on their driver and pumping out newer versions. The driver interface is shown here:
Click to enlarge |
There isn't much to the user side of the PhysX driver. We see an informational window, a test application, a diagnostic tool to check or reset hardware, and a help page. There are no real "options" to speak of in the traditional sense. The card itself really is designed to be plugged in and forgotten about. This does make it much easier on the end user under normal conditions.
We also tested the power draw and noise of the BFG PhysX card. Here are our results:
Noise (in dB)
Ambient (PC off): 43.4
No BFG PhysX: 50.5
BFG PhysX: 54.0
The BFG PhysX Accelerator does audibly add to the noise. Of course, the noise increase is nowhere near as bad as listening to an ATI X1900 XTX fan spin up to full speed.
Idle Power (in Watts)
No Hardware: 170
BFG PhysX: 190
Load Power without Physics Load
No Hardware: 324
BFG PhysX: 352
Load Power with Physics Load
No Hardware: 335
BFG PhysX: 300
At first glance these results can be a bit tricky to understand. The load tests were performed with our low quality Ghost Recon Advanced Warfighter physics benchmark. Our test "without Physics Load" is taken before we throw the grenade and blow up everything, while the "with Physics Load" reading is made during the explosion.
Yes, system power draw (measured at the wall with a Kill-A-Watt) decreases under load when the PhysX card is being used. This is made odder by the fact that the power draw of the system without a physics card increases during the explosion. Our explanation is quite simple: The GPU is the leading power hog when running GRAW, and it becomes starved for input while the PPU generates its data. This explanation fits in well with our observations on framerate under the games we tested: namely, triggering events which use PhysX hardware in current games results in a very brief (yet sharp) drop in framerate. With the system sending the GPU less work to do per second, less power is required to run the game as well. While we don't know the exact power draw of the PhysX card itself, it is clear from our data that it doesn't pull nearly the power that current graphics cards require.
67 Comments
View All Comments
segagenesis - Wednesday, May 17, 2006 - link
I feel so tempted to bring up the old cliche "The message is clear..." when you word it like that :)Really why is there not more "WTF" here? A better analogy to what you describe is the old "Hardware Decelerators" that say the S3 Virge was. And for $300? Damn, next thing we know they will be sub-licensing Patty-On-Patty technology from Burger King with a dual core physics processor for only $600! *groan*
They have the right idea here but this is some of the poorest execution possible in convincing people you need this product.
Magnadoodle - Wednesday, May 17, 2006 - link
Calling this a physics decelerator seems just perfect. I wish anandtech would use some biting humour now and then. But that would mean degraded relations with Asus and BFG.Oh well, let's just get nostalgic about the days of unconstrained journalism and reread those old 6% Pcgamer reviews.
abhaxus - Friday, May 19, 2006 - link
When I got my original voodoo 1 card, the first thing I did was plug it in and run a few timedemos in GLquake... surprise surprise, it was actually a few FPS slower than I was running in software mode. Of course, I was running software mode at 320x240 and GL at 640x480 and the game looked incredible.I haven't seen a PhysX card in person but the trailers for cellfactor look very impressive. With PhysX being taken advantage of throughout the design and coding process I can't wait to see what the final results are for new games... of course, new drivers and a PCIe version will help too.
That said... I really think that this card will eventually turn out to be only for people that don't have a dual core CPU. Seems like most everything could be done by properly multithreading the physics calculations.
Nighteye2 - Wednesday, May 17, 2006 - link
It's perfectly possible to remain be critical while remaining polite. Biting humour is unnecessarily degrading and does not add any value. Even 6% ratings can be given in perfectly polite wording.DerekWilson - Wednesday, May 17, 2006 - link
We certainly aren't pulling punches, and we wouldn't do anything to preserve a relationship with any company. If we make someone angry, we've still got plenty of ways to get a hold of their product.I hope we were able to make it clear that CoV giving similar results to GRAW gave us pause about the value of PhysX when applied to games that just stick in some effects here and there. We also (I hope clearly) stressed that there isn't enough value in the product for consumers to justify a purchase at this time.
But we weren't overly hard on AGEIA as we could be for a couple reasons. First, CellFactor and HangarofDoom are pretty interesting demos. The performance of them and the possibilities presented by games like them indicate that PhysX could be more useful in the future (especially with its integration into UE3 and other game engines). Second, without more tools or games we just can't determine the actual potential of this hardware. Sure, right now developers aren't making practical use of the technology and it isn't worth its price tag. But it is very premature for us to stamp a "decelerator" label on it and close the case.
Maybe we will end up calling this thing a lemon, but we just need more hard data before we will do so.
Magnadoodle - Wednesday, May 17, 2006 - link
Yes, I understand your point of view, and I don't think you're pulling any punches or being biaised. In fact, a biting review would be more biaised than anything. I was just remarking that this would have made a perfect occasion to have a bit of fun with AGEIA and drag them through the dredges. I nostalgically recalled the quite biting and humorous style PC Gamer put into their 6% reviews. PC Gamer never was a pantheon of game reviewing, but they didn't have to be nice to nobody (actually to "nobodies", because they had to be nice to big corporations). My point was more about the lack of wits and style in web publications these days than about anandtech being biaised. Not that anandtech has bad writers, just that it's more scientific than sarcastic.Anyway, good review Mr. Wilson and keep up the good work.
Seer - Wednesday, May 17, 2006 - link
Im also wondering about this claim that the driver update increased framerates. In all but two of the tests, the avg fps was either the same or a decrease. The largest increase was 1 fps, totally within the margin of error. (I'm talking about the GRAW tests). So, um, yeah, no increase there.