1.00 vs. 1.02 - Does it Matter?
If you read the previous pages, you can probably already guess the answer to this question. The 1.02 patch fixes a few minor errors, and it also removes DirectX 10.1 support. ATI HD 3000 series hardware is the only current graphics solution that supports DX10.1, so barring other changes there shouldn't be a performance difference on NVIDIA hardware. We tested at three settings for this particular scenario: Medium Quality, High Quality, and High Quality with 4xAA.
NVIDIA performance is more or less identical between the two game versions, but we see quite a few changes on the ATI side of things. It's interesting that overall performance appears to improve slightly on ATI hardware with the updated version of the game, outside of anti-aliasing performance.
This is where the waters get a little murky. Why exactly would Ubisoft removes DirectX 10.1 support? There seems to be an implication that it didn't work properly on certain hardware -- presumably lower-end ATI hardware -- but that looks like a pretty weak reason to totally remove the feature. After all, as far as we can tell it only affects anti-aliasing performance, and it's extremely doubtful that anyone with lower-end hardware would be enabling anti-aliasing in the first place. We did notice a few rendering anomalies with version 1.0, but there wasn't anything that should have warranted the complete removal of DirectX 10.1 support. Look at the following image gallery to see a few of the "problems" that cropped up.
In one case, there's an edge that doesn't get anti-aliased on any hardware except ATI HD 3000 with version 1.00 of the game. There may be other edges that also fall into this category, but if so we didn't spot them. The other issue is that periodically ATI hardware experiences a glitch where the "bloom/glare" effect goes crazy. This is clearly a rendering error, but it's not something you encounter regularly in the game. In fact, this error only seems to occur after you first load the game and before you lock onto any targets or use Altaïr's special "eagle vision" -- or one of any number of other graphical effects. In our experience, once any of these other effects have occurred you will no longer see this "glaring" error. In fact, I finished playing AC and never noticed this error; I only discovered it during my benchmarking sessions.
So why did Ubisoft remove DirectX 10.1 support? The official statement reads as follows: "The performance gains seen by players who are currently playing AC with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly." An additional render pass is certainly a costly function; what the above statement doesn't clearly state is that DirectX 10.1 allows one fewer rendering pass when running anti-aliasing, and this is a good thing. We contacted AMD/ATI, NVIDIA, and Ubisoft to see if we could get some more clarification on what's going on. Not surprisingly, ATI was the only company willing to talk with us, and even they wouldn't come right out and say exactly what occurred.
Reading between the lines, it seems clear that NVIDIA and Ubisoft reached some sort of agreement where DirectX 10.1 support was pulled with the patch. ATI obviously can't come out and rip on Ubisoft for this decision, because they need to maintain their business relationship. We on the other hand have no such qualms. Money might not have changed hands directly, but as part of NVIDIA's "The Way It's Meant to Be Played" program, it's a safe bet that NVIDIA wasn't happy about seeing DirectX 10.1 support in the game -- particularly when that support caused ATI's hardware to significantly outperform NVIDIA's hardware in certain situations.
Last October at NVIDIA's Editors Day, we had the "opportunity" to hear from several gaming industry professionals about how unimportant DirectX 10.1 was, and how most companies weren't even considering supporting it. Amazingly, even Microsoft was willing to go on stage and state that DirectX 10.1 was only a minor update and not something to worry about. NVIDIA clearly has reasons for supporting that stance, as their current hardware -- and supposedly even their upcoming hardware -- will continue to support only the DirectX 10.0 feature set.
NVIDIA is within their rights to make such a decision, and software developers are likewise entitled to decide whether or not they want to support DirectX 10.1. What we don't like is when other factors stand in the way of using technology, and that seems to be the case here. Ubisoft needs to show that they are not being pressured into removing DX 10.1 support by NVIDIA, and frankly the only way they can do that is to put the support backing in a future patch. It was there once, and it worked well as far as we could determine; bring it back (and let us anti-alias higher resolutions).
32 Comments
View All Comments
bill3 - Monday, June 2, 2008 - link
Actually it's terrible, I cant read the graphs AT ALL.seriously my eyes just glazed over those terrible charts..completely unreadable. I still, have no idea what I'm looking at. Is ATI supposed to be faster in this game? Why did they test with version 1.00 on ATI and 1.2 on Nvidia? I dont know because the graphs are totally useless.
Nihility - Monday, June 2, 2008 - link
I second that. The graphs are terrible. Maybe bar graphs would have been better?Sometimes when you're the one making the graph it's hard to imagine what other people are seeing when they look at them. I suggest having another pair of eyes check the graphs out for readability.
Besides that, I loved the review. Especially the performance part and the 10.1 controversy.
JarredWalton - Tuesday, June 3, 2008 - link
Charts are colored with similar colors used either for ATI vs. NVIDIA, 1.00 vs. 1.02, or dual-GPU vs. single-GPU. I could have generated four times as many graphs to show the same data, but I figure most people are capable of reading the labels on a chart and figuring out what they mean. Here's a hint: when you can't see the difference between two lines because they overlap, it's a tie.If you want to give specific examples and recommendations on what would look better and still convey the same amount of information, I'm all ears. However, simply stating that "the graphs are terrible" does little to help. Tell me what graph specifically is terrible, and tell me why it's terrible.
As an example of why I used these graphs, page 9 has two charts showing 40 total data points. You can get a clear idea of how performance scales with single or dual GPUs at the various detail settings looking at a single chart. Green is NVIDIA, Red is ATI. That makes a lot of sense to me. Creating ten different bar charts with four lines in each to show the same data makes it more difficult to compare how Medium graphics compares to High graphics performance, and it takes up five times as much space to tell the same "story".
Page 6 is the same thing, but with green used for dual-GPUs (light and dark for 1.00 and 1.02) and red for single GPUs. 24 data points in two charts instead of using six charts. Having established that 1.00 doesn't perform any different than 1.02 on NVIDIA hardware, I skipped the 1.00 NVIDIA numbers to make those charts easier to read on page 7. Then I put in the four standard test system (0xAA and 4xAA, ATI and NVIDIA) on 1.02, with 1.00 4xAA ATI in blue as a reference.
Lastly, on page 8 I have two clock speeds on NVIDIA, three on ATI, with different base colors for single and dual GPUs. ATI and NVIDIA are in separate charts, and brighter colors are for a higher overclock.
There's method to my graphing madness. Are the charts immediately clear to a casual glance? No, but then that's really difficult to do while still conveying all of the information. I spent a lot of time trying to make comprehensible charts, and settled on these as the best option I could come up with. Again, if they're so bad, it must be easy to generate something clearly better - have at it, and I'll be happy to use any sensible suggestions. However, if the only complaint is that you actually have to look at the charts and think for a minute before you understand, I'm not likely to be very sympathetic. I think our readers are smart enough to digest these graphs.
mpjesse - Monday, June 2, 2008 - link
While I appreciate the detailed review, isn't it a little irrelevant now? I mean, the game's been out for nearly 2 months now and it's been reviewed everywhere. The only thing new about this review are the performance benchmarks, in which case I would have have made the review solely about performance instead of gameplay.Just my 2 cents.
ImmortalZ - Monday, June 2, 2008 - link
Its sad that the companies with money always manage to suppress innovation.I hope this article by AT will raise some ruckus in the collective Interwebs and cause something. But I doubt it.
ViRGE - Monday, June 2, 2008 - link
For what it's worth, another forum I read had some screenshots comparing DX10 and DX10.1. The problems the poster had managed to find involved trees; there was some kind of post-processing rendering going on with trees that wasn't occurring with DX10.1, which made them look weird.Not fixing 10.1 may be an NVIDIA thing, but there was definitely a problem with it as-is.
tuteja1986 - Monday, June 2, 2008 - link
Well why where the hell is nvidia dx10.1 support if dx10.1 actually brings some kind of performance improvement in AA.Why aren't GT200 series have DX10.1 ?
I thought PC gaming was all about being the cutting edge on all technology front...
Anyways , this is not the 1st time Ubisoft or Nvidia have done this.
wyemarn - Monday, June 2, 2008 - link
Maybe because Nvidia GPUs cant support AA through shaders. So no use supporting dx 10.1. ATI GPUs have 320 stream processors so it can utilize for shaders and etc. Nvidia cards have less SPs but more ROPs, TMUs which translates to more brute power if games dont use shaders or SPs much. Technology wise, I think ATI is ahead but NVIDIA GPUs have game developer support and more raw horsepower so performance wise NVIDIA is ahead and I think this trend will continue with GTX200 series. I choosed G92 over RV670 because the raw performance is much better even though on paper HD 3800 series look great.SteelSix - Monday, June 2, 2008 - link
Worthy of a thread in Video. I just started one..Gannon - Monday, June 2, 2008 - link
The original halo had performance issues but they weren't alarming, halo was actually not too bad port compared to many other console to PC disasters. Halo 1 got 'better with hardware' advancing. Halo 2 on the other hand is just all around atrocious. Halo 2 was just not a very well made game, period, despite the addition of cutscenes, etc. Halo 1 had a much better feel and better vehicle design IMHO, I hated how the warthog looked in Halo 2, it annoyed me to no end.