Coming Soon to HD DVD: Silicon Optix HD HQV
by Derek Wilson on February 8, 2007 1:25 AM EST- Posted in
- GPUs
HD HQV Performance
For these tests, we will be using a beta version of PowerDVD HD 6.5 along with hardware acceleration to play back the HD HQV HD DVD. This works much the same as it did with the standard definition HQV test which is a collection of clips playback from a DVD. The software DVD, HD DVD or Blu-ray player, with hardware acceleration enabled, will use the graphics hardware to offload the decode process from the CPU. The byproduct of this is that image quality is more influenced by the hardware being used than the software when using hardware acceleration.
For AMD we are using Catalyst 7.1, and for NVIDIA we look at 93.71 for the 7 series and 97.92 for the 8 series. Our results on both 7 and 8 series cards for NVIDIA are the same, so there will be no distinction between the two when talking about NVIDIA results. This comparison is as much between AVIVO and PowerDVD HD as it is of the hardware at this point. Both AMD and NVIDIA should have some headroom for improving performance through their drivers in the future.
The maximum score for HD HQV is 105 with 80 of those points having to do with proper deinterlacing of interlaced HD sources. All broadcast HD sources in the US today are interlaced, and there are many HD DVD movies provided in 1080i as well. Fewer Blu-ray titles are 1080i, but they aren't impossible to find. While our HD DVD version of HQV obviously tests HD DVD players, these features will be important for the playback of any interlaced HD source. Because of this, we really expected NVIDIA and AMD to perform well.
Unfortunately, reality did not live up to our expectations. We'll break it down by test. While we would love to provide screenshots, our version of PowerDVD doesn't support screenshots, and taking pictures of the TV just doesn't provide the detail we need. Descriptions of what's going on will have to do for now.
Noise Reduction
Both AMD and NVIDIA score a flat zero on this test. None of the AMD or NVIDIA cards we tested performed any noise reduction on either the flowers or the boat scene. There weren't any artifacts present, but it is very clear that neither camp performs any noise reduction on HD video at this point.
Video Resolution Loss
AMD averages fields and thus looses detail. The result is a gray color filling the corner blocks rather than alternating fine lines. NVIDIA doubles the scanlines in one field and eliminates half of the data, as the corner blocks are solid colors. This means that both solutions fail in different ways. PowerDVD's software performs similarly to AMD hardware, which means that currently available computer hardware and software will not faithfully reproduce interlaced video.
Jaggies
Once again, AMD and NVIDIA both fail to eliminate diagonal aliasing. This is another example of the poor deinterlacing provided by computer hardware and current drivers. Eliminating jaggies is a major way to improve the visual experience of watching interlaced video on a progressive display like a 720p or 1080p HDTV or a computer monitor.
Film Resolution Loss
Like the video resolution loss test, both NVIDIA and AMD fail this test. The failure was a result of the same problems we saw in the video resolution loss test, meaning that rather than performing inverse telecine, both AMD and NVIDIA treat 1080i created from a film source the same way they would treat video. For AMD this means averaging fields, and for NVIDIA this means eliminating half the fields.
Film Resolution Loss - Stadium Test
When playing on AMD hardware, flickering is apparent in the stadium. While NVIDIA hardware doesn't flicker, a moiré pattern is apparent in the stands. Both of these fail to pass the test and demonstrate different issues that can appear when film is poorly deinterlaced.
The overall result?
AMD: 0
NVIDIA: 0
For these tests, we will be using a beta version of PowerDVD HD 6.5 along with hardware acceleration to play back the HD HQV HD DVD. This works much the same as it did with the standard definition HQV test which is a collection of clips playback from a DVD. The software DVD, HD DVD or Blu-ray player, with hardware acceleration enabled, will use the graphics hardware to offload the decode process from the CPU. The byproduct of this is that image quality is more influenced by the hardware being used than the software when using hardware acceleration.
For AMD we are using Catalyst 7.1, and for NVIDIA we look at 93.71 for the 7 series and 97.92 for the 8 series. Our results on both 7 and 8 series cards for NVIDIA are the same, so there will be no distinction between the two when talking about NVIDIA results. This comparison is as much between AVIVO and PowerDVD HD as it is of the hardware at this point. Both AMD and NVIDIA should have some headroom for improving performance through their drivers in the future.
The maximum score for HD HQV is 105 with 80 of those points having to do with proper deinterlacing of interlaced HD sources. All broadcast HD sources in the US today are interlaced, and there are many HD DVD movies provided in 1080i as well. Fewer Blu-ray titles are 1080i, but they aren't impossible to find. While our HD DVD version of HQV obviously tests HD DVD players, these features will be important for the playback of any interlaced HD source. Because of this, we really expected NVIDIA and AMD to perform well.
Unfortunately, reality did not live up to our expectations. We'll break it down by test. While we would love to provide screenshots, our version of PowerDVD doesn't support screenshots, and taking pictures of the TV just doesn't provide the detail we need. Descriptions of what's going on will have to do for now.
Noise Reduction
Both AMD and NVIDIA score a flat zero on this test. None of the AMD or NVIDIA cards we tested performed any noise reduction on either the flowers or the boat scene. There weren't any artifacts present, but it is very clear that neither camp performs any noise reduction on HD video at this point.
Video Resolution Loss
AMD averages fields and thus looses detail. The result is a gray color filling the corner blocks rather than alternating fine lines. NVIDIA doubles the scanlines in one field and eliminates half of the data, as the corner blocks are solid colors. This means that both solutions fail in different ways. PowerDVD's software performs similarly to AMD hardware, which means that currently available computer hardware and software will not faithfully reproduce interlaced video.
Jaggies
Once again, AMD and NVIDIA both fail to eliminate diagonal aliasing. This is another example of the poor deinterlacing provided by computer hardware and current drivers. Eliminating jaggies is a major way to improve the visual experience of watching interlaced video on a progressive display like a 720p or 1080p HDTV or a computer monitor.
Film Resolution Loss
Like the video resolution loss test, both NVIDIA and AMD fail this test. The failure was a result of the same problems we saw in the video resolution loss test, meaning that rather than performing inverse telecine, both AMD and NVIDIA treat 1080i created from a film source the same way they would treat video. For AMD this means averaging fields, and for NVIDIA this means eliminating half the fields.
Film Resolution Loss - Stadium Test
When playing on AMD hardware, flickering is apparent in the stadium. While NVIDIA hardware doesn't flicker, a moiré pattern is apparent in the stands. Both of these fail to pass the test and demonstrate different issues that can appear when film is poorly deinterlaced.
The overall result?
AMD: 0
NVIDIA: 0
27 Comments
View All Comments
ShizNet - Thursday, February 8, 2007 - link
i agree with last dude - if we are talking about PC Hard/Software mixed with Cust.Electronics [40"+ LCD i guess] why not add http://usa.denon.com/ProductDetails/623.asp">this guy or similar to the mix? and see: should people put more money into VidCard/CPU [for best 1080p] or save for receiver/DVD in their HTPC?otherwise - great that you guys getting down and dirty to address some issues and breaking ice for the rest of us, before we spent all that $$$ and get mid of the road performance
Visual - Thursday, February 8, 2007 - link
i dont even understand exactly what you guys just tested... was this just some test-disc played with a software player? why didn't you start the article with more information about the test?what was the system's configuration?
what codec is used for the content, and does it have the proper flags and information needed for correct deinterlacing?
which player app and decoders you used, etc?
if there were flaws in the playback, isn't it the software's fault, not the hardware's? if there were differences on ati/nvidia hardware, isn't it because the software used their built-in capabilities improperly and in different ways? surely there can be player software that handles deinterlacing perfectly without even using any hardware acceleration...
with a digital source like a hddvd/bluray disc, i don't think these kind of tests can even apply. noise reduction, wtf? we're talking of digital storage, not audio tapes after all. noise can't just appear with age. if there is "noise" on the source, it was probably put there on purpose, not real "noise" but something that was meant to be there. why should the playback system remove it?
resolution loss and jaggies, stuff that is related to deinterlacing, and it just pisses me off. why oh why should anyone be bothered with deinterlacing in this day and age?
you say "Interlaced media is available on both HD DVD and Blu-ray" but from what i've heared, the majority (if not all) of hd-dvd and blue-ray content is currently stored as 1080p on the discs. who and why would be as dumb as to produce interlaced hd content?
DerekWilson - Thursday, February 8, 2007 - link
I've updated page 3 of the article with information on the HD DVD player used and the drivers used for AMD and NVIDIA cards.The software player enabled hardware acceleration which enables AMD and NVIDIA to handle much of the decode and deinterlacing of the HD content. This is a test of the hardware and drivers provided by AMD and NVIDIA.
Codec doesn't matter and proper flags don't matter -- a good deinterlacing algorithm should detect the type of content being played. In fact, AMD and NVIDIA both do this for standard definition content.
It might be possible for software HD DVD and Blu-ray players to handle proper deinterlacing, but most software DVD players don't even do it as effectively as possible. There are no HD DVD or Blu-ray players that we know of that support the type of adaptive deinterlacing necessary to pass the HD HQV test.
I do appologize if I didn't explain noise well enough.
The problem comes in the transfer of a movie from film to digital media. CCDs used to pick up light shining through film will absolutely introduce noise, especially in large blocks of similar color like sky. Even digital HD cameras don't have an infinite color space and will have problems with noise in similar situations due to small fluctionations in the exact digital color at each pixel for each frame.
This type of noise can be reduced by post processing, but studios usually do not do this. All you need to do is watch X-Men 3 on Blu-ray to see that noise is a huge problem.
In addition, encoding and compression introduce noise. This noise can't be removed except in the decode process.
Noise is a major issue in HD content, and while much of it could be fixed with post processing, it looks horrible at high resolution.
As for interlacing, most movies will definitely be progressive. But there are some that are 1080i and will need good deinterlacing support.
The big issue, as has been pointed out elsewhere in the comments, is TV. 1080i is the standard here.
In fact, when stations start distributing series on HD DVD and Blu-ray, it is very likely we will see them in interlaced format. Most of my DVD collection consists of TV series, so I consider deinterlacing an imporant step in HD video playback.
As much as I dislike interlaced content in general, it is unfortunately here to stay.
RamarC - Friday, February 9, 2007 - link
Because a TV program is broadcast in 1080i does in no way mean that's the format it is captured/mastered in. "24p" is the current standard for mastering of most network programming and it can result in 720p or 1080i or 1080p content.http://www.digital-digest.com/highdefdvd/faq.html#...">http://www.digital-digest.com/highdefdvd/faq.html#...
In an interview with Microsoft in the Audioholics magazine in January 2006 indicated that HD DVD movies will be stored in 1080p format like BD, even if initial players can only output at 1080i.
Interlaced HD/BluRay content will be a rarity and the performance of playback software with that content is a trivial issue.
ianken - Friday, February 9, 2007 - link
" There are no HD DVD or Blu-ray players that we know of that support the type of adaptive deinterlacing necessary to pass the HD HQV test. "Becuase they don't need it as the content is 1080p.
Silicon Optix is in the business to sell video processing chips. Their benchmark is designed to get people to look for players with their hardware.
For properly authored discs NR and adaptive deinterlace is wasted.
The thing I like about the HQV dics is that sites like this use them and that motivates ATI and NVIDIA to pass them and that gets folks a better 1080i broadcast experience. It's in the realm of poorly encoded broadcast HD TV that this stuff is important.
IMHO.
autoboy - Thursday, February 8, 2007 - link
Sorry about being a huge pain in the ass. I really do like reading your articles about video processing and they are always quite good. For me though, there is always something that seems to be missing.I just found this quote from the head of the mutlimedia division at Nvidia
FiringSquad: PureVideo seems to do more than regular bob deinterlacing when tested with the HQV Benchmark DVD. Can you give us any more details on what's being done?
Scott Vouri: Yes, we do much more than regular ‘bob’ deinterlacing, but unfortunately we can’t disclose the algorithms behind our de-interlacing technology. I do want to point out that HQV doesn’t even test one of the best things about our spatial-temporal de-interlacing – the fact that we do it on 1080i HD content, which is quite computationally intensive.
So it appears that they at least do adaptive deinterlacing which means they do what they say which means they should do inverse telecine and 3:2 pulldown correctly as well. I just can't help but think there is something missing from your setup. They should score better than a 0. Is the HQV benchmark copy protected? Can it be played on regular mpeg2 decoders? Is the PowerDVD hardware acceleration broken?
autoboy - Thursday, February 8, 2007 - link
So the codec doesn't matter for deinterlacing? The decoder decodes the video in a sort of raw format and then the video card takes over the deinterlacing? Hmm. I didn't know that. I was under the impression that the codec was the most important part of the equation. Why is interlaced video such a mystery to most of us. i have been trying to fully understand it for 6 months and I find out that I still don't know anything. I just want proper deinterlacing. Is that too much to ask?Is is really that hard to get good video playback on a PC for interlaced material! Come on...