GeForce 3D Vision: Stereoscopic 3D From NVIDIA
by Derek Wilson on January 8, 2009 2:30 PM EST- Posted in
- GPUs
As we've seen over the past few years, NVIDIA isn't content with simply doing what has been done well. Certainly their graphics cards are good at what they do and competition in the market is great today delivering amazing value to consumers. But they've forged ahead with initiatives like SLI for multi-GPU rendering and CUDA for general purpose programing on GPU. Now they're taking it a step further and getting into stereoscopic 3D.
To be fair, NVIDIA has supported stereoscopic 3D for a long time, but this is more of a push to get pervasive stereoscopic graphics into the consumer space. Not only will NVIDIA graphics cards support stereoscopic rendering, they will also be enhancing their driver to extract depth information and create left and right eye images for applications that do not natively produce or support stereo rendering. And did we mention they'll also be selling active wireless shutter glasses?
Packaged as GeForce 3D Vision, NVIDIA's shutter glasses and transmitter pair will run consumers a respectable $200. This is more expensive than some glasses and cheaper than others. We actually don't have any other glasses in house to compare them to, but the quality, freedom and battery life are quite good. If it becomes necessary, we will do a comparison with other products, but the real advantage isn't really in the hardware; it's in the driver. The package also comes with a soft bag and cloth for the glasses, alternate nose pieces, cables and converters, and a couple disks with drivers, stereoscopic photo viewer and video player.
Stereoscopic 3D shutter glasses have been around since the late 90s, but with the push away from CRTs to LCDs with a fixed 60Hz refresh rate meant that high quality stereoscopic viewing on the desktop had to be put on hold (along with hopes for smaller pixels sizes, but that's a whole other rant). With Hollywood getting really interested in 3D movies and some display manufacturers getting on board with 120Hz monitors, TVs and projectors, it makes sense that we would see someone try to push this back to the forefront.
Before we get into just how NVIDIA wants to make stereoscopic 3D on the desktop a reality, lets take a look at exactly what we're talking about.
54 Comments
View All Comments
jkostans - Friday, January 9, 2009 - link
So how is this different from my ELSA 3d shutter glasses from 1999? The glasses I paid $50 for back then are just as good as this $200 setup in 2009? Great job re-inventing the wheel and charging more for it nVIDIA.There is a reson shutter glasses didn't catch on. Ghosting being the worst problem, along with compatibility, loss of brightness/color accuracy, performance hits, the need for high refresh rate, etc etc etc.
If you are thinking of buying these, don't. You will use them for a few weeks, then just toss them in a drawer due to lack of game support and super annoying ghosting.
nubie - Friday, January 9, 2009 - link
It is different because these are likely ~$400 - $500 quality glasses.Check out my setup with high resolution, no ghosting, high compatibility, minimal performance hit:
http://picasaweb.google.com/nubie07/StereoMonitorS...">http://picasaweb.google.com/nubie07/StereoMonitorS...
http://picasaweb.google.com/nubie07/3DMonitor">http://picasaweb.google.com/nubie07/3DMonitor
Running on iZ3D of course, no need for nVidia at all, buy any card you like, and keep running XP until Microsoft releases another OS worth spending money for.
jkostans - Friday, January 9, 2009 - link
No ghosting?http://picasaweb.google.com/nubie07/3DMonitor#5060...">http://picasaweb.google.com/nubie07/3DMonitor#5060...
I can see it there and thats not even a high contrast situation.
Shutter glasses are shutter glasses, they all suck regardless of price.
nubie - Saturday, January 10, 2009 - link
OK have a closed mind, technology never advances.PS, that picture was taken through a linear polarized lens, and I am holding the camera and the glasses, so they may not have been lined up.
Also the contrast is automatically set by the camera, in person there isn't any ghosting.
Shadowdancer2009 - Friday, January 9, 2009 - link
Can they PLEASE kill this tech soon?It was 100% crap the first time, and it won't get better no matter how awesome the drivers are.
The glasses eat 50% of the brightness when "open" and doesn't kill 100% when "closed"
They never did, and your review says the same thing.
This was crap ten years ago, and it's crap now.
Give us dual screen highres VR goggles instead.
nubie - Friday, January 9, 2009 - link
Maybe you don't understand the technology, these are ~$400 - $500 glasses, wireless with about a week of li-ion battery power.Don't compare them to the $10 ones you can get anywhere, at least try them for yourself.
There are much better reasons to bash nVidia, like dropping support for 90% of the displays they used to support, and making support Vista only.
gehav - Friday, January 9, 2009 - link
I'm perfectly satisfied with the current refresh rate of LCD-panels (60Hz). However what you forgot is the following: if the 3D glasses open and shut 60 times per second (for a 120Hz Panel) the old flicker of CRTs is effectively back. Therefore raising the refresh rate of the monitor to 240Hz would reduce the per eye flicker to an acceptable 120Hz. Not the monitor itself is the culprit here but the 3D glasses reintroduce flickering like in the old days of CRTs (and they are directly dependent on the refresh rate of the monitor).Georg
gehav - Friday, January 9, 2009 - link
btw: 200Hz displays are already on the way, it seems:http://www.engadget.com/2008/09/02/sony-samsung-bo...">http://www.engadget.com/2008/09/02/sony...-both-cl...
gehav - Friday, January 9, 2009 - link
Just a thought I had while reading the article:Wouldn't a ray traced image work far better for stereoscopic viewing? From what I understand the rasterizing technique used by today's graphics cards uses all kinds of tricks and effects to create the perception of a "real 3D world". That's why the drivers have to be customized for every game.
Ray tracing uses a far simpler algorithm to get good results. Every light ray is calculated separately and every game that uses ray tracing should therefore - in principle - easily be customizable for stereoscopic viewing.
I'm thinking of the announced Intel Larrabee which will maybe offer ray tracing acceleration for games and could therefore be much better suited for stereoscopic viewing.
Not sure if I'm right with these thoughts but it would be interesting to see if games that are already available in a ray tracing version (like Quake 4) could be easily adapted to support stereoscopic viewing and what the result would look like.
Apart from that I also think we would need faster LCD-panels (240Hz) to get non-flickering pictures for each eye.
Georg
nubie - Friday, January 9, 2009 - link
Check out some of the other initiatives, notably iZ3D, who have offered a free driver for all AMD products and XP support (double check the nVidia support for XP, non-existent much?)nVidia's idea is too little, too expensive, too late. I have built my own dual-polarized passive rig that works great with $3 glasses, unfortunately nVidia has dropped all support (the last supported card is from the 7 series, so "gaming" isn't really an option.)
Thankfully iZ3D has stepped up to provide drivers, but thanks to nVidia's lack of support I have lost so much money on unsupported 8 series hardware that I haven't looked at a game in a couple years.
nVidia has killed my will to game. Dropping support of 3D is not the way to sell 3D (do some research, nvidia has dropped XP, supports only vista, and not even any of the cool displays you can cobble together yourself for less than the $200 this stupid package costs.)
My proof of concept, before nvidia pulled the plug:
http://picasaweb.google.com/nubie07/3DMonitor#">http://picasaweb.google.com/nubie07/3DMonitor#
My gaming rig, before nvidia dropped support for ~3 years:
http://picasaweb.google.com/nubie07/StereoMonitorS...">http://picasaweb.google.com/nubie07/StereoMonitorS...
nVidia needs to do better than this, and they should know better.