GeForce 3D Vision: Stereoscopic 3D From NVIDIA
by Derek Wilson on January 8, 2009 2:30 PM EST- Posted in
- GPUs
More 3D than 3D: Stereoscopic Defined
Let's start with reality: we live in a world where things occupy a finite volume of space at any given moment in time... Alright, maybe that's not a good way to explain this. Let me try again. Stuff we see in real life has some width, some height and some depth. Our life in our 3D world and our two eyes give us the ability to quickly and easily judge position and dimensions of objects. 3D video games try to approximate this by drawing a two image that has many of the same "depth cues" we use to judge position and shape in reality.
Looking at a picture of something, a 2D image can help us perceive some of the depth that we would have seen if we had stood at the same location as the camera: stuff that's further away appears relatively smaller than the foreground. Shadows and lighting help give us a feel for dimensions as they fall on objects. If we were to talk about video, we would see parallax in effect making it look like objects closer to the viewer move faster than objects further away. Our experience tells us that we can expect certain constants in our reality and we pick up on those and use them to judge things that look similar to reality. Video games exploit all these things to help tell our brains that there is depth in that monitor. Or maybe we're looking at a video of something that was reality. Either way, there is something major (aside from actual depth) missing.
Though we can judge 3 dimensions to a certain extent based on depth cues, having two eyes see objects from two slightly different positions is what really tells our brain that something has depth. The combination of these two slightly different images in our brain delivers tons of information on depth. Trying to play catch with one eye is tough. Just ask your neighborhood pirate.
Seeing two different images with your two different eyes, or rather presenting two different images of the same thing from slightly different positions, is what stereoscopic 3D is. It's right there in the word ... ya know ... stereo ... and scopic. Alright, moving on.
If you've ever tried looking at those "magic eye" pictures, you know what impact just stereoscopic info can have. For those who don't know, a magic eye image is a seemingly random looking pattern that when viewed with your eyes looking "through" the image reveals a hidden 3D picture. Though there is absolutely no other depth information in the picture, no lighting or shadows, no perspective projection, nothing but basic shapes that each eye picks up when you focus through the image, the 3D effect is pronounced and looks "deeper" than any 3D game out there.
This is not a sailboat.
Combining stereoscopic information with all the other depth information makes for a dramatic effect when done properly. Correct rendering and presentation of left and right eye images with proper 3D projection, lighting all that simply looks real enough to touch. Viewing a game properly rendered for stereoscopic effects can range from feeling like looking at a shoe box diorama or a popup book to looking through a window into the next room.
Hollywood tried stereoscopic 3D with anaglyphs (those red and blue images you need the red and blue glasses for), but it didn't really take off except as a sort of lame gimmick. Back in the late 90s and early this century, we saw the computer industry test the waters with active shutter glasses that worked quite a bit better. Rather than displaying a single images with both eye views superimposed requiring filtering, shutter glasses cover one eye while the entire screen displays an image rendered for the other eye. That eye is covered while the first is uncovered to see it's own full resolution full color image. When done right this produces amazing effects.
There are a couple catches though. This process needs to happen super fast and super accurately. Anyone who spent (or spends) hours staring at sub-60Hz CRTs knows that slow flicker can cause problems from eye strain to migraines. So we need at least 60Hz for each eye for a passable experience. We also need to make absolutely certain that one eye doesn't see any of the image intended for the other eye. Thus, when building active shutter glasses, a lot of work needs to go into making both lenses able to turn on and off very fast and very accurately, and we need a display that can deliver 120 frames per second in order to achieve 60 for each eye.
Early shutter glasses and applications could work too slowly delivering the effect with a side of eye strain, and getting really good results required a CRT that could handle 120Hz and glasses that could match pace. It also required an application built for stereoscopic viewing or a sort of wrapper driver that could make the application render two alternating images every frame. Requiring the rendering of an extra image per "frame" required realtime 3D software to be very fast as well. These and other technical limitations helped to keep stereoscopic 3D on the desktop from taking off.
There is still a market today for active shutter glasses and stereoscopic viewing, though there has been sort of a lull between the production of CRTs and the availability of 120Hz LCD panels. And while LCDs that can accept and display a 120Hz signal are just starting to hit the market, it's still a little early for a resurgence of the technology. But for those early adopters out there, NVIDIA hopes to be the option of choice. So what's the big deal about NVIDIA's solution? Let's check it out.
54 Comments
View All Comments
marc1000 - Thursday, January 8, 2009 - link
I had one of these... and I had it with the 3d glasses!!! it was a 8bit console, with bad-looking games, the 3d glass was conected to the console via a cable, and the pace of changing the eyes was so slow you could se it if you pay enough attention. but it worked. and worked with any simple TV out there. however it was only FUN, no good images in reality... it's nice to see this technology come back to life!JonnyDough - Thursday, January 8, 2009 - link
60hz should be the MINIMUM. Not the STANDARD. Even @ 60hz you tend to get a good bit of eye strain. I don't know how the monitor/tv industries get away with the mere 60hz. I for one STILL get headaches. Doesn't anyone else?ViRGE - Thursday, January 8, 2009 - link
On an LCD? No. Which is why all this talk of strain is silly; the strain was a product of the flickering in a CRT, there's no reason anyone should be straining on a LCD.PrinceGaz - Thursday, January 8, 2009 - link
120hz LCD panel is probably enough to say where your testing went wrong and your problems with ghosting and other issues began.You must use a display with a native almost instant response, and no LCD panel to date can provide that (regardless of how much overdrive is given to nasty poor-quality but fast-response TN panels). You should have went old-school and used a high-quality CRT at 120hz refresh-rate, like many pro-gamers still use, or if available an OLED display as they would also be able to cope properly with 120hz refresh. Hell, I've got an old 15" CRT sitting on my desk which is capable of 640x480 @ 120hz which would almost certainly have done a better job of testing your 3D goggles than whichever LCD panel you used.
Ghosting would almost certainly have been a non-issue with a CRT running at 120hz, and having the left and right-eye images not having some of the other eye image also still visible (because of LCD response-time) would almost certainly have made it look a lot better.
DerekWilson - Friday, January 9, 2009 - link
Not that kind of ghosting ... it didn't have to do with the panel -- everything looked fine on that end. I'm using the samsung 5ms 120Hz 1680x1050 monitor. the image looked smooth.after talking with nvidia, it seems the ghosting issues were likely from convergence being off (where the look at points for each eye camera are set) causing problems. convergence can be adjusted with hot keys as well, but i didn't play with this.
eye strain didn't appear to be from flicker either -- it's more about the exercise of focusing on things that aren't there ... tweaking the depth (separation) and your distance from the monitor can make a big difference here. a CRT would not have made a difference. i do have a sony gdm f520, but its just not at the lab ...
ssiu - Thursday, January 8, 2009 - link
Yes you can use the NVIDIA glasses with analog CRT monitors with 100Hz-120Hz refresh rate.ssiu - Thursday, January 8, 2009 - link
Anyone interested in this should also check out and compare it with the competitor solution from iZ3D http://www.iz3d.com/">http://www.iz3d.com/ The 2 solutions each have their pros and cons, but iZ3D is significantly cheaper (MSRP $400 versus $600 ($200 glasses + $400 120Hz monitor)). iZ3D works with both ATI and NVIDIA video cards, and ATI users get an extra $50 rebate.simtex - Thursday, January 8, 2009 - link
This looks very promising, if nvidia really want to push this rather old technology forward again I'm sure they can do so.OpenGL have had built in support for the buffers you need to create stereoscopic images for years, in fact since version 1.1 if im not mistaken, so that is really no excuse for developers not using it.
And the suggestion that nvidia should just make a 3d monitor, what technology do are you refering to here, because as far as I know there is no technology capable of creating 3d images on a tradiional flat 2d monitor.
crimson117 - Thursday, January 8, 2009 - link
I can only find one, and it's bundled with these glasses :)http://www.tigerdirect.com/applications/searchtool...">http://www.tigerdirect.com/applications...904&...
ssiu - Thursday, January 8, 2009 - link
The other announced 120Hz monitor is Viewsonic VX2265wm.http://www.viewsonic.com/company/news/vs_press_rel...">http://www.viewsonic.com/company/news/vs_press_rel...
http://www.viewsonic.com/products/desktop-monitors...">http://www.viewsonic.com/products/deskt.../lcd/x-s...