How do I output 3D images to my 3D TV?
Asked Answered
B

2

11

I have a 3D TV and feel that I would be shirking my responsibilities (as a geek) if I didn't at least try to make it display pretty 3D images of my own creation!

I've done a very basic amount of OpenGL programming before and so I understand the concepts involved - Assume that I can render myself a simple Tetrahedron or Cube and make it spin around a bit; How can I get my 3D TV to display this image in, well, 3D?

Note that I understand the basics of how 3D works (render the same image twice from 2 different angles, one for each eye), my question is about the logistics of actually doing this (do I need an SDK? etc...)

  • The TV I have uses polarization 3D, although my intention is that this question also be relevant to other 3D technologies (if possible)
  • My laptop has a HDMI output, which is what I intend to use to connect up to my TV with (does this make any difference over using a VGA / component video cable?)
  • In the past I have experimented with GLUT / OpenGL, however if its easier / only really possible to do this using some alternative technology then thats fine
Booster answered 26/7, 2011 at 9:9 Comment(2)
Possible duplicate of How can I output a HDMI 1.4a-compatible stereoscopic signal from an OpenGL application to a 3DTV?Hetaera
I'm unaware of any TV that uses polarized or passive 3D. There is no technical way to control light at common resolutions found in TVs. The entire industry of direct LEDs is laboring to create this today. Please share your TV make and model.Reformism
E
7

The main problem is, getting your GPU to send a stereoscopic format. In the case of a HDMI connection this will not work without the help of a driver. If you have a professional grade GPU (Quadro, FireGL), then they likely support OpenGL quadbuffers, i.e. you get framebuffers for the left and right eye, both back and front:

glDrawBuffer(GL_BACK_LEFT);
render_left_eye();

glDrawBuffer(GL_BACK_RIGHT);
render_right_eye();

glDrawBuffer(GL_BACK); // renders to both eyes simultanously
render_screen_level_and_nonstereoscopic();

SwapBuffers();

Unfortunately OpenGL quad buffer is considered professional grade stuff.

Instead NVidia (at least) provides a customary stereoscopy library plus some extensions to control it. The main reasoning is, that shared fragments are to be rendered only once and then sent to both eyes with the appropirate parallax applied. However from my semi-professional experiences with stereoscopy¹, these kinds of semi-/automatic stereoscopifications just don't cut it. Stereoscopy requires tight control of the whole "production" pipeline, otherwise you're screwed. With Elephants Dream I went as far as modifying the renderer's core code.

I sent the people at the 3D devision at NVidia some case scenarios where you need exact control over the stereoscopy process, and I hope they will see the light and give access to quad buffer stereo also on consumer grade hardware.

Note that I understand the basics of how 3D works (render the same image twice from 2 different angles, one for each eye)

Actually you don't render from two different angles but with a shifted parallax and lens shift. Otherwise you get some trapezoidal/keystone distortion in the horizontal, which are very, very unpleasant to watch (in fact I now think that in the stereoscopic rendering process one should slightly diverge the optical axes – i.e. doing the complete contrary to what one would naively do – and "over"compensate with lens shift, I'm currently preparing a small study about this, but still need to gather my testing and control groups).


1: heck, I'm the guy who single-handedly stereographed Elephants Dream, rendered it and got it an award at a 3D movie festival.

Engaged answered 26/7, 2011 at 10:23 Comment(3)
Thank you for your comprehensive answer! As it happens I was actually thinking of buying another laptop that happens to have a Quadro graphics card (without realising that it might be useful), so I may yet get a chance to experiment with both methods.Booster
Actually, from their documentation, the nVidia 3D crap works by deciding based on some heuristics such as rendertarget aspect ratio whether or not to duplicate render targets as well as all draw calls, and modifying the vertex shader to do the parallax calculations. The ingenious thing about it is that some guy at nVidia is so much smarter than you and knows so much more about any application than you might possibly write, which is why it works so great. You gotta wonder just what's so hard in properly supporting 3D, i.e. quad buffers. It's not like it asks anything special from the HW.Interdictory
@Damon: The reason is simple: real quad-buffer stereo is a selling point for NVIDIA's high-end professional grade GPUs. If they simply flipped the driver switch that would allow this to work on consumer GPUs, they could lose a lot of profits from their price-inflated professional GPUs. With any luck, Direct3D 12 will require QBS, and thus NVIDIA (and AMD) will expose it in OpenGL.Metameric
A
0

Because you have a passive 3D TV, it's likely that the left and right eye views are rendered on alternate scan lines. (or perhaps on alternate pixels in a checkerboard pattern)

Thus your mission is to render the left-eye view to the even numbered scan lines, and the right eye view to the odd numbered scan lines (or vice versa). This can be accomplished either via OpenGL stencil operations, or, more modernly, using custom fragment shaders.

This way, you can avoid the whole quad-buffered video card/GL_BACK_LEFT/GL_BACK_RIGHT approach described by datenwolf. And you want to avoid that approach, as I have never encountered a video driver that directs quad-buffered stereo 3D to an actual 3D TV.

I agree with datenwolf's advice that you should use asymmetric frustum shift rather than scene rotation to generate the right and left eye viewpoints.

Analog answered 30/5, 2014 at 19:56 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.