MENU

Four-camera system to handle live transmissions for autostereoscopic 3D TV

Four-camera system to handle live transmissions for autostereoscopic 3D TV

Technology News |
By eeNews Europe



Research scientists at HHI are therefore working with twelve partners in the MUSCADE project on technologies which will make it possible to watch 3D TV without glasses. For this to happen, autostereoscopic displays are needed, which are coated with special optical foils. They create two different images for the left and the right eye, which is the basic principle of three-dimensional vision. To allow different viewing positions – for instance, when the viewer moves his/her head – these displays use five to ten different views of an image. In the future this number will be considerably higher. As conventional stereo productions only have two views, however, the captured images have to be converted before transmission, for which purpose depth information is extracted from them. In order to reliably determine the depth information, it is recommendable to use more than the usual two cameras. The MUSCADE project partners use four cameras, but this makes the already complex stereo production extremely intricate and expensive. “It can take days to calibrate four cameras to each other,” explains Zilly.

Together with his colleagues the research scientist is therefore working on a four-camera assistance system which will reduce this timeframe to about 30 to 60 minutes. “The development is based on our STAN assistance system, which has already proved its value in conventional stereo productions. But with four cameras calibration is much more complicated,” explains Zilly. This is because all positions and angles of the cameras must be set exactly the same so that the optical axes are parallel, all lenses have the same focal length and all focal points are on a common stereo basis. To achieve this, the scientists have developed a feature detector which recognizes identical objects in the image on all cameras. Using their position, the assistance system then calibrates the individual cameras to each other. But even after calibration small inaccuracies remain.

These occur if lenses with fixed focal lengths are used, which in most cases are subject to small fluctuations. Such residual faults can only be corrected electronically, for example using a digital zoom. This last correction stage is carried out by the new assistance system in real time – making even live transmissions possible. The HHI research scientists are currently working on an efficient video encoding system for compressing the huge volume of data that arises when four cameras are used so that the content can be transmitted on the existing broadcasting infrastructure.

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s