Real-time stitching IP makes 360-degree videos
Such multi camera arrays are used to produce 360-degree video for a number of applications, the creation of navigable landscapes for virtual reality being one.
Argon has origins that extend back to graphics IP company Alphamosaic Ltd. Alphamosaic was founded by Robert Swann and Steve Barlow in 2000 and sold to Broadcom for $123 million in 2004. Barlow then went on to co-found Argon Design in 2009 and serves the company as its chief technology officer. In the past Argon’s expertise had been applied to simultaneous location and mapping and stereo depth perception techniques and the company has applied the ability to correct for various parallax artefacts to overcoming challenges when stitching video together from multiple image sensors.
The Argon360 technology running on an FPGA development platform has been applied to a rig of six GoPro cameras and this set up is going to be exhibited at the NAB show in Las Vegas which opens Saturday April 16. It will be offered as FPGA code and as logic and verification for IP block to be included in multimedia and video ASICs and SoCs.
“Our ultimate goal is to create a semiconductor IP block for use in next-generation cameras and smartphones featuring an array camera or 360-degree camera, performing video-rate, state-of-the-art, photographic quality, depth-compensated stitching of multiple camera inputs to create a composite output. But there are opportunities in the interim for ground-breaking hardware-based live streaming systems for a range of applications,” the company states on its website.
Besides immersive material one of the opportunities is to create material that can be edited after the event. This means that multiple versions of skilfully framed 4K video could be made from adhoc material shot using a 360-degree camera system.
There is no cut off to the performance the Argon360 block is capable of, the company claims, as the hardware is scalable to cope with higher resolution sources and higher numbers of sources, Barlow told eeNews Europe. Similarly the system can cope with non-synchronized display sources and image sensors that lack global shuttering. “It’s true rolling shutter cameras don’t cope well with high-speed motion and can introduce artefacts, but its not made worse by stitching,” said Barlow.
Lent bumps rowing race on the Cam captured on the Argon360 demo rig. Note the icon top left that allows navigation throughout the 360-degree scene. Source: Argon
“The main problem is parallax, the difference views seen because of the different image sensor position and stitching without taking that into account can produce ghost images and jumps in the scene. Google is being supportive of 360 vision with YouTube supporting playback of 360 video and Google Cardboard where you can insert a smartphone into a cardboard holder to create a virtual reality visor,” Barlow added.
Off-chip memory interfaces are likely to be the bottleneck on performance Barlow acknowledged. “External memory bandwidth is an important factor in how this will fit into an SoC,” he said. Using six HD cameras to generate a 4K image in 360 view takes a bandwidth of about 8.6Gbytes/s. In its demonstration Argon has made use of DDR4 channels as these were the interfaces that were readily available, but for FPGA applications hybrid memory cube (HMC) interfaces may be preferred and for SoC other interfaces such as Wide I/O may be most appropriate, Barlow said.
This is part of the reason that Argon is showing a prototype at NAB. The technology is not yet highly productized and a partnership development approach is still necessary to optimize the technology, Barlow said.
Related links and articles:
News articles:
Renesas licenses Argon’s VP9 decoder test IP
V-Nova preps UHD/4K silicon IP for licensing
Google-led group preempts HEVC