Real-time stitching IP makes 360-degree videos

April 13, 2016 // By Peter Clarke
Argon Lenten bumps
Embedded systems design consultancy Argon Design Ltd. (Cambridge, England) has created prototype hardware that can stitch together video streams from multiple image sensors in real time to create high resolution and 360-degree videos.

Such multi camera arrays are used to produce 360-degree video for a number of applications, the creation of navigable landscapes for virtual reality being one.

Argon has origins that extend back to graphics IP company Alphamosaic Ltd. Alphamosaic was founded by Robert Swann and Steve Barlow in 2000 and sold to Broadcom for $123 million in 2004. Barlow then went on to co-found Argon Design in 2009 and serves the company as its chief technology officer. In the past Argon's expertise had been applied to simultaneous location and mapping and stereo depth perception techniques and the company has applied the ability to correct for various parallax artefacts to overcoming challenges when stitching video together from multiple image sensors.

GoPro rig
Close up of camera cube with captured image in the background Source: Argon

The Argon360 technology running on an FPGA development platform has been applied to a rig of six GoPro cameras and this set up is going to be exhibited at the NAB show in Las Vegas which opens Saturday April 16. It will be offered as FPGA code and as logic and verification for IP block to be included in multimedia and video ASICs and SoCs.

Block diagram shows how image sensors connect to Image Signal Processing blocks to clean up raw frames and feed Argon360. Source: Argon.

"Our ultimate goal is to create a semiconductor IP block for use in next-generation cameras and smartphones featuring an array camera or 360-degree camera, performing video-rate, state-of-the-art, photographic quality, depth-compensated stitching of multiple camera inputs to create a composite output. But there are opportunities in the interim for ground-breaking hardware-based live streaming systems for a range of applications," the company states on its website.

Besides immersive material one of the opportunities is to create material that can be edited after the event. This means that multiple versions of skilfully framed 4K video could be made from adhoc material shot using a 360-degree camera system.