
Real-time 360º video stitching: 3D next
More recently, the company has signed up its first licensee for this hardware-based intellectual property and is in discussion with a number of other chip integrators.
Discussing the merits of hardware-accelerated 360º video stitching with eeNews Europe, Argon Design’s CTO and co-founder Steve Barlow put Argon360 in perspective.
“There are four different types of solutions for video stitching. The first one is in the cloud, off-line. You have to take the individual video streams and upload them separately. This is not done in real time and you can’t see what you are creating, but it gives you a lot of processing power at low cost.
The second option is to use a GPU, you get nearly real time performance, with a delay of a few seconds but you don’t get such a good stitching. A third solution is to use an FPGA to perform simple colour blending. Basically you computationally warp the images, overlay them and blend them.
We use multiband blending which is algorithmically more complicated but we can smooth out the slight differences between the cameras due to the different camera sensors, the discontinuity between the optics and the parallax. The technique we use smooths out the distance differences so we’ve got a good real time stitching solution with a latency of just one frame, with a power consumption about the same as a video encoder”.
“Our target is below 0.5W on a modern process, 28nm or below”, Barlow added.
Argon360 can handle any combination of up to six sensors with an output resolution of up to 8192×4320 at 30 frames per second and the company is targeting a number of market areas including security, action cameras, automotive electronic surround view applications and the professional 360º video immersive content creators for VR helmets. Drones represent another promising segment.
“With our very low latency, action cameras can directly produce 360º content and livestream it, users can preview the result straight away, and all this is feasible on a small battery-operated unit. There is no way you could rely on a GPU for this, it would be too power hungry and too slow”, completed the CTO.
The company gets about 50% of its revenues from its consultancy services, helping other companies with board level, RTL, FPGA and embedded software design. The other 50% of revenues come from IP licensing of which Argon360 is the third offering.
“Next, we are going to do more in the Argon360 area”, told us Barlow. “We’ll try to solve the parallax problem and we have some pretty good software models of how to do it. We want to create stereo 360º video stitching. For now, depth extraction is typically done in the cloud. To acquire stereo 360º videos, each point in space is looked by two different cameras, usually a sphere of cameras on a large rig. We could create the correct stereo left and right disparities for each angle of view, something that would work no matter where the end-user is looking”.

on the faces of a cube.
As an example of video stitching, the company delivered two pictures, before stitching (a 6-input video stream from a rig of cameras mounted on the faces of a cube, each camera with a 120°x90° field of view) and the stitched result.
The views in the first picture are all at an angle because the cubic rig was mounted by a corner. The Argon360 produced a 360° equirectangular projection output compatible with video sharing sites such as YouTube. In this projection, longitude and latitude angles are mapped directly to X and Y respectively. The “North Pole” is spread across the whole of the top edge of the image and the “South Pole” across the whole of the bottom edge.

Argon Design – www.argondesign.com
Related articles:
Real-time stitching IP makes 360-degree videos
Tactical panoramic video ball creeps in
Panoramic throwball camera explodes crowd funding goal
Webcams move to 4K panoramic video
