The unit can not only perform 3D image acquisition, but also real-time 3D mapping alongside any other video processing or robotic control functions, hence doubling as the main application’s processing centre.
The Orbbec Persee, as the development team call it, relies on the projection of a structured light dot pattern using an infrared laser and a proprietary diffraction grating, and detection through an SXGA infrared CMOS sensor (1280x1024pixels). The startup also developed a proprietary chip to do the triangulation, depth extraction, and optionally 3D mapping over the colour image frames that are recorded by an RGB camera sensor.
Depth extraction is performed at 30 frames per second at XSGA resolution for high accuracy (0.5 centimeters at a distance of 2 metres, with a range up to 8 metres) and then converted to VGA (640×480 pixels) depth image frames for faster data transfers through USB.
Any other video processing functions or application control after that would be carried out by the built-in computer featuring a 600 MHz Mali-T7 GPU, a quad-core Cortex A17 CPU running at up to 1.8 GHz, and 2 GB of DDR3 RAM completed by an integrated SD Card reader for storage memory.
A conventional approach in robotics and other domains, would be to carry out video processing (such as face recognition, eye tracking) on the RGB frames, separately from the depth sensing rendered in grey scale for navigation or gesture interfacing.
The startup’s product line-up includes the Persee (the 3D-camera featuring a computer on-board) which can double as the main processing unit of any application relying on 3D vision, and two other offerings, the Astra and Astra Pro, which are two flavours of standalone 3D cameras, the former delivering a VGA colour output while the latter offers a 720p HD colour camera and can stream its data via USB 2.0.
Astra was launched some months ago with the funding of private ‘angel’ investors, but the Indiegogo campaign for the Persee and Astra Pro is more wide ranging, appealing to both business partners and consumers alike.
At the core of all three devices lies the company’s chip, something Orbbec is not keen sharing too much about. For EEtimes Europe, Joshua Blake, Orbbec’s co-founder and VP of Engineering puts things in perspective.
"If one wants to process 1280×1024 (SXGA) structured light infrared image into depth, it takes several hours per frame to do this on a high-end desktop-class CPU and C++ implementation.
"On a high-end desktop-class CPU and a highly-optimised OpenMP & SSE3 implementation, it takes 3 to 5 seconds per frame.
"With the 3D ASIC chip the Orbbec team developed, it takes only a few nanoseconds per frame.
"In the complete Astra 3D camera, we easily calculate SXGA depth at 30 frames per second (based upon the IR CMOS capability), then downscale the depth image to VGA for transmission across the USB 2.0 connection".
Joshua Blake, Orbbec’s co-founder and VP of Engineering, holding the Persee.
This down conversion from SXGA to VGA has another purpose, explained Blake, it allows the chip to detect and filter out low quality pixels, so the resulting VGA frames are like a ‘condensate’ of the best pixels, which makes the 3D camera operational even in the most challenging lighting conditions.
For depth sensing, Orbbec chose structured light IR projection over Time-of-Flight sensors because the technology can scale up more easily.
"On our chip, we could run the depth conversion algorithms at 300 fps, this leaves us a lot of room to extend our product range in the future, as sensors get faster and of higher resolution", Blake said, adding "In general, we’re not limited by our technology, we could increase both resolution and frame rate, but the only limiting factor is streaming the data outside the camera unit".
Over a Skype video conference, Blake demonstrated the Persee and how responsive the 3D depth sensing was, appearing more-or-less instantaneous. He also briefly switched from a grey-scale depth map to a fused 3D colour image, an application he is still working on. Real-time skeleton tracking is also on the agenda.
With its 184 x 35 x 46 mm casing, the whole 3D camera-computer unit is about half the size of the Kinect 2 which would only provide the sensors and no on-board processing power, yet, the company plans to offer a modular unit for business partners who would want to integrate it as a 3D vision platform within industrial applications.
Having already reached over 250% its initial $40,000 funding goal and with a campaign running until October 31st, the company has just open sourced the complete Orbbec Astra software development kit (SDK) to GitHub.
Available at https://orbbec3d.com/develop/, the beta SDK supports Windows and OS X as well as the creative coding framework Processing. Support for Linux and Android, as well as openFrameworks, Unity 3D, and Cinder, will be available in future versions.