First event-driven AI camera ships
A Japanese company has shipped the first industrial AI camera using an event-driven neuromorphic AI technology developed by Prophesee in France.
CenturyArks is shipping the SilkyEvCam, an ultra-compact USB camera designed to address a wide range of industrial machine vision applications. This uses the Prophesee Metavision sensor as well as the recently announced Metavision Intelligence Suite. This provides a faster time to market for machine vision developers using an event driven approach that only triggers on changes in the images.
The 2.5W SilkyEvCam is available for production orders and is coupled with the software modules for applications such as high-speed counting, vibration monitoring, ultra-slow motion and object tracking. An intuitive set of development tools guide users on how to optimally implement Event-Based Vision in their machine vision systems.
“The performance and flexibility of the Prophesee Event-Based Vision solution, including the new SDK, provides an extended platform to meet the many machine vision challenges faced by our partners and customers. The new “SilkyEvCam” USB camera equipped with a Prophesee sensor allows us to deliver a solution that is extremely efficient and cost effective. It offers a dramatic increase in the performance of sensing capabilities, AI processing and deep learning systems required in key areas such as predictive maintenance, manufacturing, factory automation, robotics and security,” said Mr. Saito, CEO of CenturyArks.
“CenturyArks use of our complete Metavision offering demonstrates the range of potential it delivers for industrial automation use cases. By combining the efficiency of the underlying event-based sensor approach with our flexible development environment, CenturyArks and its partners can benefit from a powerful off-the-shelf machine vision systems and also implement customized solutions to meet specific and demanding applications,” said Luca Verre, CEO and co-founder of Prophesee.
The camera takes advantage of the efficiency of the Metavision Event-Based Vision sensor by reducing the amount of data collected in a scene by up to 1000x compared to traditional frame-based techniques, significantly improving performance. This event-driven approach allows high-speed vision at over 10,000fps time-resolution equivalent with lower power operation, depending on the changes in the images and supports a wide dynamic range of over 120dB that makes it suitable for effective operation in demanding lighting conditions.
The platform is the first to offer full support of the Metavision Intelligence Suite, says Verre. The software suite consists of 62 algorithms, 54 code samples and 11 ready-to-use applications. It provides users with both C++ and Python APIs as well as extensive documentation and a wide range of samples organized by increasing difficulty to incrementally introduce the fundamental concepts of event-based machine vision.
- STACKED EVENT-BASED VISION SENSOR BOASTS HIGHEST HDR
- 2D MATERIALS PROMISE ULTRA-EFFICIENT NEUROMORPHIC COMPUTING
- SONY ACQUIRES SWISS VISION SENSOR FIRM
- SWISS EVENT-BASED VISION STARTUP LAUNCHES NEXT-GEN CHIP
Other articles on eeNews Europe