French AI developer Prophesee has released a set of key open-source software modules and a set of tools for event-driven Machine Learning such as optical flow and object detection.
As part of the Metavision Intelligence Suite, the Paris-based company is offering the industry’s largest HD Event-Based dataset called OpenEB to developers as a free download. This helps developers use an event-driven approach to machine learning that is triggered by changes rather that neural network frameworks.
The latest release adds an expanded set of development tools and software for designing industrial vision systems with event-driven machine learning. The suite now includes close to 100 algorithms, 67 code samples and 11 use-case specific application modules that accelerate the development process. The open-source modules of OpenEB are available through Github and allow designers to build custom plugins and ensure compatibility with the Metavision Intelligence Suite for developing event-based systems. It provides a platform for developers to share software components across the machine vision ecosystem.
“We want to set an open technology standard in the machine vision ecosystem that enables new levels of accessibility and interoperability,” said said Luca Verre, CEO and co-founder of Prophesee. “Our approach provides the growing ecosystem around event-based technology with a rich open foundation and a strong development framework. This includes an extensive and reliable data that we have collected over several years, as well as application modules that leverage our expertise in a variety of specific uses to accelerate the development of customer-specific systems.”
The OpenEB database offers a standard Event-Based data format for camera makers and their customers The open-source model for the Metavision Intelligence Suite enables compatibility across the ecosystem of camera makers and their customers. Releasing a number of key modules under an open source license accelerates the creation of custom plugins while ensuring compatibility with the underlying hardware from camera manufacturers.
The development environment provides a complete platform for rapid development of machine-learning applications, starting with the real-sequence data set Prophesee has created over the past four years. Developers then use a variety of tools to guide the development of neural network models, run inference on event-based data for both supervised training tasks for object detection and self-supervised training for optical flow, all optimized for event-based vision. In addition, developers can easily create their own models or leverage their existing frame-based datasets and models, using the provided event-based simulator, and improve them with Event-Based Vision.
The Metavision Intelligence Suite adds new ready-to-use applications for key processes that can be enhanced with Event-Based Vision. These include:
- Particle Size Monitoring: Count and measure objects passing through a field of view at very high speeds (up to 500,00 pixels/second) with up to 99.9% counting precision in a production line to ensure better control of the process.
- Jet monitoring. Monitor the speed and quality of liquid dispensing in real time. Detect and count high speed jets, with support for up to 500Hz jet dispensing, and generate alarms automatically when errors occur on the dispenser.
- Edgelet tracking. Achieve ultra robust 3D objects real-time tracking with low compute power by leveraging the low data-rate and sparse information provided by event-based sensors.
The latest version of the Metavision Intelligence Suite is available now at www.prophesee.ai/metavision-intelligence/
- Prophesee steers neuromorphic vision sensor towards industrial
- Kit provides algorithms for neuromorphic image sensor
- Stacked event-based vision sensor boasts highest HDR
- Sony acquires Swiss vision processor firm
Other articles on eeNews Europe