MENU



The new Tulipp (Towards Ubiquitous Low-power Image Processing Platforms) Use Cases are intended to be paired with the Tulipp embedded computing reference platform for a comprehensive embedded vision solution.

The new Uses Cases will be demonstrated at Vision 2018 in Stuttgart, Germany from 6-8 November 2018 . Tulipp will also hold a practical workshop on the Project at the HiPeac 2019 Conference in Valencia, Spain on 22 January 2019 and deliver an in-depth tutorial on 23 January 2019. Participants in the HiPeac workshop and tutorial will receive a free Tulipp development kit.

The Medical X-Ray Imaging Use Case consists of an embedded computing board with a medical X-ray imaging sensor which eliminates noise on images at lower radiation levels. The ADAS Use Case runs pedestrian detection algorithms real-time on a small, low-power, embedded platform. The UAVs Use Case provides real-time obstacle detection and avoidance based around a lightweight and low-cost stereo camera setup.

The Medical X-Ray Imaging Use Case demonstrates advanced image enhancement algorithms for X-Ray images running at high frame rates. It focuses on improving the performance of X-Ray imaging Mobile C-Arms, which provide an internal view of a patient’s body in real-time during the course of an operation to deliver, increases surgeon efficiency and accuracy with minimal incision sizes, aids faster patient recovery and lowers nosocomial disease risks. Tulipp’s embedded hardware reference platform is the size of a smart phone. The Use Case demonstrates how radiation doses, which are typically 30 times ambient radiation levels, can be reduced by 75% while maintaining the clarity of the real-time X-Ray images.

The Tulipp ADAS Use Case demonstrates pedestrian recognition in real-time based on the Viola & Jones algorithm. The ADAS Use Case achieves a processing time per frame of 66ms, which means that the algorithm reaches the target of running on every second image when the camera runs at 30Hz.

Tulipp’s UAV Use Case uses disparity maps, which are computed from the camera images, to locate obstacles in the flight path and to automatically steer the UAV around them.

More information

Thales – www.thalesgroup.com

Efficient Innovation SAS – www.efficient-innovation.fr

Fraunhofer IOSB – www.iosb.fraunhofer.de

Hipperos – www.hipperos.com

Norges Teknisk-Naturvitenskapelige Universitet – www.ntnu.no

Technische Universität Dresden – www.tu-dresden.de

Sundance Multiprocessor Technology – www.sundance.com

Synective Labs – www.synective.se

Vision 2018

Hall 1, Stand 1A74

Related news

Smartek Vision will become FRAMOS Embedded Engineering

FRAMOS launches modular approach to embedded vision

UC Berkeley accelerator wants to bring silicon back to Silicon Valley

Bird’s eye view reference design for ADAS development

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

10s