EU’s Tulipp project delivers its results for embedded vision: Page 2 of 2

October 30, 2018 //By Julien Happich
EU’s Tulipp project delivers its results for embedded vision
Launched in 2016, EU initiative Tulipp (Towards Ubiquitous Low-power Image Processing Platforms) target the development of high performance, energy-efficient embedded systems for a growing range of increasingly complex image processing applications.

Tulipp’s Medical X-Ray imaging use case demonstrates advanced image enhancement algorithms for X-Ray images running at high frame rates. It focuses on improving the performance of X-Ray imaging Mobile C-Arms, which provide an internal view of a patient’s body in real-time during the course of an operation, increasing surgeon efficiency and accuracy with minimal incision sizes, aiding faster patient recovery and lowering nosocomial disease risks. Using Tulipp’s embedded hardware reference platform, which is the size of a smart phone, the use case demonstrates how radiation doses to which patients and staff are exposed, which are typically 30 times ambient radiation levels, can be reduced by 75% at the same time as maintaining the clarity of the real-time X-Ray images which would otherwise be rendered useless by the increases in the noise level on the images that a reduced radiation dose can cause.

ADAS adoption is dependent on the implementation of vision systems or on combinations of vision and radar and the algorithms must be capable of integration into a small, energy-efficient Electronic Control Unit (ECU). An ADAS algorithm should be able to process a video image stream with a frame size of 640x480 at a full 30Hz or at least at the half rate. The Tulipp ADAS use case demonstrates pedestrian recognition in real-time based on Viola & Jones algorithm. Using the Tulipp reference platform, the ADAS Use Case achieves a processing time per frame of 66ms, which means that the algorithm reaches the target of running on every second image when the camera runs at 30Hz.

Tulipp’s UAV use case demonstrates a real-time obstacle avoidance system for UAVs based on a stereo camera setup with cameras orientated in the direction of flight. Even though we talk about autonomous drones, most current systems are still remotely piloted by humans. The use case uses disparity maps, which are computed from the camera images, to locate obstacles in the flight path and to automatically steer the UAV around them. This is the necessary key towards totally autonomous drones.

Tulipp is funded by the European Union’s Horizon 2020 programme -

Tulipp consortium members include:

Thales -

Efficient Innovation SAS -

Fraunhofer IOSB –

Hipperos –

Norges Teknisk-Naturvitenskapelige Universitet –   

Technische Universität Dresden –

Sundance Multiprocessor Technology –

Synective Labs –

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.