Feeling virtual objects at your fingertips

Technology News |
By Julien Happich

A Research Engineer at CEA-LIST’s Interactive Simulation Lab (LSI), Vincent Weistroffer exhibited a first demonstrator at Innorobo, freshly assembled less than two weeks ago as the end result of the MANDARIN project (MANipulation Dextre hAptique pour opéRations INdustrielles en RV which could translate as dexterous haptic manipulation for VR industrial operations ).

Backed by the French National Research Agency (ANR) with industrial partners including haptic interfaces provider Haption and car maker Renault, the MANDARIN project, which also involves the Inria Rennes and the Université de Technologie de Compiègne , aims to deliver a virtual object manipulation interface for immersive environments, which industrials could use to intuitively explore complex structures or to train technicians with assembly or disassembly procedures.

With force-feedback implemented on four fingers, the exoskeleton glove gives the wearer the sensation of manipulating real physical objects, exactly as they are displayed on screen or in a virtual environment. The glove could also be used to remotely control a slave robotic hand or manipulator.

“Next, we’ll implement hand position tracking using IR camera and reflective markers placed on the exoskeleton”, Weistroffer told eeNews Europe, showing round recesses in the 3D-printed plastic casing. “In the future, the hand could be mounted on a full haptic arm exoskeleton to implement stronger force-feedback, so you would not be able to force your way through a virtual object” the researcher added.

At the finger tips, a small rotating cylinder provides haptic feedback, simulating shear forces exerted on the finger pulp as one drag his/her finger across a surface. At this stage, the bulky prototype is only a demonstrator, but Haption may want to integrate it further, optimizing its construction for industrialization.

CEA-LIST’s freshly assembled MANDARIN demonstrator.

“Renault is interested in this project to train its operators disassembling electric batteries”, explained Weistroffer, “using virtual environments, they could train several operators at once, and they wouldn’t have to re-assemble physical models before the next disassembly session, which consumes a lot of time” said Weistroffer.

We reached out to Florian Gosselin, Project Leader at CEA LIST’s Interactive Robotics Lab (LRI) for more specific details about the haptics implementation.

“The hand exoskeleton combines a peak force feedback of over 1kg to the fingertips upon grabbing an object, with additional haptic feedback on individual fingers thanks to a small rotating drum in contact with the pulp of each fingertip”, he explained.

“The rotating drums emulate the sensation that the wearer would perceive when sliding their fingers across a surface” Gosselin added, “but we have also designed another haptic combination we call Haptips, where a small tip presses against the pulp of the fingertips”. Using micro-motors and miniature cable pulleys, the tip is displaced in a plane frontal to the fingertip so as to simulate shear force feedback in 2D.

The Haptip fingertip-mount devices.

“The Haptip fingertip-mount feedback actuators do not require a full hand exoskeleton to operate, they are lighter, less obtrusive and offer a one-size fits all solution. In that case, we can keep track of the fingers’ positions relative to the palm using a commercial LeapMotion IR sensor. The Haptips could also be mounted at the tip of the hand exoskeleton in place of the rotating drums.”

Exploded view of the Haptip device
(top right: the assembled device within its housing).

“With these two haptic effects, we combine full-hand force feedback for the grabbing action with the fingertip-level precision feedback necessary to allow the wearer turn a knob with precision for example”, commented Gosselin.

“For a fully integrated exoskeleton solution, we have also designed innovative optical sensors on the joints combined with optical encoders on the motors to detect the individual fingers’ positions. The whole hand position is acquired using standard motion capture”.

The technology could find other use cases such as health rehabilitation, technical education and in the long term, it may even trickle down into consumer applications such as video games for improved interactivity.

Visit CEA-LIST at

Visit Haption at


Related articles:

Touch interface software suite tackles haptic effects

Augmented reality gets physical with haptics

2D to 3D conversion adds texture on top

Feeling augmented


Linked Articles
eeNews Europe