Continental brings gesture control to the steering wheel

Continental brings gesture control to the steering wheel

Technology News |
By Christoph Hammerschmidt

The central element of steering wheel based gesture control is a time-of-flight sensor built into the instrument cluster. The integration of this sensor into the unusual position in the instrument cluster enables a solution that minimizes driver distraction and paves the way for further improvements on the way to a holistic HMI, Continental argues. The time-of-flight sensor detects the motion of the hand and converts it into actions. The driver can navigate through the menus by swiping up and down, and confirm the selection with a brief tapping motion. Touch-free operation is also possible for other functions. For example, if the driver moves his fingers up and down in a uniform movement while keeping his hands on the steering wheel, he can accept calls or reject them. A gesture is typically a movement linked to a specific property. Thanks to the time-of-flight sensor integrated in the instrument cluster, this development has a high rate of gesture recognition. The sensor comprises a 3D camera system with an integrated 3D image sensor and converts the infrared signal detected by the sensor into a 3D image. Consequently, the hand positions and gestures of the driver are detected with millimeter precision and converted to actions.


While with existing solutions the drivers frequently had to take their hands off the wheel or the eyes from the road ahead, the new action radius with the new solution is much more focused. “With gestures in a clearly defined area on the steering wheel, we can minimize distraction and increase safety. This narrowing down also prevents the driver from unintentionally starting gesture-based control by means of their usual everyday gestures, and thus making unwanted selections,” declares Ralf Lenninger, head of Strategy, System Development, and Innovation in Continental’s Interior division.


The system can currently detect four different gestures: setting the navigation, browsing through apps and starting music, answering calls, and controlling the on-board computer. Initial reactions of test users confirm the selection of these gestures. In particular, they welcomed the proximity to the steering wheel, operation with the thumb, as well as the intuitive learnability of the gestures. “The development of a holistic human-machine interface is crucial for further strengthening the driver’s confidence in their vehicle. Building up this confidence, combined with an intuitive dialog between driver and vehicle is yet another important step on the road to automated driving, one that we are supporting with gesture-based control on the steering wheel,” Ralf Lenninger summarizes.

The new operating concept integrates seamlessly into the holistic human-machine interface and can replace other elements such as buttons or even touch-sensitive surfaces on the steering wheel. Instead, it uses two transparent plastic panels – without any electronic components – behind the steering wheel, which a driver can operate with his thumbs, almost like a touchpad. As a result, the driver benefits from intuitive operation, while vehicle manufacturers benefit from optimized system costs. The clear design of the panels is compatible with almost any control geometry and new gestures can be added at any time. In addition, the variable complexity ensures that the system can be integrated in many different vehicle classes and not just in the luxury segments.

Related articles:

Proximity Gesture Applications In Automotive HMI

Proximity and gesture recognition sensors spread to automotive dashboards

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles