Augmented reality gets physical with haptics
From the lab to startup companies, the race is on adding physically perceptible volumes and textures to whatever is displayed on screen, ranging from a simple keyboard with a “click” feel to the complex rendering of 3D shapes and textures, either in volume or on a seemingly flat surface.
The EuroHaptics 2014 conference which took place in Versailles (France) from the 24th to the 26th of June was buzzing with actuators and haptic devices of all sorts. Well over a hundred papers, posters and dozens of demos were presented, covering experimental research setups about human touch perception on one end, and various tangible haptic interfaces on the other end of the spectrum, with plenty of force and feedback encoding schemes in between.
Before any sensory information can be effectively put to good use in a haptic interface, one should understand how we humans perceive touch, and how our perception and our experience of the world affect our individual capacity to discriminate features and objects. A lot of fundamental research goes into understanding the limitations of touch-only haptic devices, versus multi-modal haptics where touch is combined with vision and/or sound to provide a better perceptual illusion.
Often, the experiments show that a multisensory interface, as most of us would naturally experience with real world objects, provide a much better illusion and makes it easier for the end-user to manipulate virtual objects. Sometimes, they just highlight how a dual combination would be the most effective (sight and touch, or sound and touch).
Then tricks can be developed by haptic device designers to tune into our perceptual illusions and create haptic feedback effects that are felt stronger or different than what the actual interface material really should provide (for example feeling a textured shape on a truly flat glass surface).
One of the posters presented by Anke Brock from the CNRS & University of Toulouse was exploring the combinations of flat displays and haptics that would best suit visually impaired people for gestural interaction (touch displays often only offer visual cues).
For the purpose of her investigation, Brock designed an interactive map prototype including a raised-line map overlay for gestural interaction with contoured buttons for accessing different types of information such as opening hours, distances. The drawings were made in the Scalable Vector Graphics (SVG) with a configuration file written in XML so as to be interpreted by the interface application. The physical overlay was painstakingly custom made and static. But ideally, this is an area where dynamically reconfigurable haptic displays could play a bigger role.
Nowadays, you’ll find subtle haptic effects to replace the clicking feel of buttons and scrolling wheels in high-end consumer applications, such as in the cockpit of Mercedes Benz’s latest 2014 C-Class model BR205, where the overall HMI can be controlled through a central pad with no moving parts.
The smooth curved touchpad developed by Continental combines a capacitive touchpad overlay and proximity light sensors for finger detection, and built-in coils that vibrate the assembly.
The haptic feedback characteristics can be tuned in a wide range by varying the position and force level for the “press” state and the length of the haptic pulse profile.
Targeting high-end white appliances and medical equipments, Aito offers so-called Software Enhanced Piezo technology, combining piezoelectric sensing and a feedback loop processed through the dedicated AitoChip companion chip to drive the thin piezo-actuators stacked underneath the user interface.
Because it is a very low cost and highly reliable solution (no moving parts), Aito’s CEO Rene de Vries hopes his solution will become an industry standard, enabling the comforting “click” feel of mechanical switches even through the toughest steel, glass or ceramic casings (not excluding plastic or wood). The company has even set up a web portal, https://sep-touch.org/ to foster a community of software developers and technology partners around its Aito chip.
“Now we are too small a company to approach the automotive market, but as we get more visibility, I am sure that automakers would see the benefit of our technology”, told us de Vries, claiming that his piezoelectric solution is much more cost effective and simpler to implement than coil-based solutions.
Visibly flat but feeling rough
Researchers from the University of Electro-Communications (Japan) jointly with company EyePlusPlus Inc. demonstrated what they call a tactile vision substitution system, dubbed HamsaTouch.
The electro-tactile display features tiny electrodes on the upper side (where you would put the palm of your hand) tied to as many optical sensors on the other side of the tablet.
When fitted to the LCD screen of a smartphone which acts as the camera and image processor for extracting objects and landscape contour information, the optical sensing side translates the contrast information into tactile information onto the palm-facing side. Spaced about 3mm apart, the tiny electrodes send 300V at 5mA, modulated in frequency to create the tactile feel onto the skin.
One could easily identify lines, moving contours, or swipe their fingers to “read” the contrasted image in a braille-like fashion.
A sponsor of the event, Finnish company Senseg offers a transparent coating they call the Senseg Tixel, together with a driver chip to manage electrical signals sent to the Tixel surface.
By modulating the signal, the company’s Tixel delivers a sophisticated sensation of touch and texture using an electrostatic field (Coloumb’s force creating an attraction between bodies of different electrical charges).
When dragging a finger across the surface, the so-called electrovibration can create varying degrees of friction, making the fingers slip more or less as if they encountered ridges or asperities which can complement the visual cues (sand paper background, rough stone, buttons etc…
Senseg has recently secured $6 million in a Series B round of funding led by NXP Semiconductors NV, but it has yet to see its technology in a commercial product.
Another interesting paper presented by researchers from the Korea Science Academy of KAIST illustrated a surface display enabling realistic 3D haptic rendering with both kinesthetic feedback (position, force, orientation) and tactile feedback (contact pressure, slip, vibration).
In a fully transparent layered approach, the researchers combine electrovibration (the use of a frequency-modulated electrostatic force through a capacitive gap) with mechanical vibration distributed uniformly on the screen’s surface.
While 3D geometric features can be represented by adjusting the lateral friction forces using electrovibration, tactile patterns are generated through mechanical vibration to convey 2D texture information at the surface of the geometric features. Here, both the mechanical vibration and the electrovibration can be driven independently to simulate all touch-aspects of real objects, on a flat screen.
For some, altering flat screen surfaces is not enough. In a paper titled “Diminished Haptics: Towards Digital Transformation of Real World Textures”, researchers from the University of Tokyo developed a method for altering real-world textures, for example, from paper-like to metal-like, from wood-like to paper-like.
The researchers use a 28kHz transducer coupled to the object whose texture should be altered. By controlling the amplitude of the input signal, the researchers are able to determine the levitation height of the finger relative to the material’s surface, based on the squeeze effect.
To approach the real thing, they use high resolution real world textures (collected using a three-axis accelerometer) instead of synthesized data. The idea is that various textures could be mapped to an existing manufactured prototype, to reflect CAD changes during the product elaboration. By tracking the finger position and with an image projection showing different textures across the surface, the researchers also envisage to simulate patches of different textures on a same surface.
As a first pre-processing step, the researchers reduce the original real-world texture of the object. In a second pass, they aim to rewrite the texture by applying ultrasonic vibrations at adequate frequencies.