By Russ Banham
Of the five senses, touch arguably is the most underprized. If limited to a choice of having four senses, chances are that many people would select sight, hearing, smell and taste, not realizing the vital role touch plays in conveying information.
Our brains have receptors that interpret tactile sensations based on an object’s shape, weight, texture, and vibration. In tandem with information transmitted by other senses, a fuller picture emerges of the object’s significance. Take away touch and this significance is diminished. That’s where haptic technology—or haptics—comes in.
Haptics is any form of interaction involving touch. In the context of technology, haptic tools are providing extraordinary opportunities to impart information through touch. Take a virtual flight simulator for airline pilot training: Physical flight simulators with actual flight instruments cost millions of dollars; the cost of virtual flight simulators is a comparative pittance. However, the problem with virtual reality (VR) pilot training experiences is the lack of physical sensations, the feeling of pressing a button or sliding a lever.
“There’s no muscle memory associated with operating the flight instruments,” says Jake Rubin, founder and CEO of HaptX, a Seattle-based leader in haptic technology. “By removing touch, the brain discounts the authenticity of the experience.”
HaptX has developed a haptic glove that provides 130 points of feedback to a person’s hand to feel the shape, texture, and motion of objects. In the context of VR flight simulators, when the trainee visualizes a button and goes to press it, the glove creates the feeling of pressure on the person’s finger at exactly the moment the button is pushed.
Aside from its work with virtual flight simulators, HaptX also is engaged in a partnership with Nissan to produce a less expensive, immersive 3-D automobile design process.
“Typically, it takes months and costs hundreds of thousands of dollars to design a new vehicle using traditional processes,” says Rubin. “A simple thing like wanting a button in a slightly different spot on the dashboard can eat up weeks of time.”
Although automakers use CAD (computer-aided design) software in the early phases of design, they typically ship physical clay and wood mock-ups across the world for other designers to evaluate, he explains. HaptX and Nissan are working on next-generation haptic gloves that will allow designers to experience a new car model in virtual reality.
Inside the virtual model, a designer could operate different instruments like the steering wheel and navigation system, with the respective physical sensations transmitted by the haptic gloves (similar to this video by Varjo).
“Today, a designer pretends to press a button, but has no idea if it’s in an ergonomically correct position,” says Rubin. “Wearing the gloves, they can feel the buttons they press in collaboration with other designers across the world, allowing them their experiences in real time, making needed changes using the CAD software.”
Another market is robotics. HaptX has created a tactile telerobot that lets a glove-wearing user feel through force feedback what the robot touches, thereby guiding the machine’s movements. The robot can be anywhere in the world, allowing for remote operations in a manufacturing, medical, and industrial hygiene context.
“We’re looking at applications like a surgeon wearing the gloves running through a procedure on a patient that is performed by a robot, or a mechanic operating the controls of a bot that does maintenance on a machine,” Rubin says.
This is just one developer’s innovative haptic concepts. Here’s a look at three other haptic technology pioneers, developing the next generation of touch to applications in the medical, automotive, manufacturing, gaming, and other sectors.
Lofelt: Putting Touch Into Touchscreens
Smartphones are among the many products that already incorporate haptics. While the vibrations produced when a phone call is coming in aren’t especially jaw-dropping, Lofelt is aiming to change the status quo. The 5-year-old Berlin-based haptics provider has developed a vibrotactile component that simulates textures on a glass surface, giving smartphone and tablet users a more multi-sensory experience.
“Our goal was to develop vibrations that captured the physical textures associated with traditional sounds,” says Lofelt co-founder and CEO Daniel Büttner. “Since we have a strong audio background, we realized that haptic information is hidden in the audio—a foot crunching through snow, for example. Each sound produces a different vibration in the physical world. We can recreate that physical experience when touching the screen.”
The tactile sensations felt by the user’s fingertips while touching the screen—imagine feeling a repetitive crunching vibration to the beat of each foot walking through the snow—are aligned in the brain with the visual and audio components, creating an “immersive experience,” Büttner says. “It’s all an illusion, but it seems incredibly real.”
Applications include the gaming, automotive, and medical sectors. “We can make steering wheels more vibrotactile to alert the driver of different dangers, such as when the car is veering out of the lane or another vehicle up ahead just hit the brakes,” says Büttner. “Each danger would involve a different tactile feeling, creating a muscle memory in the brain.”
In the medical arena, Lofelt is engaged in a project to apply haptic technology to helping treat Parkinson’s disease. “With Parkinson’s, the brain sends out a plethora of different signals, which contributes to the tremors people experience,” Büttner explains. “If you keep the brain busy with vibrotactile information, it seems to cut down on the number of signals the brain is processing, which may reduce related tremors.”
Ultraleap: Ultrasound Tactile Sensations
Ultraleap is taking haptic technology into a new realm altogether. Rather than producing physical touch sensations using a glove or when touching a screen, Ultraleap’s mid-air haptic technology harnesses invisible ultrasound waves to feel, shape, and manipulate objects visualized in a 3D hologram, literally creating the sensation of touch in midair.
“When the user’s hand reaches the surface of the hologram, we can deliver a sense of touch by using software to track the hand’s movements and then sending ultrasound waves to the hand the moment the virtual object is `touched,’” says Anders Hakfelt, senior vice president of product and marketing at Ultraleap.
Ultraleap is a combination of two previously separate companies with singular technologies—UltraHaptics and Leap Motion (the companies merged last month). UltraHaptics is a pioneer in midair haptic technology, and Leap Motion is a leader in hand-tracking technology, the movement of the hand in space to determine its precise location. Ultraleap’s virtual tactile interfaces have applications in several sectors, including automotive, advertising, immersive entertainment, and public transportation.
Hakfelt provided an example of how the technology could assist a commuter with limited mobility traveling in a bus. “A simple thing like pressing the stop button can be tricky in a crowd,” he says. “Using the hand-tracking technology, the tool can see when someone raises her hand. It is then able to position the ultrasound to simulate the feeling of a button in her hand via vibration. The person taps her hand once, which sends information to the driver to stop the bus.”
Hakfelt predicts the eventual use of haptic technologies by surgeons to perform remote operations, in which a surgeon in one part of the world remotely controls a scalpel in another part of the world. “While it will require multiple technology interfaces, it is certainly within the realm of the possible,” he says.
NewHaptics: A Better Braille
NewHaptics‘ singular purpose is to improve the lives of blind people by enabling more immersive digital interactions. While refreshable braille touchscreens that display raised dots to form words are available for the sight-impaired, they permit only three or four words of text to be read at a time.
“A profoundly limited view of the digital world is available to the blind, with no means of accessing spatial content like diagrams and graphs, much less full-page refreshable information,” says Alex Russomanno, NewHaptics cofounder and CEO.
The company’s novel technology borrows from the Pinscreen toy from the 1980s that consisted of a box with a crowded array of pins, in which people stuck their hands or faces into the pin box to create an exact 3D sculptural relief. NewHaptics has traded the pins for tactile pixels called “taxels” that rise to the surface of a tablet to produce images like a diagram, bar chart, or a line chart like a spreadsheet, as well as full-page displays of refreshable braille text.
Funded by the National Science Foundation and the National Institutes of Health, NewHaptics’ small-scale prototype is composed of 60 taxels in a small grid. In 2020, a larger prototype will be available and marketed to the sight-impaired. “What is exciting for us is to help people who are blind to process information in ways they haven’t before,” says Russomanno.
In doing so, touch is taking on new meaning and value.
Russ Banham is a Pulitzer-nominated journalist and best-selling author.