Image credit: Ultraleap
When fingers touch an object, our nerve endings register those vibrations and send signals to the brain, which interprets them and identifies the texture of that object. The fingers are chock-full of nerve receptors and researchers are finding ways to mimic them to bring sensation to robots and prosthetics.
A team at Texas A&M University is studying how to enable haptic feedback on a touchscreen. Using friction between a finger and a screen, through miniscule adjustments in moisture or electrical charges, could enable a person to feel textures through a touch screen. In a medical environment, the technology could allow doctors to feel swollen glands using special sensing devices. A pharmacy, for example, could have a robot, a glove, or other tool equipped with sensors that could send haptic data to a physician in a remote location for interpretation. The same technology could allow online shoppers to feel the texture of fabric—is it scratchy or soft? Rigid or supple?—through their smartphone or tablet.
A similar technology has emerged from Northwestern University’s McCormick School of Engineering. Researchers there have developed a pliable skin patch with an integrated, flexible circuit board and tiny actuators that generate vibrations when they sense a current. The team created a pressure-sensitive touchscreen interface that allows a person to send haptic sensations, such as tapping or stroking, to the patch wearer. The patch technology is being tested on prosthetics, enabling a person with an artificial limb to feel haptic sensations.
Sensing Something in the Air
Ultraleap has developed a plug-and-play haptic solution called the Stratos Inspire for movement-triggered device control. The system uses an array of 256 tiny ultrasound speakers that are pre-programmed to emit high-frequency (and thus inaudible) sound waves in specific patterns. The sound waves reach a focal point simultaneously, creating just enough pressure in the air to dent the skin.
Image credit: Ultraleap
The Stratos system uses the air pressure to create vibrations that can be detected by the nerve receptors in the hand. That allows the user to feel the sensation of touch. The signal patterns create virtual 3D controls the user can feel. It can be used to develop gesture-enabled kiosks, touch-free car controls, and sensation in gaming apps.
The focal point is a programmable, adjustable position in the airspace, usually where the hand would be using the system. For example, if the technology is used near a computer, the focal point might be above the keyboard, or to control volume on a car radio, the focal point likely would be to the right of the steering wheel. Ultraleap developed an optical motion controller that tracks in real-time the user’s hand movements and positions the focal point on it.
In addition to the plug-and-play version, Ultraleap offers Stratos Explore, SDKs, and APIs for developers to create customized client solutions. The company recently teamed with SimplyNUC, a Texas-based systems integrator, to offer the TouchFree Bundle, which enables gesture-based interaction with kiosks, a bonus for pandemic life. The TouchFree application operates on existing interactive kiosks, like those used in wayfinding or restaurants, with minimal retrofitting and no additional code required.
The advances in haptics will not only improve sanitation efforts, but they will breathe life into an increasingly digitized world.
- Find out more about Ultraleap.
- Learn more about SimplyNUC.
- Learn more about the innovations at Northwestern University’s McCormick School of Engineering.
- Learn about the projects at Texas A&M University.