Stretchable sensor gives robots and VR a human touch
By David Nutt
It’s not a stretch to say that stretchable sensors could change the way soft robots function and feel. In fact, they will be able to feel quite a lot.
Cornell researchers have created a fiber-optic sensor that combines low-cost LEDs and dyes, resulting in a stretchable “skin” that detects deformations such as pressure, bending and strain. This sensor could give soft robotic systems – and anyone using augmented reality technology – the ability to feel the same rich, tactile sensations that mammals depend on to navigate the natural world.
The researchers, led by Rob Shepherd, associate professor of mechanical and aerospace engineering in the College of Engineering, are working to commercialize the technology for physical therapy and sports medicine.
Their paper, “Stretchable Distributed Fiber-Optic Sensors,” published Nov. 13 in Science. The paper’s co-lead authors are doctoral student Hedan Bai ’16 and Shuo Li, Ph.D. ’20.
The project builds upon an earlier stretchable sensor, created in Shepherd’s Organic Robotics Lab in 2016, in which light was sent through an optical waveguide, and a photodiode detected changes in the beam’s intensity to determine when the material was deformed. The lab has since developed a variety of similar sensory materials, such as optical lace and foams.
For the new project, Bai drew inspiration from silica-based distributed fiber-optic sensors, which detect minor wavelength shifts as a way to identify multiple properties, such as changes in humidity, temperature and strain. However, silica fibers aren’t compatible with soft and stretchable electronics. Intelligent soft systems also present their own structural challenges.
“We know that soft matters can be deformed in a very complicated, combinational way, and there are a lot of deformations happening at the same time,” Bai said. “We wanted a sensor that could decouple these.”
Bai’s solution was to make a stretchable lightguide for multimodal sensing (SLIMS). This long tube contains a pair of polyurethane elastomeric cores. One core is transparent; the other is filled with absorbing dyes at multiple locations and connects to an LED. Each core is coupled with a red-green-blue sensor chip to register geometric changes in the optical path of light.
The dual-core design increases the number of outputs by which the sensor can detect a range of deformations – pressure, bending or elongation – by lighting up the dyes, which act as spatial encoders. Bai paired that technology with a mathematical model that can decouple, or separate, the different deformations and pinpoint their exact locations and magnitudes.
Whereas distributed fiber-optic sensors require high-resolution detection equipment, SLIMS sensors can operate with small optoelectronics that have lower resolution. That makes them less expensive, simpler to manufacture and more easily integrated into small systems. For example, a SLIMS sensor could be incorporated into a robot’s hand to detect slippage.
The technology is also wearable. The researchers designed a 3D-printed glove with a SLIMS sensor running along each finger. The glove is powered by a lithium battery and equipped with Bluetooth so it can transmit data to basic software, which Bai designed, that reconstructs the glove’s movements and deformations in real time.
“Right now, sensing is done mostly by vision,” Shepherd said. “We hardly ever measure touch in real life. This skin is a way to allow ourselves and machines to measure tactile interactions in a way that we now currently use the cameras in our phones. It’s using vision to measure touch. This is the most convenient and practical way to do it in a scalable way.”
Bai explored SLIMS’ commercial potential through the National Science Foundation Innovation Corps (I-Corps) program. She and Shepherd are working with Cornell’s Center for Technology Licensing to patent the technology, with an eye toward applications in physical therapy and sports medicine. Both fields have leveraged motion-tracking technology but until now have lacked the ability to capture force interactions.
The researchers are also looking into the ways SLIMS sensors can boost virtual and augmented reality experiences.
“VR and AR immersion is based on motion capture. Touch is barely there at all,” Shepherd said. “Let’s say you want to have an augmented reality simulation that teaches you how to fix your car or change a tire. If you had a glove or something that could measure pressure, as well as motion, that augmented reality visualization could say, ‘Turn and then stop, so you don’t overtighten your lug nuts.’ There’s nothing out there that does that right now, but this is an avenue to do it.”
Co-authors include Clifford Pollock, the Ilda and Charles Lee Professor of Engineering; doctoral student Jose Barreiros, M.S. ’20, M.Eng. ’17; and Yaqi Tu, M.S. ’18.
The research was supported by the National Science Foundation (NSF); the Air Force Office of Scientific Research; Cornell Technology Acceleration and Maturation; the U.S. Department of Agriculture’s National Institute of Food and Agriculture; and the Office of Naval Research.
The researchers made use of the Cornell NanoScale Science and Technology Facility and Cornell Center for Materials Research, both of which are supported by the NSF.
Media Contact
Get Cornell news delivered right to your inbox.
Subscribe