Soft robots could soon be everywhere: the squishy, malleable buggers might lead search and rescue missions, administer medication to specific organs, and maybe even crawl up your butt.
And now, soft robots will know how and when they’ve been bent out of shape — or shot full of holes by Arnold Schwarzenegger.
The trick is to simulate an animal’s peripheral nervous system with a network of fiber optic cables, according to research published Wednesday in the journal Science Robotics.
The Cornell University scientists behind the project hope that the tech could be used to build robots with a sense of whether they’ve been damaged.
As the fiber optic cables, encased in a block of smart foam, bend and twist, the pattern and density of the light traveling through them changes in specific ways.
But the differences in light among various movements and manipulations are too minute for a human spot, so the researchers trained a machine learning algorithm to analyze the shifts.
The AI system was trained to track how the light traveling through the fiber optic cables changed based on how researchers bent the foam.
Once it picked up on the patterns, according to the research, the machine learning algorithm could predict the type of bend with 100 percent accuracy — it always knew whether the foam was bent up, down, left, right, or the direction in which it had been twisted.
From there, the system could guess the extent to which it had been bent or twist within a margin of 0.06 degrees.
Someday, technology like this fiber optic network might give rise to robots that could teach themselves to walk, the researchers said.
With this new form of high-tech proprioception, the sense that lets us determine where our limbs are in space without looking, futuristic robots may be able to keep track of their own shape, detect when they’ve been damaged, and better understand their surroundings.
Please like, share and tweet this article.
Pass it on: Popular Science