Kirk Klasson

Somatosensoricity: Mastering Touch

 

What’s in a name? An ancestor, a zeitgeist, some whimsey and dreams? Entrepreneurs often think long and hard about names before the marketing types get involved. Because once names are set, they get sticky, and sometimes the goo won’t come off your shoe.

Take Smart Dust. For years it sat on the front stoop of Gartner’s Hype Cycle starring up at a slope that would have caused Edmond Hillary to choke, unable to move and stymied by little more than an unfortunate name. It first made it to the bottom of the Slope of Hype in 2003 and stayed there for a while before falling off. Then it reappeared in 2013 and managed to hang on until 2020 when once again it disappeared. Not unlike artificial intelligence which for decades conjured up expectations that are nowhere near the machine based recognition of associable artifacts that has finally gained traction (see AI’s Inconvenient Truth – July 2018), Smart Dust or MEMs (Microelectromechanical systems) consists of tangible, proven technologies. Only they were cast in an application metaphor whose technical challenges were impossible to surmount. Legend had it that once thrown to the wind, dropped in an ocean or injected into your blood stream these dust-sized sensors would detect sophisticated phenomenon and transmit their discoveries through a self-powered, self-sustaining network, like an intelligent bloom of bioluminescent plankton blinking encoded data to each other and to you, a feat that to this day remains beyond human engineering.

But lucky for us, the dust didn’t die.

Over the last couple of years, employing a broad range of techniques and technologies, including MEM-based sensors, scientists have come tantalizingly close to perfecting an artificial sense of touch: somatosensoricity.

For years the physiology of tactile sensation has been well understood. The discrete mechanoreceptors that capture the sensations of temperature, pain and physical encounters are well known as is their connection to the nervous system and its connection to the brain. Even their relative distribution and density across various parts of the human anatomy has been established. But it has only been recently that science has been able to articulate and codify the rules that govern the propagation of sensation, a “universal law of touch”.

Researchers at the University of Birmingham, exploring the mechanoreceptors knows as Pacinian corpuscles determined that they “sense” a particular type of vibrotraction known as Rayleigh waves, a seismic phenomenon common to earth quakes. With this determination they were able to propose and confirm a precise mathematical model for the propagation of sensation through the elastic substrates that make up human skin. On its own, this is an outstanding accomplishment but for the field of somatosensory engineering it could be a defining moment, causing a rapid coalescence of polymer, sensor, communications and AI technologies.

Today’s artificial skin is a layered matrix of disparate microelectromechanical sensor technologies, smart dust, embedded in substrates of flexible polymers, orchestrated by various communications methods and protocols and interpreted by special purpose artificial intelligence schemes. In short, a biomimetic contraption of nearly unfathomable complexity and one that’s ripe for simplification under a unifying principle. And it could be that “the law of touch” provides one. From the transmissive properties of the polymer to the oscillating sensitivity of the mechanoreceptors to synchronicity of the communications protocol and the discrimination of sensation made by artificial intelligence.

Source: “A Survey of Tactile-Sensing Systems and Their Applications in Biomedical Engineering”, Shenzhen Institute of Advanced Technologies, Shenzhen College of Advanced Technologies.

The sense of touch is a wondrous albeit, when it comes to technology, a somewhat ignored phenomenon. Touch is our most primal sense. Before there were eyes and ears life relied on touch to acquire its knowledge of the surrounding environment, kind of like a Roomba trapped beneath couch with lots of legs. The first sense that develops in the womb is touch. And some scientists believe that it is our last sense to leave when we expire. Recently, it’s been shown that touch triggers the production of oxytocin and oxytocin promotes a sense of well being as well as the efficacy of other therapies in humans. If you doubt the primacy of touch simply hold a new born or spend a few minutes in an animal shelter where the occupants can explain the longing for and fear of physical sensation. And yet when you consider the share of innovation dollars invested in somatosensory technologies up against those devoted to its vision and aural siblings it appears to be modest indeed. Smaller still when you leave out the Haptics portion of the portfolio. Which is odd when you consider the fact that nearly every human affordance, the buttons, knobs and dials that we use to control our lives, incorporates the sense of touch (see The More I Buy Gadgets the Better I Like Furniture – October 2011).

Like many biomimetic technologies, progress in touch has rested mostly with universities, pac-rim universities in particular, and not major corporations (see AI: Waking the Neuromorphic – February 2018). Although signs point to this changing, especially as the application portfolio for these technologies increase and the population of developed countries continues to age. Initially, artificial skin technologies were focused on improving prosthetics, a valuable but not necessarily large market. Haptic applications, at least the force feedback segment, have been well established for sometime but the degree of sensitivity provided by current versions of artificial skin technology exceeds most industrial requirements. This would suggest that current artificial touch technologies are destined to intercept a future generation of highly sophisticated human-focused robotics.

But getting there will likely require a number of intermediary steps that meld robotics and human affordances in explicitly practical ways. One obvious candidate would be the recently introduced shelf stocking robots that debuted in Japan convenience stores. These machines are being “taught” how to handle and place products by remote human mentors and in the process how to discriminate how much “grip” and “slip” each package can afford. Potato chips and jalapeno dip. One of these things is not like the other. Another candidate would be telemedicine and robotic surgery where fine grained sensory discrimination is required. Mastering these would help pave the way to more promising applications such as robotic geriatric care managers. But either way it appears that somatosensory technology is ripe for serious commercial investment. Perhaps firms that are in the business of handling human artifacts will become as attractive to VC’s as those that recognize associable human artifacts or as the marketing types like to say AI. Now might be an opportune time to bring these two together.

Who knows. In a couple of years we can expect to see artificial skin show up in really high value applications like terminators and sex dolls.

And with any luck, they’ll be able to get the lid off the pickle jar.

 

Cover graphic courtesy of “A Skin-Inspired Tactile Sensor for Smart Prosthetics” by the Chinese Academy of Sciences, via Science Robots Sept 2018 issue, all other images, statistics, illustrations and citations, etc. derived and included under fair use/royalty free provisions.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Insights on Technology and Strategy