Synesthesia Suits

Thinking In Public
March 6, 2017
Announcing: Summer Teacher Training Fellowship
April 5, 2017

I just returned from SXSW, where I spent several hours with the folks from Sony CSL and had the opportunity to test out a bunch of new whole-body virtual experiences. Rather than wiring in and losing myself in the immersive story, my physical body was taken along for the ride. I mean that literally––one experimental project involves a stationary bike and video dome. At the moment, it’s a game meant to let you have fun without letting your body atrophy into the couch, but it made me think through future scenarios for human-machine perception and how computers will someday make us feel.

For example, one prototype––a synesthesia suit––is designed to let us feel music, rather than simply listening to it. A traditional head-mounted VR display and sound-blocking earphones communicate with a series of 26 actuators, which vibrate in different sequences to simulate stringed instruments, woodwinds and percussion. Additional sensors could be used to artificially speed our heart rates, or make us feel as though we’re sweating. Here are some plausible future applications I’m mapping:

  • Haptic news stories, allowing you to feel what you see. Which is interesting for news organizations: imagine being able to feel, even in a very small way,  what it’s like to cross a border as a refugee? Or to be beaten up? Or shot––by a police officer, a gang member, or a toddler playing with a gun?
  • Haptic marketing. For brands, imagine giving your customers the ability to feel the texture of a fabric or the pressure points inside a new shoe? Or a way to stimulate pleasure receptors while watching a commercial for pharmaceuticals?
  • Haptic rehabilitation therapies. For stroke victims, this early experimentation could signal a new kind of prosthesis that simulates touch and temperature.
  • Full-body haptic training for sports and fitness. For certain sports––golf especially––a synesthesia suit could offer a more advanced way to improve your game.

More impressive were the collaborative experiences that required VR goggles––but instead showed video in real-time. I tried a human perception experiment along with three other players. Our head-mounted VR displays didn’t transport us into some fictional world, but rather split the real world into four squares representing the perspectives of each of the players. To start the game, we had to cooperate to form a square looking only through our displays, each of us building a corner with our fingers and thumbs. Then we played tag, hiding inside of a maze as we watched each other move around.

This collaborative perception technology has an obvious use case for warfare. But think of all the other times it’d be useful to see multiple perspectives at once: live events (games, concerts)… doing basic electrical work inside of a home or building… remote-teaching students how to write in a foreign language… surgeons, doctors and nurses practicing a complicated surgery before performing it on a real patient… There are numerous opportunities here for brands, orchestras, universities, stadiums.

The point of the tag experiment, created by Sony CSL’s Shunichi Kasahara, is to push the limits of human perception, and for a few moments I felt physically tethered to a different kind of future, one in which more of my senses are activated at once. Here’s how Kasahara describes the technology: “For realtime observation, the reflection upon behaviors, and post analysis, we embedded a small IR camera and IR-LED into the head mounted smartphone goggles to capture the single eye image. Then we implemented an eye tracking system with standard computer vision pupil detection. This allows researchers and audience of the workshops to observe and interpret the relationship between behavior of participants and eye movement. Therefore, we can carefully observe what they do consciously and what is happening unconsciously.”

I couldn’t help but think about the far-future of body ownership. In these experiments, only I had the ability to move my arms and legs, but given the human-robotic interfaces I know are being researched elsewhere, it was hard not to fast forward to scenarios in which my physical body was being controlled by a team of people, much like teams of people control robots today. Our bodies are machines, after all, directed by the squishy computers inside our skulls.

-Amy Webb

Leave a Reply