Gesture Recognition

WiFi Recognition
March 10, 2020
Synth-Pop Makes a Comeback
March 10, 2020

Gesture Recognition

Gesture recognition technologies are now capable of interpreting motion to identify us and make decisions on our behalf.

Gesture recognition technologies are now capable of interpreting motion to identify us and make decisions on our behalf.

Emerging gesture recognition systems represent natural user interfaces (NUIs), and they will be an important future component of many different technologies. Imagine picking up a digital object with your hand, or controlling a remote robotic arm without being tethered to a bunch of wires.

Gesture recognition unlocks the interplay between our physical and digital realms. Google’s Pixel 4 phone can be controlled without touching the screen. Instead, the phone uses motion sense and radar technology to detect micro-gestures.

The technology comes out of Project Soli, an early hand-tracking technology developed by Google’s Advanced Technology and Projects group, which also developed the Project Jacquard connected clothing system found in Levi’s Commuter Trucker jackets. (In early 2019 Google won approval from the Federal Communications Commission to run its Project Soli hand-tracking technology on commercial aircraft.)

NUIs will soon allow us to control many devices with our body movement alone. We’ll also start to see applications in the workplace that record our body movement to predict when we’ll be most productive. It could also help security systems and teams learn when we might cause harm to others.