Google’s Advanced Technology and Products division recently announced a new round of research that aims to refine the radar technology of Soli, sensor-integrated software that responds to human behavior and movement.
Proxemics is the study of human use of space and how changes in population density can affect behavior, communication, and social interaction. More specifically, proxemics inform branches of design that deal with ergonomics and space mediation. On one hand, proxemics aid in the configuration of floor plans to harmonize instinctive human behavior with spatial experiences. In a different light, proxemics further develop technology to respond to our behavior and needs with human-like responses. Google’s Advanced Technology and Products division (ATAP) recently took to proxemics to refine the Soli sensor, a sensor with embedded radar technology that uses electromagnetic waves to pick up on even subtle human body language and movements.
Designer: Google’s Advanced Technology and Products (ATAP)
Used in modern appliances like the Nest Hub smart display and Google Pixel 4, Soli’s radar has contributed to sleep-tracking and contactless, remote control technology. This new round of research spearheaded by Google’s ATAP team finds the sensor data gathered by Soli being used to enable computers to recognize and respond to our daily movements. Leonardo Giusti, head of design for ATAP, says, “We believe as technology becomes more present in our life, it’s fair to start asking technology itself to take a few more cues from us.”
In response, the team at Google hoped to develop Soli to capture the same energy as your mom turning the television off and covering you in a throw after you doze off on the couch. The integrated radar software is designed to detect a user’s proximity to computers and personal smart devices, turning off as we walk from its screen and turning back on once we’re in front of it again. In addition to proximity sensors, the radar technology recognizes changes in body orientation, which signals to the device whether a user will soon interact with it.
While we may not wake up swaddled in a warm blanket, this new round of research finds computers and smart devices acknowledging and responding to when we are in front of the screen and when we walk away from it or doze off for a bit. Noting the process behind this, Lauren Bedal, senior interaction designer at ATAP, explains, “We were able to move in different ways, we performed different variations of that movement, and then—given this was a real-time system that we were working with—we were able to improvise and kind of build off of our findings in real-time.”