Google’s ATAP (Advanced Technologies and Projects) division is known for their futuristic creations responsible how Google foresees the future and if you haven’t realized until now, it matters a lot what Google thinks of the next technology wave. They’ve brought us comprehensive modular phones under Project ARA, 3D mapping to handhelds under “Tango”, touch-sensitive fabrics and a whole lot more. However, at this year’s Google I/O, they unveiled something that is going to revolutionize how we actually interact with our devices every day with Project Soli.


Google has managed to design astonishingly tiny radar chips that can be directly embedded into devices like a smartwatch or a Bluetooth speaker. What this does is that users will be able to control their gadgets using precise hand gestures without actually touching the display or pressing any button. The smartwatch demo they showcased involved a couple of hand flicks through which the digital watch face could be moved. You can perform hand pinching gestures in air to manipulate a Bluetooth speaker’s actions. But the question here arises – “Why do you need a radar inside a wearable?”.


To explain the motives behind the prototype, Ivan Poupyrev, Technical Project Lead at Project Soli mentioned that “If you can put something in a smartwatch, you can put it anywhere“. As a result, ATAP modified the chip to make it smaller, equally robust and less power hungry. When it comes to the chip, it’s a minute silver with four antennas that are capable of providing full duplex communication for seeding and receiving radar signals. The first iteration needed 1.2W of power, however, the final product requires merely 0.054W that is 22x less!


Analyzing hands signals do run into a ton of issues as there’s no proper interface involved here, hence, every object will have its own coordinates. However, Poupyrev states that Soli’s major aim is to incorporate a common interface for all. Nick Gillian, lead machine engineer for Soli demoed two essential zones for these radars to work – near and far. As the person moves in closer, the radar is able to identify various fine movements that are programmed to execute unique actions.

ATAP has initially partnered with a bunch of manufacturers including LG, Qualcomm, JBL and more to release a consumer ready product as soon as ready. An intact timeline, although, is not yet finalized. Project Soli’s definitely gives a glimpse of how our future interaction will look like, maybe Google is trying to commence an era of no touchscreens and buttons with this. We can, as of now, just wait.

Was this article helpful?