Google’s Project Soli on Gesture Control is amazing

There’s something that Google announced this year in its Google I/O Conference that is going to change how we interact with our devices, bridging the gap between digital and physical interactions even more. It is Project Soli coming from Google ATAP (Advanced Technology and Projects), Google’s division for creating cool new things which we may use in the future.

Project Soli uses a miniature radar, YES radar, to sense motion at sub-millimeter accuracy at high speeds thus enabling us to use gestures which could not be tracked earlier. Touch screens have already removed the need for physical buttons, Project Soli has the potential to make them extinct! Our smartphones and wearables still have one to five physical buttons. With Project Soli integrated, it can be reduced to none. Imagine opening the camera app by a tapping the tip of your index finger on your thumb, increase/decrease the volume by rubbing your index finger on your thumb, crossing your fingers to lock the screen and many such gestures which you can customize and set up according to your needs and choice!

Google's Project Soli
Controlling Volume using finger gestures

ALSO READ : Google working on autonomous cars

Why is Project Soli is a game-changer?

Google's Project Soli
Chip recognizing finger movements

Project Soli uses radar emitting microchip to recognize finger movements and thus it can be embedded into almost anything like smart-watches, phones, cars, computers, toys, etc. whereas touchscreens and camera based motion sensing devices cannot be integrated everywhere. Our wearable devices are becoming slimmer and compact and hence providing interface to interact with them easily is becoming a challenge, which Google hopes to solve with this project.

Camera based gesture control systems require extra hardware and can recognize limited slow motion gestures whereas Project Soli operates at 60 GHz radar spectrum at up to 10000 frames per second enabling it to capture even the slightest twitch of our fingers from even a meter away!

Have a look at the video below to get the feel of what Project Soli is.

Dr. Ivan Poupyrev, Lead researcher who heads a team of designers and developers at Google ATAP labs in San Francisco quotes : 

Until now we lacked the fidelity to capture hand movements in sufficient detail. But now using radar, for the first time in history you can build Minority Report type interfaces.

Google's Project Soli
Adjusting watch using gestures

Project Soli will be of immense help to the visually impaired people, enabling them to interact with digital world without the need to press the proper button or touching any screen. It can be a very intuitive game controller, especially in Virtual reality systems when the user doesn’t have to perform in-game tasks using gamepads, thus making the experience even more immersive and natural. While demonstrating the project in San Francisco, Dr. Ivan Poupyrev fine-tuned a radio suing gestures mid-air without touching or pressing a single button.

Join the discussion

Leave a Reply

Your email address will not be published. Required fields are marked *