How to

Goggle Project Soli: User Guide

Goggle Project Soli
Written by Hassan Abbas

Goggle Project Soli: On Saturday, some interesting Google Pixel 4 images started circulating on Twitter. Showing an oval-shaped cutout on the right of the detached bezels for the Pixel 4 and Pixel 4XL.

Many theories have come to the force but a majority of them hint at Project Soli integration. So, without further ado, let’s get to it.

What is Project Soli?

Google assigned some of its best engineers to create a piece of hardware. It allows the human hand to act as a universal input device. In wearables and smartphones, currently, we use the touch panel to interact with the system. Soli removes the middleman by letting you interact with your device using simple hand gestures, without making contact.

How does it work?

It emits electromagnetic waves in a broad beam. The objects in the beam’s path scatter feeding some back to the radar antenna. The radar processes some properties of the returned signal, such as energy, time delay, and frequency shift. These properties enable Soli to identify the object’s size, shape, orientation, material, distance, and velocity.

Soli’s contagious resolution has been fine-tuned to choose the finest finger gestures. It means that it doesn’t need large bandwidth and high spatial resolution to work. The radar tracks fine variation in the received signal over time to decode finger movements and twisting hand shapes.

How to use Soli?

Soli uses Virtual Tools to identify finger gestures and carry out the tasks associated with them. According to Google, Virtual Gestures are hand gestures that shame the same interactions with physical tools.

Suppose holding a key between your thumb and index finger. Now, just rotate the key as if you were opening a lock. That’s it. Soli, in theory, selects the gesture and perform the task associated with it.

So far, Soli allows three primary Virtual Gestures.


Imagine an invisible button between your thumb and index fingers. Then press the button by clicking the fingers together. Primary use is expected to be selecting an application, perform in-app actions.


Suppose a dial that you turn up or down by rubbing your thumb against the index finger. So, the primary use is expected to be volume control.


At last, just think about using a Virtual Slider. Just brush your thumb across your index finger to act. Its primary use controls horizontal sliders, such as brightness control.

Soli generates feedback by allowing the annoying sensation of fingers touching one another.

What are the applications?

As no mainstream device has implemented Soli so far. It’s difficult to guess how it’d perform in the wild. But everything goes according to plan, the radar could become fundamental to wearables, smartphones, IoT components, and even cars in the future.

The radar is super compact i.e 8mm x 10mm. It doesn’t use much energy also, it has no moving parts, and packs virtually endless potential. It takes all into consideration, a contact-less smartphone experience doesn’t look that far-fetched.


Here is all about Google Project Soli. For further queries and questions let us know in the comment section below!

Also Read: New Google Chat feature in Google Photos

About the author

Hassan Abbas

Tech enthusiast with too many items on his wish-list and not nearly enough money! Specializing in all things tech, with a slight Apple bent he has been writing for various blogs for the best part of (too many) years

Leave a Comment