Project Soli: Hands Become the Interface

Google ATAP builds what people actually use

Google ATAP is tasked with creating cool new things that we’ll all actually use. At the recently concluded Google I/O event, they showcase Project Soli. A new kind of wearable technology that wants to make your hands and fingers the only user interface you’ll ever need.

This is not touchless interaction as a gimmick. It is a rethink of interface itself. Your gestures become input. Your hands become the control surface.

The breakthrough is radar, not cameras

To make this possible, Project Soli uses a radar that is small enough to fit into a wearable like a smartwatch.

The small radar picks up movements in real time and interprets how gestures alter its signal. This enables precise motion sensing without relying on cameras or fixed environmental conditions.

The implication is straightforward. Interaction moves from screens to motion. User interfaces become something you do, not something you tap.

In wearable computing and ambient interfaces, the real unlock is interaction that works in motion, without relying on tiny screens.

Why this matters for wearable tech

Wearables struggle when they copy the smartphone model onto tiny screens. Project Soli pushes in the opposite direction.

Instead of shrinking interfaces, it removes them. The wearable becomes a sensor-driven layer that listens to intent through movement.

If this approach scales, it changes what wearable interaction can be. Less screen dependency. More natural control. Faster micro-interactions.



A few fast answers before you act

Is Project Soli just gesture control?

It is gesture control powered by a radar sensor small enough for wearables, designed to make hands and fingers the primary interface.

Why use radar instead of cameras?

Radar can sense fine motion without relying on lighting, framing, or line-of-sight in the same way camera-based systems do.

What is the real promise here?

Interfaces that disappear. Interaction becomes physical, immediate, and wearable-friendly.

Moda, the world’s first digital makeup artist

Never got the hang of applying makeup with your own hands? Not to worry. Moda from Foreo is set to become the world’s first digital makeup artist. Using facial scanning technology and a 3D printer it can adapt and apply the latest makeup trends directly to the user’s face in just about 30 seconds.

To begin, users need to download an integrated smartphone app and select the style they want to emulate. This could be from Moda’s image library, a photo of a celebrity from the internet or a picture of a fashionable friend. Once the selection has been made, it customizes the colors and shapes to suit the wearer’s skin tone and face shape. Then when the face is placed into the device, Moda paints it using FDA-approved makeup ink.

View video on YouTube: https://youtu.be/tR9mbXs3wA8

With the vast number of online videos showing users how to copy celebrity makeup styles, there certainly seems to be a potential audience for Moda. For more infos on its availability visit www.foreo.com/moda.

Mother and the Motion Cookies

mother

Sensors are becoming more and more prevelant in our daily lives. Jawbone Up, Fitbit and many other wearable devices already collect all sorts of data for us to evaluate. Now a US based startup called Sense has created Mother and the Motion Cookies, a family of smart sensors that help you track the functions you want, while allowing you to change them as often as you need.

All one has to do is select what you want to monitor, place a Motion Cookie on the appropriate object and get alerts when it’s important. Watch the demo video for more…

More infos at: www.sen.se/store/mother.