{"id":10191,"date":"2015-06-01T17:22:32","date_gmt":"2015-06-01T12:22:32","guid":{"rendered":"https:\/\/www.ramble.sunmatrix.com\/?p=10191"},"modified":"2026-02-27T11:06:15","modified_gmt":"2026-02-27T10:06:15","slug":"project-soli-hands-become-the-interface","status":"publish","type":"post","link":"https:\/\/www.sunmatrix.com\/ramble\/project-soli-hands-become-the-interface\/","title":{"rendered":"Project Soli: Hands Become the Interface"},"content":{"rendered":"<h2>Google ATAP builds what people actually use<\/h2>\n<p>Google ATAP is tasked with creating cool new things that we\u2019ll all actually use. At the recently concluded Google I\/O event, they showcase <strong>Project Soli<\/strong>. A new kind of wearable technology that wants to make your hands and fingers the only user interface you\u2019ll ever need.<\/p>\n<p>This is not touchless interaction as a gimmick. It is a rethink of interface itself. Your gestures become input. Your hands become the control surface.<\/p>\n<h2>The breakthrough is radar, not cameras<\/h2>\n<p>To make this possible, Project Soli uses a radar that is small enough to fit into a wearable like a smartwatch.<\/p>\n<p>The small radar picks up movements in real time and interprets how gestures alter its signal. This enables precise motion sensing without relying on cameras or fixed environmental conditions.<\/p>\n<p>In wearable computing and ambient interfaces, the real unlock is interaction that works in motion, without relying on tiny screens.<\/p>\n<p>The real question is whether wearables can move beyond miniaturized apps and make interaction work in motion, without a screen-first mindset.<\/p>\n<p>The implication is straightforward. Interaction moves from screens to motion. User interfaces become something you do, not something you tap.<\/p>\n<h2>Why this matters for wearable tech<\/h2>\n<p>Wearables struggle when they copy the smartphone model onto tiny screens. Wearable UX should treat the screen as optional, not primary.<\/p>\n<p><strong>Extractable takeaway:<\/strong> When the screen becomes the bottleneck, shift the interface to sensing and interpretation, then keep the gesture vocabulary small enough to learn fast.<\/p>\n<p>Instead of shrinking interfaces, it removes them. The wearable becomes a sensor-driven layer that listens to intent through movement.<\/p>\n<p>If this approach scales, it changes what wearable interaction can be. Less screen dependency. More natural control. Faster micro-interactions.<\/p>\n<p><iframe width=\"560\" height=\"315\" loading=\"lazy\" src=\"https:\/\/www.youtube.com\/embed\/0QNiZfSsPc0?fs=1&#038;playsinline=1\" title=\"YouTube video player\" frameborder=\"0\" allow=\"fullscreen; encrypted-media; picture-in-picture; clipboard-write; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><br \/>\n<\/p>\n<h2>What Soli teaches about hands-first UX<\/h2>\n<ul>\n<li><strong>Start with intent, not UI.<\/strong> Define the handful of moments where a gesture is faster than hunting for a screen.<\/li>\n<li><strong>Design for motion.<\/strong> Favor interactions that work while walking, commuting, or doing something else with your attention.<\/li>\n<li><strong>Keep the gesture set teachable.<\/strong> A small, consistent vocabulary beats a large library that nobody remembers.<\/li>\n<\/ul>\n<hr \/>\n<h2>A few fast answers before you act<\/h2>\n<h3>Is Project Soli just gesture control?<\/h3>\n<p>It is gesture control powered by a radar sensor small enough for wearables, designed to make hands and fingers the primary interface.<\/p>\n<h3>Why use radar instead of cameras?<\/h3>\n<p>Radar can sense fine motion without relying on lighting, framing, or line-of-sight in the same way camera-based systems do.<\/p>\n<h3>What is the real promise here?<\/h3>\n<p>Interfaces that disappear. Interaction becomes physical, immediate, and wearable-friendly.<\/p>\n<h3>What should a product team prototype first?<\/h3>\n<p>Pick one high-frequency moment where a quick gesture could replace a screen tap, and test whether the sensing feels reliable in motion.<\/p>\n<h3>What is the biggest adoption risk?<\/h3>\n<p>If gestures feel inconsistent or hard to learn, people will default back to the screen. The bar is effortless, not novel.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Google ATAP builds what people actually use Google ATAP is tasked with creating cool new things that we\u2019ll all actually use. At the recently concluded Google I\/O event, they showcase Project Soli. A new kind of wearable technology that wants to make your hands and fingers the only user interface you\u2019ll ever need. This is &hellip; <a href=\"https:\/\/www.sunmatrix.com\/ramble\/project-soli-hands-become-the-interface\/\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">Project Soli: Hands Become the Interface<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":12270,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_seopress_robots_primary_cat":"none","_seopress_titles_title":"","_seopress_titles_desc":"Google\u2019s Project Soli uses radar to turn hands and fingers into a natural interface, redefining how wearables and devices are controlled.","_seopress_robots_index":"","iawp_total_views":1,"footnotes":""},"categories":[32,90,13,111],"tags":[6811,6793,36,6116,6117,6118,6119,6812,6120,6121,6122,6694,6809,6810,6123,5037,5801,5597],"class_list":["post-10191","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-emerging-technology","category-emerging-trends","category-mobile","category-technology-news","tag-future-interfaces","tag-gesture-interfaces","tag-google","tag-google-atap","tag-google-atap-projects","tag-google-atap-technology","tag-google-future-of-ui","tag-google-i-o","tag-google-soli","tag-google-soli-chip","tag-google-soli-project","tag-human-computer-interaction","tag-project-soli","tag-radar-sensors","tag-soli-chip","tag-wearable-tech","tag-wearable-technology","tag-wearables"],"jetpack_featured_media_url":"https:\/\/www.sunmatrix.com\/ramble\/wp-content\/uploads\/google_atap.jpg","jetpack_shortlink":"https:\/\/wp.me\/pgYpE1-2En","_links":{"self":[{"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/posts\/10191","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/comments?post=10191"}],"version-history":[{"count":10,"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/posts\/10191\/revisions"}],"predecessor-version":[{"id":16231,"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/posts\/10191\/revisions\/16231"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/media\/12270"}],"wp:attachment":[{"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/media?parent=10191"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/categories?post=10191"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.sunmatrix.com\/ramble\/wp-json\/wp\/v2\/tags?post=10191"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}