Quantcast
Channel: Fast Company
Viewing all articles
Browse latest Browse all 2739

Control Your Next Smartwatch With A Wave Of Your Hand

$
0
0

We don't want big screens on our wrist, but small screens are tough to interact with unless we use a precision tool like a stylus. If smartwatches are going to be a thing, they'll need to offer users something other than a touchscreen.

At Google's Advanced Technology and Projects group, or ATAP, Project Soli is the first of two projects led by legendary interaction designer Ivan Poupyrev that is trying to decouple the way we interact with our devices from touchscreens. It's a tiny radar, small enough to fit into a 1.5-inch smartwatch, that can understand the gestures your fingers make even though they never touch a screen.

Google isn't the first company to attempt to tackle this problem. Apple released its watch with the "digital crown," a dial that's comparable to the iPod wheel. Spin it to zoom in and out of content or to work your way through various buttons on menus. But even Apple's digital crown only understands one of the many complicated finger movements that we make every single day. We don't just use our fingers to turn a dial. We rub them together, we snap them, we point, and so on.

That's what makes the radar even more fluid, functional, and hands-off than Apple's digital crown could ever hope to be be.

"Our hand is an amazing instruments," Poupyrev says. "They are very fast and precise, especially when using tools, but we're still not able to capture that in our user interfaces." ATAP's goal was to create a way to capture the "broad vocabulary of gestures" that we all make with our hands every day, and transmit them to a smartwatch or wearables—no touch required.

When we think of radar, we usually think of echolocation: getting a picture of an object by bouncing a sound wave off of it, waiting for it to come back, then extrapolating that into a 3-D model. But when you're tracking something that moves as fast as a human hand, with so many possible points of articulation, that approach can be expensive, Google found; it just soaks up processing power, and therefore, battery life. So to make Project Soli more efficient, Google doesn't even try to track your fingers as a 3-D model: instead, it just tries to compare the combined radar waveform of your whole hand against a library of possible radar-read handmotions. Google watches the ripples and extrapolates the gestures that's way.

It's surprisingly precise. On stage at Google I/O 2015, Poupyrev held his hand a foot away from an Android Wear smartwatch and showed how he could quickly scroll through lists by just rubbing his fingers together, change the time by turning an imaginary dial in mid-air, impatiently swipe away a Google Now card with a hand wave, and even play a game of Pong on his watch face by just flicking his fingers. "We don't need to be able to detect everything you can do with your hands," Poupyrev tells me. "It's enough to detect just three or four really accurately, at least at start."

Project Soli has only been in production for 11 months. Even so, Poupyrev says Google will try to get it out to dev kits later this year.

And don't think Soli is only for smartwatches: Poupyrev says this technology can be applied to pretty much everything. Put a Soli sensor under a table, and now the table surface can tell what you're doing above it. Put it on a light switch and you can now adjust the lights just by waving at it. Soli predicts a day when controlling your devices by touch will look practically quaint.


Viewing all articles
Browse latest Browse all 2739

Trending Articles