Sunday 14 June 2015

Project Soli

Project Soli is developing a new interaction sensor using radar technology. The sensor can track sub-millimeter motions at high speed and accuracy. It fits onto a chip, can be produced at scale and built into small devices and everyday objects.











Just because Google's keynote is done and over doesn't mean the Google I/O dev conference is also finished with serving up new and interesting information about what the Mountain View-based company envisions for the future.
Take Project Soli, for instance.
ADVERTISEMENT
It debuted at one session from the conference, and everyone is talking about it. Google ATAP - a skunkworks division at the company that is building things like 3D-sensing, spatially-aware tablets - led the session and introduced the project as a new technology that could change wearables for forever.

What is Project Soli?

Project Soli is a sensor that can easily be used in even the smallest wearables. It is capable of accurately detecting your hand movements in real-time, meaning it's a lot like Leap Motion and other gesture-tracking controllers. But instead of using cameras, Project Soli uses radar technology that fits within a tiny chip.
null
Google ATAP has basically realised that our hands are the best way to interact with devices. We have such fine control with our fingers; just think about how fast and seamlessly yours can transition from, let's say, typing on a keyboard to untangling a bunch of wires. Project Soli wants to apply that capability to gesture control.
Project Soli's founder, Ivan Poupyrev, demoed the sensor on Friday. Because Project Soli can recognise fine gestures, rather than the large ones needed by most other motion-based controller systems, it will be able to replace all current interfaces and allow you to gesture-control wearables without ever touching a display.

How will Project Soli change the way we use wearables?

Well, Project Soli has the potential to change the way we use all devices - not just wearables. Wearables are probably the most obvious and natural place to apply the technology first, because those types of devices usually have such small displays (and there's is an obvious need for richer, more functional input options on them).
null

The Apple Watch, for instance, has the physical Digital Crown that provides users with additional ways to navigate Apple's Watch OS. But Project Soli makes that approach seem archaic. A smartwatch with Project Soli wouldn't need a digital crown, because you could just wave your fingers in the air to get things done.
You could mimick turning down a volume dialer to turn down the volume. You could mimick pressing a button to turn something on or off. You could mimick turning a page to flip through an eBook. The possibilities are endless.



 

 

 

 

 

Want to see Project Soli in action?

The video below not only shows how Project Soli works when applied to a variety of devices and different scenarios, but it also goes into greater detail about why Google first dreamed up the technology.