There’s a new project from the Google stable called Soli that has the potential to bring gesture controls to all sorts of hardware, including wearables. The example gifs below make it look like some kind of telekinetic voodoo, but in reality the gestures are made possible by a new “interaction sensor” that uses radar technology to detect and recognise movements made near it.Google quietly unveiled Soli at its recent I/O conference and while the initiative is still in its infancy, expect to see an API for developers in coming months.Here’s a video showing just some of the potential use cases for Soli, we can imagine a whole lot more.https://www.youtube.com/watch?v=0QNiZfSsPc0
Trending
- Huawei launches Qiankun, a new software brand for self-driving cars
- May’s expected petrol prices are in: fortune favours the diesel drivers (again)
- Anker quietly drops its Soundcore Sport X20 earbuds (and we want a pair right now)
- Apple’s May 2024 ‘Let Loose’ event is expected to feature (possibly OLED) iPads
- A NASA rover has reached a promising place to search for fossilised life on Mars
- How to keep your home secure 24/7 with Xiaomi Smart Cameras
- NASA to overhaul mission returning samples from Mars – here’s why it must and will go ahead
- The Kia Picanto gets a facelift, new specs and starts at R261,000