Google Project Tango and Project Soli

There’s a few projects going on over at the Google Labs that I would love to have reason to do development with.

Project Tango

Project Tango is working on device – environment awareness through use of camera and strong positional accuracy. I’ve long been interested in 3d environment detection and the possibilities of AR within the detected space. One example that I’ve always been interested in is the possibility of enemies popping up from behind walls and desks based on how frequently the device enters those spaces. For example – if you haven’t been to the kitchen in 3 hours then it turns into an enemy spawn point that you need to “clean”.

This type of project would create impetus to visit places more often and combat the couchpotato syndrome.

Project Soli

Project Soli on the other hand is working on using radars to do high-precision detection of finger movements in a localized space near the sensor. This has the potential to be used in cellphone or wearable devices to do intent detection without requiring touch screens. The biggest problem with project like Android Wear or the Apple Watch is the requirement to have a touch screen. When you touch the screen you occlude a significant portion of it. By having a non-touch interaction method you allow a person to continue reading content while they scroll through content or change settings.