As stuff like Google Glass becomes mainstream, we’re going to see a lot more wearable computing devices around. But one thing that isn’t clear is how we’ll control them. One idea is to use gesture control, which would enable users to communicate with wearable computers without having to use a whole separate smartphone or other device to do so.
Y Combinator-backed startup Thalmic Labs believes it has a better way of determining user intent when using gesture control. To do so, it’s developed a n
Think this is good?1 person thinks this is good0 people think this is good