What would happen if you took a Galaxy Nexus, projected it on a large screen with an HDMI out cable plugged in to a projector, and then connected it to a PC running running Simple-Kinect-Touch 2.0 and then connected an XBox Kinect camera to the whole thing? Why, a motion enabled Android system of course. All of this tech exists now. There is no reason we won't be standing in front of a computer screen in the near future touching and waving our fingers and arms, and controlling the movement of app icons and screens.
That's all true but just because it's possible. It sounds like alot of work. Mouses, and keypads are awfully efficient.
I'm waiting to be able to think to screen.
Posted by: attorneydavid | January 25, 2012 at 08:39 AM