Apple working on a motion and gesture controlled 3D GUI for iOS
Apple has submitted a solution to this problem. It suggests enabling the user to use the device as an imaginary viewfinder. The device would use orientation data from onboard sensors to determine its position and allow users to look around their digital world. This idea, in function anyway, seems reminiscent of the technology used in Yelp mobile apps’ Monocle feature.
The Cupertino giant also stated, “In some implementations, the display environment could be changed based on gestures made a distance above a touch sensitive display that incorporates proximity sensor arrays.” This suggests the possibility of some sort of to-go version of Kinect meets Minority Report, which admittedly, would be pretty cool.
Finally, Apple introduced the idea of a “Snap To” feature that enables the 3D environment to lock into place, when an action or gesture occurs, to allow the user to interact with the items within the environment.
source: Patently Apple via Apple Insider