Google Patent Suggests New Direction For Project Glass Augmented Reality Interface
Head-mounted wearable computers present a bit of an interface dilemma. Voice-based head-mounted systems impart the impression that a individual is murmuring to him or herself, and accelerometer-based systems that rely on head motion make customers appear like they have a nervous tic.
One remedy to the head-mounted-computer user interface conundrum requires hand gestures. Enter a new Google patent that seems to be the search giant’s answer to controlling its Project Glass augmented reality method. Titled, “wearable marker for passive interaction,” the patented program, which just went public Tuesday, would use a reflective infrared identifier placed on a user’s hand to track and determine the user’s gestures.
The IR identifier would be invisible to the human eye and could be placed on a ring or glove, or even affixed to a fingernail. (Regardless of whether the fingernail identifier would be bejeweled isn’t defined in the patent’s language.” An IR camera integrated into an HMD (head mount display) would be used to track the IR image.
Making use of hand gesture patterns, the HMD would be controlled by a user’s hand movements. For example, a certain gesture pattern could be used to launch an application or open a document.
In addition to interacting with a wearable program that looks suspiciously like Project Glass, the IR identifier could also be utilised to determine person customers. For example, the system could offer pre-determined, custom eyewear settings for every single user: You place on your Google glasses, appear at the IR identifier on your finger, and the program would activate your user pre-sets.
Of all the input systems that could be used to manage Project Glass, hand gestures would appear to make the most sense. That is, if you are comfy seeking like you are conducting an orchestra while walking down the street.