Facebook's blog post has us excited about smart glasses again
Back in August 2017, we told you that Facebook had filed a patent application (more precisely, its Oculus division filed the application) for technology that sends a wave of light through a lens putting "computer generated elements" over a real world background. If that sounds like an AR powered pair of smart glasses, you'd be right. Recently, there has been talk about Facebook developing AR glasses for consumers.
Facebook published a blog on Tuesday in which it said that for a decade it has been working on a "contextually-aware, AI-powered interface for augmented reality (AR) glasses that can use the information you choose to share, to infer what you want to do, when you want to do it." Part of the problem with smart glasses is finding a way for the user to navigate the device. In the blog, Facebook writes, "This AR interface will need to be proactive rather than reactive. It will be an interface that turns intention into action seamlessly, giving us more agency in our own lives and allowing us to stay present with those around us. Importantly, it will need to be socially acceptable in every respect — secure, private, unobtrusive, easy to learn, easy to use, comfortable/all-day wearable, effortless, and reliable." Facebook Reality Labs (FRL) Chief Scientist Michael Abrash adds that "In order for AR to become truly ubiquitous, you need low-friction, always-available technology that’s so intuitive to use that it becomes an extension of your body."
Facebook reveals some information about its smart glasses
What Facebook is working on here is no surprise since it is mentioned right in the blog. "But all-day wearable AR glasses require a new paradigm because they will be able to function in every situation you encounter in the course of a day. They need to be able to do what you want them to do and tell you what you want to know when you want to know it, in much the same way that your own mind works — seamlessly sharing information and taking action when you want it, and not getting in your way otherwise."
Facebook gives a long example of how the glasses would work and mentions that a soft wristband will worn by the user. There will be a digital assistant available and when you walk into an oft-visited eatery, it will ask if you want to order your usual meal as soon as you walk in. Donning a pair of soft, lightweight haptic gloves, it looks as though a virtual screen and QWERTY keyboard are right in front of you. Using the virtual keyboard is an intuitive as using a real physical keyboard, Facebook says.
To help you focus on your work without loud background noise making things harder to do, the Assistant "uses special in-ear monitors (IEMs) and active noise cancellation to soften the background noise." However, if you're using these AR specs in a loud noisy restaurant, the glasses will know to let the server's voice automatically go through when she asks if you want more coffee.
Some of the features that Facebook will use for its smart glasses, hand-tracking cameras, a microphone array, and eye-tracking technology, are features already mentioned for Apple Glass which is now expected to arrive in 2025. Facebook is looking at the use of electromyography (EMG) which uses electrical signals that travel from the spinal cord to the wrist, to detect the user's navigational intent. EMG can detect a finger motion of just a millimeter. Input could be as simple as pressing a virtual button. Facebook says, "AR glasses interaction will ultimately benefit from a novel integration of multiple new and/or improved technologies, including neural input, hand tracking and gesture recognition, voice recognition, computer vision, and several new input technologies like IMU finger-click and self-touch detection."
Sounds like Facebook is working on something exciting. For the first time since Google introduced Project Glass with an exciting day in the life video, it feels like smart glasses are going to be the next big thing.