Google AI glasses release date expectations, price estimates, and upgrades
The upcoming Google smart glasses will run Android XR. | Image by Google
Google AI glasses: what to expect
Rumored features:
- Google demoed its Android XR-powered smart glasses during Google I/O 2025
- Also, the smart glasses were teased during MWC 2026
- Codenamed 'Google Martha'
- The smart glasses have a companion app that handles notifications, settings access, video recording
- Will work in tandem with your phone
- Android XR operating system with Gemini AI
- The device may use the Nano Banana model to edit an image on the go
- Screen may be only in right lens by design
- Google is yet to give us a specific release timeline for the device.
- Potentially, the smart glasses may come out in 2026.
Expected price:
- No leaks on pricing just yet.
Jump to:
Google AI glasses release date
Currently, Google has not given official details on when it plans to release the Google AI glasses. Presently, we only know this may happen sometime in 2026, but when exactly is a mystery right now.
Despite that, the glasses exist, and Google has demoed them twice already. Once, a prototype was briefly shown during Google I/O 2025, and then they were demoed again during MWC 2026.
Maybe we'll get to hear more about them during Google I/O 2026, which is scheduled for May 19-20.
Google AI glasses price
Leaks are silent about the Google AI glasses pricing currently. According to rumors, the pricing may put them in competition with Meta's smart glasses and potentially the unannounced Apple smart glasses.
| Smart glasses model | Starting price |
|---|---|
| Google AI glasses | Unknown |
| Ray-Ban Meta glasses Gen 2 | $459 |
| Apple smart glasses | $400-$800* |
*- anticipated prices
Curiously enough, a report from The Elec claims to expect 2026 to be the year of AI glasses. Reportedly, sales of AI glasses are going to experience significant growth this year, and maybe this could be related to Google releasing a model, and potentially even Apple.
Google AI glasses camera
Leaks and rumors are yet to give us details on the cameras the Google AI glasses may sport. From the prototypes that were given for tests, we know that the glasses will have a camera to record the outside world from the vantage point of the user. But we don't know what this camera would be exactly.
Google AI glasses design
The prototypes that we demoed for the smart glasses may not be the final product, so we can't base our expectations of the devices' design on them. However, the glasses are likely not to deviate too much from what you imagine when you think of smart glasses.
Think about Meta's Ray-Ban glasses. We have a frame that's usually black, pretty classic but somewhat thick in comparison to some models of diopter glasses. Ray-Bans also have a sunglasses-like variant, but we don't know if Google's going to offer that option.
It's important to note that smart glasses don't look like an AR/VR headset. They should be comfortable to wear around your day and should look very much like normal glasses. Google's smart glasses are said to have a screen only on one of the lenses, instead of on both like competing devices.

Ray-Ban Meta AI smart glasses Gen 2 for illustrative purposes. | Image by Ray-Ban
Google is partnering with popular eyewear brands like Gentle Monster and Warby Parker to design more stylish smart glasses. We'll see what Google and these companies will come up with in terms of stylish design. For one, I'd rather prefer a couple of frame options to choose from, but we'll see on that.
Google AI glasses features and software
The prototype that Google showed had a companion app that let you use connected features with your phone. The app allows for you to view your notifications, access settings, and start video recording.

Companion app for the Google AI prototype glasses. | Image by Sayed Ali Alkamel on X
There will be an entirely separate operating system powering the device, called Android XR, an OS that Google already announced officially. The software relies heavily on Google's AI model, Gemini. The use of Gemini would allow the smart glasses to identify things from the real world and search for them, also to translate live, and potentially be capable of other tricks.
Navigation in the real world is also one of the features that may come to the glasses, as well as visualizing and responding to text messages, and capturing photos via voice commands. According to Google, the smart glasses will work in tandem with your phone.
During MWC 2026, Google hinted that the glasses may be able to use the generative AI model Nano Banana to edit a photo on the go. The MWC demo also hinted at features like live translation with automatic switching between languages. Gemini will also be able to identify an album cover and start playing music.
Moreover, you may be able to participate in Google Meet calls with the glasses.
But that's not all that was teased. The glasses may be able to understand a poster for a location that the user is looking at and get directions to get there. Apparently, the map may be shown when you look down to confirm if you're walking in the right direction.
Meanwhile, rumor has it that the device may rely on cloud streaming a lot. The Mountain View tech giant may reportedly be looking to use cloud streaming to process AR content, which would help it bypass technical limitations such as a more limited processing power.
Of course, this approach may have latency, undermining the purpose of AI glasses. This issue may be addressed with a strong chip to process some things on-device, including AI tasks.
Google AI glasses hardware and specs
Now, the question that's possibly the most important one when it comes to smart glasses: where would the power for all the AI tasks and integration with the real world come from?
The majority of talks indicate that Google's smart glasses may sport a proprietary Tensor chip. Google's Tensor processors have been powering Pixel phones for years. Currently, the latest Tensor chip is the G5, which is designed by Google and manufactured by TSMC. The chip is built on a 3nm process and is generally optimized for on-device AI.
The next-generation Tensor chip is likely to be the G6, which we may see in the Pixel 11 phones later this year.
Rumors are not saying whether Google will use the G5 or G6 for the glasses, design another chip specifically for them, or ditch the idea of Tensor altogether. We will probably know more on that topic soon, so stay tuned.
Hopefully, whichever chip Google ends up picking for the device will be powerful enough to process some things on-device. We don't want latency on smart glasses, that's for sure.
Should I wait for the Google AI glasses?
- You should wait for Google's smart glasses if you want to use smart glasses but you don't like Meta's Ray-Bans. Google's may have enough features that would be a nice addition to your tech, even though currently, the majority of the features are still unknown.
- You should not wait for Google's smart glasses if you're not interested in smart glasses or wearable tech, or if you find that Meta's AI glasses are good enough for your needs. Also, Apple is working on smart glasses too, reportedly, so if you're an Apple fan, you might want to wait on Cupertino's take.
Follow us on Google News