Meta keeps pushing its smart glasses in a more practical direction

New software tweaks hint at where this category is heading.

0comments
Meta Ray-Ban glasses on a white background.
Meta is pushing out a fresh update for its AI-powered smart glasses, and it adds a handful of small but meaningful upgrades that make the whole experience feel more polished.

A steady push toward more useful smart glasses


AR smart glasses have been slowly moving into the mainstream, and a big reason for that is Meta’s partnerships with Ray-Ban and Oakley. Things really picked up with the second-generation models that launched earlier this year, and now Meta is building on that momentum with a new software release.

The tech giant behind Facebook, Instagram, WhatsApp, and Threads has started rolling out the v21 software update. It is going out gradually, beginning with users enrolled in the Early Access Program in the US and Canada, with wider availability coming shortly after.

One of the standout additions is a new feature designed to help you hear people better in loud environments. The idea is pretty straightforward but useful: when you are talking to someone in a noisy setting, the glasses can amplify their voice through the open-ear speakers.

Whether you are in a crowded restaurant, at a party, riding public transit, or walking along a busy street, this “conversation focus” feature boosts the voice of the person you are speaking with so it stands out more clearly from the background noise. You can fine-tune how strong the amplification is by swiping along the right temple of the glasses or adjusting it in the settings.

I don’t expect this to replace proper noise-canceling earbuds, but compared to how things worked before, it should make real-world conversations noticeably easier.

Once the software update reaches your glasses, you will be able to hear people talking more easily in a noisy environment. | Image credit – Meta

The update also introduces the first multimodal AI music experience for Ray-Ban Meta and Oakley Meta glasses, created in partnership with Spotify. This is where things get a bit more futuristic.

You can now ask Meta AI to play music that matches what you are looking at. Picture yourself hiking and taking in a mountain view – you can say something like, “Hey Meta, play a song that fits this view,” and the system combines visual recognition with Spotify’s personalization to build a playlist tailored to both the moment and your taste.

Recommended For You

It is a clever blend of computer vision and music discovery, and while it is not essential, it definitely adds a fun layer to the whole experience.

Meta is still ahead of the pack


By rolling out these kinds of small but thoughtful updates, Meta is making its smart glasses more appealing to people curious about AR. The market itself is clearly heating up, with Google and Apple both expected to bring their own products into the mix down the line.

That said, Meta still has a solid head start. Plus, AR glasses have finally started getting the display upgrades they’ve needed for years, which should only make them more useful over time.

Do Meta’s smart glasses feel more “real” now than a year ago?


The post-smartphone bet continues


Meta made a bold decision to largely skip the smartphone race and focus on what comes after it, and this strategy could end up paying off. I’m still a bit skeptical that smart glasses will ever reach smartphone-level adoption, mainly because they don’t yet deliver the same kind of visual engagement we are used to.

Still, if this is the direction things are heading, it is encouraging to see the technology improving at this pace. If smart glasses are part of our everyday lives in the not-too-distant future, updates like this are exactly how we get there.
Google News Follow
Follow us on Google News
COMMENTS (0)

Latest Discussions

by 30zpark • 3
by RxCourier9534 • 13

Recommended For You

FCC OKs Cingular\'s purchase of AT&T Wireless