Smart glasses will help you hear, not see. But will you trust your AI eyewear with your ears?
“I don’t know what this thing in front of me is, but I’ll research!”
Next time you’re in front of the Walt Disney Concert Hall in Los Angeles, you may say what (more than) most of the people say upon laying their eyes on that building:
"That’s really weird."
Or, you could say something else. Something like:
You just need your (smart) glasses on.
By the way, here’s the aforementioned Walt Disney Concert Hall:
I have no idea how closely you follow the tech world, but you may have missed the recent mutation of “IoT” into “AIoT”.
The Internet of Things (IoT) refers to the network of physical objects – “things” – that are embedded with sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the Internet. These devices range from ordinary household items to sophisticated industrial tools. A good example of IoT are devices like smart thermostats, security cameras, and lighting systems that can be automated or controlled remotely.
AIoT is when AI is thrown into the mix, potentially elevating IoT to previously unthinkable levels of productivity and diverse applications.
As to why we’re dealing with glasses – well, that’s because Apple launched the $3,499 Vision Pro headset. Yup, those recent TikTok and YouTube videos of people walking (driving, dining, lounging) around with high-tech scuba diving masks on their faces you’ve been watching are all featuring the Vision Pro. Yup, smart glasses were a thing years before the Vision Pro was even announced, but in this case, I believe that this chicken came before the egg precisely because we all know the egg was coming and that it was going to smell like apples.
Good old Apple. Nothing sparks hype like a new Apple product. And this time, Apple started a whole new product category. Before you proceed with the lynching, let me assure you that I’m aware the Vision Pro headset can’t be defined as “smart glasses” and it’s a spatial computer.
But there’s a proverb: In a fire, both green and dry wood burn.
As $3,499 might be too much for too many, it’s only natural to see the state of the $300-$400 smart tech that we can put extremely close to our eyeballs. Enter the smart glasses – lightweight, affordable, and full of compromises!
And no, we’re not taking
VR headsets into account. These Mandalorian-wannabe helmets are hooked to your console or computer and are more of a houseplant, compared to the smart glasses – let’s talk about something that you can take out in the world. That is if you do go out on a regular basis. If you don’t – don’t worry, there are plenty of smart glass applications for in-house use.
We can’t talk about smart glasses and not start with the Ray-Ban/Meta collaboration. The Ray-Ban Meta Smart Glasses, formerly known as Ray-Ban Stories, got a major upgrade in the Fall of 2023, packing a Qualcomm Snapdragon AR1 Gen1 processor, better 12 MP cameras, better audio, options for live streaming to Facebook and Instagram, and Meta AI.
Recently, there was a Version 2 update that brought image quality improvements, global volume control and security enhancements.
Better image quality and greater stability are always welcome, but what’s really interesting about the Ray-Ban Meta Smart Glasses is the Meta AI, of course.
Saying “Hey Meta, look at this” to your glasses summons Zuck’s AI powers – the cameras become the AI’s eyes. After the artificial intelligence model analyzes the picture of whatever is in front of you – a shovel, if you’re out in the country – the smart device on your face could say something like: “This is a shovel – a tool used for digging, lifting, and moving bulk materials, such as soil, coal, gravel, snow, sand, or ore”. If you can trick your AI into thinking it's a Scorsese character, you could get a somewhat different definition of a shovel and what it could be used for.
Or the smart glasses could help you outshine your architecture friends upon visiting Barcelona and inspecting Gaudi’s Sagrada Familia, as suggested at the start.
The thing is, Meta has not yet rolled out the “Look at this” feature to everybody out there – a limited number of users are getting it, testing it and reporting what’s going on. The Ray-Ban Meta Smart Glasses is what the discontinued Google Glass dreamed of becoming.
Next time you’re in front of the Walt Disney Concert Hall in Los Angeles, you may say what (more than) most of the people say upon laying their eyes on that building:
"That’s really weird."
Or, you could say something else. Something like:
Designed by Frank Gehry, this building epitomizes the Deconstructivism movement in architecture, but its style goes beyond simple classification due to its fluid forms, innovative use of materials, and the way it challenges traditional architectural norms. It incorporates elements of sculpture and it’s highly expressive, making it a stand-out example of contemporary architecture that defies easy categorization into a single architectural style. Gehry's approach emphasizes the emotional and experiential aspects of buildings, blending form and function in unexpected ways.
Nice one, huh? And you don’t have to take courses in architecture.
You just need your (smart) glasses on.
By the way, here’s the aforementioned Walt Disney Concert Hall:
Why glasses?
I have no idea how closely you follow the tech world, but you may have missed the recent mutation of “IoT” into “AIoT”.
The Internet of Things (IoT) refers to the network of physical objects – “things” – that are embedded with sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the Internet. These devices range from ordinary household items to sophisticated industrial tools. A good example of IoT are devices like smart thermostats, security cameras, and lighting systems that can be automated or controlled remotely.
AIoT is when AI is thrown into the mix, potentially elevating IoT to previously unthinkable levels of productivity and diverse applications.
As to why we’re dealing with glasses – well, that’s because Apple launched the $3,499 Vision Pro headset. Yup, those recent TikTok and YouTube videos of people walking (driving, dining, lounging) around with high-tech scuba diving masks on their faces you’ve been watching are all featuring the Vision Pro. Yup, smart glasses were a thing years before the Vision Pro was even announced, but in this case, I believe that this chicken came before the egg precisely because we all know the egg was coming and that it was going to smell like apples.
Good old Apple. Nothing sparks hype like a new Apple product. And this time, Apple started a whole new product category. Before you proceed with the lynching, let me assure you that I’m aware the Vision Pro headset can’t be defined as “smart glasses” and it’s a spatial computer.
As $3,499 might be too much for too many, it’s only natural to see the state of the $300-$400 smart tech that we can put extremely close to our eyeballs. Enter the smart glasses – lightweight, affordable, and full of compromises!
What do they do?
We can’t talk about smart glasses and not start with the Ray-Ban/Meta collaboration. The Ray-Ban Meta Smart Glasses, formerly known as Ray-Ban Stories, got a major upgrade in the Fall of 2023, packing a Qualcomm Snapdragon AR1 Gen1 processor, better 12 MP cameras, better audio, options for live streaming to Facebook and Instagram, and Meta AI.
Recently, there was a Version 2 update that brought image quality improvements, global volume control and security enhancements.
Better image quality and greater stability are always welcome, but what’s really interesting about the Ray-Ban Meta Smart Glasses is the Meta AI, of course.
Saying “Hey Meta, look at this” to your glasses summons Zuck’s AI powers – the cameras become the AI’s eyes. After the artificial intelligence model analyzes the picture of whatever is in front of you – a shovel, if you’re out in the country – the smart device on your face could say something like: “This is a shovel – a tool used for digging, lifting, and moving bulk materials, such as soil, coal, gravel, snow, sand, or ore”. If you can trick your AI into thinking it's a Scorsese character, you could get a somewhat different definition of a shovel and what it could be used for.
The thing is, Meta has not yet rolled out the “Look at this” feature to everybody out there – a limited number of users are getting it, testing it and reporting what’s going on. The Ray-Ban Meta Smart Glasses is what the discontinued Google Glass dreamed of becoming.
Speaking of the Google Glass, let's remind ourselves why this legendary device was discontinued: it was hit with a wave of criticism and it could not withstand. Privacy advocates were concerned that people with Google Glass may be able to identify strangers in public using facial recognition, or secretly record and broadcast private conversations. On top of that, cyber forensics experts at the University of Massachusetts said they found a way to steal smartphone and tablet passwords using Google Glass with a software program that uses Google Glass to track finger shadows as someone types in their password.
Back to the smart glasses, though: There are many more examples – the RayNeo X2 Lite and the RayNeo X2, the upcoming Frame by Brilliant Labs, the Solos AirGo 3, etc.
The smart glasses can be thought of as an AI assistant, only faster in certain real-life situations. Why bother reaching out for your pocket, taking out your smartphone (don’t drop it on the concrete), unlocking it, navigating to Google Lens, taking a picture of whatever you’re interested in, waiting for the phone to do its magic, then using your eyes to read through the search results.
Instead, the smart glasses will do all of that for you faster, easier and potentially better. For now, they are far from perfect and need much more refinement, but that will happen soon.
Of course, there’s a great potential for smart glasses to help people with impaired vision, but that’s not this article’s point.
Let’s go back to the smartphone realm for a change. You can’t possibly have missed the Galaxy AI unveiling alongside the Galaxy S24 announcement. At the Galaxy Unpacked event, Samsung emphasized greatly on the Galaxy AI and on the Circle to Search feature. Long story short, this one allows you to enable a Google search for things you see on your display. If you’re watching a YouTube video about poisonous plants and you come across one you’re not familiar with, you can circle it and get instantaneous results about the object you’ve chosen.
Living in times with unprecedented levels of access to knowledge, wisdom, information, data and content (TikTok videos sum all of that up nicely). We want more, much more. We want to know it all right now.
It’s easy to see why companies are obsessed with developing and selling tools that quench our information/content/data thirst. Or, if you’re into a different kind of explanation, you have to just ask yourself who gets to decide what information your AI assistant is providing you with. “Grima Wormtongue” – that’s the result my smart assistant is providing me with upon asking it “What was the name of that crook from Lord of the Rings that was advising King Theoden?”. Oh, the irony…
Anyway, the hunger for information is real. Hence, the glasses are getting smart and they’ll talk to us. Are we going to listen?
It’s a bit ironic, but the more we use our eyes to see (with smart glasses), the more we’d have to use our ears. Next thing, you know, we’ll have to use our brains even more. Wait, wasn’t the whole idea of smart wearables to make us think less?
Jokes aside, I’m having a hard time imagining the smart glass craze that’s about to unfold in the next few years not manifesting itself in all sorts of ugly scenarios. Not even to mention the privacy nightmare when everyone starts wearing smart glasses that pack POV cameras.
Yes, it’s hypocritical to dismiss this technology entirely, because, after all, for thousands of years, we went to external sources – be it books, films, or persons – for extra knowledge and facts.
I’m just not sure if these AI assistants will be objective if that’s even possible, but only time will tell.
But I’m looking forward to those POV videos of people saying/doing stupid things. I can even hear their excuses: “My AI smart glasses told me so!”
I can’t wait.
Back to the smart glasses, though: There are many more examples – the RayNeo X2 Lite and the RayNeo X2, the upcoming Frame by Brilliant Labs, the Solos AirGo 3, etc.
Instead, the smart glasses will do all of that for you faster, easier and potentially better. For now, they are far from perfect and need much more refinement, but that will happen soon.
Of course, there’s a great potential for smart glasses to help people with impaired vision, but that’s not this article’s point.
The search for Search (this is how we got Circle to Search)
Let’s go back to the smartphone realm for a change. You can’t possibly have missed the Galaxy AI unveiling alongside the Galaxy S24 announcement. At the Galaxy Unpacked event, Samsung emphasized greatly on the Galaxy AI and on the Circle to Search feature. Long story short, this one allows you to enable a Google search for things you see on your display. If you’re watching a YouTube video about poisonous plants and you come across one you’re not familiar with, you can circle it and get instantaneous results about the object you’ve chosen.
Living in times with unprecedented levels of access to knowledge, wisdom, information, data and content (TikTok videos sum all of that up nicely). We want more, much more. We want to know it all right now.
Anyway, the hunger for information is real. Hence, the glasses are getting smart and they’ll talk to us. Are we going to listen?
For finals
It’s a bit ironic, but the more we use our eyes to see (with smart glasses), the more we’d have to use our ears. Next thing, you know, we’ll have to use our brains even more. Wait, wasn’t the whole idea of smart wearables to make us think less?
Jokes aside, I’m having a hard time imagining the smart glass craze that’s about to unfold in the next few years not manifesting itself in all sorts of ugly scenarios. Not even to mention the privacy nightmare when everyone starts wearing smart glasses that pack POV cameras.
Yes, it’s hypocritical to dismiss this technology entirely, because, after all, for thousands of years, we went to external sources – be it books, films, or persons – for extra knowledge and facts.
But I’m looking forward to those POV videos of people saying/doing stupid things. I can even hear their excuses: “My AI smart glasses told me so!”
I can’t wait.
Things that are NOT allowed: