AI

“The year of AI glasses” is 2024.

Although very few people discuss AI glasses, a large number of users will be wearing them on a daily basis by the end of the year.

It is anticipated that Apple will release its costly augmented reality platform, Vision Pro, in the first quarter of 2024. But by year’s end, I think AI glasses will be the biggest thing around. Cue the drum roll, please!

ai glasses, ray ban, ai glasses ray ban, meta

Hold on, what?

Yes, that is correct. The sleeper hit of the year will be glasses that allow you to communicate with artificial intelligence (AI) from the comfort of your own face. Actually, the market leader Meta is already the subject of conversation.

The arrival of the Ray-Ban Meta glasses, which were announced in September and shipped in October, was initially met with a collective shrug. It was assumed that they were virtual assistant glasses, similar to Amazon’s Echo Frames, or camera glasses, like Snap Spectacles. Or, conversely, a minor improvement over Ray-Ban Stories, their Meta predecessor. Spectacles, Echo Frames, and Ray-Ban Stories, however, did not excite the tech-savvy public.

Everyone took a while to realize that the $299 Ray-Ban Meta glasses are far superior in power, quality, AI assistant, camera, and audio.

Three things occurred in December that led to the Ray-Ban Meta glasses becoming extremely popular on the internet, despite their dull launch.

First, Meta revealed that its “multimodal” feature would be available in a closed beta. The “multimodal” feature allows the glasses to send a picture to the Meta Assistant for processing and analysis. Users can summon the Meta Assistant with the “Hey, Meta” command. The glasses can scan a table of ingredients and provide recipe suggestions without using your hands. It is astounding and blatantly revolutionary how spoken and visual communication with Meta’s potent AI can change the world.

ai glasses, ray ban, ai glasses ray ban, meta

Second, a consensus among tech journalists began to emerge regarding the transformative power of Ray-Ban Meta glasses. Even though I was recommending them back in October, it wasn’t until last month that a significant portion of my colleagues began to show genuine enthusiasm for them.

Kimberly Gedeon of Mashable stated, “The Ray-Ban Meta Smart Glasses shocked me, in a good way.”

“Ray-Ban Meta glasses convinced me to believe in smart glasses,” stated Filipe Espósito of 9to5Mac.

Regarding the “multi-modal” feature, Scott Stein of C|NET remarked, “the demo wowed me because I had never seen anything like it.”

Despite being limited to Facebook and Instagram, users popularized Ray-Ban Meta videos on TikTok and other social media platforms.

Also Read : Ethical AI: Navigating the Complex Landscape of Artificial Intelligence Ethics

Yes, a market exists already.
There are currently a number of AI glasses available on the market. For instance, in August, Lucyd Lyte Smart Eyewear Powered with ChatGPT was released. Reviewers claim that although the glasses are cheap, they are of fatally low quality, particularly in terms of sound quality.

In comparison to the Lucyd Lyte devices, the Solos AirGo 3 is a different product that offers ChatGPT access through the glasses and has received more favourable reviews.

“And, of course, Amazon just released the third-generation Echo Frames, which lack a camera but are only $30 less expensive than Ray-Ban Meta glasses.” Furthermore, they grant you access to Amazon’s Alexa assistant—which isn’t actually generative AI based on LLM.

There are and will be more AI wearables available. Currently, using earbuds to communicate with AI is fairly simple. Unlike glasses, which people do, in fact, always wear except when they sleep, the issue with this platform is that no one wears it all day, every day.

Using smart watches is an additional way to access AI. You can ask questions of AI through your watch using an app, and the AI will respond. The sound quality of watches is typically too loud for those around the wearer and too quiet for the wearer, which is the weak point in this situation.

The 55-gram magnetic AI Pin from Humane is designed to be worn on clothing. Its input devices are a camera and microphone; its output devices are a speaker and a laser beam. It is energising to see a completely new wearable platform take such a risk. However, it is excessive to ask the general public to pin something to their clothing, as this is not always feasible or advised. There is no chance of success for the Humane AI Pin in the market.

ai glasses, ray ban, ai glasses ray ban, meta

The form factor’s AI glasses significance

The glasses’ superior form factor is the main reason wearables as AI interface devices won’t take off. The temple or arm of these glasses ideally positions itself to deliver clear audio to the wearer’s ears, remaining virtually silent to those around them. Additionally, they have enough room inside to accommodate antennae, batteries, and other electronic parts.

“Glasses can look fashionable. They can still hold a high-quality camera and a light to show others when you’re taking photos or videos, as demonstrated by the Ray-Ban Meta glasses.”

Of course, a lot of people already wear glasses on a daily basis. You can order Ray-Bans with prescription lenses, progressive lenses (sunglasses for the sun, clear indoors), or both. Because of the Ray-Ban Meta glasses, I know people who started wearing clear glasses indoors. Of course, choosing AI glasses is a no-brainer for those who already wear spectacles.

The reason Ray-Ban Meta glasses dominate the AI glasses market is that they are significantly more expensive than their competitors and require Meta to subsidise their quality. In actuality, Ray-Ban Meta glasses cost less than dozens of Ray-Bans with no electronics at all.

However, as happened when Amazon released the Echo device, well-known rivals appeared fast. In the next year or two, we can anticipate competitors like Google, Microsoft, OpenAI, and others—possibly even Apple—entering the AI glasses market.

This is the market that excites me. Meta demonstrated that AI glasses are very alluring if the entire experience is flawless and of the highest calibre.

Google will ship smartglasses (likely Pixel brand), as AI glasses replace search for most users.

Amazon plans to release AI glasses under the Fire brand.

They will surpass Meta’s unsuccessful Echo Frames and align with Amazon’s data-gathering goals.

And eventually Apple will have to introduce AI glasses of its own, most likely under the iGlasses brand. Similar to smart watches, glasses combine technology and style and require a high level of usability. For Apple, this will be an enticing market.

My wild card prediction is that OpenAI will launch its first hardware product—AI glasses. In any case, 2024 is going to be a surprise to everyone, as AI glasses will become the most significant tech device category.

Leave a Reply

Your email address will not be published. Required fields are marked *