I wore Meta’s Ray-Ban Display smart glasses for 20 minutes at Meta Connect 2025, and here are my first impressions.
Story continues below this ad
Sleek and lightweight and look like normal glasses
I have been wearing prescription glasses since third grade. Over the years, I have tried glasses of all shapes and sizes, even though many people have told me to just get surgery and be done with them. But somehow, I see them as more a part of my identity than anything else. Honestly, I probably know more about glasses than anyone else. When choosing a pair of glasses, the number one thing I look for is comfort and how well they suit my face. The Meta Ray-Ban Display glasses don’t look any different from regular glasses. They are sleek, lightweight at only 69 grams, and don’t make you feel like you are wearing something strange that draws stares in public.
These glasses have a display in one of the lenses in addition to extra compute circuitry and a larger battery than Ray-Ban Meta AI glasses. (Image credit: Anuj Bhatia/Indian Express)
While the Meta Orion glasses were a bit on the chunkier side, the Ray-Ban Display glasses seem impressively functional as smart glasses, though they are slightly thicker, especially around the temples. To be clear, this is only noticeable when compared to the regular Ray-Ban Meta glasses.
The form factor of these smart glasses isn’t any different from standard Ray-Ban glasses. They are actually built on the same design language and form as the popular Ray-Ban AI glasses that are all the rage these days and all over social media. This means the new model retains the Ray-Ban branding and styling, features two 12-megapixel cameras on the front, and, of course, a display.
A display adds a new dimension along with AI features
If you have ever used the Meta Ray-Ban glasses, you know they don’t have a built-in display, as everything is controlled through voice interaction. That’s not the case with the Meta Ray-Ban Display smart glasses, which include a display that changes how you interact with your surroundings. There aren’t two displays – instead, there’s a single monocular display embedded in the center of the right lens. A binocular display would have been ideal, but I am guessing Meta opted for a single monocular display due to cost considerations.
Story continues below this ad
The display’s custom-built light engine provides sharp, bright visuals with 42 pixels per degree of view, a density not found in any other similar consumer device. According to Meta, these lenses are photochromatic, which means they adjust automatically based on the light you are in. The display can show the menu, weather, and notifications; provide turn-by-turn navigation; display captions and translations of real-world speech; and present Meta AI responses as text as well as audio. I could see the display clearly even outdoors, under direct sunlight. The user interface is rather simplistic with six-app grid that users can interact with using the wristband.
An sEMG bracelet allows you to more naturally interact with virtual interfaces by just using your hands and gestures. (Image credit: Anuj Bhatia/Indian Express)
The good thing about the display on these glasses is that there’s no learning curve to adjusting to the screen. Unlike Google Glass, which was launched many years ago and required you to strain your eyes to see the display, this projected image is softer on the eyes and appears at the right distance.
To wake up the display, you simply double-tap with your middle finger. You can then use gestures like a thumbs-up or swipe left and right, similar to using a D-pad. Pressing with your middle finger again takes you back to the main menu. While various gestures are supported, the glasses uses a gesture band (or bracelet), which helps deliver accurate hand-tracking input, even when the glasses’ cameras can’t see your hand (more on that later).
It takes a while to get used to the gestures, and throughout the demo, I kept forgetting which one to use. But that’s okay. I had a similar experience when I tried Apple’s Vision Pro for the first time. Once you get used to these features, then there’s no looking back.
Story continues below this ad
During the demo, I got to try out various features of the Meta Ray-Ban Display glasses. Meta had curated three to four specific experiences for attendees to explore and test what these glasses can do. For example, one of the experiences focused on the camera functionality. By saying “Hey Meta, open the camera,” or using a double-finger tap, you could activate the camera to take a photo or record a video. Photos and videos sync to Meta’s phone app by joining a local Wi-Fi connection with the glasses.
The original Ray-Ban glasses can also do this, but with the new display, the subject comes alive, eliminating the need to reach for your phone. Having both a camera and a display opens up entirely new ways to use the glasses. In one experience, for instance, I was asked to visualize and restyle a wall in the room: maybe with a new coat of paint or a different art style. Think of it like how Pinterest works, but overlaid on the real world.
Remember I briefly mentioned the band I wore during the demo with the glasses? It’s worn like a smartwatch, but it’s not used to tell time or track your vitals. Instead, it improves hand tracking accuracy by sensing muscle movements and translating them into actions. The band detects the electrical signals in your arm muscles to better understand how you want to interact with virtual objects, though it can’t read your mind. Think of the sEMG band as more than just an input method; it’s essentially the control interface for Meta’s smart glasses. While the glasses can technically function without the band, the experience isn’t nearly as good.
In one of the experiences, I was able to to scroll through the scroll through tracks by rubbing my thumb forward or backward against my index finger. It felt almost like using a fidget roller to control a Walkman, except there was no physical object in my hand.
Story continues below this ad
The use of the sEMG band is extremely important for next-generation virtual interfaces because it essentially enables full input without the need for a touchscreen or cameras to see your hands. The band I wore was comfortable, made from stretchy, breathable fabric, and fit snugly around my wrist. It’s completely non-invasive and feels no different from wearing a smartband. The EMG Meta Neural Band has an 18-hour battery life and an IPX7 water rating.
A great AI wearable companion
I remember when AI devices like the Humane AI Pin and the Rabbit R1 launched a few years ago, they were pitched as AI companions. As it turned out, those attempts felt broken and unpolished. In contrast, Meta’s implementation, with AI placed at the center of the experience, feels just right for everyday wear. The Ray-Ban AI glasses already proved how well Meta AI could be integrated, and the new Meta Ray-Ban Display glasses build on that, offering features like live translation and the ability for the AI to identify objects. This is where the glasses truly start to feel like a futuristic wearable AI assistant, a companion that can talk to me and see what I see.
I would definitely try doing a video calling from WhatsApp, if I get to spend more time with the glasses in the future.
You can see how the new Meta glasse look on my face. (Image: Anuj Bhatia/The Indian Express)
I think AI offers more practical use cases on smart glasses than on smartphones. The fact that you can get deeper, more personalised responses and that the glasses seem to listen more and think before replying—makes interactions feel much more natural, especially with follow-up questions.
Story continues below this ad
These glasses also double as Bluetooth headphones, pairing seamlessly with both iOS and Android devices. The built-in speakers deliver surprisingly solid sound with minimal audio bleed. And with multiple microphones, the glasses are especially good for voice-based tasks like listening to podcasts or taking phone calls.
Questions on battery life still exists
Meta includes a case that doubles as a charger. The glasses snap into the case, and small metal contact pins on the bridge charge the onboard battery, just like with the Ray-Ban AI glasses. The glasses last six hours per charge, with the external case providing an additional 30 hours, or roughly four full charging cycles. I am not sure whether the battery life of these glasses can truly compare to that of smartphones just yet, but that’s how it stands for now.
The future looks bright for smart glasses
Meta’s new glasses have given me hope that smart glasses could one day replace smartphones – and that day may not be too far off. If you ask me now whether these glasses are a smartphone replacement, I don’t have a clear answer. They don’t make phone calls, nor do they have the full capability to replace my iPhone. But I will say this: Meta’s Ray-Ban Display smart glasses feel like the everyday glasses I have been looking for.
If the original Ray-Ban AI glasses were primarily camera and audio-focused, and they are great at that, these new glasses are visually more advanced, and the integration of AI takes the experience to a whole new level. For now, they are still far from perfect, but Meta seems to be making real progress toward what needs to happen before creating true AR glasses. At this moment, I would place Meta’s Ray-Ban Display glasses somewhere between the original Ray-Ban AI glasses and Project Orion. Perhaps the fate of these high-tech glasses will be hinged on how developers respond to them. Meta needs them to develope third-party apps (and I am told Meta’s family on apps (Instagram, WhatsApp, Facebook Messenger etc will come pre-loaded on the glasses).
Story continues below this ad
But is that enough to gauge consumer interest in these glasses that retail for $799? I think we will have to wait a while to see whether consumers are interested in smart glasses that now feature a real display.
The writer is attending the Meta Connect 2025 in Menlo Park, California at the invitation of Meta.