Meta Ray-Ban Display glasses can now type messages without touching a phone

Meta Platforms has announced a major software update for its Meta Ray-Ban Display smart glasses, introducing neural handwriting, live captions, display recording, expanded navigation support, and new developer tools. The update pushes Meta’s AI wearable closer to becoming a practical everyday computing device that can handle messaging, navigation, accessibility, and content creation without relying heavily on a smartphone.

Published By: Deepti Ratnam | Published: May 15, 2026, 09:42 AM (IST)

Meta is expanding the capabilities of its next-generation smart glasses. After these features, the glasses feel much more closer to the futuristic computing rather than only having a traditional wearables. The company has brought neural handwriting support for Meta Ray-Ban Display glasses to all users along with several new AI-powered features. This came one of the major update for the Meta AI-powered glasses as they are designed to make the glasses more interactive in everyday life. Also Read: Meta’s new Ray-Ban AI Glasses now support prescription lenses and new features

The update is important because it pushes the glasses go beyond being just a camera and audio wearable. The company now wants the device to perform and behave more like a lightweight computer that can quietly sit on your face. It will be there on your face throughout the day without demanding constant attention like a smartphone. Also Read: Meta’s next Ray-Ban smart glasses may be built for everyday users with even prescription

Neural handwriting turns hand gestures into messages

One of the best additions in Meta Ray-Ban glasses is the neural handwriting feature. This feature allows users to write messages using hand gestures. It perform the handwriting task through the included neural wristband. So, rather than typing on a screen, you can move your finger naturally in the air to compose messages. The feature works across several social media apps including, Instagram, WhatsApp, Messenger, and more. Additionally, it also works on native Android and iOS messaging apps. Also Read: Ray-Ban Meta Glasses Just Got Easier To Buy In India: Coming To Amazon, Flipkart On November 21

When Meta first showcased this technology, it looked more like a research demo rather than consumer feature. But now after rolling it out publicly, the tech giant seems confident about gesture-based inputs. They consider and believe that it will become a practical alternative to tapping on a phone screen.

Neural Handwriting will offer a quieter and more private way of communication. So in a way it will not work like voice typing, which feels like awkward in public spaces, and hence neural handwriting will make the communication easy while walking, multitasking, and commuting.

Interestingly, what makes this feature even more fascinating is the direction it signals for wearable computing. Most of the smart glasses companies are currently focused on AI assistants, notifications, or cameras. However, Meta is slowly building a full interaction system. This system will help users see information, interact with apps, hear responses, and noe even write messages, that too without needing to pull out their smartphone.

Display recording could become Meta's next creator-focused feature

The company has also introduced a feature called 'Display Recording.' This feature allows users to capture videos that combine what appears inside the glasses display. This can record and shoot the real-world view and surrounding audio into a single shareable clip.

If consider it in practical terms, it will be one of the biggest creator-focused features on the device. It is useful for travel creators because they can use the glasses to record navigation overlays while walking through a city.

Gamers could showcase in-lens app interactions, whereas, a cooking creator could display recipe steps while recording the preparation process.

The footage will feel more immersive, rather than traditional phone recordings. This will happen because viewers will see both the digital layer and the real environment together.

Navigation and live captions make the glasses more practical

Besides this, Meta is also working and expanding walking directions. This will be included for the United States users and several other international cities, including Rome, Paris, and London. The navigation feature will remove the need to constantly check your phone while walking. This is something that often interrupts real-world awareness.

Live Captions Support

Another notable addition is the live captions support across social media platforms such as WhatsApp, Facebook Messenger, and Instagram Direct voice messages. The glasses will provide transcribed speech directed at the user. This will happen at the time of conversations and calls.

While tech enthusiasts must know that live captioning in itself is not entirely new in technology, but integrating it with smart glasses changes the experience significantly. Instead of looking down at a screen, the captions will appear in a more natural line of sight.

It will benefit users in nosy environment, public transport, airports, or crowded streets. It will be easier to follow conversations without repeatedly asking people to repeat themselves.

Meta wants developers to build the future of smart glasses

Meta's platform will also be made available to developers via a new developer preview. Developers can now develop apps for the glasses using web technologies: HTML, CSS and JavaScript. According to the company, apps can feature games, cooking instructions, transportation applications, grocery lists, and more.

This change could be more significant than the hardware changes. Without third party developers making experiences that people use daily, smart devices will not be successful in the long term. Apps made smartphones popular.The popularity of smartphones is due to applications. Smartwatches were beneficial because their service developers optimized for small screens. Before other smart glasses companies catch up with them, it seems Meta is attempting to develop a smart glasses ecosystem of its own.

The company will be releasing tools that will enable developers to launch existing mobile apps on the screen of the glasses. This will eventually result in more useful real-world applications rather than experimental demos that could be of little use.

Meta's real goal goes beyond smart glasses

The takeaway from Meta's announcement is the emphasis the company places on "glanceable computing." Rather than replacing phones right away, the glasses attempt to decrease the frequency at which users look at one of them. That could work in favour of Meta's chances of success in the mainstream.

Numerous products on the market with smart glasses have not survived due to their too early replacement of the smartphone. Meta's approach seems to be different. The company is referring to the glasses as an "ambient companion device" that "helps users on the go, all day long.

The other interesting tidbit is the upcoming launch of "Muse Spark," Meta says it is the first step towards personal superintelligence specifically for their products. For now, though, this suggests that future AI interactions on the glasses will be much more contextually and personally aware than existing voice assistants.

Get latest Tech and Auto news from Techlusive on our WhatsApp Channel, Facebook, X (Twitter), Instagram and YouTube.