Meta opens Ray-Ban Display 'neural handwriting' to all developers

Meta has opened the "neural handwriting" feature on its Ray-Ban Display smart glasses to the wider developer community, in a move that broadens the platform's third-party app ecosystem. The announcement came from the Reality Labs team at the company's Sunnyvale campus.
The feature allows users to type into typing surfaces shown on the glasses' display using an sEMG (surface electromyography) wristband. The wristband reads electrical activity in the wrist and finger muscles and converts the signals to text input in real time. Meta claims input speeds of up to 50 words a minute.
Meta has published the developer documentation on its Reality Labs site. The documentation includes SDK samples, performance benchmarks and details on the accuracy boundaries of the writing model. According to the company, the writing model has been trained on English for two years and currently has an error rate of 4.8 per cent at 50 words per minute.
The feature was first shown publicly at Meta Connect in 2025. At that time, a small group of Meta Research developers had access; with the official SDK now released, the feature is now available to all developers. The SDK download has been added to Meta's "Wear Apps" application store.
A notable addition for developers is a low-level API that allows third-party apps to read sEMG sensor data directly. This API lets developers add their own machine-learning layers on top of the standard writing model. According to the documentation, access to this lower-level API is limited to members of Meta's annual Wear Developer programme.
Meta chief technology officer Andrew Bosworth wrote on X that "neural handwriting" was "the most accurate input method we have designed for sustained-time use on smart glasses." Bosworth added that the team is working on a gesture-classification interpretation model.
Developer-community reactions have been mixed. Mobile-app developer Liam Pham wrote on social media that "the accuracy is surprisingly high — it creates a typing-like experience that feels natural." Privacy researcher Soheil Niknam said the handling of sEMG data raised a new set of security questions.
Meta said sEMG data is processed locally between the glasses and the paired wristband; the primary model does not send data to Meta cloud services. The company added that the details of its policy on user-data retention would be published next month in a "privacy white paper."
Meta has not disclosed total Ray-Ban Display sales. However, CFO Susan Li told the February earnings call that "Ray-Ban Display has launched at four times the pace of Ray-Ban Stories," Meta's first smart-glasses model. Market researcher IDC estimates that more than half of all smart-glasses shipments globally in Q1 2026 were made by Meta.
The developer launch came with previews of three upcoming Ray-Ban Display apps: a glasses version of the Notion note-taking app, the cross-platform messaging app Beeper and the German language-learning app Babbel. The three apps are expected to be available through the application store from late May.