Everything We Know About Apple Glasses
Apple Glasses Are Coming
Apple is building smart glasses. Not a concept. Not a patent filing. A product in hardware engineering right now, with production targeting December 2026 and a public launch in 2027.
Here's everything we know, what it means for AI, and why we've been building for this moment since Demi's first line of code.
The timeline is real
On February 17, 2026, Bloomberg's Mark Gurman reported that Apple is ramping up work on smart glasses, an AI pendant, and camera-equipped AirPods. The glasses, codenamed N50, are the furthest along. Production is targeted for December 2026. Public launch in 2027.
This lines up with analyst Ming-Chi Kuo's roadmap from June 2025, which placed mass production of Ray-Ban-style Apple smart glasses at Q2 2027 with projected shipments of 3 to 5 million units in the first year. Kuo reaffirmed this timeline as recently as June 2026.
The signal isn't just analyst predictions. It's organizational. Apple dissolved its Vision Products Group and halted Vision Pro production after sales collapsed to roughly 80,000 units in 2025. The lighter Vision Air headset (N100) was paused entirely to redirect engineering resources to N50. Apple isn't hedging. They're all in on glasses.
What we know about the product
The first generation Apple glasses won't have a display. No AR overlays, no floating windows. They're camera-and-audio glasses, similar in concept to Meta's Ray-Bans but with Apple's silicon and ecosystem depth.
Here's what multiple sources (MacRumors, 9to5Mac, Bloomberg) report:
A dual camera system. One high-resolution camera for photos and video. A second for environmental sensing and computer vision. Visual Intelligence for identifying objects, reading text, and understanding what you're looking at. Voice-controlled Siri as the primary interaction. Live translation. Turn-by-turn navigation. Phone calls, music, spatial video. Multiple frame styles, prescription lens support through Zeiss, and an LED indicator when the camera is active.
The custom chip inside is based on the Apple Watch S-series architecture. Apple's silicon group stripped down the Watch chip for power efficiency and added multi-camera control. Same ultra-efficient foundation that gives Apple Watch all-day battery life with health sensors, GPS, and cellular.
They'll require an iPhone, marketed as an iPhone accessory. Same tier as Apple Watch and AirPods.
Price estimates center around $499 to $799. Premium, but within reach.
The market already proved itself
Apple isn't taking a bet. They're entering a market that Meta already validated.
Meta sold 7 million Ray-Ban smart glasses in 2025, tripling year-over-year. EssilorLuxottica's stock surged 14%. The Ray-Ban Meta glasses became the top-selling product in 60% of EMEA Ray-Ban stores. Production targets for 2026 sit at 10 million units, potentially scaling to 30 million.
Google confirmed AI glasses launching in 2026 built on Android XR, with Samsung, Gentle Monster, and Warby Parker as partners. Snap spun off a separate company for consumer AR Specs shipping this year. Global smart glasses shipments rose 110% year-over-year in H1 2025.
This is the year the category goes mainstream. Apple's entry in 2027 is timed to ride a wave that's already building.
Apple's three-device AI strategy
The glasses aren't a standalone product. Per Bloomberg and TechCrunch, Apple is building three AI wearables at once.
Smart glasses with dual cameras and voice control. An AI pendant that pins to your shirt or hangs as a necklace, with a low-resolution camera for context and a microphone for Siri. Camera-equipped AirPods with infrared sensing for gestures and spatial awareness.
All three connect to your iPhone. All three are designed around the same principle: AI that sees what you see, hears what you hear, and acts on it.
The Apple Watch is already part of this ecosystem. Health data from your wrist. Environmental awareness from your glasses. Audio interaction from your ears. A single AI layer connecting all of it.
Where the display version fits
The first generation is audio and camera only. But Apple is already working on a display-equipped version for 2028, using microLED technology and waveguide optics. Apple has secured most of the global micro-OLED production capacity for 2026-2027, locking competitors out of the supply chain.
Kuo's roadmap outlines seven head-mounted products in development: three Vision headsets and four smart glasses variants. The display glasses will add AR overlays on top of everything the first generation already does.
The no-display first generation isn't a compromise. It's a deliberate sequencing. Ship the AI, the cameras, and the ecosystem integration first. Add the visual layer once the product and user behavior are established.
Why this is exactly what we built for
We didn't build Demi for a chat window. We built it for a world where AI runs on your body, not in your hand.
Demi's primary surface is the Apple Watch. Two inches. No keyboard. Interactions measured in seconds. Every design decision we made was forced by those constraints. The AI has to understand you from a few words, act immediately, and confirm without demanding attention.
That's the exact interaction model Apple's glasses will require. Voice in, action out, confirmation at a glance. No typing. No scrolling. No staring at a screen.
When someone tells Demi to reschedule a meeting, it finds the right time, moves it, and notifies everyone. When someone says "order my usual from the Thai place," Demi handles the browsing, the order, the payment. When someone asks "what's on my calendar tomorrow," Demi reads it back in seconds.
These interactions don't change when the input device moves from your wrist to your face. They get better. The same voice command that works on a Watch works identically through glasses. The same autonomous agent that handles your email, calendar, smart home, and food orders from your wrist will handle them from your glasses without a single architectural change.
We're not pivoting to glasses. The product we've been building is already the product glasses need.
The AI gap nobody's talking about
The hardware conversation dominates the headlines. Camera specs, frame materials, chip architecture, display technology. But the real gap isn't hardware. It's the AI layer.
Apple's glasses will ship with Siri. Improved, yes. But Siri today can't book a restaurant, draft and send an email, compare three products and give you a recommendation, order food, or run autonomous tasks on a schedule.
Meta's glasses run Meta AI. Good at conversation, limited at action. Google's run Gemini. Strong at search, early at execution.
None of them have an autonomous agent that handles multi-step tasks across your calendar, email, smart home, browser, and third-party services. None of them have a permission system designed for an AI that acts on your life. None of them have been training on wearable interaction patterns for over a year.
Demi does all of that today. On a watch. The device that shares a chip architecture with the glasses Apple is building.
What comes next
Apple will likely announce the glasses at WWDC 2026 this June, based on Gurman's reporting and AppleInsider's analysis. Production starts at the end of this year. Public availability in 2027.
Between now and then, every AI assistant will scramble to figure out how to work without a screen. Most of them were designed around chat interfaces that assume your full attention. Adapting that to a voice-first, glanceable, ambient form factor isn't a feature update. It's a rebuild.
We don't have to rebuild. We started here.
Sources: Bloomberg, Ming-Chi Kuo, MacRumors, 9to5Mac, CNBC, TechCrunch, Engadget, Wareable, Patently Apple