Apple is finally making a serious play for your face — and it’s not wearing a Vision Pro.
According to reports from Bloomberg’s Mark Gurman (via TechCrunch), Apple is actively testing multiple smart glasses designs and prototypes. The general industry consensus points toward Apple prioritizing a lightweight, camera-and-AI-first approach over a full AR display — at least for the first consumer generation.
On paper, a display-less design might sound underwhelming compared to the dream of AR overlays everywhere. But it’s actually a smart, Apple-ian move. Here’s why.
No Display? No Problem (For Now)
The lack of a display in v1 would disappoint enthusiasts, but it’s likely the right call. Battery technology hasn’t caught up to what a full AR experience demands in a glasses form factor. The Meta Ray-Bans proved that smart glasses don’t need a screen to be useful — they just need good cameras, solid AI, and a capable assistant. Apple seems to agree.
Instead of cramming in half-baked AR that drains in 45 minutes, Apple is reportedly prioritizing what actually matters: comfort, all-day wearability, and a best-in-class AI assistant. That’s the formula Meta landed on with the Ray-Bans, and Apple would be wise to follow it — while bringing their own design polish.
Gesture Control: The Vision Pro’s Legacy
Apple is reportedly doubling down on hand tracking across its product line — the Vision Pro uses it, and supply chain rumors suggest AirPods Pro cameras and future smart glasses could support it too. A gesture-based input system for glasses would mean reliable interaction without needing to touch your phone or speak aloud in public.
This could create a consistent interaction model across Apple’s wearable ecosystem. Tap your fingers together to answer a call, swipe the air to dismiss a notification, point at something and ask Siri what it is. It’s invisible computing, which is exactly the point.
Design That Actually Looks Like Eyewear
Multiple reports suggest Apple is testing premium frame materials — moving beyond the standard plastics used in many first-generation smart glasses. With four style variants reportedly in testing, it’s clear Apple wants these to feel like glasses you’d choose to wear, not a gadget strapped to your face.
That distinction matters. The smart glasses market lives or dies on whether people wear them outside. If they look like tech, they stay in the drawer. If they look like premium eyewear, they become everyday accessories.
Siri’s Big Moment
None of this works without a great voice assistant, and that’s where the pressure is really on Apple. iOS 27 is widely expected to bring a significant Siri overhaul, and smart glasses would be its first real hardware showcase. If Siri still can’t reliably answer “what am I looking at?” without hallucinating, the glasses fail regardless of how good the hardware is.
Apple knows this. That’s probably why the glasses keep slipping — they’re waiting for the AI software to catch up to the hardware vision.
The Bottom Line
Apple’s smart glasses strategy — if reports are accurate — appears conservative in the best way. Skip the AR display for now, nail the basics, make them beautiful, and let the AI carry the experience. If the software delivers, these could be the first smart glasses that normal people actually want to buy — not just tech reviewers.
A preview later this year, launch likely in 2027. Plenty of time for Apple to get the software right.
Based on reporting from TechCrunch (citing Bloomberg’s Mark Gurman) and Glass Almanac.


