Apple’s long-rumored smart glasses are coming into focus. A fresh set of leaks has detailed key design decisions for Apple’s first entry into the AI eyewear category, and the picture is notably restrained. According to reports from AppleInsider, the device will rely on gesture-based input, include two built-in cameras, integrate Siri as the primary AI interface, and — critically — will not feature any kind of display.
This positions Apple’s smart glasses as a direct competitor to Meta’s Ray-Ban AI glasses rather than the full spatial computing experience of the Apple Vision Pro.
Gesture-Based Input Instead of Touch or Voice Only
The biggest design detail in the new leaks is the emphasis on gesture-based input. Apple is reportedly building the glasses to respond to physical hand or head movements, reducing reliance on voice commands alone. This approach makes sense for a device worn in public where activating Siri by voice repeatedly would be intrusive.
The exact gestures being tested have not been detailed, but the approach is consistent with how Apple has handled interaction models on the Apple Watch and the AirPods lineup — subtle physical inputs that feel natural without drawing attention.
Two Built-In Cameras for Visual Intelligence
Two cameras are confirmed in the current design. These cameras are expected to power Siri’s Visual Intelligence capabilities, allowing the glasses to identify objects, read text, translate signs, and answer questions about what the wearer is looking at in real time.
This functionality would bring Apple Intelligence features — already available on iPhone via the Camera Control button — directly to the face. Apple is also bringing a new Siri Camera Mode to the iOS 27 Camera app, so the underlying AI infrastructure is clearly being built out to work across multiple form factors.
No Display: Apple Prioritizing Battery Life and Wearability
Unlike Google’s XR glasses or earlier concepts of AR eyewear, Apple’s smart glasses in their current form will reportedly have no in-lens display. This is a significant design choice that prioritizes battery life, weight, and everyday wearability over immersive visual output.
The tradeoff is that all AI responses and information will be delivered through audio — likely via open-ear speakers similar to the technology Apple has developed in AirPods. This keeps the glasses looking like normal eyewear, which is arguably the most important design goal for a product meant to be worn all day.
Apple’s Apple Vision Pro represents the high end of Apple’s spatial computing vision. The smart glasses appear to be a separate, more accessible product for everyday AI assistance rather than immersive computing.
Siri as the Core AI Interface
Siri will be central to the experience. Given that Apple is overhauling Siri for iOS 27 with a chatbot-like interface — including a new app, conversational history, and support for third-party AI agents — the glasses could debut alongside or shortly after the new Siri launches this fall.
The glasses would effectively give Siri a persistent presence on the user’s face, making it the most always-on version of Apple’s assistant to date.
When Will Apple Smart Glasses Launch?
No official launch timeline has been confirmed. Reports suggest the glasses could arrive in late 2026 or early 2027, potentially alongside or after the expected iPhone Fold debut. Apple’s approach — starting with a camera-and-audio-focused device before adding a display — mirrors how Meta built out its Ray-Ban smart glasses before moving toward more complex AR hardware.
Frequently Asked Questions
Will Apple smart glasses have a display?
According to current leaks, no. Apple’s first smart glasses are expected to have no in-lens or AR display. All output will be delivered via audio.
What cameras will Apple smart glasses have?
The current design reportedly includes two built-in cameras used to power Siri’s Visual Intelligence features for real-time object recognition, text reading, and translation.
How will you interact with Apple smart glasses?
Leaked reports indicate the glasses will use gesture-based input rather than relying solely on voice commands, making them more discreet to use in public.
When could Apple smart glasses launch?
No confirmed date is available. Estimates point to a possible launch window in late 2026 or early 2027, potentially after the new Siri overhaul in iOS 27 is established.
Conclusion
Apple’s smart glasses are shaping up to be a thoughtful, practical first step into AI eyewear. Gesture controls, two cameras, and Siri integration — all without a display — suggests Apple is prioritizing wearability and real-world utility over spectacle. Whether this cautious approach pays off depends heavily on how compelling the new Siri turns out to be when iOS 27 arrives.
Leave a Reply