Apple CEO Tim Cook has been dropping hints for months about the company’s next major product category, and the picture is now clear enough to map out. In interviews around Apple’s 50th anniversary celebrations this month, Cook reiterated that the iPhone remains central to Apple’s vision while also signaling where the company is heading next: AI-powered wearables built around Visual Intelligence.
The category includes three products currently in development — Apple Glass, a wearable AI pendant, and a new version of AirPods with built-in cameras. Here is the full picture of what Apple is building, when each product is expected, and why Cook believes Visual Intelligence is the technology that holds all of it together.
Tim Cook’s Signal — How He Foreshadows Products
Cook’s approach to hinting at new product categories follows a recognizable pattern. Before the Apple Watch launched, he repeatedly emphasized the importance of health data. Before Apple Vision Pro, he spent years emphasizing the transformative potential of augmented reality. Now, for months, he has been centering his public comments around Visual Intelligence — Apple’s camera-based AI feature that analyzes your surroundings and acts on what it sees.
Speaking at events around Apple’s 50th anniversary celebrations this week, Cook said: “There’s so much left that we can do with the iPhone. I think it’s going to continue to be the center of people’s digital lives.” He then described a future where Apple Intelligence “integrates into other personal devices” — explicitly referencing augmented reality glasses and AI pendants as the next frontier.
Bloomberg’s Mark Gurman, who has closely tracked Cook’s language shifts, concluded: “Apple CEO Tim Cook is signaling that Visual Intelligence will be the defining feature of the company’s push into wearable AI devices.”
See Also: Siri Personal Intelligence Is Almost Here
What Is Visual Intelligence?
Visual Intelligence launched on the iPhone 16 Pro in 2024 and is now available on the full iPhone 17 lineup. It uses your camera to analyze whatever you point it at — a restaurant menu, a plant, a landmark, a product label — and surfaces useful information about it powered by Apple Intelligence, Google, and ChatGPT.
The problem with Visual Intelligence on the iPhone is the same problem every camera-based AI feature faces: you have to hold up your phone and consciously point it at something. It requires deliberate action, which limits how naturally it integrates into daily life.
Apple’s solution is to move the camera off the phone and onto something you are already wearing.
The Three AI Wearables Apple Is Building
Apple Glass — Smart Glasses for 2026 or 2027
Apple Glass is the most developed of the three wearable projects and is expected to compete directly with Meta’s Ray-Ban smart glasses. The device will feature built-in speakers, microphones, and cameras embedded into a standard glasses frame — no display, no heads-up projection, no AR overlay. The first generation is purely a camera-and-audio device, similar in concept to the Ray-Bans.
The cameras will feed Visual Intelligence continuously, allowing Siri to see what you see and respond based on your environment. The navigation use case Cook has hinted at: instead of hearing “turn in 50 feet,” you would hear “turn left at the signpost” — because Apple Glass can see the signpost. Other early use cases include identifying ingredients on a plate, scanning items for product information, reading foreign-language text aloud, and triggering reminders based on objects you walk past.
Apple Glass is expected in late 2026, with production potentially beginning in December 2026. A second generation with a true augmented reality overlay — floating information visible in your field of view — is expected in 2028 or later.
Camera-Equipped AirPods Ultra — Coming This Year
AirPods with cameras are the closest product to shipping. Multiple reports, including Bloomberg’s Gurman, indicate camera-equipped AirPods are planned for as early as 2026. These are likely the AirPods Ultra announced on March 17 — and the camera integration may arrive via a later hardware revision or be part of a second-generation AirPods Ultra in 2027.
The cameras in AirPods would not be for photography. They would give Apple Intelligence a continuous view of the world around you through the earbuds, enabling the same Visual Intelligence functionality as Apple Glass without requiring glasses.
The AI Pendant — An Always-On AI Worn Around Your Neck
The most unusual of the three projects is an AI pin or pendant — a wearable device equipped with a camera sensor that can be attached to clothing or worn on a lanyard. It is designed to provide continuous environmental awareness by feeding Visual Intelligence with a live view of your surroundings at all times.
The AI pin is the least developed of the three projects and could still be canceled. If it does ship, the earliest realistic timeline is 2027. Apple is watching the performance of Humane’s AI Pin — which launched in 2024 to poor reviews and was ultimately shut down — and Meta’s Ray-Ban glasses as reference points for what this category of product needs to do to find an audience.
Why Apple Thinks It Can Win This Category
Apple’s competitive advantage in AI wearables is the same it has in every hardware category: vertical integration. The A-series and S-series chips, the M-series for heavier processing, Private Cloud Compute for privacy-preserving cloud inference, and 2.5 billion active Apple devices already running Apple Intelligence — these are advantages no competitor can match in the near term.
The dependency on iPhone is also intentional. Apple Glass, camera AirPods, and the AI pendant all connect to and depend on iPhone to handle processing. That keeps them lightweight, affordable, and battery-efficient while anchoring them to Apple’s largest installed base. It also means anyone not in the Apple ecosystem cannot use these devices — a deliberate lock-in that mirrors the strategy Apple used with AirPods and Apple Watch.
The question is not whether Apple can build these products. It clearly can. The question is whether Visual Intelligence — and the new Siri that underpins it — is good enough, by the time Apple Glass ships, to make these devices feel magical rather than gimmicky. The answer to that question depends almost entirely on how well Siri’s overhaul lands in iOS 27 this September.
Leave a Reply