Apple is moving closer to what could become its next major hardware category, with plans to launch its first smart glasses in 2027 as the company sharpens its focus on AI-powered wearables following the lukewarm reception of the Vision Pro.
According to Bloomberg’s Mark Gurman, Apple is actively testing multiple frame designs and could unveil the product as early as the end of this year, ahead of a commercial release in 2027.
The product marks a pivot for the iPhone maker after years of pursuing an ambitious mixed-reality roadmap built around headsets and eventual augmented reality eyewear. The Cupertino giant now appears to be prioritizing a lighter, more practical wearable that can be worn all day, a direct response to growing consumer interest in AI-first devices and the early traction seen by Meta’s Ray-Ban smart glasses.
Register for Tekedia Mini-MBA edition 20 (June 8 – Sept 5, 2026).
Register for Tekedia AI in Business Masterclass.
Join Tekedia Capital Syndicate and co-invest in great global startups.
Register for Tekedia AI Lab.
The latest design testing reportedly includes four frame styles: a large rectangular frame, a slimmer rectangular version similar to the glasses worn by CEO Tim Cook, a larger oval or circular frame, and a smaller oval or circular design. Apple is also evaluating multiple finishes and colorways, including black, ocean blue, and light brown, suggesting the company is placing a strong emphasis on aesthetics and everyday wearability.
That is an important departure from the Vision Pro strategy. The Vision Pro was technologically ambitious but struggled to achieve mainstream appeal due to its high price point, bulkier form factor, and limited real-world use cases. By contrast, these glasses appear designed to fit into Apple’s more familiar playbook: enter an existing category late, refine the user experience, and make the device desirable as both technology and fashion.
Practically, the glasses are expected to resemble Meta Platforms’ Ray-Ban Meta smart glasses far more than the Vision Pro. Unlike a true augmented-reality device, the first-generation Apple glasses are not expected to feature displays embedded in the lenses. Instead, the product is likely to focus on camera, audio, and AI-driven contextual assistance.
According to reports, users will be able to take photos and videos, answer phone calls, play music, and interact with a long-awaited upgraded Siri. That positions the device less as a computing platform and more as an AI companion built around ambient intelligence.
However, the bigger story here is Apple’s evolving AI hardware strategy. The glasses are expected to rely heavily on a next-generation Siri that can understand what the wearer is seeing through onboard cameras and microphones. This means the product could support functions such as object recognition, landmark identification, contextual reminders, live translation, and navigation prompts delivered through audio.
In effect, Apple is trying to give Siri “eyes and ears.” This matters because it extends Apple Intelligence beyond the iPhone and Mac into always-on wearable computing. Rather than forcing users to open an app or take out a phone, the glasses would allow AI interactions to happen in real time and in context.
That is precisely the direction in which the broader industry is moving. Meta has already established an early lead in this segment. Its Ray-Ban Meta glasses have emerged as one of the few AI hardware products to find genuine consumer traction.
By comparison, Apple’s entry is likely to be more tightly integrated with the iPhone ecosystem, which could become its biggest competitive advantage. The glasses are expected to work closely with the iPhone for processing, connectivity, and user identity, allowing Apple to preserve battery life and keep the hardware slim.
That ecosystem integration may also help Apple avoid the pitfalls that hurt standalone AI hardware products such as the Humane AI Pin, which struggled because it attempted to replace the smartphone rather than complement it.
Reports over the past year suggest Apple has scaled back parts of its headset roadmap and redirected engineering resources toward smart glasses. That indicates the company sees AI wearables, not premium headsets, as the more immediate commercial opportunity.
The first version may be display-less, but it is likely to serve as a stepping stone toward Apple’s longer-term ambition of full augmented-reality glasses. The broader implication is that Apple is shifting from a vision of immersive computing to practical, AI-enhanced everyday wearables.
The product is expected to become one of the company’s most important launches since the Apple Watch, not because it replaces the iPhone, but because it deepens how users interact with Apple’s ecosystem throughout the day. In that sense, these glasses may be less about hardware innovation alone and more about Apple’s attempt to define the next interface for AI.



