From Augmented Reality on smartphones to AI smart glasses: how the market is shifting towards Mixed Reality and why glasses like Meta Orion and Spectacles are closer than we think.
In the last ten years, when people talked about Augmented Reality (AR), they mostly thought of Instagram filters, face effects, games like Pokémon GO or small smartphone-based experiences. Today the landscape is radically different: AR is increasingly becoming a device instead of just a feature.
As Metagate, which develops Mixed Reality experiences on Meta Quest 3, we see this paradigm shift every day: attention is moving away from phone screens towards smart glasses and Mixed Reality headsets, with a direct impact both on the product and on the commercial funnel.
From the boom of AR filters to the shutdown of social platforms
Meta Spark and the end of AR filters on Instagram
For years, the dominant narrative was: “AR is within everyone’s reach because everyone has a smartphone.” Meta embodied this phase with Meta Spark, the platform for creating AR filters on Instagram and Facebook.
But in August 2024 Meta announces the shutdown of Meta Spark, with support for third-party filters ending as of January 2025. This is a very strong signal: the company is shifting resources away from the “decorative AR” of social media towards Mixed Reality platforms and devices and smart glasses.
8th Wall and the decline of general-purpose WebAR
On the WebAR front, the path of 8th Wall is similar. Acquired by Niantic in 2022 to strengthen the “real-world metaverse”, the platform has for years been the standard for browser-based AR experiences.
In 2025 it is announced that 8th Wall will enter a sunset phase: no new logins and no new experiences from 2026, and a complete shutdown of hosting in 2027. Niantic continues to work on AR (and now also Mixed Reality) – and continues to collaborate with Meta – but the message is clear: AR is no longer just a layer of the web or social media, it is moving onto dedicated hardware.
Meta, Luxottica and the race towards smart glasses
Meta & EssilorLuxottica: from partnership to equity stake
Meta has chosen a very precise path: if the future of AR runs through glasses, you need to team up with those who know how to produce glasses at scale. Hence the partnership with EssilorLuxottica, a group that controls brands like Ray-Ban and Oakley.
In July 2025 Meta goes one step further: it acquires a stake of around 3% in EssilorLuxottica, for an estimated value of around 3–3.5 billion euros, with the possibility of rising up to 5% over time. This move strengthens the smart glasses partnership, turning it into a real strategic alliance on the future of smart eyewear.
In practice, Meta not only co-designs smart glasses, but also becomes a shareholder of the world’s main player in the eyewear sector.
From Ray-Ban Stories to Ray-Ban Meta and Meta Ray-Ban Display
The journey of Meta’s smart glasses starts with Ray-Ban Stories in 2021 and evolves in 2023 with Ray-Ban Meta, presented on 27 September 2023:
- new 12MP cameras,
- improved audio,
- live streaming to Facebook and Instagram,
- integration with Meta AI.
The first two generations of Ray-Ban Meta do not have a display in the lenses: they are AI-first glasses, with camera + audio + assistant, but without visual overlay.
In 2025, however, Meta introduces Meta Ray-Ban Display (often also referred to as “Meta Ray-Ban Display”), Meta’s first AI smart glasses with an integrated display in the lenses:
- a micro AR display for notifications, directions and contextual information,
- deep integration with Meta AI,
- control via a neural wristband for advanced gestures and input.
With Ray-Ban Meta and Meta Ray-Ban Display, Meta thus covers two steps of the evolutionary curve: first AI glasses without a display, then smart glasses with an AR display, still close in look and feel to a traditional pair of Ray-Bans.
Quest 3 today, no Quest 4 (for now)
Meanwhile, on the Mixed Reality side, Meta continues to bet heavily on the Meta Quest 3 and its subsequent versions (such as Quest 3S), while Meta Quest 4 has not yet been launched and plans for new headsets have been pushed further into the future.
This is a key point: Meta remains the leader in headset-based Mixed Reality, but in its latest announcements the public focus has clearly shifted towards AI smart glasses.
Meta Orion: the “full” AR glasses of the future
If Ray-Ban Meta Display are the bridge between present and future, the Meta Orion project represents Meta’s long-term vision for full AR glasses.
Presented as a prototype at Meta Connect 2024, Orion is not yet a consumer product, but a system consisting of:
- glasses with AR waveguides to project content into the field of view,
- an external pocket computer providing computing power,
- a sEMG wristband to control the interface with micro hand movements.
Here the difference is clear: Orion does not just augment what we can do with the phone, it aims to become a new layer of personal computing.
Apple, Google, Samsung: the path from headset to glasses
Apple Vision Pro: extreme power, but still for the living room
Apple chose to enter mixed reality at the very top end of the range with Apple Vision Pro, a powerful but expensive and relatively bulky “spatial computer”.
In 2025 Apple updates Vision Pro mainly in terms of chip and performance, without revolutionising its design. It is an important step forward in performance, but it does not change the nature of the product: Vision Pro remains a device for home or professional use, rather than something you wear all day long.
In the meantime, supply chains and rumours strongly point to plans for lighter AR glasses, probably connected to the iPhone, with a time horizon around 2027. So Apple too is clearly aiming at glasses.
Google + Samsung: Android XR and headsets for developers
On the Android side, the map is taking shape like this:
- Google announces Android XR, a platform designed for XR headsets and glasses, developed in collaboration with Samsung and Qualcomm.
- Samsung introduces a first headset (later evolved into Galaxy XR), clearly targeted at developers and early adopters rather than the mainstream public.
- Google, meanwhile, is working on its own AI smart glasses, expected to arrive around 2026, in versions both with and without a display.
Again, the pattern is clear: first you build the XR ecosystem with a headset, then you shift the focus to lighter, more wearable AR glasses.
Over the Reality and the transition from mobile builder to mapping platform
In the world of smartphone-based AR, one of the few builders left with a global vision is Over the Reality (OVER).
OVER was born as a geolocated AR metaverse, with a world map divided into “OVRLands” and tools (Web Builder, Unity SDK) for creating experiences positioned in physical space.
In recent years, however, the strategy has increasingly shifted towards:
- VPS (Visual Positioning System),
- 3D mapping,
- spatial data for robotics and autonomous navigation.
Here too we see the same trend: AR is no longer just an “effect” for the end user, it is becoming mapping and data infrastructure that enables robots, AI and, in the future, AR glasses.
Snapchat Spectacles: the AR glasses laboratory
Snap was one of the first companies to truly believe in AR glasses:
- in 2021 it introduced the first Spectacles with see-through AR displays, designed for a limited number of creators;
- in 2024 came the fifth-generation Spectacles, with wider FOV, hand tracking and a dedicated OS (Snap OS);
- in 2026 Snap aims to launch the first consumer Specs, lighter and more affordable AR glasses, with the goal of entering the mass market.
Today, Spectacles are a laboratory: Snap tests form factors, interactions and AR languages that could become large-scale standards tomorrow.
Ray-Ban Meta, Spectacles, Orion: key differences
To understand where the Mixed Reality and smart glasses market is really heading, it is useful to compare three archetypes:
Ray-Ban Meta & Meta Ray-Ban Display (Meta + EssilorLuxottica)
- Ray-Ban Meta (first generations): no display in the lenses, but camera, open-ear audio and Meta AI.
- Meta Ray-Ban Display: they add an integrated AR display to show notifications, directions, text and contextual information.
- Interaction: voice (“Hey Meta”), touch on the temples, camera, AI and – in the most advanced models – neural wristband.
- Target: mainstream consumers, creators, everyday use.
Snap Spectacles (Snap)
- See-through MR display with waveguides, a real holographic overlay.
- Interaction: hand tracking, voice control, touch.
- Current target: advanced developers and creators, transitioning towards consumers with the Specs.
Meta Orion (Meta)
- MR glasses with waveguides, a wide field of view and complex overlays.
- Architecture with pocket computer + sEMG wristband for neural/muscular input.
- Vision: to replace a significant part of what we do on smartphones today, turning AR into the new operating system of the physical world.
The Metagate case: mobile AR vs headset Mixed Reality in the commercial funnel
Why smartphone AR has not (really) worked in the funnel
From our real-world experience at Metagate, built with actual clients and projects, we have been able to see an uncomfortable truth that goes beyond the narrative:
“Everyone has a smartphone so everyone can use AR” is true in theory, but it doesn’t work in the real funnel.
The patterns we have observed are consistent:
- Very low real usage rate: even if technically everyone could, almost no one actually opens the AR experience from their phone.
- Too much friction: scan the QR code, open the link, grant permissions, load the scene… every step is a potential drop-off point.
- Limited wow effect: once opened, the experience is perceived as “one more piece of content” inside the phone, not as something transformative.
- “Toy” perception on the business side: especially in enterprise and retail contexts, mobile AR is seen as a gimmick rather than a strategic investment.
The result is a funnel that stalls early: high theoretical potential, but low real conversion.
How the funnel flips with headset Mixed Reality
With Mixed Reality on a headset (e.g. Meta Quest 3), the parameters are reversed:
- Fewer potential users (not everyone owns an MR headset), but those who try the experience encounter an extremely strong wow effect.
- The experience is no longer “a piece of content on the screen”, but a three-dimensional environment that enters the user’s physical space, with objects, interfaces and content that “are really there”.
- In client meetings, the MR demo becomes a narrative accelerator: in a few seconds the decision-maker understands what a process, product or space could become when it is “augmented” by digital content.
In funnel terms:
- Mobile AR: accessible to everyone, but rarely memorable → the funnel tends to break.
- Headset Mixed Reality: fewer users, but an impact strong enough to justify an investment → the funnel continues, and conversations turn into projects.
This is one of the reasons why, at Metagate, we are betting so heavily on Mixed Reality experiences on Meta Quest: they are not only spectacular, they have a concrete impact on engagement, decisions and conversions.
Are we close to AR glasses “for everyone”?
Putting together all the market signals, the picture is clear:
- “surface-level” AR platforms (filters, general-purpose WebAR) are shutting down or repositioning,
- the big players (Meta, Apple, Google, Samsung, Snap) are converging on smart glasses as the end goal,
- the infrastructure (3D maps, VPS, spatial data, multimodal AI) is being prepared to support wearable, always-on computing.
Projects like Meta Orion and Snapchat’s Spectacles show what tomorrow’s AR will look like; products like Ray-Ban Meta and Meta Ray-Ban Display are the bridge we are crossing today.
For those who develop Mixed Reality experiences like Metagate, the message is simple:
We are moving from AR as an “effect on the feed” to AR as the “operating system of the physical world”.
Today’s MR headsets – like Meta Quest 3 – are the lab where we learn the spatial language. Tomorrow’s AR glasses – Orion, Spectacles, the glasses from Apple and Google – will be the place where this language becomes everyday life.