Meta Connect 2025: Full Breakdown of Meta’s Vision for AI, XR, and the Future of Wearables

7 mins
Abstract digital art with intersecting blue and pink gradient lines forming geometric wave patterns against a dark blue background.

Meta Connect 2025 took place on September 17–18 at Meta’s Menlo Park campus, with a mix of in-person sessions and immersive virtual streams. The event welcomed developers, tech media, and select partners, but skipped broader creator invites this year. For those tuning in remotely, the event was livestreamed in 2D via Meta’s developer site and broadcast in 3D within Horizon Worlds, an immersive viewing option exclusive to Meta’s Quest platform.

The hybrid setup wasn’t just about reach; it offered a real-world test of Meta’s evolving infrastructure. The shift to a more tech-focused in-person audience also hinted at where Meta sees traction: developers, researchers, and long-term platform contributors.

Promotional graphic for Meta Connect 2025 featuring abstract neon shapes in blue, pink, and green gradients with the Meta logo in the corner.
Featured Image: Meta Connect

Keynotes and Agenda Highlights

CEO Mark Zuckerberg opened the conference with a keynote that framed the company’s direction: not just toward XR, but toward wearable AI. From the outset, Meta positioned its smart glasses lineup as the new computing frontier, a space where voice assistants, real-time feedback, and connected services blend with daily wear.

The developer keynote the following day expanded on these themes. Sessions throughout Connect explored tools like Horizon Studio, updates to the Spatial SDK, and deep integrations between XR content and Meta’s AI stack. A fireside chat featuring Chief Scientist Michael Abrash and VP Richard Newcombe underscored Meta’s vision for AI-powered interfaces and contextual computing.

Wearables Take the Spotlight

Ray-Ban Meta Gen 2

Meta’s second-generation Ray-Ban smart glasses bring meaningful upgrades. Battery life is now nearly double that of the original, offering around 8 hours of mixed use. The camera captures 3K video and high-resolution stills with noticeable improvements in sharpness and colour fidelity.

Portrait of a man wearing black-framed Ray-Ban Meta smart glasses, dressed in a dark green shirt, looking directly into the camera.
Featured Image: Ray-Ban Meta Gen 2 / Image credits: Meta

Unlike many smart glasses still stuck in prototype territory, the Ray-Ban Gen 2 aims for daily wear. With new frame styles and colourways, Meta is clearly targeting the intersection of utility and style. Voice capture, livestreaming, and music playback features are all handled through discreet open-ear speakers and an upgraded microphone array.

Oakley Meta Vanguard

Developed in partnership with Oakley, the Vanguard model targets athletes and outdoor users. With a wraparound build, water resistance, and a high-FOV camera, they’re built for motion.

Close-up of a woman wearing sporty Oakley-branded smart glasses with reflective yellow-tinted lenses, looking focused against a clear sky.
Featured Image: Oakley Meta Vanguard / Image credits: Meta

What sets the Vanguard apart is its integration with Garmin and Strava. Real-time stats, audio coaching, and auto-recording triggers let the glasses function as a fitness dashboard. Whether sprinting or cycling, wearers can stay hands-free while still collecting and reviewing key performance metrics.

Ray-Ban Meta Display + Neural Band

If there was one showstopper, it was the Meta Display glasses bundled with the Neural Band. The right lens hosts a microdisplay that is visible only when needed and capable of displaying notifications, text, or short videos. This isn’t full AR in the traditional sense, but it’s a leap toward accessible heads-up information.

Pair of Ray-Ban smart glasses with a small digital display visible inside one lens, shown alongside a black wrist-worn neural input band.
Featured Image: Ray-Ban Meta Display + Neural Band / Image credits: Meta

Paired with the Neural Band, a wrist-based controller that uses EMG to interpret hand movements, the setup allows subtle gestures to serve as controls. This silent, low-latency interface points toward how Meta envisions frictionless interaction in public or on the move.

Priced at $799 USD, this model sets the tone for high-end wearable computing—one that’s invisible until called upon.

Beyond Hardware: Meta’s Strategic Focus on AI

Connect 2025 wasn’t just about form factors. The software running inside these devices received equal attention. Across all glasses models, Meta AI plays a central role, acting as a context-aware assistant for capturing, organizing, and responding.

Zuckerberg referenced this as “personal superintelligence,” a system that can remember events, summarize key moments, and eventually provide in-the-moment coaching or reminders. While today’s glasses offer between one and two hours of continuous AI activity, Meta’s aim is all-day support, pending improvements in battery and thermal performance.

Instead of siloed assistants on phones or speakers, Meta is pushing for ambient, wearable AI that learns and adapts in real time.

Developer Tools and Ecosystem Updates

Horizon Engine and Horizon Studio

To support the next wave of content creators, Meta introduced Horizon Engine, a rebuilt platform for rendering, interaction, and scale. Worlds created in Horizon can now host over 100 users in a single instance, which is 5x the previous capacity. Performance is also improved, with faster load times and better lighting physics.

Horizon Studio, the new creation suite, makes use of generative AI to accelerate content workflows. Creators can prompt the system for terrain, buildings, and even non-playable characters, all generated through natural language. These features aim to reduce the barrier to entry for immersive content creation, especially for those without 3D modelling backgrounds.

Hyperscape Capture

Meta also debuted Hyperscape Capture, a scanning tool that converts real-world spaces into photorealistic VR environments. Early access participants can use their Quest headsets to scan physical spaces and upload them as interactive backdrops.

This move opens the door for digital twins in education, real estate, and simulation training, making VR more grounded in familiar spaces.

Meta Developer Toolkit and Spatial SDKs

Developers received a new consolidated toolkit that combines Meta’s various SDKs into one integrated suite. From VR locomotion to AR overlays, the tools now offer more control and better integration with the Horizon OS.

Sessions at Connect showed how Android developers can port apps into the Quest environment, turning mobile experiences into persistent spatial windows. This approach widens the potential use cases for Meta’s hardware beyond games and media.

The Entertainment and Content Angle

Horizon TV and Streaming Deals

Meta’s push into entertainment received a boost with the announcement of Horizon TV, a new VR-native media hub that will support streaming from Disney+, Hulu, and ESPN.

Virtual reality display showing a cartoon chicken skateboarding in a 3D-rendered outdoor scene with a building and trees in the background.
Featured Image: Horizon TV / Image credits: Meta

In partnership with Disney and Universal, Meta is also bringing immersive versions of films into Horizon, including stereoscopic previews and spatial audio adaptations. One highlight was a 3D scene from Avatar: Fire and Ash, previewed by James Cameron in person.

These deals elevate the Quest platform from a gaming device to an immersive media console, setting the stage for longer daily engagement.

Industry and Media Reactions

Reactions to Connect 2025 were largely positive. The smart glasses lineup was seen as a meaningful leap toward wearable computing, moving beyond concept demos and into polished consumer products.

Analysts pointed to Meta’s positioning as unique: no other company currently offers such a cohesive blend of AI assistant, camera capture, and audio in a wearable form. Developer sentiment around Horizon Engine and Horizon Studio was also strong, particularly for the ease of AI-assisted creation.

Preparing for the Future

Meta didn’t spell out its entire roadmap, but the Connect agenda made one thing clear: smart glasses are now a core part of its strategy. The integration of display technology, AI models, and neural input indicates a shift toward interfaces that don’t require a dedicated device, such as phones or desktops.

This approach mirrors how mobile computing replaced the desktop; not through speed alone, but through accessibility and context. Meta is betting that AI-powered eyewear, supported by new development tools and content deals, will unlock similar shifts.

Whether this direction resonates beyond early adopters will depend on continued improvements in comfort, battery life, and privacy. But the technical foundation is now in place.

The groundwork has been laid for a new class of experiences. From creators using AI to generate worlds in minutes, to everyday users getting reminders whispered through their glasses, the lines between digital and physical are beginning to blur.