VR/AR

Google Announces Android XR Operating System Alongside Samsung MR Headset


Google today announced Android XR, a new core branch of Android, designed as a spatial operating system for XR headsets and glasses. The company is pitching this as a comprehensive spatial computing platform, and hopes to establish its own territory in the XR landscape against incumbents Meta and Apple.

Google has revealed Android XR and it’s basically what the name implies: a full-blown version of Android that’s been adapted to run on XR headsets, supports the entire existing library of flat Android apps, and opens the door to “spatialized” versions of those apps, as well as completely immersive VR content.

Samsung’s newly announced headset, codenamed Project Moohan, will be the first MR headset to launch with Android XR next year. Check out our early hands-on with the headset.

Samsung Project Moohan | Image courtesy Google

Google tells us that Android apps currently on the Play Store will be available on immersive Android XR headsets by default, with developers able to opt-out if they choose. That means a huge library of existing flat apps will be available on the device on day one—great for giving the headset a baseline level of productivity.

That includes all of Google’s major first-party apps like Chrome, Gmail, Calendar, Drive, and more. Some of Google’s apps have been updated to uniquely take advantage of Android XR (or, as Google says, they have been “spatialized”).

Google TV, for instance, can be watched on a large, curved screen, with info panels popping out of the main window for better use of real-estate.

Google Photos has been redesigned with a layout that’s unique to Android XR, and the app can automatically convert photos and videos to 3D (with pretty impressive results).

YouTube not only supports a large curved screen for viewing but also supports the platform’s existing library of 360, 180, and 3D content.

Chrome supports multiple browser windows for multi-tasking while web-browsing, which pairs nicely with Android XR’s built-in support for bluetooth mice and keyboards.

And Google Maps has a fully immersive view that’s very similar to Google Earth VR, including the ability to view Street View photography and newly added volumetric captures of business interiors and other places (based on gaussian splats).

Functionally, this is all pretty similar to what Apple is doing with VisionOS, but Android flavored.

Where Android XR significantly differentiates itself is through its AI integration. Gemini is built right into Android XR. But this goes far beyond a chat agent. Gemini on Android XR is a conversational agent which allows you to have free-form voice conversations about what you see in both the real world and the virtual world. That means you can ask it for help in an app that’s floating in front of you, or ask it something about things you see around you via passthrough.

Apple has Siri on VisionOS, but it can’t see anything in or out of the headset. Meta has an experimental AI on Horizon OS that can see things in the real world around you, but it can’t see things in the virtual world. Gemini’s ability to consider both real and virtual content makes it feel more seamlessly integrated into the system and more useful.

Android XR is designed to power not only immersive MR headsets, but smartglasses too. In the near-term, Google envisions Android XR smartglasses as HUD-like companions to your smartphone, rather than full AR.

Prototype Android XR smartglasses | Image courtesy Google

And it’s Gemini that forms the core of Google’s plans for Android XR on smartglasses. The near-term devices for this use-case are compact glasses that can actually pass for regular-looking glasses, and offer small displays floating in your field-of-view for HUD-like informational purposes, as well as audio feedback for conversations with Gemini. Uses like showing texts, directions, or translations are being shown. Similar to Android XR on an MR headset, these smartglasses are almost certain to be equipped with cameras, giving Gemini the ability to see and respond to things you see.

It’s a lot like what Google Glass was doing a decade ago, but sleeker and much smarter.

While no specific smartglasses products have been announced for Android XR yet, Google and Samsung have been collaborating on an MR headset called “Project Moohan,” which Samsung will launch to consumers next year.

When it comes to development, Google is supporting a wide gamut of dev pathways. For devs building with Android Studio, a new Jetpack XR SDK extends that workflow to help developers create spatial versions of their existing flat apps. This includes a new Android XR Emulator for testing Android XR apps without a headset. Unity is also supported through a new Android XR Extension, as well as WebXR and OpenXR.

Google also says it’s bringing new capabilities to OpenXR through vendor extensions, including the following:

  • AI-powered hand mesh, designed to adapt to the shape and size of hands to better represent the diversity of your users
  • Detailed depth textures that allow real world objects to occlude virtual content
  • Sophisticated light estimation, for lighting your digital content to match real-world lighting conditions
  • New trackables that let you bring real world objects like laptops, phones, keyboards, and mice into a virtual environment

On the design side, Google has updated its ‘Material Design’ to include new components and layouts that automatically adapt for spatial apps.

Developers interested in building for Android XR can reach out via this form to express interest in an Android XR Developer Bootcamp coming in 2025.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.