08 Dec 2025

feedAndroid Developers Blog

#WeArePlay: How Miksapix Interactive is bringing ancient Sámi Mythology to gamers worldwide

Posted by Robbie McLachlan, Developer Marketing




In our latest #WeArePlay film, which celebrates the people behind apps and games on Google Play, we meet Mikkel - the founder and CEO of Miksapix Interactive. Mikkel is on a mission to share the rich stories and culture of the Sámi people through gaming. Discover how he is building a powerful platform for cultural preservation using a superheroine.


You went from a career in broadcasting to becoming a founder in the games industry. What inspired that leap?

I've had an interest in games for a long time and always found the medium interesting. While I was working for a broadcast corporation in Karasjok, I was thinking, "Why aren't there any Sámi games or games with Sámi content?". Sámi culture is quite rich in lore and mythology. I wanted to bring that to a global stage. That's how Miksapix Interactive was born.



Your game, Raanaa - The Shaman Girl, is deeply rooted in Sámi culture. What is the significance of telling these specific stories?

Because these are our stories to tell! Our mission is to tell them to a global audience to create awareness about Sámi identity and culture. Most people in the world don't know about the Sámi and the Sámi cultures and lore. With our languages at risk, I hope to use storytelling as a way to inspire Sámi children to value their language, celebrate their identity, and take pride in their cultural heritage. Sámi mythology is rich with powerful matriarchs and goddesses, which inspired us to create a superheroine. Through her journey of self-discovery and empowerment, Raanaa finds her true strength - a story we hope will inspire hope and resilience in pre-teens and teens around the world. Through games like Raanaa - The Shaman Girl, we get to convey our stories in new formats.

How did growing up with rich storytelling affect your games?

I was raised in a reindeer herders family, which means we spent a lot of time in nature and the fields with the reindeers. Storytelling was a big part of the family. We would eat supper in the Lavvu tent sitting around a bonfire with relatives and parents telling stories. With Miksapix Interactive, I am taking my love for storytelling and bringing it to the world, using my first hand experience of the Sámi culture.


How has Google Play helped you achieve global reach from your base in the Arctic?

For us, Google Play was a no-brainer. It was the easiest part to just release it on Google Play, no hassle. We have more downloads from Google Play than anywhere else, and it has definitely helped us getting abroad in markets like Brazil, India and the US and beyond. The positive Play Store reviews motivated and inspired us during the development of Raanaa. We use Google products like Google Sheets for collaboration when we do a localization or translation.

What is next for Miksapix Interactive?

Now, our sights are set on growth. We are very focused on the Raanaa IP. For the mobile game, we are looking into localizing it to different Sámi languages. In Norway, we have six Sámi languages, so we are now going to translate it to Lule Sámi and Southern Sámi. We're planning to have these new Sámi languages available this year.

Discover other inspiring app and game founders featured in #WeArePlay.

08 Dec 2025 10:00pm GMT

Start building for glasses, new devices for Android XR and more in The Android Show | XR Edition

Posted by Matthew McCullough - VP of Product Management, Android Developer



Today, during
The Android Show | XR Edition, we shared a look at the expanding Android XR platform, which is fundamentally evolving to bring a unified developer experience to the entire XR ecosystem. The latest announcements, from Developer Preview 3 to exciting new form factors, are designed to give you the tools and platform you need to create the next generation of XR experiences. Let's dive into the details!

A spectrum of new devices ready for your apps

The Android XR platform is quickly expanding, providing more users and more opportunities for your apps. This growth is anchored by several new form factors that expand the possibilities for XR experiences.


A major focus is on lightweight, all-day wearables. At I/O, we announced we are working with Samsung and our partners Gentle Monster and Warby Parker to design stylish, lightweight AI glasses and Display AI glasses that you can wear comfortably all day. The integration of Gemini on glasses is set to unlock helpful, intelligent experiences like live translation and searching what you see.

And, partners like Uber are already exploring how AI Glasses can streamline the rider experience by providing simple, contextual directions and trip status right in the user's view


The ecosystem is simultaneously broadening its scope to include wired XR glasses, exemplified by Project Aura from XREAL. This device blends the immersive experiences typically found in headsets with portability and real-world presence. Project Aura is scheduled for launch next year.

New tools unlock development for all form factors

If you are developing for Android, you are already developing for Android XR. The release of Android XR SDK Developer Preview 3 brings increased stability for headset APIs and, most significantly, opens up development for AI Glasses.


You can now build augmented experiences for AI glasses using new libraries like Jetpack Compose Glimmer, a UI toolkit for transparent displays , and Jetpack Projected, which lets you extend your Android mobile app directly to glasses. Furthermore, the SDK now includes powerful ARCore for Jetpack XR updates, such as Geospatial capabilities for wayfinding.



For immersive experiences on headsets and wired XR glasses like Project Aura from XREAL, this release also provides new APIs for detecting a device's field-of-view, helping your adaptive apps adjust their UI.

Check out our post on the Android XR Developer Preview 3 to learn more about all the latest updates.

Expanding your reach with new engine ecosystems

The Android XR platform is built on the OpenXR standard, enabling integration with the tools you already use so you can build with your preferred engine.

Developers can utilize Unreal Engine's native Android and OpenXR capabilities, today, to build for Android XR leveraging the existing VR Template for immersive experiences. To provide additional, optimized extensions for the Android XR platform, a Google vendor plug, including support for hand tracking, hand mesh, and more, will be released early next year.

Godot now includes Android XR support, leveraging its focus on OpenXR to enable development for devices like Samsung Galaxy XR. The new Godot OpenXR vendor plugin v4.2.2 stable allows developers to port their existing projects to the platform.



Watch The Android Show | XR Edition

Thank you for tuning into the The Android Show | XR Edition. Start building differentiated experiences today using the Developer Preview 3 SDK and test your apps with the XR Emulator in Android Studio. Your feedback is crucial as we continue to build this platform together. Head over to developer.android.com/xr to learn more and share your feedback.


08 Dec 2025 6:00pm GMT

Build for AI Glasses with the Android XR SDK Developer Preview 3 and unlock new features for immersive experiences

Posted by Matthew McCullough - VP of Product Management, Android Developer

In October, Samsung launched Galaxy XR - the first device powered by Android XR. And it's been amazing seeing what some of you have been building! Here's what some of our developers have been saying about their journey into Android XR.

Android XR gave us a whole new world to build our app within. Teams should ask themselves: What is the biggest, boldest version of your experience that you could possibly build? This is your opportunity to finally put into action what you've always wanted to do, because now, you have the platform that can make it real.

You've also seen us share a first look at other upcoming devices that work with Android XR like Project Aura from XREAL and stylish glasses from Gentle Monster and Warby Parker.

To support the expanding selection of XR devices, we are announcing Android XR SDK Developer Preview 3!




With Android XR SDK Developer Preview 3, on top of building
immersive experiences for devices such as Galaxy XR, you can also now build augmented experiences for upcoming AI Glasses with Android XR.

New tools and libraries for augmented experiences

With developer preview 3, we are unlocking the tools and libraries you need to build intelligent and hands-free augmented experiences for AI Glasses. AI Glasses are lightweight and portable for all day wear. You can extend your existing mobile app to take advantage of the built-in speakers, camera, and microphone to provide new, thoughtful and helpful user interactions. With the addition of a small display on display AI Glasses, you can privately present information to users. AI Glasses are perfect for experiences that can help enhance a user's focus and presence in the real world.


To power augmented experiences on AI Glasses, we are introducing two new, purpose-built libraries to the Jetpack XR SDK:


Jetpack Compose Glimmer is a demonstration of design best practices for beautiful, optical see-through augmented experiences. With UI components optimized for the input modality and styling requirements of display AI Glasses, Jetpack Compose Glimmer is designed for clarity, legibility, and minimal distraction.

To help visualize and test your Jetpack Compose Glimmer UI we are introducing the AI Glasses emulator in Android Studio. The new AI Glasses emulator can simulate glasses-specific interactions such as touchpad and voice input.



Beyond the new Jetpack Projected and Jetpack Compose Glimmer libraries, we are also expanding ARCore for Jetpack XR to support AI Glasses. We are starting off with motion tracking and geospatial capabilities for augmented experiences - the exact features that enable you to create helpful navigation experiences perfect for all-day-wear devices like AI Glasses.


Expanding support for immersive experiences

We continue to invest in the libraries and tooling that power immersive experiences for XR Headsets like Samsung Galaxy XR and wired XR Glasses like the upcoming Project Aura from XREAL. We've been listening to your feedback and have added several highly-requested features to the Jetpack XR SDK since developer preview 2.


Jetpack SceneCore now features dynamic glTF model loading via URIs and improved materials support for creating new PBR materials at runtime. Additionally, the SurfaceEntity component has been enhanced with full Widevine Digital Rights Management (DRM) support and new shapes, allowing it to render 360-degree and 180-degree videos in spheres and hemispheres.

In Jetpack Compose for XR, you'll find new features like the UserSubspace component for follow behavior, ensuring content remains in the user's view regardless of where they look. Additionally, you can now use spatial animations for smooth transitions like sliding or fading. And to support an expanding ecosystem of immersive devices with diverse display capabilities, you can now specify layout sizes as fractions of the user's comfortable field of view.

In Material Design for XR, new components automatically adapt spatially via overrides. These include dialogs that elevate spatially, and navigation bars, which pop out into an Orbiter. Additionally, there is a new SpaceToggleButton component for easily transitioning to and from full space.

And in ARCore for Jetpack XR, new perception capabilities have been added, including face tracking with 68 blendshape values unlocking a world of facial gestures. You can also use eye tracking to power virtual avatars, and depth maps to enable more-realistic interactions with a user's environment.

For devices like Project Aura from XREAL, we are introducing the XR Glasses emulator in Android Studio. This essential tool is designed to give you accurate content visualization, while matching real device specifications for Field of View (FoV), Resolution, and DPI to accelerate your development.


If you build immersive experiences with Unity, we're also expanding your perception capabilities in the Android XR SDK for Unity. In addition to lots of bug fixes and other improvements, we are expanding tracking capabilities to include: QR and ArUco codes, planar images, and body tracking (experimental). We are also introducing a much-requested feature: scene meshing. It enables you to have much deeper interactions with your user's environment - your digital content can now bounce off of walls and climb up couches!

And that's just the tip of the iceberg! Be sure to check out our immersive experiences page for more information.

Get Started Today!

The Android XR SDK Developer Preview 3 is available today! Download the latest Android Studio Canary (Otter 3, Canary 4 or later) and upgrade to the latest emulator version (36.4.3 Canary or later) and then visit developer.android.com/xr to get started with the latest libraries and samples you need to build for the growing selection of Android XR devices. We're building Android XR together with you! Don't forget to share your feedback, suggestions, and ideas with our team as you progress on your journey in Android XR.


08 Dec 2025 6:00pm GMT