10 Dec 2025
TalkAndroid
Crunchyroll Pulls The Plug On Free Anime This December
The plan wasn't all that great anyway. Hopefully, the ads don't move over to the Fan tier.
10 Dec 2025 11:50am GMT
ELEHEAR Beyond Pro Hearing Aids – Full Review
If you're like me and have some issues hearing, the multiple award-winning ELEHEAR Beyond Pro is the perfect…
10 Dec 2025 9:33am GMT
Prime Video is bringing back a cult TV series and fans are losing it
Get ready to step through the Stargate once again. Prime Video has officially confirmed the return of the…
10 Dec 2025 7:30am GMT
09 Dec 2025
TalkAndroid
This new Netflix show has everything to become the next #1 hit
Netflix has a long history of turning Spanish dramas into global sensations - from Money Heist to Elite,…
09 Dec 2025 4:30pm GMT
Anker and eufy drop big end-of-year discounts these stocking fillers
Great Christmas stocking fillers from Anker and Eufy
09 Dec 2025 3:32pm GMT
OnePlus amps up Pad Go 2 and Watch Lite with bigger power, brighter displays and a new Stylo
OnePlus previews major upgrades for the Pad Go 2 and Watch Lite
09 Dec 2025 9:39am GMT
Your phone battery is draining fast? This Android setting could be the reason
If your Android battery seems to vanish faster than your morning coffee, you're not alone. Many users assume…
09 Dec 2025 7:30am GMT
Boba Story Lid Recipes – 2025
Look no further for all the latest Boba Story Lid Recipes. They are all right here!
09 Dec 2025 3:10am GMT
Dice Dreams Free Rolls – Updated Daily
Get the latest Dice Dreams free rolls links, updated daily! Complete with a guide on how to redeem the links.
09 Dec 2025 3:09am GMT
08 Dec 2025
Android Developers Blog
#WeArePlay: How Miksapix Interactive is bringing ancient Sámi Mythology to gamers worldwide
Posted by Robbie McLachlan, Developer Marketing
In our latest #WeArePlay film, which celebrates the people behind apps and games on Google Play, we meet Mikkel - the founder and CEO of Miksapix Interactive. Mikkel is on a mission to share the rich stories and culture of the Sámi people through gaming. Discover how he is building a powerful platform for cultural preservation using a superheroine.
You went from a career in broadcasting to becoming a founder in the games industry. What inspired that leap?
I've had an interest in games for a long time and always found the medium interesting. While I was working for a broadcast corporation in Karasjok, I was thinking, "Why aren't there any Sámi games or games with Sámi content?". Sámi culture is quite rich in lore and mythology. I wanted to bring that to a global stage. That's how Miksapix Interactive was born.
Your game, Raanaa - The Shaman Girl, is deeply rooted in Sámi culture. What is the significance of telling these specific stories?
Because these are our stories to tell! Our mission is to tell them to a global audience to create awareness about Sámi identity and culture. Most people in the world don't know about the Sámi and the Sámi cultures and lore. With our languages at risk, I hope to use storytelling as a way to inspire Sámi children to value their language, celebrate their identity, and take pride in their cultural heritage. Sámi mythology is rich with powerful matriarchs and goddesses, which inspired us to create a superheroine. Through her journey of self-discovery and empowerment, Raanaa finds her true strength - a story we hope will inspire hope and resilience in pre-teens and teens around the world. Through games like Raanaa - The Shaman Girl, we get to convey our stories in new formats.
How did growing up with rich storytelling affect your games?
I was raised in a reindeer herders family, which means we spent a lot of time in nature and the fields with the reindeers. Storytelling was a big part of the family. We would eat supper in the Lavvu tent sitting around a bonfire with relatives and parents telling stories. With Miksapix Interactive, I am taking my love for storytelling and bringing it to the world, using my first hand experience of the Sámi culture.
How has Google Play helped you achieve global reach from your base in the Arctic?
For us, Google Play was a no-brainer. It was the easiest part to just release it on Google Play, no hassle. We have more downloads from Google Play than anywhere else, and it has definitely helped us getting abroad in markets like Brazil, India and the US and beyond. The positive Play Store reviews motivated and inspired us during the development of Raanaa. We use Google products like Google Sheets for collaboration when we do a localization or translation.
What is next for Miksapix Interactive?
Now, our sights are set on growth. We are very focused on the Raanaa IP. For the mobile game, we are looking into localizing it to different Sámi languages. In Norway, we have six Sámi languages, so we are now going to translate it to Lule Sámi and Southern Sámi. We're planning to have these new Sámi languages available this year.
Discover other inspiring app and game founders featured in #WeArePlay.08 Dec 2025 10:00pm GMT
TalkAndroid
Samsung Turns the S25 Into a Powerhouse With One UI 8.5 Beta
One UI 8.5 Beta is now rolling out to your Galaxy S25 phone.
08 Dec 2025 8:25pm GMT
Honor’s Durable Magic8 Lite Packs a massive 7500mAh battery
Honor's long-lasting Magic 8 Lite goes on sale in early January
08 Dec 2025 6:59pm GMT
Android Developers Blog
Start building for glasses, new devices for Android XR and more in The Android Show | XR Edition


Posted by Matthew McCullough - VP of Product Management, Android Developer
Today, during The Android Show | XR Edition, we shared a look at the expanding Android XR platform, which is fundamentally evolving to bring a unified developer experience to the entire XR ecosystem. The latest announcements, from Developer Preview 3 to exciting new form factors, are designed to give you the tools and platform you need to create the next generation of XR experiences. Let's dive into the details!
A spectrum of new devices ready for your apps
The Android XR platform is quickly expanding, providing more users and more opportunities for your apps. This growth is anchored by several new form factors that expand the possibilities for XR experiences.
A major focus is on lightweight, all-day wearables. At I/O, we announced we are working with Samsung and our partners Gentle Monster and Warby Parker to design stylish, lightweight AI glasses and Display AI glasses that you can wear comfortably all day. The integration of Gemini on glasses is set to unlock helpful, intelligent experiences like live translation and searching what you see.

And, partners like Uber are already exploring how AI Glasses can streamline the rider experience by providing simple, contextual directions and trip status right in the user's view

The ecosystem is simultaneously broadening its scope to include wired XR glasses, exemplified by Project Aura from XREAL. This device blends the immersive experiences typically found in headsets with portability and real-world presence. Project Aura is scheduled for launch next year.
New tools unlock development for all form factors
If you are developing for Android, you are already developing for Android XR. The release of Android XR SDK Developer Preview 3 brings increased stability for headset APIs and, most significantly, opens up development for AI Glasses.

You can now build augmented experiences for AI glasses using new libraries like Jetpack Compose Glimmer, a UI toolkit for transparent displays , and Jetpack Projected, which lets you extend your Android mobile app directly to glasses. Furthermore, the SDK now includes powerful ARCore for Jetpack XR updates, such as Geospatial capabilities for wayfinding.

For immersive experiences on headsets and wired XR glasses like Project Aura from XREAL, this release also provides new APIs for detecting a device's field-of-view, helping your adaptive apps adjust their UI.
Check out our post on the Android XR Developer Preview 3 to learn more about all the latest updates.
Expanding your reach with new engine ecosystems
The Android XR platform is built on the OpenXR standard, enabling integration with the tools you already use so you can build with your preferred engine.
Developers can utilize Unreal Engine's native Android and OpenXR capabilities, today, to build for Android XR leveraging the existing VR Template for immersive experiences. To provide additional, optimized extensions for the Android XR platform, a Google vendor plug, including support for hand tracking, hand mesh, and more, will be released early next year.
Godot now includes Android XR support, leveraging its focus on OpenXR to enable development for devices like Samsung Galaxy XR. The new Godot OpenXR vendor plugin v4.2.2 stable allows developers to port their existing projects to the platform.
Watch The Android Show | XR Edition
Thank you for tuning into the The Android Show | XR Edition. Start building differentiated experiences today using the Developer Preview 3 SDK and test your apps with the XR Emulator in Android Studio. Your feedback is crucial as we continue to build this platform together. Head over to developer.android.com/xr to learn more and share your feedback.
08 Dec 2025 6:00pm GMT
Build for AI Glasses with the Android XR SDK Developer Preview 3 and unlock new features for immersive experiences


Posted by Matthew McCullough - VP of Product Management, Android Developer
In October, Samsung launched Galaxy XR - the first device powered by Android XR. And it's been amazing seeing what some of you have been building! Here's what some of our developers have been saying about their journey into Android XR.
Android XR gave us a whole new world to build our app within. Teams should ask themselves: What is the biggest, boldest version of your experience that you could possibly build? This is your opportunity to finally put into action what you've always wanted to do, because now, you have the platform that can make it real.
You've also seen us share a first look at other upcoming devices that work with Android XR like Project Aura from XREAL and stylish glasses from Gentle Monster and Warby Parker.
To support the expanding selection of XR devices, we are announcing Android XR SDK Developer Preview 3!
With Android XR SDK Developer Preview 3, on top of building immersive experiences for devices such as Galaxy XR, you can also now build augmented experiences for upcoming AI Glasses with Android XR.
New tools and libraries for augmented experiences
With developer preview 3, we are unlocking the tools and libraries you need to build intelligent and hands-free augmented experiences for AI Glasses. AI Glasses are lightweight and portable for all day wear. You can extend your existing mobile app to take advantage of the built-in speakers, camera, and microphone to provide new, thoughtful and helpful user interactions. With the addition of a small display on display AI Glasses, you can privately present information to users. AI Glasses are perfect for experiences that can help enhance a user's focus and presence in the real world.
To power augmented experiences on AI Glasses, we are introducing two new, purpose-built libraries to the Jetpack XR SDK:
-
Jetpack Projected - built to bridge mobile devices and AI Glasses with features that allow you to access sensors, speakers, and displays on glasses
-
Jetpack Compose Glimmer - new design language and UI components for crafting and styling your augmented experiences on display AI Glasses
Jetpack Compose Glimmer is a demonstration of design best practices for beautiful, optical see-through augmented experiences. With UI components optimized for the input modality and styling requirements of display AI Glasses, Jetpack Compose Glimmer is designed for clarity, legibility, and minimal distraction.
To help visualize and test your Jetpack Compose Glimmer UI we are introducing the AI Glasses emulator in Android Studio. The new AI Glasses emulator can simulate glasses-specific interactions such as touchpad and voice input.

Beyond the new Jetpack Projected and Jetpack Compose Glimmer libraries, we are also expanding ARCore for Jetpack XR to support AI Glasses. We are starting off with motion tracking and geospatial capabilities for augmented experiences - the exact features that enable you to create helpful navigation experiences perfect for all-day-wear devices like AI Glasses.

Expanding support for immersive experiences
We continue to invest in the libraries and tooling that power immersive experiences for XR Headsets like Samsung Galaxy XR and wired XR Glasses like the upcoming Project Aura from XREAL. We've been listening to your feedback and have added several highly-requested features to the Jetpack XR SDK since developer preview 2.
Jetpack SceneCore now features dynamic glTF model loading via URIs and improved materials support for creating new PBR materials at runtime. Additionally, the SurfaceEntity component has been enhanced with full Widevine Digital Rights Management (DRM) support and new shapes, allowing it to render 360-degree and 180-degree videos in spheres and hemispheres.
In Jetpack Compose for XR, you'll find new features like the UserSubspace component for follow behavior, ensuring content remains in the user's view regardless of where they look. Additionally, you can now use spatial animations for smooth transitions like sliding or fading. And to support an expanding ecosystem of immersive devices with diverse display capabilities, you can now specify layout sizes as fractions of the user's comfortable field of view.
In Material Design for XR, new components automatically adapt spatially via overrides. These include dialogs that elevate spatially, and navigation bars, which pop out into an Orbiter. Additionally, there is a new SpaceToggleButton component for easily transitioning to and from full space.
And in ARCore for Jetpack XR, new perception capabilities have been added, including face tracking with 68 blendshape values unlocking a world of facial gestures. You can also use eye tracking to power virtual avatars, and depth maps to enable more-realistic interactions with a user's environment.
For devices like Project Aura from XREAL, we are introducing the XR Glasses emulator in Android Studio. This essential tool is designed to give you accurate content visualization, while matching real device specifications for Field of View (FoV), Resolution, and DPI to accelerate your development.
.gif)
If you build immersive experiences with Unity, we're also expanding your perception capabilities in the Android XR SDK for Unity. In addition to lots of bug fixes and other improvements, we are expanding tracking capabilities to include: QR and ArUco codes, planar images, and body tracking (experimental). We are also introducing a much-requested feature: scene meshing. It enables you to have much deeper interactions with your user's environment - your digital content can now bounce off of walls and climb up couches!
And that's just the tip of the iceberg! Be sure to check out our immersive experiences page for more information.
Get Started Today!
The Android XR SDK Developer Preview 3 is available today! Download the latest Android Studio Canary (Otter 3, Canary 4 or later) and upgrade to the latest emulator version (36.4.3 Canary or later) and then visit developer.android.com/xr to get started with the latest libraries and samples you need to build for the growing selection of Android XR devices. We're building Android XR together with you! Don't forget to share your feedback, suggestions, and ideas with our team as you progress on your journey in Android XR.
08 Dec 2025 6:00pm GMT
TalkAndroid
Jolla Phone smashes pre-order targets as Europe’s only independent smartphone returns
Jolla's return to the smartphone market is off to a fast start
08 Dec 2025 4:53pm GMT
Scored 4.5 out of 5 this intense Netflix show has viewers totally hooked
If you've been looking for your next edge-of-your-seat binge, The Beast in Me might just be it. Since…
08 Dec 2025 4:35pm GMT
Loved Absentia? Here are 4 addictive Netflix series you need to watch next
If you've just raced through all three seasons of Absentia on Netflix, chances are you're still catching your…
08 Dec 2025 7:30am GMT
07 Dec 2025
TalkAndroid
This Netflix series is perfect for your next weekend binge
If you're looking for something warm, clever, and quietly uplifting to get you through a cold weekend, Old…
07 Dec 2025 4:30pm GMT
Idris Elba returns as Luther on Netflix with a new movie and a big surprise for fans
He's back. Idris Elba is stepping once again into the dark, rumpled coat of Detective John Luther for…
07 Dec 2025 7:30am GMT
04 Dec 2025
Android Developers Blog
Android Studio Otter 2 Feature Drop is stable!
Posted by Sandhya Mohan - Product Manager, Trevor Johns - Developer Relations Engineer

The Android Studio Otter 2 Feature Drop is here to supercharge your productivity.
This final stable release for '25 powers up Agent Mode, equipping it with the new Android Knowledge Base for improved accuracy, and giving you the option to try out the new Gemini 3 model. You'll also be able to take advantage of new settings such as the ability to keep your personalized IDE environment consistent across all of your machines. We've also incorporated all of the latest stability and performance improvements from the IntelliJ IDEA 2025.2 platform, including Kotlin compiler and terminal improvements, making this a significant enhancement for your development workflow.
Updates to Agent Mode
We recently introduced the ability to use our latest model, Gemini 3 Pro Preview, within Android Studio. This is our best model for coding and agentic capabilities. It'll give you superior performance in Agent Mode and advanced problem-solving capabilities so you can focus on what you do best: creating high quality apps for your users.
We are beginning to roll out limited Gemini 3 access (with a 1 million token size window) to developers who are using the no-cost default model. For higher usage rate limits and longer sessions with Agent Mode, you can add a paid Gemini API Key or use a Gemini Code Assist Enterprise plan. Learn more about how to get started with Gemini 3.
While the training of large language models provides deep knowledge that is excellent for common tasks-like creating Compose UIs-training concludes on a fixed date, resulting in gaps for new libraries and updated best practices. They are also less effective with niche APIs because the necessary training data is scarce. To fix this, Android Studio's Agent Mode is now equipped with the Android Knowledge Base, a new feature designed to significantly improve accuracy and reduce hallucinations by grounding responses with authoritative documentation. This means that instead of just relying on its training data, the agent can actively consult fresh documentation from official sources like the Android developer docs, Firebase, Google Developers, and Kotlin docs before it answers you.
The information in the Android Knowledge Base is stored in Android Studio and its content is automatically updated in the background on a periodic basis, so this feature is available regardless of which LLM you're using for AI assistance.
Gemini searching documentation before it answers you
This feature will be invoked automatically when Agent Mode detects a need for additional context, and you'll see additional explanatory text. However, if you'd like Agent Mode to reference documentation more frequently, you can include a line such as "Refer to Android documentation for guidance" in your Rules configuration.
Requested settings updates
Backup and Sync
Backup and Sync is a new way to keep your personalized Android Studio environment consistent across all your installations. You can now back up your settings-including your preferred keymaps, Code Editor settings, system settings, and more-to cloud storage using your Google Account, giving you a seamless experience wherever you code. We also support Backup and Sync using JetBrains accounts for developers using both IntelliJ and Android Studio installs simultaneously.
Backup and Sync
Getting started is simple. Just sign into your Google Account by clicking the avatar in the top-right corner of the IDE, or navigate to Settings > Backup and Sync. Once you authorize Android Studio to access your account's storage, you have full control over which categories of app data you want to sync. If you're syncing for the first time on a new machine, Android Studio will give you the option to either download your existing remote settings or upload your current local settings to the cloud. Of course, if you change your mind, you can easily disable Backup and Sync at any time from the settings menu. This feature has been available since the first Android Studio Otter release.
You can now opt in to receive communications directly from the Android Studio team. This enables you to get emails and notifications about important product updates, new features, and new libraries as soon as they're available.
You'll see this option when you sign in, and you can change your preference at any time by going to Settings > Tools > Google Accounts > Communications.
Your option to receive emails and notifications
-
Kotlin K2 Mode: Following its rapid adoption after being enabled by default, the K2 Kotlin mode is now more stable and performant. This version improves Kotlin code analysis stability, adds new inspections, and enhances the reliability of Kotlin script execution.
-
Terminal Performance: The integrated terminal is significantly faster, with major improvements in rendering. For Bash and Zsh, this update also introduces minor visual refinements without compromising or altering core shell behavior.
04 Dec 2025 6:37pm GMT
03 Dec 2025
Android Developers Blog
What's new in the Jetpack Compose December '25 release
Posted by Nick Butcher, Jetpack Compose Product Manager
Today, the Jetpack Compose December '25 release is stable. This contains version 1.10 of the core Compose modules and version 1.4 of Material 3 (see the full BOM mapping), adding new features and major performance improvements.
To use today's release, upgrade your Compose BOM version to 2025.12.00:
|
implementation(platform("androidx.compose:compose-bom:2025.12.00")) |
Performance improvements
We know that the runtime performance of your app is hugely important to you and your users, so performance has been a major priority for the Compose team. This release brings a number of improvements-and you get them all by just upgrading to the latest version. Our internal scroll benchmarks show that Compose now matches the performance you would see if using Views:
Scroll performance benchmark comparing Views and Jetpack Compose across different versions of Compose
Pausable composition in lazy prefetch
Pausable composition in lazy prefetch is now enabled by default. This is a fundamental change to how the Compose runtime schedules work, designed to significantly reduce jank during heavy UI workloads.
Previously, once a composition started, it had to run to completion. If a composition was complex, this could block the main thread for longer than a single frame, causing the UI to freeze. With pausable composition, the runtime can now "pause" its work if it's running out of time and resume the work in the next frame. This is particularly effective when used with lazy layout prefetch to prepare frames ahead of time. The Lazy layout CacheWindow APIs introduced in Compose 1.9 are a great way to prefetch more content and benefit from pausable composition to produce much smoother UI performance.
Pausable composition combined with Lazy prefetch help reduce jank
We've also optimized performance elsewhere, with improvements to Modifier.onPlaced, Modifier.onVisibilityChanged, and other modifier implementations. We'll continue to invest in improving the performance of Compose.
New features
Retain
Compose offers a number of APIs to hold and manage state across different lifecycles; for example, remember persists state across compositions, and rememberSavable/rememberSerializable to persist across activity or process recreation. retain is a new API that sits between these APIs, enabling you to persist values across configuration changes without being serialized, but not across process death. As retain does not serialize your state, you can persist objects like lambda expressions, flows, and large objects like bitmaps, which cannot be easily serialized. For example, you may use retain to manage a media player (such as ExoPlayer) to ensure that media playback doesn't get interrupted by a configuration change.
@Composable
fun MediaPlayer() {
val applicationContext = LocalContext.current.applicationContext
val exoPlayer = retain { ExoPlayer.Builder(applicationContext).apply { ... }.build() }
...
}
We want to extend our thanks to the AndroidDev community (especially the Circuit team), who have influenced and contributed to the design of this feature.
Material 1.4
Version 1.4.0 of the material3 library adds a number of new components and enhancements:
-
TextField now offers an experimental TextFieldState based version, which provides a more robust method for managing text's state. In addition, new SecureTextField and OutlinedSecureTextField variants are now offered. The material Text composable now supports autoSize behaviour.
-
The carousel component now offers a new HorizontalCenteredHeroCarousel variant.
-
TimePicker now supports switching between the picker and input modes.
-
A vertical drag handle helps users to change an adaptive pane's size and/or position.
Horizontal centered hero carousel
Note that Material 3 Expressive APIs continue to be developed in the alpha releases of the material3 library. To learn more, see this recent talk:
New animation features
We continue to expand on our animation APIs, including updates for customizing shared element animations.
Dynamic shared elements
By default, sharedElement() and sharedBounds() animations attempt to animate
layout changes whenever a matching key is found in the target state. However, you may want to disable this animation dynamically based on certain conditions, such as the direction of navigation or the current UI state.
To control whether the shared element transition occurs, you can now customize the
SharedContentConfig passed to rememberSharedContentState(). The isEnabled
property determines if the shared element is active.
SharedTransitionLayout {
val transition = updateTransition(currentState)
transition.AnimatedContent { targetState ->
// Create the configuration that depends on state changing.
fun animationConfig() : SharedTransitionScope.SharedContentConfig {
return object : SharedTransitionScope.SharedContentConfig {
override val SharedTransitionScope.SharedContentState.isEnabled: Boolean
get() =
// determine whether to perform a shared element transition
}
}
}
See the documentation for more.
Modifier.skipToLookaheadPosition()
A new modifier, Modifier.skipToLookaheadPosition(), has been added in this release, which keeps the final position of a composable when performing shared element animations. This allows for performing transitions like "reveal" type animation, as can be seen in the Androidify sample with the progressive reveal of the camera. See the video tip here for more information:
Initial velocity in shared element transitions
This release adds a new shared element transition API, prepareTransitionWithInitialVelocity, which lets you pass an initial velocity (e.g. from a gesture) to a shared element transition:
Modifier.fillMaxSize()
.draggable2D(
rememberDraggable2DState { offset += it },
onDragStopped = { velocity ->
// Set up the initial velocity for the upcoming shared element
// transition.
sharedContentStateForDraggableCat
?.prepareTransitionWithInitialVelocity(velocity)
showDetails = false
},
)
A shared element transition that starts with an initial velocity from a gesture
Veiled transitions
EnterTransition and ExitTransition define how an AnimatedVisibility/AnimatedContent composable appears or disappears. A new experimental veil option allows you to specify a color to veil or scrim content; e.g., fading in/out a semi-opaque black layer over content:
Veiled animated content - note the semi-opaque veil (or scrim) over the grid content during the animation
AnimatedContent(
targetState = page,
modifier = Modifier.fillMaxSize().weight(1f),
transitionSpec = {
if (targetState > initialState) {
(slideInHorizontally { it } togetherWith
slideOutHorizontally { -it / 2 } + veilOut(targetColor = veilColor))
} else {
slideInHorizontally { -it / 2 } +
unveilIn(initialColor = veilColor) togetherWith slideOutHorizontally { it }
}
},
) { targetPage ->
...
}
Upcoming changes
Deprecation of Modifier.onFirstVisible
Compose 1.9 introduced Modifier.onVisibilityChanged and Modifier.onFirstVisible. After reviewing your feedback, it became apparent that the contract of Modifier.onFirstVisible was not possible to honor deterministically; specifically, when an item first becomes visible. For example, a Lazy layout may dispose of items that scroll out of the viewport, and then compose them again if they scroll back into view. In this circumstance, the onFirstVisible callback would fire again, as it is a newly composed item. Similar behavior would also occur when navigating back to a previously visited screen containing onFirstVisible. As such, we have decided to deprecate this modifier in the next Compose release (1.11) and recommend migrating to onVisibilityChanged. See the documentation for more information.
Coroutine dispatch in tests
We plan to change coroutine dispatch in tests to improve test flakiness and catch more issues. Currently, tests use the UnconfinedTestDispatcher, which differs from production behavior; e.g., effects may run immediately rather than being enqueued. In a future release, we plan to introduce a new API that uses StandardTestDispatcher by default to match production behaviours. You can try the new behavior now in 1.10:
@get:Rule // also createAndroidComposeRule, createEmptyComposeRule
val rule = createComposeRule(effectContext = StandardTestDispatcher())
Using the StandardTestDispatcher will queue tasks, so you must use synchronization mechanisms like composeTestRule.waitForIdle() or composeTestRule.runOnIdle(). If your test uses runTest, you must ensure that runTest and your Compose rule share the same StandardTestDispatcher instance for synchronization.
// 1. Create a SINGLE dispatcher instance
val testDispatcher = StandardTestDispatcher()
// 2. Pass it to your Compose rule
@get:Rule
val composeRule = createComposeRule(effectContext = testDispatcher)
@Test
// 3. Pass the *SAME INSTANCE* to runTest
fun myTest() = runTest(testDispatcher) {
composeRule.setContent { /* ... */ }
}
Tools
Great APIs deserve great tools, and Android Studio has a number of recent additions for Compose developers:
-
Transform UI: Iterate on your designs by right clicking on the @Preview, selecting Transform UI, and then describing the change in natural language.
-
Generate @Preview: Right-click on a composable and select Gemini > Generate [Composable name] Preview.
-
Customize Material Symbols with new support for icon variations in the Vector Asset wizard.
-
Generate code from a screenshot or ask Gemini to match your existing UI to a target image. This can be combined with remote MCP support e.g. to connect to a Figma file and generate Compose UI from designs.
-
Fix UI quality issues audits your UI for common problems, such as accessibility issues, and then proposes fixes.
To see these tools in action, watch this recent demonstration:
Happy Composing
We continue to invest in Jetpack Compose to provide you with the APIs and tools you need to create beautiful, rich UIs. We value your input, so please share your feedback on these changes or what you'd like to see next in our issue tracker.
03 Dec 2025 8:34pm GMT
02 Dec 2025
Android Developers Blog
Android 16 QPR2 is Released
Posted by Matthew McCullough, VP of Product Management, Android Developer
Faster Innovation with Android's first Minor SDK Release
Today we're releasing Android 16 QPR2, bringing a host of enhancements to user experience, developer productivity, and media capabilities. It marks a significant milestone in the evolution of the Android platform as the first release to utilize a minor SDK version.A Milestone for Platform Evolution: The Minor SDK Release
To support this, we have introduced new fields to the Build class as of Android 16, allowing your app to check for these new APIs using SDK_INT_FULL and VERSION_CODES_FULL.
if ((Build.VERSION.SDK_INT >= Build.VERSION_CODES.BAKLAVA) && (Build.VERSION.SDK_INT_FULL >= Build.VERSION_CODES_FULL.BAKLAVA_1)) { // Call new APIs from the Android 16 QPR2 release }
Enhanced User Experience and Customization
QPR2 improves Android's personalization and accessibility, giving users more control over how their devices look and feel.Expanded Dark Theme
When the expanded dark theme setting is enabled by a user, the system uses your app's isLightTheme theme attribute to determine whether to apply inversion. If your app inherits from one of the standard DayNight themes, this is done automatically for you. If it does not, make sure to declare isLightTheme="false" in your dark theme to ensure your app is not inadvertently inverted. Standard Android Views, Composables, and WebViews will be inverted, while custom rendering engines like Flutter will not.
This is largely intended as an accessibility feature. We strongly recommend implementing a native dark theme, which gives you full control over your app's appearance; you can protect your brand's identity, ensure text is readable, and prevent visual glitches from happening when your UI is automatically inverted, guaranteeing a polished, reliable experience for your users.
Custom Icon Shapes & Auto-Theming
In QPR2, users can select specific shapes for their app icons, which apply to all icons and folder previews. Additionally, if your app does not provide a dedicated themed icon, the system can now automatically generate one by applying a color filtering algorithm to your existing launcher icon.|
|
|
|
Custom Icon Shapes |
|
|
|
|
|
Test Icon Shape & Color in Android Studio |
Automatic system icon color filtering |
Interactive Chooser Sessions
The sharing experience is now more dynamic. Apps can keep the UI interactive even when the system sharesheet is open, allowing for real-time content updates within the Chooser.Boosting Your Productivity and App Performance
We are introducing tools and updates designed to streamline your workflow and improve app performance.Linux Development Environment with GUI Applications
The Linux development environment feature has been expanded to support running Linux GUI applications directly within the terminal environment.Generational Garbage Collection
The Android Runtime (ART) now includes a Generational Concurrent Mark-Compact (CMC) Garbage Collector. This focuses collection on newly allocated objects, resulting in reduced CPU usage and improved battery efficiency.Widget Engagement Metrics
You can now query user interaction events-such as clicks, scrolls, and impressions-to better understand how users engage with your widgets.16KB Page Size Readiness
To help prepare for future architecture requirements, we have added early warning dialogs for debuggable apps that are not 16KB page-aligned.Media, Connectivity, and Health
QPR2 brings robust updates to media standards and device connectivity.IAMF and Audio Sharing
We have added software decoding support for Immersive Audio Model and Formats (IAMF), an open-source spatial audio format. Additionally, Personal Audio Sharing for Bluetooth LE Audio is now integrated directly into the system Output Switcher.Health Connect Updates
Health Connect now automatically tracks steps using the device's sensors. If your app has the READ_STEPS permission, this data will be available from the "android" package. Not only does this simplify the code needed to do step tracking, it's also more power efficient. It also can now track weight, set index, and Rate of Perceived Exertion (RPE) in exercise segments.Smoother Migrations
A new 3rd-party Data Transfer API enables more reliable data migration between Android and iOS devices.Strengthening Privacy and Security
Security remains a top priority with new features designed to protect user data and device integrity.Developer Verification
We introduced APIs to support developer verification during app installation along with new ADB commands to simulate verification outcomes. As a developer, you are free to install apps without verification by using ADB, so you can continue to test apps that are not intended or not yet ready to distribute to the wider consumer population.SMS OTP Protection
The delivery of messages containing an SMS retriever hash will be delayed for most apps for three hours to help prevent OTP hijacking. The RECEIVE_SMS broadcast will be withheld and sms provider database queries will be filtered. The SMS will be available to these apps after the three hour delay.Secure Lock Device
A new system-level security state, Secure Lock Device, is being introduced. When enabled (e.g., remotely via "Find My Device"), the device locks immediately and requires the primary PIN, pattern, or password to unlock, heightening security. When active, notifications and quick affordances on the lock screen will be hidden, and biometric unlock may be temporarily disabled.Get Started
If you're not in the Beta or Canary programs, your Pixel device should get the Android 16 QPR2 release shortly. If you don't have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. If you are currently on the Android 16 QPR2 Beta and have not yet installed the Android 16 QPR3 beta, you can opt out of the program and you will then be offered the release version of Android 16 QPR2 over the air.For the best development experience with Android 16 QPR2, we recommend that you use the latest Canary build of Android Studio Otter.
Thank you again to everyone who participated in our Android beta program. We're looking forward to seeing how your apps take advantage of the updates in Android 16 QPR2.
02 Dec 2025 7:00pm GMT
Explore AI on Android with Our Sample Catalog App
Posted by Thomas Ezan and Ivy Knight
As the AI landscape continues to expand, we often hear that developers aren't always sure where to start and which API or SDK is best for their use case.
So we wanted to provide you with examples of AI-enabled features using both on-device and Cloud models and inspire you to create delightful experiences for your users.
We are thrilled to announce the launch of the redesigned Android AI Sample Catalog, a dedicated application designed to inspire and educate Android developers to build the next generation of AI-powered Android apps.
Discover what's possible with Google AI
The Android AI Sample Catalog is designed as a one-stop destination to explore the capabilities of Google AI APIs and SDKs. Inside, you'll find a collection of samples demonstrating a wide range of AI use cases that you can test yourself. We really designed this catalog to give you a hands-on feel for what you can build and help you find the right solution and capability for your needs.
Here are some of the samples you can find in the catalog:
|
Image generation with Imagen Uses Imagen to generate images of landscapes, objects and people in various artistic styles. |
On-device summarization with Gemini Nano Lets you summarize text on-device using Gemini Nano via the GenAI Summarization API. |
|
Chat with Nano Banana A chatbot app using the Gemini 3 Pro Image model (a.k.a. "Nano Banana Pro") letting you edit images via a conversation with the model. |
On-device image description with Gemini Nano Lets you generate image descriptions using Gemini Nano via the GenAI Image Description API. |
Other samples include: image editing via Imagen mask-editing capabilities, a to-do list app controlled via the voice using the Gemini Live API, on-device rewrite assistance powered by Gemini Nano, and more!
The samples using cloud inference are built using the Firebase AI Logic SDK, and the ML Kit GenAI API is used for the samples running on-device inference. We plan to continue creating new samples and updating the existing ones as new capabilities are added to the models and SDKs.
Fully open source and ready to copy
We believe the best way to learn is by doing. That's why the AI Sample Catalog is not only fully open-source but it's been architectured so the code relevant to the AI features is self-contained and easy to copy and paste, so you can quickly experiment with these code samples in your own project.
When you're exploring a sample in the app and want to see how it's built, you can simply click the <> SOURCE button to jump directly to the code on GitHub.
To help you get started quickly, each sample includes a README file that highlights the APIs used, along with key code snippets.

Note: To run the samples using the Firebase AI Logic SDK, you'll need to set up a Firebase AI project. Also, the samples using ML Kit Gen AI APIs powered by Gemini Nano are only supported on certain devices.
We also put extra thought into the app's user interface to make your learning experience more engaging and intuitive. We've refreshed the app with a bold new brand that infuses the Android look with an expressive AI design language. Most notably, the app now features a vibrant, textured backdrop for the new Material 3 expressive components, giving you a modern and enjoyable environment to explore the samples and dive into the code. The systematic illustrations, inspired by generated image composition, further enhance this polished, expressive experience.
Check out the Android AI Sample Catalog today, test the features, and dive into the code on GitHub to start bringing your own AI-powered ideas to life!
02 Dec 2025 5:00pm GMT
01 Dec 2025
Android Developers Blog
Learn about our newest Jetpack Navigation library with the Nav3 Spotlight Week
Posted by Don Turner - Developer Relations Engineer
Jetpack Navigation 3 is now stable, and using it can help you reduce tech debt, provide better separation of concerns, speed up feature development time, and support new form factors. We're dedicating a whole week to providing content to help you learn about Nav3, and start integrating it into your app.
You'll learn about the library in detail, how to modularize your navigation code, and lots of code recipes for common use cases. At the end of the week, tune into the "Ask Me Anything" session so you can have the experts answer anything you like about Nav3. Here's the full schedule:
Monday: API Overview
Dec 1st, 2025
Learn the most important Nav3 APIs including NavDisplay, NavEntry, and entryProvider with a coding walkthrough video.
Tuesday: Animations
Dec 2nd, 2025
Make your screen transitions look beautiful! Learn how to set custom animations for all screens in your app, and how to override transitions for individual screens that need different behavior.
See the Animate between destinations documentation and the Animation recipes to learn how to override the default animations at the NavDisplay level, and at the individual destination level.
Wednesday: Deep links
Dec 3rd, 2025
Deep links support has been one of the most requested features from developers. You'll learn how to create deep links with a variety of different code recipes.
Check out our guide to deep linking in Navigation 3. We have a basic recipe that shows you how to parse intents into navigation keys, and a more advanced recipe that demonstrates how to create a synthetic back stack.
Bonus content! Our main architecture sample, Now in Android, has been migrated to Navigation 3. Full details here.
Thursday: Modularization
Dec 4th, 2025
Learn how to modularize your navigation code. Avoid circular dependencies by separating navigation keys into their own modules, and learn how to use dependency injection and extension functions to move content into feature modules.
Friday: Ask Me Anything
Dec 5th, 2025
Do you have burning questions? We have a panel of experts waiting to provide answers live at 9am PST on Friday. Ask your questions using the #AskAndroid tag on BlueSky, LinkedIn and X.
01 Dec 2025 5:00pm GMT
25 Nov 2025
Android Developers Blog
#WeArePlay: Solving the dinner dilemma - how DELISH KITCHEN empowers 13 million home cooks

Posted by Robbie McLachlan - Developer Marketing
In our latest #WeArePlay film, which celebrates the people behind apps and games on Google Play, we meet Chiharu - a co-founder of DELISH KITCHEN. She is on a mission to solve one of the most common daily frustrations: "What should I make for dinner?" Discover how this avid baker since her teens combined her passions for food and tech, building an app that delivers 55,000 professional recipes to over 13 million users across Japan.You went from making websites in middle school to creating one of Japan's leading recipe apps. What inspired that leap and sparked the idea for DELISH KITCHEN?
I've loved the internet since I was a kid and always dreamed of creating my own online service. After working on mobile games, I discovered the huge potential for video content on mobile. That's when the spark for DELISH KITCHEN happened. I was working on new services related to food and saw that people were really struggling with deciding what to make every day. I felt this myself when I used to bake as it was difficult to understand recipes from only text and pictures. I wanted to solve that daily problem for people, and I knew video was the way to do it and make their lives happier.

What is the core mission behind DELISH KITCHEN and how does it make life easier for your users?
The biggest struggle for people who cook every day is deciding what to make. My mission is to solve that problem and make people happier every day. For beginners, we show them that cooking is easier than they thought. For experienced cooks, we make it easy to find what to make today. We have 55,000 recipes, and we also suggest the entire meal, or "KONDATE," so they don't have to think about pairing a main dish with a side.

How has Google Play helped you reach and grow with such a large audience?
We wanted our app to be accessible to as many people as possible, and distributing it on Google Play was very important for reaching the large number of Android users in Japan. To have over 5 million downloads, mostly just in Japan, shows how vital that is. Google Play has been a great partner in our growth because it provides the integrations and tools that help developers grow more efficiently. For us, that means checking all the user reviews to decide on new features, and it also includes feature campaigns that help us reach even more people.
What is next for DELISH KITCHEN?
We just launched an AI-powered cooking assistant for our premium users. It's designed to answer questions like, "What can I make from leftovers?" or "Can you give me a high-protein recipe under 500 calories?". We are also working on a new app, which will help people record their meals and manage their health and wellness. Beyond apps, we are reaching into offline services by building new partnerships with supermarkets to help people make better food choices in their everyday shopping.
Discover other inspiring app and game founders featured in #WeArePlay.
25 Nov 2025 5:00pm GMT
21 Nov 2025
Android Developers Blog
Fully Optimized: Wrapping up Performance Spotlight Week
Posted by Ben Weiss, Senior Developer Relations Engineer and Sara Hamilton, Product Manager
We spent the past week diving deep into sharing best practices and guidance that helps to make Android apps faster, smaller, and more stable. From the foundational powers of the R8 optimizer and Profile Guided Optimizations, to performance improvements with Jetpack Compose, to a new guide on levelling up your app's performance, we've covered the low effort, high impact tools you need to build a performant app.
This post serves as your index and roadmap to revisit these resources whenever you need to optimize. Here are the five key takeaways from our journey together.
Use the R8 optimizer to speed up your app
The single most impactful, low-effort change you can make is fully enabling the R8 optimizer. It doesn't just reduce app size; it performs deep, whole-program optimizations to fundamentally rewrite your code for efficiency. Revisit your Keep Rules and get R8 back into your engineering tasks.
Our newly updated and expanded documentation on the R8 optimizer is here to help.
Reddit observed a 40% faster cold startup and 30% fewer ANR errors after enabling R8 full mode.
You can read the full case study on our blog.
Engineers at Disney+ invest in app performance and are optimizing the app's user experience. Sometimes even seemingly small changes can make a huge impact. While inspecting their R8 configuration, the team found that the -dontoptimize flag was being used. After enabling optimizations by removing this flag, the Disney+ team saw significant improvements in their app's performance.
So next time someone asks you what you could do to improve app performance, just link them to this post.
Read more in our Day 1 blog: Use R8 to shrink, optimize, and fast-track your app
Guiding you to better performance
Baseline Profiles effectively remove the need for Just in Time compilation, improving startup speed, scrolling, animation and overall rendering performance. Startup Profiles make app startup more even more lightweight by bringing an intelligent order to your app's classes.dex files.
And to learn more about just how important Baseline Profiles are for app performance, read Meta's engineering blog where they shared how Baseline Profiles improved various critical performance metrics by up to 40% across their apps.
We continue to make Jetpack Compose more performant for you in Jetpack Compose 1.10. Features like pausable composition and the customizable cache window are crucial for maintaining zero scroll jank when dealing with complex list items.Take a look at the latest episode of #TheAndroidShow where we explain this in more detail.
Read more in our Wednesday's blog: Deeper Performance Considerations
Measuring performance can be easy as 1, 2, 3
You can't manage what you don't measure. Our Performance Leveling Guide breaks down your measurement journey into five steps, starting with easily available data and building up to advanced local tooling.
Starting at level 1, we'll teach you how to use readily available data from Android Vitals, which provides you with field data on ANRs, crashes, and excessive battery usage.
We'll also teach you how to level up. For example, we'll demonstrate how to reach level 3 with local performance testing using Jetpack Macrobenchmark and the new UiAutomator 2.4 API to accurately measure and verify any change in your app's performance.
Read more in our Thursday's blog
Debugging performance just got an upgrade
Advanced optimization shouldn't mean unreadable crash reports. New features are designed to help you confidently debug R8 and background work:
Automatic Logcat Retrace
Starting in Android Studio Narwhal, stack traces can automatically be de-obfuscated in the Logcat window. This way you can immediately see and debug any crashes in a production-ready build.
Narrow Keep Rules
On Tuesday we demystified the Keep Rules needed to fix runtime crashes, emphasizing writing specific, member-level rules over overly-broad wildcards. And because it's an important topic, we made you a video as well.
And with the new lint check for wide Keep Rules, the Android Studio Otter 3 Feature Drop has you covered here as well.
We also released new guidance on testing and troubleshooting your R8 configuration to help you get the configuration right with confidence.
Read more in our Tuesday's blog: Configure and troubleshoot R8 Keep Rules
Background Work
We shared guidance on debugging common scenarios you may encounter when scheduling tasks with WorkManager.
Background Task Inspector gives you a visual representation and graph view of WorkManager tasks, helping debug why scheduled work is delayed or failed. And our refreshed Background Work documentation landing page highlights task-specific APIs that are optimized for particular use cases, helping you achieve more reliable execution.
Read more in our Wednesday's blog: Background work performance considerations
Performance optimization is an ongoing journey
If you successfully took our challenge to enable R8 full mode this week, your next step is to integrate performance into your product roadmap using the App Performance Score. This standardized framework helps you find the highest leverage action items for continuous improvement.
We capped off the week with the #AskAndroid Live Q&A session, where engineers answered your toughest questions on R8, Profile Guided Optimizations, and more. If you missed it, look for the replay!
21 Nov 2025 5:00pm GMT
20 Nov 2025
Android Developers Blog
Leveling Guide for your Performance Journey

Posted by Alice Yuan - Senior Developer Relations Engineer

Welcome to day 4 of Performance Spotlight Week. Now that you've learned about some of the awesome tools and best practices we've introduced recently such as the R8 Optimizer, and Profile Guided Optimization with Baseline Profiles and Startup Profiles, you might be wondering where to start your performance improvement journey.
We've come up with a step-by-step performance leveling guide to meet your mobile development team where you are-whether you're an app with a single developer looking to get started with performance, or you have an entire team dedicated to improving Android performance.
The performance leveling guide features 5 levels. We'll start with level 1, which introduces minimal adoption effort performance tooling, and we'll go up to level 5, ideal for apps that have the resourcing to maintain a bespoke performance framework.
Feel free to jump to the level that resonates most with you:
Level 1: Use Play Console provided field monitoring
We recommend first leveraging Android vitals within the Play Console for viewing automatically collected field monitoring data, giving you insights about your application with minimal effort.
Android vitals is Google's initiative to automatically collect and surface this field data for you.
Here's an explanation of how we deliver this data:
-
Collect Data: When a user opts-in, their Android device automatically logs key performance and stability events from all apps, including yours.
-
Aggregate Data: Google Play collects and anonymizes this data from your app's users.
-
Surface Insights: The data is presented to you in the Android vitals dashboard within your Google Play Console.
The Android vitals dashboard tracks many metrics, but a few are designated as Core Vitals. These are the most important because they can affect your app's visibility and ranking on the Google Play Store.
The Core Vitals
| GOOGLE PLAY'S CORE TECHNICAL QUALITY METRICS
To maximize visibility on Google Play, keep your app below the bad behavior thresholds for these metrics. |
|
|---|---|
| User-perceived crash rate | The percentage of daily active users who experienced at least one crash that is likely to have been noticeable |
| User-perceived ANR rate | The percentage of daily active users who experienced at least one ANR that is likely to have been noticeable |
| Excessive battery usage | The percentage of watch face sessions where battery usage exceeds 4.44% per hour |
| New: Excessive partial wake locks | The percentage of user sessions where cumulative, non-exempt wake lock usage exceeds 2 hours |
The core vitals include user-perceived crash rate, ANR rate, excessive battery usage and the newly introduced metric on excessive partial wake locks.
User-Perceived ANR Rate
You can use the Android Vitals ANR dashboard, to see stack traces of issues that occur in the field and get insights and recommendations on how to fix the issue.

You can drill down into a specific ANR that occurred, to see the stack trace as well as insights on what might be causing the issue.
Also, check out our ANR guidance to help you diagnose and fix the common scenarios where ANRs might occur.
User-Perceived Crash Rate
Use the Android vitals crflevelash dashboard to further debug crashes and view a sample of stack traces that occur within your app.
Our documentation also has guidance around troubleshooting specific crashes. For example, the Troubleshoot foreground services guide discusses ways to identify and fix common scenarios where crashes occur.
Excessive Battery Usage
To decrease watch face sessions with excessive battery usage on Wear OS, check out the Wear guide on how to improve and conserve battery.
[new] Excessive Partial Wake Locks
We recently announced that apps that exceed the excessive partial wake locks threshold may see additional treatment starting on March 1st 2026.
For mobile devices, the Android vitals metric applies to non-exempted wake locks acquired while the screen is off and the app is in the background or running a foreground service. Android vitals considers partial wake lock usage excessive if wake locks are held for at least two hours within a 24-hour period and it affects more than 5% of your app's sessions, averaged over 28 days.
To debug and fix excessive wake lock issues, check out our technical blog post.
Consult our Android vitals documentation and continue your journey to better leverage Android vitals.
Level 2: Follow the App Performance Score action items
Next, move onto using the App Performance Score to find the high leverage action items to uplevel your app performance.
The Android App Performance Score is a standardized framework to measure your app's technical performance. It gives you a score between 0 and 100, where a lower number indicates more room for improvement.
To get easy wins, you should first start with the Static Performance Score first. These are often configuration changes or tooling updates that provide significant performance boosts.
Step 1: Perform the Static Assessment
The static assessment evaluates your project's configuration and tooling adoption. These are often the quickest ways to improve performance.
Navigate to the Static Score section of the scoreboard page and do the following:
-
Assess Android Gradle Plugin (AGP) Version.
-
Adopt R8 Minification incrementally or ideally, use R8 in full mode to minify and optimize the app code.
-
Adopt Baseline Profiles which improves code execution speed from the first launch providing performance enhancements for every new app install and every app update.
-
Adopt Startup Profiles to improve Dex Layout. Startup Profiles are used by the build system to further optimize the classes and methods they contain by improving the layout of code in your APK's DEX files.
-
Upgrade to the newest version of Jetpack Compose
Step 2: Perform the Dynamic Assessment
Once you have applied the static easy wins, use the dynamic assessment to validate the improvements on a real device. You can first do this manually with a physical device and a stop watch.
Navigate to the Dynamic Score section of the scoreboard page and do the following:
-
Set up your test environment with a physical device. Consider using a lower-end device to exaggerate performance issues, making them easier to spot.
-
Measure startup time from the launcher. Cold start your app from the launcher icon and measure the time until it is interactive.
-
Measure app startup time from a notification, with the goal to reduce notification startup time to be below a couple seconds.
-
Measure rendering performance by scrolling through your core screens and animations.
Once you've completed these steps, you will receive a score between 1 - 100 for the static and dynamic scores, giving you an understanding of your app's performance and where to focus on.
Level 3: Leverage local performance test frameworks
Once you've started to assess dynamic performance, you may find it too tedious to measure performance manually. Consider automating your performance testing using performance test frameworks such as Macrobenchmarks and UiAutomator.
Macrobenchmark 💚 UiAutomator
Think of Macrobenchmark and UiAutomator as two tools that work together: Macrobenchmark is the measurement tool. It's like a stopwatch and a frame-rate counter that runs outside your app. It is responsible for starting your app, recording metrics (like startup time or dropped frames), and stopping the app. UiAutomator is the robot user. The library lets you write code to interact with the device's screen. It can find an icon, tap a button, scroll on a list and more.
How to write a test
When you write a test, you wrap your UiAutomator code inside a Macrobenchmark block.
-
Define the Test: Use the @MacrobenchmarkRule
-
Start Measuring: Call benchmarkRule.measureRepeated.
-
Drive the UI: Inside that block, use UiAutomator code to launch your app, find UI elements, and interact with them.
Here's an example code snippet of what it looks like to test a compose list for scrolling jank.
benchmarkRule.measureRepeated(
// ...
metrics = listOf(
FrameTimingMetric(),
),
startupMode = StartupMode.COLD,
iterations = 10,
) {
// 1. Launch the app's main activity
startApp()
// 2. Find the list using its resource ID and scroll down
onElement { viewIdResourceName == "$packageName.my_list" }
.fling(Direction.DOWN)
}
-
Review the results: Each test run provides you with precisely measured information to give you the best data on your app's performance.
timeToInitialDisplayMs min 1894.4, median 2847.4, max 3355.6
frameOverrunMs P50 -3.2, P90 6.2, P95 10.4, P99 119.5
Common use cases
Macrobenchmark provides several core metrics out of the box. StartupTimingMetric allows you to accurately measure app startup. The FrameTimingMetric enables you to understand an app's rendering performance during the test.
We have a detailed and complete guide to using Macrobenchmarks and UiAutomator alongside code samples available for you to continue learning.
Level 4: Use trace analysis tools like Perfetto
Trace analysis tools like Perfetto are used when you need to see beyond your own application code. Unlike standard debuggers or profilers that only see your process, Perfetto captures the entire device state-kernel scheduling, CPU frequency, other processes, and system services-giving you complete context for performance issues.
Check our Performance Debugging youtube playlist for video instructions on performance debugging using system traces, Android Studio Profiler and Perfetto.
How to use Perfetto to debug performance
The general workflow for debugging performance using trace analysis tools is to record, load and analyze the trace.
Step 1: Record a trace
You can record a system trace using several methods:
-
Recording a trace manually on the device directly from the developer options.
-
Using the Android Studio CPU Profiler
-
Using the Perfetto UI
Step 2: Load the trace
Once you have the trace file, you need to load it into the analysis tool.
-
Open Chrome and navigate to ui.perfetto.dev.
-
Drag and drop your .perfetto-trace (or .pftrace) file directly into the browser window.
-
The UI will process the file and display the timeline.
Step 3: Analyze the trace
You can use Perfetto UI or Android Studio Profiler to investigate performance issues. Check out this episode of the MAD Skills series on Performance, where our performance engineer Carmen Jackson discusses the Perfetto traceviewer.
Scenarios for inspecting system traces using Perfetto
Perfetto is an expert tool and can provide information about everything that happened on the Android device while a trace was captured. This is particularly helpful when you cannot identify the root cause of a slowdown using standard logs or basic profilers.
Debugging Jank (Dropped Frames)
If your app stutters while scrolling, Perfetto can show you exactly why a specific frame missed its deadline.
If it's due to the app, you might see your main thread running for a long duration doing heavy parsing; this indicates scenarios where you should move the work into asynchronous processing.
If it's due to the system, you might see your main thread ready to run, but the CPU kernel scheduler gave priority to a different system service, leaving your app waiting (CPU contention). This indicates scenarios where you may need to optimize usage of platform APIs.
Analyzing Slow App Startup
Startup is complex, involving system init, process forking, and resource loading. Perfetto visualizes this timeline precisely.
You can see if you are waiting on Binder calls (inter-process communication). If your onCreate waits a long time for a response from the system PackageManager, Perfetto will show that blocked state clearly.
You can also see if your app is doing more work than necessary during the app startup. For example, if you are creating and laying out more views than the app needs to show, you can see these operations in the trace.
Investigating Battery Drain & CPU Usage
Because Perfetto sees the whole system, it's perfect for finding invisible power drains.
You can identify which processes are holding wake locks, preventing the device from sleeping under the "Device State" tracks. Learn more in our wake locks blog post. Also, use Perfetto to see if your background jobs are running too frequently or waking up the CPU unnecessarily.
Level 5: Build your own performance tracking framework
The final level is for apps that have teams with resourcing to maintain a performance tracking framework.
Building a custom performance tracking framework on Android involves leveraging several system APIs to capture data throughout the application lifecycle, from startup to exit, and during specific high-load scenarios.
By using ApplicationStartInfo, ProfilingManager, and ApplicationExitInfo, you can create a robust telemetry system that reports on how your app started, detailed info on what it did while running, and why it died.
ApplicationStartInfo: Tracking how the app started
Available from Android 15 (API 35), ApplicationStartInfo provides detailed metrics about app startup in the field. The data includes whether it was a cold, warm, or hot start, and the duration of different startup phases.
This helps you develop a baseline startup metric using production data to further optimize that might be hard to reproduce locally. You can use these metrics to run A/B tests optimizing the startup flow.
The goal is to accurately record launch metrics without manually instrumenting every initialization phase.
You can query this data lazily some time after application launch.
ProfilingManager: Capturing why it was slow
ProfilingManager (API 35) allows your app to programmatically trigger system traces on user devices. This is powerful for catching transient performance issues in the wild that you can't reproduce locally.
The goal is to automatically record a trace when a specific highly critical user journey is detected as running slowly or experiencing performance issues.
You can register a listener that triggers when specific conditions are met or trigger it manually when you detect a performance issue such as jank, excessive memory, or battery drain.
Check our documentation on how to capture a profile, retrieve and analyze profiling data and use debug commands.
ApplicationExitInfo: Tracking why the app died
ApplicationExitInfo (API 30) tells you why your previous process died. This is crucial for finding native crashes, ANRs, or system kills due to excessive memory usage (OOM). You'll also be able to get a detailed tombstone trace by using the API getTraceInputStream.
The goal of the API is to understand stability issues that don't trigger standard Java crash reporters (like Low Memory Kills).
You should trigger this API on the next app launch.
Next Steps
Improving Android performance is a step-by-step journey. We're so excited to see how you level up your performance using these tools!
Tune in tomorrow for Ask Android
You have shrunk your app with R8 and optimized your runtime with Profile Guided Optimization. And measure your app's performance.
Join us tomorrow for the live Ask Android session. Ask your questions now using #AskAndroid and get them answered by the experts.
20 Nov 2025 5:00pm GMT
19 Nov 2025
Android Developers Blog
Jetpack Navigation 3 is stable
Posted by Don Turner - Developer Relations Engineer
Jetpack Navigation 3 version 1.0 is stable 🎉. Go ahead and use it in your production apps today. JetBrains are already using it in their KotlinConf app.
Navigation 3 is a new navigation library built from the ground up to embrace Jetpack Compose state. It gives you full control over your back stack, helps you retain navigation state, and allows you to easily create adaptive layouts (like list-detail). There's even a cross-platform version from JetBrains.
Why a new library?
The original Jetpack Navigation library (now Nav2) was designed 7 years ago and, while it serves its original goals well and has been improved iteratively, the way apps are now built has fundamentally changed.
Reactive programming with a declarative UI is now the norm. Nav3 embraces this approach. For example, NavDisplay (the Nav3 UI component that displays your screens) simply observes a list of keys (each one representing a screen) backed by Compose state and updates its UI when that list changes.
Figure 1. NavDisplay observes changes to a list backed by Compose state.
Nav2 can also make it difficult to have a single source of truth for your navigation state because it has its own internal state. With Nav3, you supply your own state, which gives you complete control.
Lastly, you asked for more flexibility and customizability. Rather than having a single, monolithic API, Nav3 provides smaller, decoupled APIs (or "building blocks") that can be combined together to create complex functionality. Nav3 itself uses these building blocks to provide sensible defaults for well-defined navigation use cases.
This approach allows you to:
-
Customize screen animations at both a global and individual level
-
Display multiple panes at the same time, and create flexible layouts using the Scenes API
-
Easily replace Nav3 components with your own implementations if you want custom behavior
Read more about its design and features in the launch blog.
Migrating from Navigation 2
If you're already using Nav2, specifically Navigation Compose, you should consider migrating to Nav3. To assist you with this, there is a migration guide. The key steps are:
-
Add the navigation 3 dependencies.
-
Update your navigation routes to implement NavKey. Your routes don't have to implement this interface to use Nav3, but if they do, you can take advantage of Nav3's rememberNavBackStack function to create a persistent back stack.
-
Create classes to hold and modify your navigation state - this is where your back stacks are held.
-
Replace NavController with these classes.
-
Move your destinations from NavHost's NavGraph into an entryProvider.
-
Replace NavHost with NavDisplay.
Experimenting with AI agent migration
You may want to experiment with using an AI agent to read the migration guide and perform the steps on your project. To try this with Gemini in Android Studio's Agent Mode:
-
Save this markdown version of the guide into your project.
-
Paste this prompt to the agent (but don't hit enter): "Migrate this project to Navigation 3 using ".
-
Type @migration-guide.md - this will supply the guide as context to the agent.
As always, make sure you carefully review the changes made by the AI agent - it can make mistakes!
We'd love to hear how you or your agent performed, please send your feedback here.
Tasty navigation recipes for common scenarios
For common but nuanced use cases, we have a recipes repository. This shows how to combine the Nav3 APIs in a particular way, allowing you to choose or modify the recipe to your particular needs. If a recipe turns out to be popular, we'll consider "graduating" the non-nuanced parts of it into the core Nav3 library or add-on libraries.
Figure 2. Useful code recipes can graduate into a library.
There are currently 19 recipes, including for:
-
Passing navigation arguments to ViewModels (including using Koin)
-
Returning results from screens by events and by shared state
We're currently working on a deeplinks recipe, plus a Koin integration, and have plenty of others planned. An engineer from JetBrains has also published a Compose Multiplatform version of the recipes.
If you have a common use case that you'd like to see a recipe for, please file a recipe request.
Summary
To get started with Nav3, check out the docs and the recipes. Plus, keep an eye out for a whole week of technical content including:
-
A deep dive video on the API covering modularization, animations and adaptive layouts.
-
A live Ask Me Anything (AMA) with the engineers who built Nav3.
Nav3 Spotlight Week starts Dec 1st 2025.
As always, if you find any issues, please file them here.
19 Nov 2025 8:02pm GMT
Stronger threat detection, simpler integration: Protect your growth with the Play Integrity API
Posted by Dom Elliott - Group Product Manager, Google Play and Eric Lynch - Senior Product Manager, Android Security
In the mobile ecosystem, abuse can threaten your revenue, growth, and user trust. To help developers thrive, Google Play offers a resilient threat detection service, Play Integrity API. Play Integrity API helps you verify that interactions and server requests are genuine-coming from your unmodified app on a certified Android device, installed by Google Play.
The impact is significant: apps using Play integrity features see 80% lower unauthorized usage on average compared to other apps. Today, leaders across diverse categories-including Uber, TikTok, Stripe, Kabam, Wooga, Radar.com, Zimperium, Paytm, and Remini-use it to help safeguard their businesses.
We're continuing to improve the Play Integrity API, making it easier to integrate, more resilient against sophisticated attacks, and better at recovering users who don't meet integrity standards or encounter errors with new Play in-app remediation prompts.
Detect threats to your business
The Play Integrity API offers verdicts designed to detect specific threats that impact your bottom line during critical interactions.
-
Unauthorized access: The accountDetails verdict helps you determine whether the user installed or paid for your app or game on Google Play.
-
Code tampering: The appIntegrity verdict helps you determine whether you're interacting with your unmodified binary that Google Play recognizes.
-
Risky devices and emulated environments: The deviceIntegrity verdict helps you determine whether your app is running on a genuine Play Protect certified Android device or a genuine instance of Google Play Games for PC.
-
Unpatched devices: For devices running Android 13 and higher, MEETS_STRONG_INTEGRITY response in the deviceIntegrity verdict helps you determine if a device has applied recent security updates. You can also opt in to deviceAttributes to include the attested Android SDK version in the response.
-
Risky access by other apps: The appAccessRiskVerdict helps you determine whether apps are running that could be used to capture the screen, display overlays, or control the device (for example, by misusing the accessibility permission). This verdict automatically excludes apps that serve genuine accessibility purposes.
-
Known malware: The playProtectVerdict helps you determine whether Google Play Protect is turned on and whether it has found risky or dangerous apps installed on the device.
-
Hyperactivity: The recentDeviceActivity level helps you determine whether a device has made an anomalously high volume of integrity token requests recently, which could indicate automated traffic and could be a sign of attack.
-
Repeat abuse and reused devices: deviceRecall (beta) helps you determine whether you're interacting with a device that you've previously flagged, even if your app was reinstalled or the device was reset. With device recall, you can customize the repeat actions you want to track.
The API can be used across Android form factors including phones, tablets, foldables, Android Auto, Android TV, Android XR, ChromeOS, Wear OS, and on Google Play Games for PC.
Make the most of Play Integrity API
Apps and games have found success with the Play Integrity API by following the security considerations and taking a phased approach to their anti-abuse strategy.
Step 1: Decide what you want to protect: Decide what actions and server requests in your apps and games are important to verify and protect. For example, you could perform integrity checks when a user is launching the app, signing in, joining a multiplayer game, generating AI content, or transferring money.
Step 2: Collect integrity verdict responses: Perform integrity checks at important moments to start collecting verdict data, without enforcement initially. That way you can analyze the responses for your install base and see how they correlate with your existing abuse signals and historical abuse data.
NEW: Let Play recover users with issues automatically
Deciding how to respond to different integrity signals can be complex, you need to handle various integrity responses and API error codes (like network issues or outdated Play services). We're simplifying this with new Play in-app remediation prompts. You can show a Google Play prompt to your users to automatically fix a wide range of issues directly within your app. This reduces integration complexity, ensures a consistent user interface, and helps get more users back to a good state.
GET_INTEGRITY automatically detects the issue
(in this example, a network error)
and resolves it.
You can trigger the GET_INTEGRITY dialog, available in Play Integrity API library version 1.5.0+, after a range of issues to automatically guide the user through the necessary fixes including:
-
Unauthorized access: GET_INTEGRITY guides the user back to a Play licensed response in accountDetails.
-
Code tampering: GET_INTEGRITY guides the user back to a Play recognized response in appIntegrity.
-
Device integrity issues: GET_INTEGRITY guides the user on how to get back to the MEETS_DEVICE_INTEGRITY state in deviceIntegrity.
-
Remediable error codes: GET_INTEGRITY resolves remediable API errors, such as prompting the user to fix network connectivity or update Google Play Services.
We also offer specialized dialogs including GET_STRONG_INTEGRITY (which works like GET_INTEGRITY while also getting the user back to the MEETS_STRONG_INTEGRITY state with no known malware issues in the playProtectVerdict), GET_LICENSED (which gets the user back to a Play licensed and Play recognized state), and CLOSE_UNKNOWN_ACCESS_RISK and CLOSE_ALL_ACCESS_RISK (which prompt the user to close potentially risky apps).
Choose modern integrity solutions
In addition to Play Integrity API, Google offers several other features to consider as part of your overall anti-abuse strategy. Both Play Integrity API and Play's automatic protection offer user experience and developer benefits for safeguarding app distribution. We encourage existing apps to migrate to these modern integrity solutions instead of using the legacy Play licensing library.
Automatic protection: Prevent unauthorized access with Google Play's automatic protection and ensure users continue getting your official app updates. Turn it on and Google Play will automatically add an installer check to your app's code, with no developer integration work required. If your protected app is redistributed or shared through another channel, then the user will be prompted to get your app from Google Play. Eligible Play developers also have access to Play's advanced anti-tamper protection, which uses obfuscation and runtime checks to make it harder and costlier for attackers to modify and redistribute protected apps.
Safeguard your business today
With a strong foundation in hardware-backed security and new automated remediation dialogs simplifying integration, the Play Integrity API is an essential tool for protecting your growth.
Get started with the Play Integrity API documentation.
19 Nov 2025 6:11pm GMT
Deeper Performance Considerations

Posted by Ben Weiss - Senior Developer Relations Engineer, Breana Tate - Developer Relations Engineer, Jossi Wolf - Software Engineer on Compose

Compose yourselves and let us guide you through more background on performance.
Welcome to day 3 of Performance Spotlight Week. Today we're continuing to share details and guidance on important areas of app performance. We're covering Profile Guided Optimization, Jetpack Compose performance improvements and considerations on working behind the scenes. Let's dive right in.
Profile Guided Optimization
Baseline Profiles and Startup Profiles are foundational to improve an Android app's startup and runtime performance. They are part of a group of performance optimizations called Profile Guided Optimization.
When an app is packaged, the d8 dexer takes classes and methods and populates your app's classes.dex files. When a user opens the app, these dex files are loaded, one after the other until the app can start. By providing a Startup Profile you let d8 know which classes and methods to pack in the first classes.dex files. This structure allows the app to load fewer files, which in turn improves startup speed.
Baseline Profiles effectively move the Just in Time (JIT) compilation steps away from user devices and onto developer machines. The generated Ahead Of Time (AOT) compiled code has proven to reduce startup time and rendering issues alike.
Trello and Baseline Profiles
We asked engineers on the Trello app how Baseline Profiles affected their app's performance. After applying Baseline Profiles to their main user journey, Trello saw a significant 25 % reduction in app startup time.
Trello was able to improve their app's startup time by 25 % by using baseline profiles.
Baseline Profiles at Meta
Also, engineers at Meta recently published an article on how they are accelerating their Android apps with Baseline Profiles.
Across Meta's apps the teams have seen various critical metrics improve by up to 40 % after applying Baseline Profiles.
Technical improvements like these help you improve user satisfaction and business success as well. Sharing this with your product owners, CTOs and decision makers can also help speed up your app's performance.
Get started with Baseline Profiles
To generate either a Baseline or Startup Profile, you write a macrobenchmark test that exercises the app. During the test profile data is collected which will be used during app compilation. The tests are written using the new UiAutomator API, which we'll cover tomorrow.
Writing a benchmark like this is straightforward and you can see the full sample on GitHub.
@Test
fun profileGenerator() {
rule.collect(
packageName = TARGET_PACKAGE,
maxIterations = 15,
stableIterations = 3,
includeInStartupProfile = true
) {
uiAutomator {
startApp(TARGET_PACKAGE)
}
}
}
Considerations
Start by writing a macrobenchmark tests Baseline Profile and a Startup Profile for the path most traveled by your users. This means the main entry point that your users take into your app which usually is after they logged in. Then continue to write more test cases to capture a more complete picture only for Baseline Profiles. You do not need to cover everything with a Baseline Profile. Stick to the most used paths and measure performance in the field. More on that in tomorrow's post.
Get started with Profile Guided Optimization
To learn how Baseline Profiles work under the hood, watch this video from the Android Developers Summit:
And check out the Android Build Time episode on Profile Guided Optimization for another in-depth look:
We also have extensive guidance on Baseline Profiles and Startup Profiles available for further reading.
Jetpack Compose performance improvements
The UI framework for Android has seen the performance investment of the engineering team pay off. From version 1.9 of Jetpack Compose, scroll jank has dropped to 0.2 % during an internal long scrolling benchmark test.
These improvements were made possible because of several features packed into the most recent releases.
Customizable cache window
By default, lazy layouts only compose one item ahead of time in the direction of scrolling, and after something scrolls off screen it is discarded. You can now customize the amount of items to retain through a fraction of the viewport or dp size. This helps your app perform more work upfront, and after enabling pausable composition in between frames, using the available time more efficiently.
To start using customizable cache windows, instantiate a LazyLayoutCacheWindow and pass it to your lazy list or lazy grid. Measure your app's performance using different cache window sizes, for example 50% of the viewport. The optimal value will depend on your content's structure and item size.
val dpCacheWindow = LazyLayoutCacheWindow(ahead = 150.dp, behind = 100.dp)
val state = rememberLazyListState(cacheWindow = dpCacheWindow)
LazyColumn(state = state) {
// column contents
}
Pausable composition
This feature allows compositions to be paused, and their work split up over several frames. The APIs landed in 1.9 and it is now used by default in 1.10 in lazy layout prefetch. You should see the most benefit with complex items with longer composition times.
More Compose performance optimizations
In the versions 1.9 and 1.10 of Compose the team also made several optimizations that are a bit less obvious.
Several APIs that use coroutines under the hood have been improved. For example, when using Draggable and Clickable, developers should see faster reaction times and improved allocation counts.
Optimizations in layout rectangle tracking have improved performance of Modifiers like onVisibilityChanged() and onLayoutRectChanged(). This speeds up the layout phase, even when not explicitly using these APIs.
Another performance improvement is using cached values when observing positions via onPlaced().
Prefetch text in the background
Starting with version 1.9, Compose adds the ability to prefetch text on a background thread. This enables you to pre-warm caches to enable faster text layout and is relevant for app rendering performance. During layout, text has to be passed into the Android framework where a word cache is populated. By default this runs on the Ui thread. Offloading prefetching and populating the word cache onto a background thread can speed up layout, especially for longer texts. To prefetch on a background thread you can pass a custom executor to any composable that's using BasicText under the hood by passing a LocalBackgroundTextMeasurementExecutor to a CompositionLocalProvider like so.
val defaultTextMeasurementExecutor = Executors.newSingleThreadExecutor()
CompositionLocalProvider(
LocalBackgroundTextMeasurementExecutor provides DefaultTextMeasurementExecutor
) {
BasicText("Some text that should be measured on a background thread!")
}
Depending on the text, this can provide a performance boost to your text rendering. To make sure that it improves your app's rendering performance, benchmark and compare the results.
Background work performance considerations
Background Work is an essential part of many apps. You may be using libraries like WorkManager or JobScheduler to perform tasks like:
-
Periodically uploading analytical events
-
Syncing data between a backend service and a database
-
Processing media (i.e. resizing or compressing images)
A key challenge while executing these tasks is balancing performance and power efficiency. WorkManager allows you to achieve this balance. It's designed to be power-efficient, and allow work to be deferred to an optimal execution window influenced by a number of factors, including constraints you specify or constraints imposed by the system.
WorkManager is not a one-size-fits-all solution, though. Android also has a number of power-optimized APIs that are designed specifically with certain common Core User Journeys (CUJs) in mind.
Reference the Background Work landing page for a list of just a few of these, including updating a widget and getting location in the background.
Local Debugging tools for Background Work: Common Scenarios
To debug Background Work and understand why a task may have been delayed or failed, you need visibility into how the system has scheduled your tasks.
To help with this, WorkManager has several related tools to help you debug locally and optimize performance (some of these work for JobScheduler as well)! Here are some common scenarios you might encounter when using WorkManager, and an explanation of tools you can use to debug them.
Debugging why scheduled work is not executing
Scheduled work being delayed or not executing at all can be due to a number of factors, including specified constraints not being met or constraints having been imposed by the system.
The first step in investigating why scheduled work is not running is to confirm the work was successfully scheduled. After confirming the scheduling status, determine whether there are any unmet constraints or preconditions preventing the work from executing.
There are several tools for debugging this scenario.
Background Task Inspector
The Background Task Inspector is a powerful tool integrated directly into Android Studio. It provides a visual representation of all WorkManager tasks and their associated states (Running, Enqueued, Failed, Succeeded).
To debug why scheduled work is not executing with the Background Task Inspector, consult the listed Work status(es). An 'Enqueued' status indicates your Work was scheduled, but is still waiting to run.
Benefits: Aside from providing an easy way to view all tasks, this tool is especially useful if you have chained work. The Background Task inspector offers a graph view that can visualize if a previous task failing may have impacted the execution of the following task.
Background Task Inspector list view
Background Task Inspector graph view
adb shell dumpsys jobscheduler
This command returns a list of all active JobScheduler jobs (which includes WorkManager Workers) along with specified constraints, and system-imposed constraints. It also returns job history.
Use this if you want a different way to view your scheduled work and associated constraints. For WorkManager versions earlier than WorkManager 2.10.0, adb shell dumpsys jobscheduler will return a list of Workers with this name:
[package name]/androidx.work.impl.background.systemjob.SystemJobService
If your app has multiple workers, updating to WorkManager 2.10.0 will allow you to see Worker names and easily distinguish between workers:
#WorkerName#@[package name]/androidx.work.impl.background.systemjob.SystemJobService
Benefits: This command is useful for understanding if there were any system-imposed constraints, which you cannot determine with the Background Task Inspector. For example, this will return your app's standby bucket, which can affect the window in which scheduled work completes.
Enable Debug logging
You can enable custom logging to see verbose WorkManager logs, which will have WM- attached.
Benefits: This allows you to gain visibility into when work is scheduled, constraints are fulfilled, and lifecycle events, and you can consult these logs while developing your app.
WorkInfo.StopReason
If you notice unpredictable performance with a specific worker, you can programmatically observe the reason your worker was stopped on the previous run attempt with WorkInfo.getStopReason.
It's a good practice to configure your app to observe WorkInfo using getWorkInfoByIdFlow to identify if your work is being affected by background restrictions, constraints, frequent timeouts, or even stopped by the user.
Benefits: You can use WorkInfo.StopReason to collect field data about your workers' performance.
Debugging WorkManager-attributed high wake lock duration flagged by Android vitals
Android vitals features an excessive partial wake locks metric, which highlights wake locks contributing to battery drain. You may be surprised to know that WorkManager acquires wake locks to execute tasks, and if the wake locks exceed the threshold set by Google Play, can have impacts to your app's visibility. How can you debug why there is so much wake lock duration attributed to your work? You can use the following tools.
Android vitals dashboard
First confirm in the Android vitals excessive wake lock dashboard that the high wake lock duration is from WorkManager and not an alarm or other wake lock. You can use the Identify wake locks created by other APIs documentation to understand which wake locks are held due to WorkManager.
Perfetto
Perfetto is a tool for analyzing system traces. When using it for debugging WorkManager specifically, you can view the "Device State" section to see when your work started, how long it ran, and how it contributes to power consumption.
Under "Device State: Jobs" track, you can see any workers that have been executed and their associated wake locks.
Device State section in Perfetto, showing CleanupWorker and BlurWorker execution.
Resources
Consult the Debug WorkManager page for an overview of the available debugging methods for other scenarios you might encounter.
And to try some of these methods hands on and learn more about debugging WorkManager, check out the Advanced WorkManager and Testing codelab.
Next steps
Today we moved beyond code shrinking and explored how the Android Runtime and Jetpack Compose actually render your app. Whether it's pre-compiling critical paths with Baseline Profiles or smoothing out scroll states with the new Compose 1.9 and 1.10 features, these tools focus on the feel of your app. And we dove deep into best practices on debugging background work.
Ask Android
On Friday we're hosting a live AMA on performance. Ask your questions now using #AskAndroid and get them answered by the experts.
The challenge
We challenged you on Monday to enable R8. Today, we are asking you to generate one Baseline Profile for your app.
With Android Studio Otter, the Baseline Profile Generator module wizard makes this easier than ever. Pick your most critical user journey-even if it's just your app startup and login-and generate a profile.
Once you have it, run a Macrobenchmark to compare CompilationMode.None vs. CompilationMode.Partial.
Share your startup time improvements on social media using #optimizationEnabled.
Tune in tomorrow
You have shrunk your app with R8 and optimized your runtime with Profile Guided Optimization. But how do you prove these wins to your stakeholders? And how do you catch regressions before they hit production?
Join us tomorrow for Day 4: The Performance Leveling Guide, where we will map out exactly how to measure your success, from field data in Play Vitals to deep local tracing with Perfetto.
19 Nov 2025 5:00pm GMT
18 Nov 2025
Android Developers Blog
How Uber is reducing manual logins by 4 million per year with the Restore Credentials API
Posted by Niharika Arora - Senior Developer Relations Engineer at Google, Thomás Oliveira Horta - Android Engineer at Uber
Uber is the world's largest ridesharing company, getting millions of people from here to there while also supporting food delivery, healthcare transportation, and freight logistics. Simplicity of access is crucial to its success; when users switch to a new device, they expect a seamless transition without needing to log back into the Uber app or go through SMS-based one-time password authentication. This frequent device turnover presents a challenge, as well as an opportunity for strong user retention.
To maintain user continuity, Uber's engineers turned to the Restore Credentials feature, an essential tool for a time when 40% of people in the United States replace their smartphone every year. Following an assessment of user demand and code prototyping, they introduced Restore Credentials support in the Uber rider app. To validate that restoring credentials helps remove friction for re-logins, the Uber team ran a successful A/B experiment for a five-week period. The integration led to a reduction in manual logins that, when projected across Uber's massive user base, is estimated to eliminate 4 million manual logins annually.
Eliminating login friction with Restore Credentials

The Restore Credentials API eliminates the multi-step manual sign in process on new devices.
There were past attempts at account restoration on new devices using solutions like regular data backup and BlockStore, though both solutions required sharing authentication tokens directly, from source device to destination device. Since token information is highly sensitive, these solutions are only used to some extent, to pre-fill login fields on the destination device and reduce some friction during the sign-in flows. Passkeys are also used to provide a secure and fast login method, but their user-initiated nature limits their impact on seamless device transitions.
"Some users don't use the Uber app on a daily basis, but they expect it will just work when they need it," said Thomás Oliveira Horta, an Android engineer at Uber. "Finding out you're logged out just as you open the app to request a ride on your new Android phone can be an unpleasant, off-putting experience."
With Restore Credentials, the engineers were able to bridge this gap. The API generates a unique token on the old device, which is seamlessly and silently moved to the new device when the user restores their app data during the standard onboarding process. This process leverages Android OS's native backup and restore mechanism, ensuring the safe transfer of the restore key along with the app's data. The streamlined approach guarantees a simple and safe account transfer, meeting Uber's security requirements without any additional user input or development overhead.
|
Note: Restore keys and passkeys use the same underlying server implementation. However, when you save them in your database, you must differentiate between them. This distinction is crucial because user-created passkeys can be managed directly by the user, while restore keys are system-managed and hidden from the user interface. |
"With the adoption of Restore Credentials on Uber's rider app, we started seeing consistent usage," Thomás said. "An average of 10,000 unique daily users have signed in with Restore Credentials in the current rollout stage, and they've enjoyed a seamless experience when opening the app for the first time on a new device. We expect that number to double once we expand the rollout to our whole userbase."
Implementation Considerations
"Integration was pretty easy with minor adjustments on the Android side by following the sample code and documentation," Thomás said. "Our app already used Credential Manager for passkeys, and the backend required just a couple of small tweaks. Therefore, we simply needed to update the Credential Manager dependency to its latest version to get access to the new Restore Credentials API. We created a restore key via the same passkey creation flow and when our app is launched on a new device, the app proactively checks for this key by attempting a silent passkey retrieval. If the restore key is found, it is immediately utilized to automatically sign the user in, bypassing any manual login."
Throughout the development process, Uber's engineers navigated a few challenges during implementation-from choosing the right entry point to managing the credential lifecycle on the backend.
Choosing the Restore Credentials entry point
The engineers carefully weighed the tradeoffs between a perfectly seamless user experience and implementation simplicity when selecting which Restore Credentials entry point to use for recovery. Ultimately, they prioritized a solution that offered an ideal balance.
"This can take place during App Launch or in the background during device restoration and setup, using BackupAgent," Thomás said. "The background login entry point is more seamless for the user, but it presented challenges with background operations and required usage of the BackupAgent API, which would have led to increased complexity in a codebase as large as Uber's." They decided to implement the feature during the first app launch, which was significantly faster than the manual login.
Addressing server-side challenges
A few server-side challenges arose during integration with the backend WebAuthn APIs, as their design assumed user verification would always be required, and that all credentials would be listed in a user's account settings; neither of these assumptions worked for the non-user-managed Restore Credential keys.
The Uber team resolved this by making minor changes to the WebAuthn services, creating new credential types to distinguish passkeys from Restore Credentials and process them appropriately.
Managing the Restore Credentials lifecycle
Uber's engineers faced several challenges in managing the credential keys on the backend, with specialized support from backend engineer Ryan O'Laughlin:
-
Preventing orphaned keys: A significant challenge was defining a strategy for deleting registered Public Keys to prevent them from becoming "orphaned." For example, uninstalling the app deletes the local credential, but because this action doesn't signal the backend, it leaves an unused key on the server.
-
Balancing key lifespan: Keys needed a "time to live" that was long enough to handle edge cases. For example, if a user goes through a backup and restore, then manually logs out from the old device, the key is deleted from that old device. However, the key must remain valid on the server so the new device can still use it.
-
Supporting multiple devices: Since a user might have multiple devices (and could initiate a backup and restore from any of them), the backend needed to support multiple Restore Credentials per user (one for each device).
Uber's engineers addressed these challenges by establishing rules for server-side key deletion based on new credential registration and credential usage.
The feature went from design to delivery in a rapid two-month development and testing process. Afterward, a five-week A/B experiment (time to validate the feature with users) went smoothly and yielded undeniable results.
Preventing user drop-off with Restore Credentials
By eliminating manual logins on new devices, Uber retained users who might have otherwise abandoned the sign-in flow on a new device. This boost in customer ease was reflected in a wide array of improvements, and though they may seem slight at a glance, the impact is massive at the scale of Uber's user base:
-
3.4% decrease in manual logins (SMS OTP, passwords, social login).
-
1.2% reduction in expenses for logins requiring SMS OTP.
-
0.575% increase in Uber's access rate (% of devices that successfully reached the app home screen).
-
0.614% rise in devices with completed trips.
Today, Restore Credentials is well on its way to becoming a standard part of Uber's rider app, with over 95% of users in the trial group registered.
[UI flow]
During new device setup, users can restore app data and credentials from a backup. After selecting Uber for restoration and the background process finishes, the app will automatically sign the user in on the new device's first launch.
The invisible yet massive impact of Restore Credentials
In the coming months, Uber plans to expand the integration of Restore Credentials. Projecting from the trial's results, they estimate the change will eliminate 4 million manual logins annually. By simplifying app access and removing a key pain point, they are actively building a more satisfied and loyal customer base, one ride at a time.
"Integrating Google's RestoreCredentials allowed us to deliver the seamless 'it just works' experience our users expect on a new device," said Matt Mueller, Lead Product Manager (Core Identity) at Uber. "This directly translated to a measurable increase in revenue, proving that reducing login friction is key to user engagement and retention."
Ready to enhance your app's login experience?
Learn how to facilitate a seamless login experience when switching devices with Restore Credentials and read more in the blog post. In the latest canary of the Android Studio Otter you can validate your integration, as new features help mock the backup and restoring mechanisms.
If you are new to Credential Manager, you can refer to our official documentation, codelab and samples for help with integration.
18 Nov 2025 10:00pm GMT
Configure and troubleshoot R8 Keep Rules

Posted by Ajesh R Pai - Developer Relations Engineer & Ben Weiss - Senior Developer Relations Engineer

In modern Android development, shipping a small, fast, and secure application is a fundamental user expectation. The Android build system's primary tool for achieving this is the R8 optimizer, the compiler that handles dead code and resource removal for shrinking, code renaming or minification, and app optimization.
Enabling R8 is a critical step in preparing an app for release, but it requires developers to provide guidance in the form of "Keep Rules."
After reading this article, check out the Performance Spotlight Week video on enabling, debugging and troubleshooting the R8 optimizer on YouTube.
Why Keep Rules are needed
The need to write Keep Rules stems from a core conflict: R8 is a static analysis tool, but Android apps often rely on dynamic execution patterns like reflection or calls in and out of native code using the JNI (Java Native Interface).
R8 builds a graph of used code by analyzing direct calls. When code is accessed in a dynamic way, R8's static analysis cannot predict that and it will identify that code as unused and remove it, leading to runtime crashes.
A keep rule is an explicit instruction to the R8 compiler, stating: "This specific class, method, or field is an entry point that will be accessed dynamically at runtime. You must keep it, even if you cannot find a direct reference to it."
See the official guide for more details on Keep Rules.
Where to write Keep Rules
Custom Keep Rules for an application are written in text file. By convention, this file is named proguard-rules.pro and is located in the root of the app or library module. This file is then specified in your module's build.gradle.kts file's release build type.
release {
isShrinkResources = true
isMinifyEnabled = true
proguardFiles(
getDefaultProguardFile("proguard-android-optimize.txt"),
"proguard-rules.pro",
)
}
Use the correct default file
The getDefaultProguardFile method imports a default set of rules provided by the Android SDK. When using the wrong file your app might not be optimized. Make sure to use proguard-android-optimize.txt. This file provides the default Keep Rules for standard Android components and enables R8's code optimizations. The outdated proguard-android.txt only provides the Keep Rules but does not enable R8's optimizations.
Since this is a serious performance problem, we are starting to warn developers about using the wrong file, starting in Android Studio Narwhal 3 Feature Drop. And starting with the Android Gradle Plugin Version 9.0 we're no longer supporting the outdated proguard-android.txt file. So make sure you upgrade to the optimized version.
How to write Keep Rules
A keep rule consists of three main parts:
-
An option like -keep or -keepclassmembers
-
Optional modifiers like allowshrinking
-
A class specification that defines the code to match
For the complete syntax and examples, refer to the guidance to add Keep Rules.
Keep Rule anti-patterns
It's important to know about best practices, but also about anti-patterns. These anti-patterns often arise from misunderstandings or troubleshooting shortcuts and can be catastrophic for a production build's performance.
Global options
These flags are global toggles that should never be used in a release build. They are only for temporary debugging to isolate a problem.
Using -dontotptimize effectively disables R8's performance optimizations leading to a slower app.
When using -dontobfuscate you disable all renaming and using -dontshrink turns off dead code removal. Both of these global rules increase app size.
Avoid using these global flags in a production environment wherever possible for a more performant app user experience.
Overly broad keep rules
The easiest way to nullify R8's benefits is to write overly-broad Keep Rules. Keep rules like the one below instruct the R8 optimizer to not shrink, not obfuscate, and not optimize any class in this package or any of its sub-packages. This completely removes R8's benefits for that entire package. Try to write narrow and specific Keep Rules instead.
-keep class com.example.package.** { *;} // WIDE KEEP RULES CAUSE PROBLEMS
The inversion operator (!)
The inversion operator (!) seems like a powerful way to exclude a package from a rule. But it's not that simple. Take this example:
-keep class !com.example.my_package.** { *; } // USE WITH CAUTION
You might think that this rule means "do not keep classes in com.example.package." But it actually means "keep every class, method and property in the entire application that is not in com.example.package." If that came as a surprise to you, best check for any negations in your R8 configuration.
Redundant rules for Android components
Another common mistake is to manually add Keep Rules for your app's Activities, Services, or BroadcastReceivers. This is unnecessary. The default proguard-android-optimize.txt file already includes the relevant rules for these standard Android components to work out of the box.
Also many libraries bring their own Keep Rules. So you should not have to write your own rules for these. In case there is a problem with Keep Rules from a library you're using, it is best to reach out to the library author to see what the problem is.
Keep Rule best practices
Now that you know what not to do, let's talk about best practices.
Write narrow Keep Rules
Good Keep Rules should be as narrow and specific as possible. They should preserve only what is necessary, allowing R8 to optimize everything else.
|
Rule |
Quality |
|---|---|
-keep class com.example.** { ; } |
Low: Keeps an entire package and its subpackages |
-keep class com.example.MyClass { ; } |
Low: Keeps an entire class which is likely still too wide |
-keepclassmembers class com.example.MyClass { private java.lang.String secretMessage; public void onNativeEvent(java.lang.String); } |
High: Only relevant methods and properties from a specific class are kept |
Use common ancestors
Instead of writing separate Keep Rules for multiple different data models, write one rule that targets a common base class or interface. The below rule tells R8 to keep any members of classes that implement this interface and is highly scalable.
# Keep all fields of any class that implements SerializableModel
-keepclassmembers class * implements com.example.models.SerializableModel {
<fields>;
}
Use Annotations to target multiple classes
Create a custom annotation (e.g., @Serialize) and use it to "tag" classes that need their fields preserved. This is another clean, declarative, and highly scalable pattern. You can create Keep Rules for already existing annotations from frameworks you're using as well.
# Keep all fields of any class annotated with @Serialize
-keepclassmembers class * {
@com.example.annotations.Serialize <fields>;
}
Choose the right Keep Option
The Keep Option is the most critical part of the rule. Choosing the wrong one can needlessly disable optimization.
|
Keep Option |
What It Does |
|
-keep |
Prevents the class and members mentioned in the declaration from being removed or renamed. |
|
-keepclassmembers |
Prevents the specified members from being removed or renamed, but allows the class itself to be removed but only on classes which are not otherwise removed. |
|
-keepclasseswithmembers |
A combination: Keeps the class and its members, only if all the specified members are present. |
You can find more about the keep option in our documentation for Keep Options.
Allow optimization with Modifiers
Modifiers like allowshrinking and allowobfuscation relax a broad -keep rule, giving optimization power back to R8. For example, if a legacy library forces you to use -keep on an entire class, you might be able to reclaim some optimization by allowing shrinking and obfuscation:
# Keep this class, but allow R8 to remove it if it's unused and allow R8 to rename it.
-keep,allowshrinking,allowobfuscation class com.example.LegacyClass
Add global options for additional optimization
Beyond Keep Rules, you can add global flags to your R8 configuration file to encourage even more optimization.
-repackageclasses is a powerful option that instructs R8 to move all obfuscated classes into a single package. This saves significant space in the DEX file by removing redundant package name strings.
-allowaccessmodification allows R8 to widen access (e.g., private to public) to enable more aggressive inlining. This is now enabled by default when using proguard-android-optimize.txt.
Warning: Library authors must never add these global optimization flags to their consumer rules, as they would be forcibly applied to the entire app.
And to make it even more clear, in version 9.0 of the Android Gradle Plugin we're going to start ignoring global optimization flags from libraries altogether.
Best practices for libraries
Every Android app relies on libraries one way or another. So let's talk about best practices for libraries.
For library developers
If your library uses reflection or JNI, you have the responsibility to provide the necessary Keep Rules to its consumers. These rules are placed in a consumer-rules.pro file, which is then automatically bundled inside the library's AAR file.
android {
defaultConfig {
consumerProguardFiles("consumer-rules.pro")
}
...
}
For library consumers
Filter out problematic Keep Rules
If you must use a library that includes problematic Keep Rules, you can filter them out in your build.gradle.kts file starting with AGP 9.0 This tells R8 to ignore the rules coming from a specific dependency.
release {
optimization.keepRules {
// Ignore all consumer rules from this specific library
it.ignoreFrom("com.somelibrary:somelibrary")
}
}
The best Keep Rule is no Keep Rule
The ultimate R8 configuration strategy is to remove the need to write Keep Rules altogether. For many apps can be achieved by choosing modern libraries that favor code generation over reflection. With code generation, the optimizer can more easily determine what code is actually used at runtime and what code can be removed. Also not using any dynamic reflection means no "hidden" entry points, and therefore, no Keep Rules are needed. When choosing a new library, always prefer a solution that uses code generation over reflection.
For more information about how to choose libraries, check choose library wisely.
Debugging and troubleshooting your R8 configuration
When R8 removes code it should have kept, or your APK is larger than expected, use these tools to diagnose the problem.
Find duplicate and global Keep Rules
Because R8 merges rules from dozens of sources, it can be hard to know what the "final" ruleset is. Adding this flag to your proguard-rules.pro file generates a complete report:
# Outputs the final, merged set of rules to the specified file
-printconfiguration build/outputs/logs/configuration.txt
You can search this file to find redundant rules or trace a problematic rule (like -dontoptimize) back to the specific library that included it.
Ask R8: Why are you keeping this?
If a class you expected to be removed is still in your app, R8 can tell you why. Just add this rule:
# Asks R8 to explain why it's keeping a specific class
class com.example.MyUnusedClass
-whyareyoukeeping
During the build, R8 will print the exact chain of references that caused it to keep that class, allowing you to trace the reference and adjust your rules.
For a full guide, check out the troubleshoot R8 section.
Next steps
R8 is a powerful tool for enhancing Android app performance. Its effectiveness, depends on a correct understanding of its operation as a static analysis engine.
By writing specific, member-level rules, leveraging ancestors and annotations, and carefully choosing the right keep options, you can preserve exactly what is necessary. The most advanced practice is to eliminate the need for rules entirely by choosing modern, codegen-based libraries over their reflection-based predecessors.
As you're following along Performance Spotlight Week, make sure to check out today's Spotlight Week video on YouTube and continue with our R8 challenge. Use #optimizationEnabled for any questions on enabling or troubleshooting R8. We're here to help.
It's time to see the benefits for yourself.
We challenge you to enable R8 full mode for your app today.
-
Follow our developer guides to get started: Enable app optimization.
-
Check if you still use proguard-android.txt and replace it with proguard-android-optimize.txt.
-
Then, measure the impact. Don't just feel the difference, verify it. Measure your performance gains by adapting the code from our Macrobenchmark sample app on GitHub to measure your startup times before and after.
We're confident you'll see a meaningful improvement in your app's performance.
While you're at it, use the social tag #AskAndroid to bring your questions. Throughout the week our experts are monitoring and answering your questions.
Stay tuned for tomorrow, where we'll talk about Profile Guided Optimization with Baseline and Startup Profiles, share how Compose rendering performance improved over the past releases and share performance considerations for background work.
18 Nov 2025 5:00pm GMT



.png)







.png)



