03 Dec 2025

feedAndroid Developers Blog

What's new in the Jetpack Compose December '25 release

Posted by Nick Butcher, Jetpack Compose Product Manager





Today, the Jetpack Compose December '25 release is stable. This contains version 1.10 of the core Compose modules and version 1.4 of Material 3 (see the full BOM mapping), adding new features and major performance improvements.


To use today's release, upgrade your Compose BOM version to 2025.12.00:


implementation(platform("androidx.compose:compose-bom:2025.12.00"))



Performance improvements

We know that the runtime performance of your app is hugely important to you and your users, so performance has been a major priority for the Compose team. This release brings a number of improvements-and you get them all by just upgrading to the latest version. Our internal scroll benchmarks show that Compose now matches the performance you would see if using Views:


Scroll performance benchmark comparing Views and Jetpack Compose across different versions of Compose


Pausable composition in lazy prefetch

Pausable composition in lazy prefetch is now enabled by default. This is a fundamental change to how the Compose runtime schedules work, designed to significantly reduce jank during heavy UI workloads.


Previously, once a composition started, it had to run to completion. If a composition was complex, this could block the main thread for longer than a single frame, causing the UI to freeze. With pausable composition, the runtime can now "pause" its work if it's running out of time and resume the work in the next frame. This is particularly effective when used with lazy layout prefetch to prepare frames ahead of time. The Lazy layout CacheWindow APIs introduced in Compose 1.9 are a great way to prefetch more content and benefit from pausable composition to produce much smoother UI performance.


Pausable composition combined with Lazy prefetch help reduce jank


We've also optimized performance elsewhere, with improvements to Modifier.onPlaced, Modifier.onVisibilityChanged, and other modifier implementations. We'll continue to invest in improving the performance of Compose.


New features

Retain

Compose offers a number of APIs to hold and manage state across different lifecycles; for example, remember persists state across compositions, and rememberSavable/rememberSerializable to persist across activity or process recreation. retain is a new API that sits between these APIs, enabling you to persist values across configuration changes without being serialized, but not across process death. As retain does not serialize your state, you can persist objects like lambda expressions, flows, and large objects like bitmaps, which cannot be easily serialized. For example, you may use retain to manage a media player (such as ExoPlayer) to ensure that media playback doesn't get interrupted by a configuration change.


@Composable

fun MediaPlayer() {

val applicationContext = LocalContext.current.applicationContext

val exoPlayer = retain { ExoPlayer.Builder(applicationContext).apply { ... }.build() }

...

}


We want to extend our thanks to the AndroidDev community (especially the Circuit team), who have influenced and contributed to the design of this feature.


Material 1.4

Version 1.4.0 of the material3 library adds a number of new components and enhancements:




Horizontal centered hero carousel


Note that Material 3 Expressive APIs continue to be developed in the alpha releases of the material3 library. To learn more, see this recent talk:



New animation features

We continue to expand on our animation APIs, including updates for customizing shared element animations.


Dynamic shared elements

By default, sharedElement() and sharedBounds() animations attempt to animate

layout changes whenever a matching key is found in the target state. However, you may want to disable this animation dynamically based on certain conditions, such as the direction of navigation or the current UI state.


To control whether the shared element transition occurs, you can now customize the

SharedContentConfig passed to rememberSharedContentState(). The isEnabled

property determines if the shared element is active.


SharedTransitionLayout {

val transition = updateTransition(currentState)

transition.AnimatedContent { targetState ->

// Create the configuration that depends on state changing.

fun animationConfig() : SharedTransitionScope.SharedContentConfig {

return object : SharedTransitionScope.SharedContentConfig {

override val SharedTransitionScope.SharedContentState.isEnabled: Boolean

get() =

// determine whether to perform a shared element transition

}

}

}


See the documentation for more.


Modifier.skipToLookaheadPosition()

A new modifier, Modifier.skipToLookaheadPosition(), has been added in this release, which keeps the final position of a composable when performing shared element animations. This allows for performing transitions like "reveal" type animation, as can be seen in the Androidify sample with the progressive reveal of the camera. See the video tip here for more information:




Initial velocity in shared element transitions

This release adds a new shared element transition API, prepareTransitionWithInitialVelocity, which lets you pass an initial velocity (e.g. from a gesture) to a shared element transition:


Modifier.fillMaxSize()

.draggable2D(

rememberDraggable2DState { offset += it },

onDragStopped = { velocity ->

// Set up the initial velocity for the upcoming shared element

// transition.

sharedContentStateForDraggableCat

?.prepareTransitionWithInitialVelocity(velocity)

showDetails = false

},

)




A shared element transition that starts with an initial velocity from a gesture



Veiled transitions

EnterTransition and ExitTransition define how an AnimatedVisibility/AnimatedContent composable appears or disappears. A new experimental veil option allows you to specify a color to veil or scrim content; e.g., fading in/out a semi-opaque black layer over content:


Veiled animated content - note the semi-opaque veil (or scrim) over the grid content during the animation


AnimatedContent(

targetState = page,

modifier = Modifier.fillMaxSize().weight(1f),

transitionSpec = {

if (targetState > initialState) {

(slideInHorizontally { it } togetherWith

slideOutHorizontally { -it / 2 } + veilOut(targetColor = veilColor))

} else {

slideInHorizontally { -it / 2 } +

unveilIn(initialColor = veilColor) togetherWith slideOutHorizontally { it }

}

},

) { targetPage ->

...

}



Upcoming changes


Deprecation of Modifier.onFirstVisible

Compose 1.9 introduced Modifier.onVisibilityChanged and Modifier.onFirstVisible. After reviewing your feedback, it became apparent that the contract of Modifier.onFirstVisible was not possible to honor deterministically; specifically, when an item first becomes visible. For example, a Lazy layout may dispose of items that scroll out of the viewport, and then compose them again if they scroll back into view. In this circumstance, the onFirstVisible callback would fire again, as it is a newly composed item. Similar behavior would also occur when navigating back to a previously visited screen containing onFirstVisible. As such, we have decided to deprecate this modifier in the next Compose release (1.11) and recommend migrating to onVisibilityChanged. See the documentation for more information.


Coroutine dispatch in tests

We plan to change coroutine dispatch in tests to improve test flakiness and catch more issues. Currently, tests use the UnconfinedTestDispatcher, which differs from production behavior; e.g., effects may run immediately rather than being enqueued. In a future release, we plan to introduce a new API that uses StandardTestDispatcher by default to match production behaviours. You can try the new behavior now in 1.10:


@get:Rule // also createAndroidComposeRule, createEmptyComposeRule

val rule = createComposeRule(effectContext = StandardTestDispatcher())


Using the StandardTestDispatcher will queue tasks, so you must use synchronization mechanisms like composeTestRule.waitForIdle() or composeTestRule.runOnIdle(). If your test uses runTest, you must ensure that runTest and your Compose rule share the same StandardTestDispatcher instance for synchronization.


// 1. Create a SINGLE dispatcher instance

val testDispatcher = StandardTestDispatcher()


// 2. Pass it to your Compose rule

@get:Rule

val composeRule = createComposeRule(effectContext = testDispatcher)


@Test

// 3. Pass the *SAME INSTANCE* to runTest

fun myTest() = runTest(testDispatcher) {

composeRule.setContent { /* ... */ }

}


Tools

Great APIs deserve great tools, and Android Studio has a number of recent additions for Compose developers:


  • Transform UI: Iterate on your designs by right clicking on the @Preview, selecting Transform UI, and then describing the change in natural language.

  • Generate @Preview: Right-click on a composable and select Gemini > Generate [Composable name] Preview.




To see these tools in action, watch this recent demonstration:




Happy Composing

We continue to invest in Jetpack Compose to provide you with the APIs and tools you need to create beautiful, rich UIs. We value your input, so please share your feedback on these changes or what you'd like to see next in our issue tracker.


03 Dec 2025 8:34pm GMT

02 Dec 2025

feedAndroid Developers Blog

Android 16 QPR2 is Released

Posted by Matthew McCullough, VP of Product Management, Android Developer







Faster Innovation with Android's first Minor SDK Release

Today we're releasing Android 16 QPR2, bringing a host of enhancements to user experience, developer productivity, and media capabilities. It marks a significant milestone in the evolution of the Android platform as the first release to utilize a minor SDK version.


A Milestone for Platform Evolution: The Minor SDK Release


Minor SDK releases allow us to deliver APIs and features more rapidly outside of the major yearly platform release cadence, ensuring that the platform and your apps can innovate faster with new functionality. Unlike major releases that may include behavior changes impacting app compatibility, the changes in QPR2 are largely additive, minimizing the need for regression testing. Behavior changes in QPR2 are largely focused on security or accessibility, such as SMS OTP protection, or the support for the expanded dark theme.

To support this, we have introduced new fields to the Build class as of Android 16, allowing your app to check for these new APIs using SDK_INT_FULL and VERSION_CODES_FULL.

if ((Build.VERSION.SDK_INT >= Build.VERSION_CODES.BAKLAVA) && (Build.VERSION.SDK_INT_FULL >= Build.VERSION_CODES_FULL.BAKLAVA_1)) {
    // Call new APIs from the Android 16 QPR2 release
}

Enhanced User Experience and Customization

QPR2 improves Android's personalization and accessibility, giving users more control over how their devices look and feel.

Expanded Dark Theme

To create a more consistent user experience for users who have low vision, photosensitivity, or simply those who prefer a dark system-wide appearance, QPR2 introduced an expanded option under dark theme.

The old Fitbit app showing the impact of expanded dark theme; the new Fitbit app directly supports a dark theme

When the expanded dark theme setting is enabled by a user, the system uses your app's isLightTheme theme attribute to determine whether to apply inversion. If your app inherits from one of the standard DayNight themes, this is done automatically for you. If it does not, make sure to declare isLightTheme="false" in your dark theme to ensure your app is not inadvertently inverted. Standard Android Views, Composables, and WebViews will be inverted, while custom rendering engines like Flutter will not.

This is largely intended as an accessibility feature. We strongly recommend implementing a native dark theme, which gives you full control over your app's appearance; you can protect your brand's identity, ensure text is readable, and prevent visual glitches from happening when your UI is automatically inverted, guaranteeing a polished, reliable experience for your users.

Custom Icon Shapes & Auto-Theming

In QPR2, users can select specific shapes for their app icons, which apply to all icons and folder previews. Additionally, if your app does not provide a dedicated themed icon, the system can now automatically generate one by applying a color filtering algorithm to your existing launcher icon.

Custom Icon Shapes

Test Icon Shape & Color in Android Studio

Automatic system icon color filtering

Interactive Chooser Sessions

The sharing experience is now more dynamic. Apps can keep the UI interactive even when the system sharesheet is open, allowing for real-time content updates within the Chooser.

Boosting Your Productivity and App Performance

We are introducing tools and updates designed to streamline your workflow and improve app performance.

Linux Development Environment with GUI Applications

The Linux development environment feature has been expanded to support running Linux GUI applications directly within the terminal environment.

Wilber, the GIMP mascot, designed by Aryeom Han, is licensed under CC BY-SA 4.0. The screenshot of the GIMP interface is used with courtesy.

Generational Garbage Collection

The Android Runtime (ART) now includes a Generational Concurrent Mark-Compact (CMC) Garbage Collector. This focuses collection on newly allocated objects, resulting in reduced CPU usage and improved battery efficiency.

Widget Engagement Metrics

You can now query user interaction events-such as clicks, scrolls, and impressions-to better understand how users engage with your widgets.

16KB Page Size Readiness

To help prepare for future architecture requirements, we have added early warning dialogs for debuggable apps that are not 16KB page-aligned.


Media, Connectivity, and Health

QPR2 brings robust updates to media standards and device connectivity.

IAMF and Audio Sharing

We have added software decoding support for Immersive Audio Model and Formats (IAMF), an open-source spatial audio format. Additionally, Personal Audio Sharing for Bluetooth LE Audio is now integrated directly into the system Output Switcher.


Health Connect Updates

Health Connect now automatically tracks steps using the device's sensors. If your app has the READ_STEPS permission, this data will be available from the "android" package. Not only does this simplify the code needed to do step tracking, it's also more power efficient. It also can now track weight, set index, and Rate of Perceived Exertion (RPE) in exercise segments.

Smoother Migrations

A new 3rd-party Data Transfer API enables more reliable data migration between Android and iOS devices.

Strengthening Privacy and Security

Security remains a top priority with new features designed to protect user data and device integrity.

Developer Verification

We introduced APIs to support developer verification during app installation along with new ADB commands to simulate verification outcomes. As a developer, you are free to install apps without verification by using ADB, so you can continue to test apps that are not intended or not yet ready to distribute to the wider consumer population.

SMS OTP Protection

The delivery of messages containing an SMS retriever hash will be delayed for most apps for three hours to help prevent OTP hijacking. The RECEIVE_SMS broadcast will be withheld and sms provider database queries will be filtered. The SMS will be available to these apps after the three hour delay.

Secure Lock Device

A new system-level security state, Secure Lock Device, is being introduced. When enabled (e.g., remotely via "Find My Device"), the device locks immediately and requires the primary PIN, pattern, or password to unlock, heightening security. When active, notifications and quick affordances on the lock screen will be hidden, and biometric unlock may be temporarily disabled.

Get Started

If you're not in the Beta or Canary programs, your Pixel device should get the Android 16 QPR2 release shortly. If you don't have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. If you are currently on the Android 16 QPR2 Beta and have not yet installed the Android 16 QPR3 beta, you can opt out of the program and you will then be offered the release version of Android 16 QPR2 over the air.
For the best development experience with Android 16 QPR2, we recommend that you use the latest Canary build of Android Studio Otter.
Thank you again to everyone who participated in our Android beta program. We're looking forward to seeing how your apps take advantage of the updates in Android 16 QPR2.
For complete information on Android 16 QPR2 please visit the Android 16 developer site.

02 Dec 2025 7:00pm GMT

Explore AI on Android with Our Sample Catalog App

Posted by Thomas Ezan and Ivy Knight

As the AI landscape continues to expand, we often hear that developers aren't always sure where to start and which API or SDK is best for their use case.

So we wanted to provide you with examples of AI-enabled features using both on-device and Cloud models and inspire you to create delightful experiences for your users.

We are thrilled to announce the launch of the redesigned Android AI Sample Catalog, a dedicated application designed to inspire and educate Android developers to build the next generation of AI-powered Android apps.

Discover what's possible with Google AI

The Android AI Sample Catalog is designed as a one-stop destination to explore the capabilities of Google AI APIs and SDKs. Inside, you'll find a collection of samples demonstrating a wide range of AI use cases that you can test yourself. We really designed this catalog to give you a hands-on feel for what you can build and help you find the right solution and capability for your needs.

Here are some of the samples you can find in the catalog:

Image generation with Imagen

Uses Imagen to generate images of landscapes, objects and people in various artistic styles.

On-device summarization with Gemini Nano

Lets you summarize text on-device using Gemini Nano via the GenAI Summarization API.

Chat with Nano Banana

A chatbot app using the Gemini 3 Pro Image model (a.k.a. "Nano Banana Pro") letting you edit images via a conversation with the model.

On-device image description with Gemini Nano

Lets you generate image descriptions using Gemini Nano via the GenAI Image Description API.

Other samples include: image editing via Imagen mask-editing capabilities, a to-do list app controlled via the voice using the Gemini Live API, on-device rewrite assistance powered by Gemini Nano, and more!

The samples using cloud inference are built using the Firebase AI Logic SDK, and the ML Kit GenAI API is used for the samples running on-device inference. We plan to continue creating new samples and updating the existing ones as new capabilities are added to the models and SDKs.

Fully open source and ready to copy

We believe the best way to learn is by doing. That's why the AI Sample Catalog is not only fully open-source but it's been architectured so the code relevant to the AI features is self-contained and easy to copy and paste, so you can quickly experiment with these code samples in your own project.

When you're exploring a sample in the app and want to see how it's built, you can simply click the <> SOURCE button to jump directly to the code on GitHub.



To help you get started quickly, each sample includes a README file that highlights the APIs used, along with key code snippets.



Note: To run the samples using the Firebase AI Logic SDK, you'll need to set up a Firebase AI project. Also, the samples using ML Kit Gen AI APIs powered by Gemini Nano are only supported on certain devices.

We also put extra thought into the app's user interface to make your learning experience more engaging and intuitive. We've refreshed the app with a bold new brand that infuses the Android look with an expressive AI design language. Most notably, the app now features a vibrant, textured backdrop for the new Material 3 expressive components, giving you a modern and enjoyable environment to explore the samples and dive into the code. The systematic illustrations, inspired by generated image composition, further enhance this polished, expressive experience.



Check out the Android AI Sample Catalog today, test the features, and dive into the code on GitHub to start bringing your own AI-powered ideas to life!

02 Dec 2025 5:00pm GMT