25 Dec 2024
TalkAndroid
Cookies Inc: Ultimate Guide & Secret Codes
Find all the latest Cookies Inc secret codes right here in this Ultimate Game Guide!
25 Dec 2024 6:51am GMT
Marvel Strike Force Codes – December 2024
Find the latest Marvel Strike Force codes here! Keep on reading for more!
25 Dec 2024 6:49am GMT
AU Reborn Private Server Codes – December 2024
Find all the latest Private Server codes here! Keep reading for more.
25 Dec 2024 6:46am GMT
Marvel Snap Codes December 2024
Find all the latest Marvel Snap Codes right here! Read on for more!
25 Dec 2024 6:43am GMT
Tiny Quest – Idle RPG – Game Guide, Codes & Tier List
Find the latest Tiny Quest Idle RPG game guide, Tier list and codes here!
25 Dec 2024 6:40am GMT
League of Angels: Pact Codes – December 2024
Find the latest League of Angels: Pact codes here! Keep on reading for more!
25 Dec 2024 6:38am GMT
Postknight 2 – Codes & How To Redeem Them
Find all the latest Postknight 2 Codes here! Read on for more!
25 Dec 2024 6:35am GMT
24 Dec 2024
TalkAndroid
Google TV’s Free Channel Catalog Gets Bigger For The Holidays
You definitely won't have a dearth of Christmas content to watch if you've got Google TV.
24 Dec 2024 4:00pm GMT
Board Kings Free Rolls – Updated Every Day!
Run out of rolls for Board Kings? Find links for free rolls right here, updated daily!
24 Dec 2024 3:09pm GMT
Coin Tales Free Spins – Updated Every Day!
Tired of running out of Coin Tales Free Spins? We update our links daily, so you won't have that problem again!
24 Dec 2024 3:08pm GMT
Avatar World Codes – December 2024 – Updated Daily
Find all the latest Avatar World Codes right here in this article! Read on for more!
24 Dec 2024 3:06pm GMT
Coin Master Free Spins & Coins Links
Find all the latest Coin Master free spins right here! We update daily, so be sure to check in daily!
24 Dec 2024 3:06pm GMT
Monopoly Go Events Schedule Today – Updated Daily
Current active events are House of Sweets Event, Build & Bake Event, and Partner Event - Gingerbread Partners. .
24 Dec 2024 3:02pm GMT
Monopoly Go – Free Dice Links Today (Updated Daily)
If you keep on running out of dice, we have just the solution! Find all the latest Monopoly Go free dice links right here!
24 Dec 2024 2:59pm GMT
Family Island Free Energy Links (Updated Daily)
Tired of running out of energy on Family Island? We have all the latest Family Island Free Energy links right here, and we update these daily!
24 Dec 2024 2:57pm GMT
Crazy Fox Free Spins & Coins (Updated Daily)
If you need free coins and spins in Crazy Fox, look no further! We update our links daily to bring you the newest working links!
24 Dec 2024 2:55pm GMT
19 Dec 2024
Android Developers Blog
Celebrating Another Year of #WeArePlay
Posted by Robbie McLachlan - Developer Marketing
This year #WeArePlay took us on a journey across the globe, spotlighting 300 people behind apps and games on Google Play. From a founder whose app uses AI to assist visually impaired people to a game where nimble-fingered players slice flying fruits and use special combos to beat their own high score, we met founders transforming ideas into thriving businesses.
Let's start by taking a look back at the people featured in our global film series. From a mother and son duo preserving African languages, to a founder whose app helps kids become published authors - check out the full playlist.
We also continued our global tour around the world with:
- 153 new stories from the United States like Ashley's Get Mom Strong, which gives access to rehabilitation and fitness plans to help moms heal and get strong after childbirth
- 49 new stories from Japan like Toshiya's Mirairo ID, an app that empowers the disabled community by digitizing disability certificates
- 50 new stories from Australia, including apps like Tristan's Bushfire.io, which supports communities during natural disasters
And we released global collections of 36 stories, each with a theme reflecting the diversity of the app and game community on Google Play, including:
- LGBTQ+ founders creating safe spaces and fostering representation
- Women founders breaking barriers and building impactful businesses
- Creators turning personal passions-such as fitness, mental health, or creativity-into inspiring apps
- Founders building sports apps and games that bring players, fans, and communities together
To the global community of app and game founders, thank you for sharing your inspiring journey. As we enter 2025, we look forward to discovering even more stories of the people behind games and apps businesses on Google Play.
19 Dec 2024 7:00pm GMT
18 Dec 2024
Android Developers Blog
The Second Developer Preview of Android 16
Posted by Matthew McCullough - VP of Product Management, Android Developer
The second developer preview of Android 16 is now available to test with your apps. This build includes changes designed to enhance the app experience, improve battery life, and boost performance while minimizing incompatibilities, and your feedback is critical in helping us understand the full impact of this work.
System triggered profiling
ProfilingManager was added in Android 15, giving apps the ability to request profiling data collection using Perfetto on public devices in the field. To help capture challenging trace scenarios such as startups or ANRs, ProfilingManager now includes System Triggered Profiling. Apps can use ProfilingManager#addProfilingTriggers() to register interest in receiving information about these flows. Flows covered in this release include onFullyDrawn for activity based cold starts, and ANRs.
val anrTrigger = ProfilingTrigger.Builder( ProfilingTrigger.TRIGGER_TYPE_ANR ) .setRateLimitingPeriodHours(1) .build() val startupTrigger: ProfilingTrigger = //... mProfilingManager.addProfilingTriggers(listOf(anrTrigger, startupTrigger))
Start component in ApplicationStartInfo
ApplicationStartInfo was added in Android 15, allowing an app to see reasons for process start, start type, start times, throttling, and other useful diagnostic data. Android 16 adds getStartComponent() to distinguish what component type triggered the start, which can be helpful for optimizing the startup flow of your app.
Richer Haptics
Android has exposed limited control over the haptic actuator since its inception.
Android 11 added support for more complex haptic effects that more advanced actuators can support through VibrationEffect.Compositions of device-defined semantic primitives.
Android 16 adds haptic APIs that let apps define the amplitude and frequency curves of a haptic effect while abstracting away differences between device capabilities.
Better job introspection
Android 16 introduces JobScheduler#getPendingJobReasons(int jobId) which can return multiple reasons why a job is pending, due to both explicit constraints set by the developer and implicit constraints set by the system.
We're also introducing JobScheduler#getPendingJobReasonsHistory(int jobId), which returns a list of the most recent constraint changes.
The API can help you debug why your jobs may not be executing, especially if you're seeing reduced success rates with certain tasks or latency issues with job completion as well. This can also better help you understand if certain jobs are not completing due to system defined constraints versus explicitly set constraints.
Adaptive refresh rate
Adaptive refresh rate (ARR), introduced in Android 15, enables the display refresh rate on supported hardware to adapt to the content frame rate using discrete VSync steps. This reduces power consumption while eliminating the need for potentially jank-inducing mode-switching.
Android 16 DP2 introduces hasArrSupport() and getSuggestedFrameRate(int) while restoring getSupportedRefreshRates() to make it easier for your apps to take advantage of ARR.
RecyclerView 1.4 internally supports ARR when it is settling from a fling or smooth scroll, and we're continuing our work to add ARR support into more Jetpack libraries. This frame rate article covers many of the APIs you can use to set the frame rate so that your app can directly leverage ARR.
Job execution optimizations
Starting in Android 16, we're adjusting regular and expedited job execution runtime quota based on the following factors:
- Which app standby bucket the application is in; active standby buckets will be given a generous runtime quota.
- Jobs started while the app is visible to the user and continues after the app becomes invisible will adhere to the job runtime quota.
- Jobs that are executing concurrently with a foreground service will adhere to the job runtime quota. If you need to perform a data transfer that may take a long time consider using a user initiated data transfer.
Note: To understand how to further debug and test the behavior change, read more about JobScheduler quota optimizations.
Fully deprecating JobInfo#setImportantWhileForeground
The JobInfo.Builder#setImportantWhileForeground(boolean) method indicates the importance of a job while the scheduling app is in the foreground or when temporarily exempted from background restrictions.
This method has been deprecated since Android 12 (API 31). Starting in Android 16, it will no longer function effectively and calling this method will be ignored.
This removal of functionality also applies to JobInfo#isImportantWhileForeground(). Starting in Android 16, if the method is called, the method will return false.
Deprecated Disruptive Accessibility Announcements
Android 16 DP2 deprecates accessibility announcements, characterized by the use of announceForAccessibility or the dispatch of TYPE_ANNOUNCEMENT AccessibilityEvents. They can create inconsistent user experiences for users of TalkBack and Android's screen reader, and alternatives better serve a broader range of user needs across a variety of Android's assistive technologies.
Examples of alternatives:
- For significant UI changes like window changes, use Activity.setTitle(CharSequence) and setAccessibilityPaneTitle(java.lang.CharSequence). In Compose use Modifier.semantics { paneTitle = "paneTitle" }
- To inform the user of changes to critical UI, use setAccessibilityLiveRegion(int). In Compose use Modifier.semantics { liveRegion = LiveRegionMode.[Polite|Assertive] }. These should be used sparingly as they may generate announcements every time a View or composable is updated.
- To notify users about errors, send an AccessibilityEvent of type AccessibilityEvent#CONTENT_CHANGE_TYPE_ERROR and set AccessibilityNodeInfo#setError(CharSequence), or use TextView#setError(CharSequence).
The deprecated announceForAccessibility API includes more detail on suggested alternatives.
Cloud search in photo picker
The photo picker provides a safe, built-in way for users to grant your app access to selected images and videos from both local and cloud storage, instead of their entire media library. Using a combination of Modular System Components through Google System Updates and Google Play services, it's supported back to Android 4.4 (API level 19). Integration requires just a few lines of code with the associated Android Jetpack library.
The developer preview includes new APIs to enable searching from the cloud media provider for the Android photo picker. Search functionality in the photo picker is coming soon.
Ranging with enhanced security
Android 16 adds support for robust security features in WiFi location on supported devices with WiFi 6's 802.11az, allowing apps to combine the higher accuracy, greater scalability, and dynamic scheduling of the protocol with security enhancements including AES-256-based encryption and protection against MITM attacks. This allows it to be used more safely in proximity use cases, such as unlocking a laptop or a vehicle door. 802.11az is integrated with the Wi-Fi 6 standard, leveraging its infrastructure and capabilities for wider adoption and easier deployment.
Health Connect updates
Health Connect in the developer preview adds ACTIVITY_INTENSITY, a new datatype defined according to World Health Organization guidelines around moderate and vigorous activity. Each record requires the start time, the end time and whether the activity intensity is moderate or vigorous.
Health Connect also contains updated APIs supporting health records. This allows apps to read and write medical records in FHIR format with explicit user consent. This API is currently in an early access program. Sign up if you'd like to be part of our early access program.
Predictive back additions
Android 16 adds new APIs to help you enable predictive back system animations in gesture navigation such as the back-to-home animation. Registering the onBackInvokedCallback with the new PRIORITY_SYSTEM_NAVIGATION_OBSERVER allows your app to receive the regular onBackInvoked call whenever the system handles a back navigation without impacting the normal back navigation flow.
Android 16 additionally adds the finishAndRemoveTaskCallback() and moveTaskToBackCallback(). By registering these callbacks with the OnBackInvokedDispatcher, the system can trigger specific behaviors and play corresponding ahead-of-time animations when the back gesture is invoked.
Two Android API releases in 2025
This preview is for the next major release of Android with a planned launch in Q2 of 2025 and we plan to have another release with new developer APIs in Q4. The Q2 major release will be the only release in 2025 to include planned behavior changes that could affect apps. The Q4 minor release will pick up feature updates, optimizations, and bug fixes; it will not include any app-impacting behavior changes.
We'll continue to have quarterly Android releases. The Q1 and Q3 updates in-between the API releases will provide incremental updates to help ensure continuous quality. We're actively working with our device partners to bring the Q2 release to as many devices as possible.
There's no change to the target API level requirements and the associated dates for apps in Google Play; our plans are for one annual requirement each year, and that will be tied to the major API level.
How to get ready
In addition to performing compatibility testing on the next major release, make sure that you're compiling your apps against the new SDK, and use the compatibility framework to enable targetSdkVersion-gated behavior changes as they become available for early testing.
App compatibility
The Android 16 Preview program runs from November 2024 until the final public release next year. At key development milestones, we'll deliver updates for your development and testing environments. Each update includes SDK tools, system images, emulators, API reference, and API diffs. We'll highlight critical APIs as they are ready to test in the preview program in blogs and on the Android 16 developer website.
We're targeting Late Q1 of 2025 for our Platform Stability milestone. At this milestone, we'll deliver final SDK/NDK APIs and also final internal APIs and app-facing system behaviors. We're expecting to reach Platform Stability in March 2025, and from that time you'll have several months before the official release to do your final testing. Learn more in the release timeline details.
Get started with Android 16
You can get started today with Developer Preview 2 by flashing a system image and updating the tools. If you are currently on Developer Preview 1, you will automatically get an over-the-air update to Developer Preview 2. We're looking for your feedback so please report issues and submit feature requests on the feedback page. The earlier we get your feedback, the more we can include in the final release.
For the best development experience with Android 16, we recommend that you use the latest preview of the Android Studio Ladybug feature drop. Once you're set up, here are some of the things you should do:
- Compile against the new SDK, test in CI environments, and report any issues in our tracker on the feedback page.
- Test your current app for compatibility, learn whether your app is affected by changes in Android 16, and install your app onto a device or emulator running Android 16 and extensively test it.
We'll update the preview system images and SDK regularly throughout the Android 16 release cycle. This preview release is for developers only and not intended for daily consumer use. We're making it available by manual download. Once you've manually installed a preview build, you'll automatically get future updates over-the-air for all later previews and Betas.
If you've already installed Android 15 QPR Beta 2 and would like to flash Android 16 Developer Preview 2, you can do so without first having to wipe your device.
As we reach our Beta releases, we'll be inviting consumers to try Android 16 as well, and we'll open up enrollment for Android 16 in the Android Beta program at that time.
For complete information, visit the Android 16 developer site.
18 Dec 2024 7:00pm GMT
17 Dec 2024
Android Developers Blog
How Instagram enabled users to take stunning Low Light Photos
Posted by Donovan McMurray - Developer Relations Engineer
Instagram, the popular photo and video sharing social networking service, is constantly delighting users with a best-in-class camera experience. Recently, Instagram launched another improvement on Android with their Night Mode implementation.
As devices and their cameras become more and more capable, users expect better quality images in a wider variety of settings. Whether it's a night out with friends or the calmness right after you get your baby to fall asleep, the special moments users want to capture often don't have ideal lighting conditions.
Now, when Instagram users on Android take a photo in low light environments, they'll see a moon icon that allows them to activate Night Mode for better image quality. This feature is currently available to users with any Pixel device from the 6 series and up, a Samsung Galaxy S24Ultra, or a Samsung Flip6 or Fold6, with more devices to follow.
Leveraging Device-specific Camera Technologies
Android enables apps to take advantage of device-specific camera features through the Camera Extensions API. The Extensions framework currently provides functionality like Night Mode for low-light image captures, Bokeh for applying portrait-style background blur, and Face Retouch for beauty filters. All of these features are implemented by the Original Equipment Manufacturers (OEMs) in order to maximize the quality of each feature on the hardware it's running on.
Furthermore, exposing this OEM-specific functionality through the Extensions API allows developers to use a consistent implementation across all of these devices, getting the best of both worlds: implementations that are tuned to a wide-range of devices with a unified API surface. According to Nilesh Patel, a Software Engineer at Instagram, "for Meta's billions of users, having to write custom code for each new device is simply not scalable. It would also add unnecessary app size when Meta users download the app. Hence our guideline is 'write once to scale to billions', favoring platform APIs."
More and more OEMs are supporting Extensions, too! There are already over 120 different devices that support the Camera Extensions, representing over 75 million monthly active users. There's never been a better time to integrate Extensions into your Android app to give your users the best possible camera experience.
Impact on Instagram
The results of adding Night Mode to Instagram have been very positive for Instagram users. Jin Cui, a Partner Engineer on Instagram, said "Night Mode has increased the number of photos captured and shared with the Instagram camera, since the quality of the photos are now visibly better in low-light scenes."
Compare the following photos to see just how big of a difference Night Mode makes. The first photo is taken in Instagram with Night Mode off, the second photo is taken in Instagram with Night Mode on, and the third photo is taken with the native camera app with the device's own low-light processing enabled.
Ensuring Quality through Image Test Suite (ITS)
The Android Camera Image Test Suite (ITS) is a framework for testing images from Android cameras. ITS tests configure the camera and capture shots to verify expected image data. These tests are functional and ensure advertised camera features work as expected. A tablet mounted on one side of the ITS box displays the test chart. The device under test is mounted on the opposite side of the ITS box.
Devices must pass the ITS tests for any feature that the device claims to support for apps to use, including the tests we have for the Night Mode Camera Extension.
The Android Camera team faced the challenge of ensuring the Night Mode Camera Extension feature functioned consistently across all devices in a scalable way. This required creating a testing environment with very low light and a wide dynamic range. This configuration was necessary to simulate real-world lighting scenarios, such as a city at night with varying levels of brightness and shadow, or the atmospheric lighting of a restaurant.
The first step to designing the test was to define the specific lighting conditions to simulate. Field testing with a light meter in various locations and lighting conditions was conducted to determine the target lux level. The goal was to ensure the camera could capture clear images in low-light conditions, which led to the establishment of 3 lux as the target lux level. The figure below shows various lighting conditions and their respective lux value.
The next step was to develop a test chart to accurately measure a wide dynamic range in a low light environment. The team developed and iterated on several test charts and arrived at the following test chart shown below. This chart arranges a grid of squares in varying shades of grey. A red outline defines the test area for cropping. This enables excluding darker external regions. The grid follows a Hilbert curve pattern to minimize abrupt light or dark transitions. The design allows for both quantitative measurements and simulation of a broad range of light conditions.
The test chart captures an image using the Night Mode Camera Extension in low light conditions. The image is used to evaluate the improvement in the shadows and midtones while ensuring the highlights aren't saturated. This evaluation involves two criteria: ensure the average luma value of the six darkest boxes is at least 85, and ensure the average luma contrast between these boxes is at least 17. The figure below shows the test capture and chart results.
By leveraging the existing ITS infrastructure, the Android Camera team was able to provide consistent, high quality Night Mode Camera Extension captures. This gives application developers the confidence to integrate and enable Night Mode captures for their users. It also allows OEMs to validate their implementations and ensure users get the best quality capture.
How to Implement Night Mode with Camera Extensions
Camera Extensions are available to apps built with Camera2 or CameraX. In this section, we'll walk through each of the features Instagram implemented. The code examples will use CameraX, but you'll find links to the Camera2 documentation at each step.
Enabling Night Mode Extension
Night Mode involves combining multiple exposures into a single still photo for better quality shots in low-light environments. So first, you'll need to check for Night Mode availability, and tell the camera system to start a Camera Extension session. With CameraX, this is done with an ExtensionsManager instead of the standard CameraManager.
private suspend fun setUpCamera() { // Obtain an instance of a process camera provider. The camera provider // provides access to the set of cameras associated with the device. // The camera obtained from the provider will be bound to the activity lifecycle. val cameraProvider = ProcessCameraProvider.getInstance(application).await() // Obtain an instance of the extensions manager. The extensions manager // enables a camera to use extension capabilities available on the device. val extensionsManager = ExtensionsManager.getInstanceAsync( application, cameraProvider).await() // Select the camera. val cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA // Query if extension is available. Not all devices will support // extensions or might only support a subset of extensions. if (extensionsManager.isExtensionAvailable(cameraSelector, ExtensionMode.NIGHT)) { // Unbind all use cases before enabling different extension modes. try { cameraProvider.unbindAll() // Retrieve a night extension enabled camera selector val nightCameraSelector = extensionsManager.getExtensionEnabledCameraSelector( cameraSelector, ExtensionMode.NIGHT ) // Bind image capture and preview use cases with the extension enabled camera // selector. val imageCapture = ImageCapture.Builder().build() val preview = Preview.Builder().build() // Connect the preview to receive the surface the camera outputs the frames // to. This will allow displaying the camera frames in either a TextureView // or SurfaceView. The SurfaceProvider can be obtained from the PreviewView. preview.setSurfaceProvider(surfaceProvider) // Returns an instance of the camera bound to the lifecycle // Use this camera object to control various operations with the camera // Example: flash, zoom, focus metering etc. val camera = cameraProvider.bindToLifecycle( lifecycleOwner, nightCameraSelector, imageCapture, preview ) } catch (e: Exception) { Log.e(TAG, "Use case binding failed", e) } } else { // In the case where the extension isn't available, you should set up // CameraX normally with non-extension-enabled CameraSelector. } }
To do this in Camera2, see the Create a CameraExtensionSession with the Camera2 Extensions API guide.
Implementing the Progress Bar and PostView Image
For an even more elevated user experience, you can provide feedback while the Night Mode capture is processing. In Android 14, we added callbacks for the progress and for post view, which is a temporary image capture before the Night Mode processing is complete. The below code shows how to use these callbacks in the takePicture() method. The actual implementation to update the UI is very app-dependent, so we'll leave the actual UI updating code to you.
// When setting up the ImageCapture.Builder, set postviewEnabled and // posviewResolutionSelector in order to get a PostView bitmap in the // onPostviewBitmapAvailable callback when takePicture() is called. val cameraInfo = cameraProvider.getCameraInfo(cameraSelector) val isPostviewSupported = ImageCapture.getImageCaptureCapabilities(cameraInfo).isPostviewSupported val postviewResolutionSelector = ResolutionSelector.Builder() .setAspectRatioStrategy(AspectRatioStrategy( AspectRatioStrategy.RATIO_16_9_FALLBACK_AUTO_STRATEGY, AspectRatioStrategy.FALLBACK_RULE_AUTO)) .setResolutionStrategy(ResolutionStrategy( previewSize, ResolutionStrategy.FALLBACK_RULE_CLOSEST_LOWER_THEN_HIGHER )) .build() imageCapture = ImageCapture.Builder() .setTargetAspectRatio(AspectRatio.RATIO_16_9) .setPostviewEnabled(isPostviewSupported) .setPostviewResolutionSelector(postviewResolutionSelector) .build() // When the Night Mode photo is being taken, define these additional callbacks // to implement PostView and a progress indicator in your app. imageCapture.takePicture( outputFileOptions, Dispatchers.Default.asExecutor(), object : ImageCapture.OnImageSavedCallback { override fun onPostviewBitmapAvailable(bitmap: Bitmap) { // Add the Bitmap to your UI as a placeholder while the final result is processed } override fun onCaptureProcessProgressed(progress: Int) { // Use the progress value to update your UI; values go from 0 to 100. } } )
To accomplish this in Camera2, see the CameraFragment.kt file in the Camera2Extensions sample app.
Implementing the Moon Icon Indicator
Another user-focused design touch is showing the moon icon to let the user know that a Night Mode capture will happen. It's also a good idea to let the user tap the moon icon to disable Night Mode capture. There's an upcoming API in Android 16 next year to let you know when the device is in a low-light environment.
Here are the possible values for the Night Mode Indicator API:
UNKNOWN
- The camera is unable to reliably detect the lighting conditions of the current scene to determine if a photo will benefit from a Night Mode Camera Extension capture.
OFF
- The camera has detected lighting conditions that are sufficiently bright. Night Mode Camera Extension is available but may not be able to optimize the camera settings to take a higher quality photo.
ON
- The camera has detected low-light conditions. It is recommended to use Night Mode Camera Extension to optimize the camera settings to take a high-quality photo in the dark.
Next Steps
Read more about Android's camera APIs in the Camera2 guides and the CameraX guides. Once you've got the basics down, check out the Android Camera and Media Dev Center to take your camera app development to the next level. For more details on upcoming Android features, like the Night Mode Indicator API, get started with the Android 16 Preview program.
17 Dec 2024 8:15pm GMT
What's new in CameraX 1.4.0 and a sneak peek of Jetpack Compose support
Posted by Scott Nien - Software Engineer (scottnien@)
Get ready to level up your Android camera apps! CameraX 1.4.0 just dropped with a load of awesome new features and improvements. We're talking expanded HDR capabilities, preview stabilization and the versatile effect framework, and a whole lot of cool stuff to explore. We will also explore how to seamlessly integrate CameraX with Jetpack Compose! Let's dive in and see how these enhancements can take your camera app to the next level.
HDR preview and Ultra HDR
High Dynamic Range (HDR) is a game-changer for photography, capturing a wider range of light and detail to create stunningly realistic images. With CameraX 1.3.0, we brought you HDR video recording capabilities, and now in 1.4.0, we're taking it even further! Get ready for HDR Preview and Ultra HDR. These exciting additions empower you to deliver an even richer visual experience to your users.
HDR Preview
This new feature allows you to enable HDR on Preview without needing to bind a VideoCapture use case. This is especially useful for apps that use a single preview stream for both showing preview on display and video recording with an OpenGL pipeline.
To fully enable the HDR, you need to ensure your OpenGL pipeline is capable of processing the specific dynamic range format and then check the camera capability.
See following code snippet as an example to enable HLG10 which is the baseline HDR standard that device makers must support on cameras with 10-bit output.
// Declare your OpenGL pipeline supported dynamic range format. val openGLPipelineSupportedDynamicRange = setOf( DynamicRange.SDR, DynamicRange.HLG_10_BIT ) // Check camera dynamic range capabilities. val isHlg10Supported = cameraProvider.getCameraInfo(cameraSelector) .querySupportedDynamicRanges(openGLPipelineSupportedDynamicRange) .contains(DynamicRange.HLG_10_BIT) val preview = Preview.Builder().apply { if (isHlg10Supported) { setDynamicRange(DynamicRange.HLG_10_BIT) } }
Ultra HDR
Introducing Ultra HDR, a new format in Android 14 that lets users capture stunningly realistic photos with incredible dynamic range. And the best part? CameraX 1.4.0 makes it incredibly easy to add Ultra HDR capture to your app with just a few lines of code:
val cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA val cameraInfo = cameraProvider.getCameraInfo(cameraSelector) val isUltraHdrSupported = ImageCapture.getImageCaptureCapabilities(cameraInfo) .supportedOutputFormats .contains(ImageCapture.OUTPUT_FORMAT_JPEG_ULTRA_HDR) val imageCapture = ImageCapture.Builder().apply { if (isUltraHdrSupported) { setOutputFormat(ImageCapture.OUTPUT_FORMAT_JPEG_ULTRA_HDR) } }.build()
Jetpack Compose support
While this post focuses on 1.4.0, we're excited to announce the Jetpack Compose support in CameraX 1.5.0 alpha. We're adding support for a Composable Viewfinder built on top of AndroidExternalSurface and AndroidEmbeddedExternalSurface. The CameraXViewfinder Composable hooks up a display surface to a CameraX Preview use case, handling the complexities of rotation, scaling and Surface lifecycle so you don't need to.
// in build.gradle implementation ("androidx.camera:camera-compose:1.5.0-alpha03") class PreviewViewModel : ViewModel() { private val _surfaceRequests = MutableStateFlow<SurfaceRequest?>(null) val surfaceRequests: StateFlow<SurfaceRequest?> get() = _surfaceRequests.asStateFlow() private fun produceSurfaceRequests(previewUseCase: Preview) { // Always publish new SurfaceRequests from Preview previewUseCase.setSurfaceProvider { newSurfaceRequest -> _surfaceRequests.value = newSurfaceRequest } } // ... } @Composable fun MyCameraViewfinder( viewModel: PreviewViewModel, modifier: Modifier = Modifier ) { val currentSurfaceRequest: SurfaceRequest? by viewModel.surfaceRequests.collectAsState() currentSurfaceRequest?.let { surfaceRequest -> CameraXViewfinder( surfaceRequest = surfaceRequest, implementationMode = ImplementationMode.EXTERNAL, // Or EMBEDDED modifier = modifier ) } }
Kotlin-friendly APIs
CameraX is getting even more Kotlin-friendly! In 1.4.0, we've introduced two new suspend functions to streamline camera initialization and image capture.
// CameraX initialization val cameraProvider = ProcessCameraProvider.awaitInstance() val imageProxy = imageCapture.takePicture() // Processing imageProxy imageProxy.close()
Preview Stabilization and Mirror mode
Preview Stabilization
Preview stabilization mode was added in Android 13 to enable the stabilization on all non-RAW streams, including previews and MediaCodec input surfaces. Compared to the previous video stabilization mode, which may have inconsistent FoV (Field of View) between the preview and recorded video, this new preview stabilization mode ensures consistency and thus provides a better user experience. For apps that record the preview directly for video recording, this mode is also the only way to enable stabilization.
Follow the code below to enable preview stabilization. Please note that once preview stabilization is turned on, it is not only applied to the Preview but also to the VideoCapture if it is bound as well.
val isPreviewStabilizationSupported = Preview.getPreviewCapabilities(cameraProvider.getCameraInfo(cameraSelector)) .isStabilizationSupported val preview = Preview.Builder().apply { if (isPreviewStabilizationSupported) { setPreviewStabilizationEnabled(true) } }.build()
MirrorMode
While CameraX 1.3.0 introduced mirror mode for VideoCapture, we've now brought this handy feature to Preview in 1.4.0. This is especially useful for devices with outer displays, allowing you to create a more natural selfie experience when using the rear camera.
To enable the mirror mode, simply call Preview.Builder.setMirrorMode APIs. This feature is supported for Android 13 and above.
Real-time Effect
CameraX 1.3.0 introduced the CameraEffect framework, giving you the power to customize your camera output with OpenGL. Now, in 1.4.0, we're taking it a step further. In addition to applying your own custom effects, you can now leverage a set of pre-built effects provided by CameraX and Media3, making it easier than ever to enhance your app's camera features.
Overlay Effect
The new camera-effects artifact aims to provide ready-to-use effect implementations, starting with the OverlayEffect. This effect lets you draw overlays on top of camera frames using the familiar Canvas API.
The following sample code shows how to detect the QR code and draw the shape of the QR code once it is detected.
By default, drawing is performed in surface frame coordinates. But what if you need to use camera sensor coordinates? No problem! OverlayEffect provides the Frame#getSensorToBufferTransform function, allowing you to apply the necessary transformation matrix to your overlayCanvas.
In this example, we use CameraX's MLKit Vision APIs (MlKitAnalyzer) and specify COORDINATE_SYSTEM_SENSOR to obtain QR code corner points in sensor coordinates. This ensures accurate overlay placement regardless of device orientation or screen aspect ratio.
// in build.gradle implementation ("androidx.camera:camera-effects:1.4.1}") implementation ("androidx.camera:camera-mlkit-vision:1.4.1") var qrcodePoints: Array<Point>? = null var qrcodeTimestamp = 0L val qrcodeBoxEffect = OverlayEffect( PREVIEW /* applied on the preview only */, 5, /* hold multiple frames in the queue so we can match analysis result with preview frame */, Handler(Looper.getMainLooper()), {} ) fun initCamera() { qrcodeBoxEffect.setOnDrawListener { frame -> if(frame.timestamp != qrcodeTimestamp) { // Do not change the drawing if the frame doesn't match the analysis // result. return@setOnDrawListener true } frame.overlayCanvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.CLEAR) qrcodePoints?.let { // Using sensor coordinates to draw. frame.overlayCanvas.setMatrix(frame.sensorToBufferTransform) val path = android.graphics.Path().apply { it.forEachIndexed { index, point -> if (index == 0) { moveTo(point.x.toFloat(), point.y.toFloat()) } else { lineTo(point.x.toFloat(), point.y.toFloat()) } } lineTo(it[0].x.toFloat(), it[0].y.toFloat()) } frame.overlayCanvas.drawPath(path, paint) } true } val imageAnalysis = ImageAnalysis.Builder() .build() .apply { setAnalyzer(executor, MlKitAnalyzer( listOf(barcodeScanner!!), COORDINATE_SYSTEM_SENSOR, executor ) { result -> val barcodes = result.getValue(barcodeScanner!!) qrcodePoints = barcodes?.takeIf { it.size > 0}?.get(0)?.cornerPoints // track the timestamp of the analysis result and release the // preview frame. qrcodeTimestamp = result.timestamp qrcodeBoxEffect.drawFrameAsync(qrcodeTimestamp) } ) } val useCaseGroup = UseCaseGroup.Builder() .addUseCase(preview) .addUseCase(imageAnalysis) .addEffect(qrcodeBoxEffect) .build() cameraProvider.bindToLifecycle( lifecycleOwner, cameraSelector, usecaseGroup) }
Here is what the effect looks like:
Screen Flash
Taking selfies in low light just got easier with CameraX 1.4.0! This release introduces a powerful new feature: screen flash. Instead of relying on a traditional LED flash which most selfie cameras don't have, screen flash cleverly utilizes your phone's display. By momentarily turning the screen bright white, it provides a burst of illumination that helps capture clear and vibrant selfies even in challenging lighting conditions.
Integrating screen flash into your CameraX app is flexible and straightforward. You have two main options:
1. Implement the ScreenFlash interface: This gives you full control over the screen flash behavior. You can customize the color, intensity, duration, and any other aspect of the flash. This is ideal if you need a highly tailored solution.
2. Use the built-in implementation: For a quick and easy solution, leverage the pre-built screen flash functionality in ScreenFlashView or PreviewView. This implementation handles all the heavy lifting for you.
If you're already using PreviewView in your app, enabling screen flash is incredibly simple. Just enable it directly on the PreviewView instance. If you need more control or aren't using PreviewView, you can use ScreenFlashView directly.
Here's a code example demonstrating how to enable screen flash:
// case 1: PreviewView + CameraX core API. previewView.setScreenFlashWindow(activity.getWindow()); imageCapture.screenFlash = previewView.screenFlash imageCapture.setFlashMode(ImageCapture.FLASH_MODE_SCREEN) // case 2: PreviewView + CameraController previewView.setScreenFlashWindow(activity.getWindow()); cameraController.setImageCaptureFlashMode(ImageCapture.FLASH_MODE_SCREEN); // case 3 : use ScreenFlashView screenFlashView.setScreenFlashWindow(activity.getWindow()); imageCapture.setScreenFlash(screenFlashView.getScreenFlash()); imageCapture.setFlashMode(ImageCapture.FLASH_MODE_SCREEN);
Camera Extensions new features
Camera Extensions APIs aim to help apps to access the cutting-edge capabilities previously available only on built-in camera apps. And the ecosystem is growing rapidly! In 2024, we've seen major players like Pixel, Samsung, Xiaomi, Oppo, OnePlus, Vivo, and Honor all embrace Camera Extensions, particularly for Night Mode and Bokeh Mode. CameraX 1.4.0 takes this even further by adding support for brand-new Android 15 Camera Extensions features, including:
- Postview: Provides a preview of the captured image almost instantly before the long-exposure shots are completed
- Capture Process Progress: Displays a progress indicator so users know how long capturing and processing will take, improving the experience for features like Night Mode
- Extensions Strength: Allows users to fine-tune the intensity of the applied effect
Below is an example of the improved UX that uses postview and capture process progress features on Samsung S24 Ultra.
Interested to know how this can be implemented? See the sample code below:
val extensionsCameraSelector = extensionsManager .getExtensionEnabledCameraSelector(DEFAULT_BACK_CAMERA, extensionMode) val isPostviewSupported = ImageCapture.getImageCaptureCapabilities( cameraProvider.getCameraInfo(extensionsCameraSelector) ).isPostviewSupported val imageCapture = ImageCapture.Builder().apply { setPostviewEnabled(isPostviewSupported) }.build() imageCapture.takePicture(outputfileOptions, executor, object : OnImageSavedCallback { override fun onImageSaved(outputFileResults: OutputFileResults) { // final image saved. } override fun onPostviewBitmapAvailable(bitmap: Bitmap) { // Postview bitmap is available. } override fun onCaptureProcessProgressed(progress: Int) { // capture process progress update } }
Important: If your app ran into the CameraX Extensions issue on Pixel 9 series devices, please use CameraX 1.4.1 instead. This release fixes a critical issue that prevented Night Mode from working correctly with takePicture.
What's Next
We hope you enjoy this new release. Our mission is to make camera development a joy, removing the friction and pain points so you can focus on innovation. With CameraX, you can easily harness the power of Android's camera capabilities and build truly amazing app experiences.
Have questions or want to connect with the CameraX team? Join the CameraX developer discussion group or file a bug report:
We can't wait to see what you create!
17 Dec 2024 8:00pm GMT
Get your apps ready for 16 KB page size devices
Posted by Yacine Rezgui - Developer Relations Engineer, Steven Moreland - Staff Software Engineer
Android is evolving to deliver even faster, more performant experiences. One key improvement is the adoption of a 16 KB memory page size. This change enables the operating system to manage memory more efficiently, leading to noticeable performance gains (5-10%) in both apps and games. We provided an in-depth technical explanation and highlighted the performance improvements in Adding 16 KB Page Size to Android.
To help you test your app on 16 KB devices, this functionality is available as a developer option on Google Pixel 8 and 9 devices, and Samsung devices will soon offer similar support, as well as Xiaomi, vivo, and other Android OEMs.
To ensure compatibility with 16 KB devices, apps that utilize native code, either directly or through libraries or SDKs, might require rebuilding. However, the transition is significantly easier than the previous shift from 32-bit to 64-bit architecture. This article will guide you through the necessary steps to prepare your apps for the upcoming devices. The next generation of devices is on its way, with the first models supporting 16 KB page sizes expected to arrive in a couple of years.
Getting ready for 16 KB: SDK developers
If you develop your own SDKs and libraries, we encourage you to update them to be 16 KB page size compatible and test them on 16 KB devices as soon as possible. This will give app developers ample time to incorporate the necessary changes. Registering with Play SDK Console is a great way to ensure you receive advanced notices like these in the future and in a timely manner.
Getting ready for 16 KB: app developers with no native code
Apps written in and with dependencies entirely in Kotlin or the Java programming languages will work as-is!
Getting ready for 16 KB: app developers with native code
To check if your app has native code, you can utilize tools like APK Analyzer in Android Studio. However, the only way to ensure app compatibility is to test.
Rebuild your app
To ensure your app works on devices with a 16 KB page size, follow these steps:
1. Upgrade your tools: Start by upgrading to Android Gradle Plugin (AGP) 8.5.1 or higher. These updated tools incorporate the necessary 16 KB page size configuration for your App Bundle and the APKs generated from it using bundletool.
2. Align your native code: If your app includes native code, use NDK version r28 or higher, or rebuild it with 16 KB page size alignment. You should also ensure that your native code does not rely on or hardcode the value of PAGE_SIZE.
3. Update SDKs and libraries: Confirm that all SDKs and libraries used in your app are compatible with 16 KB page size. If necessary, contact the SDK or library developers for updated versions.
Test your app in 16 KB mode
To make sure your application does not assume the page size to be 4 KB anywhere, test it with a 16 KB page size emulator or virtual device in addition to how you have been testing (with a 4 KB page size). This helps identify and resolve any compatibility issues from the move to 16 KB page sizes. You can also test on physical devices with the developer option available on Pixel 8, 8a, and 8 Pro starting with the Android 15 QPR1 and Pixel 9, 9 Pro, 9 Pro XL in the Android 15 QPR2 Beta 2, with more devices on the way.
The Future is Faster and More Efficient
The move to 16 KB page size benefits the Android ecosystem. It unlocks performance improvements, paves the way for future innovations, and provides users with smoother and richer app experiences.
We'll continue to provide updates and resources to help you through this transition. Start preparing your apps today to ensure you're ready for the future of Android!
17 Dec 2024 9:00am GMT
13 Dec 2024
Android Developers Blog
#WeArePlay | Meet the people building sport apps and games
Posted by Robbie McLachlan - Developer Marketing
In a year filled with iconic sports moments-from the Olympic and Paralympic Games in Paris to the UEFA Euro Cup in Germany-our celebration of app and game businesses continues with nine new #WeArePlay stories. These founders are building sports apps and games that unite players, fans, and communities-from immersive sports simulations to apps that motivate runners with rewards like vouchers and free gifts.
Let's take a look at some of my favourites.
Immerse yourself into your favourite sport with Hao, Yukun, and Mingming's simulator games
Hao always dreamed of creating video games. After studying computer science, he joined a gaming company where he met Yukun and Mingming. Their shared passion for game design and long conversations about graphics, movie scenes, and nostalgic childhood games inspired them to start Feamber Games. Specializing in realistic 3D sports simulations like pool and archery, they've added competitive elements to enhance the experience. Recently, they've expanded into immersive games that let players build business empires and manage hotels. Now, the trio is focused on growing their global audience.
Anna's boxing fitness app is a knockout, with tailored training and on-demand classes
Anna discovered her love for boxing at 11, staying dedicated to non-contact training throughout adulthood. After a career in accounting and becoming a mother, she struggled to attend classes, inspiring her to create Boxx - an app that brings boxing training to any location. Collaborating with fitness instructors, she developed personalized sessions, hybrid workouts, expert-led on-demand classes, and progress tracking. With hands-free guided audio and community features coming soon, Anna is regularly reviewing feedback to find innovative approaches to improve boxers' experiences.
Get active and track your progress with Yi Hern, Dana, and Pearl's running app
After creating a successful augmented reality game, childhood friends Yi Hern, Dana, and Pearl decided to inspire people to stay active. Combining Yi Hern's engineering skills, Dana's visual arts expertise, and Pearl's scientific background, they developed JomRun - Let's Run. The app allows runners to track their progress, earn rewards like vouchers and free gifts, and easily join marathons. With teams in Malaysia and Singapore, and plans to introduce new features, the trio is gearing up to expand across Southeast Asia.
Ohjun and Jaeho's volleyball game get high scores from players worldwide
Ohjun and Jaeho, childhood friends from an online game development community, combined their love for game building and volleyball to create The Spike - Volleyball Story. After a successful test release on Google Play, the game gained popularity in South Korea, inspiring them to improve it and reach a global audience. They added new features like story and tournament modes, plus a complete UX overhaul, all to recreate the excitement of real-life volleyball. Now, they're focused on creating even more thrilling sports games.
13 Dec 2024 2:00pm GMT
12 Dec 2024
Android Developers Blog
Reddit improved app startup speed by over 50% using Baseline Profiles and R8
Posted by Ben Weiss - Developer Relations Engineer, and Lauren Darcey - Senior Engineering Manager, Reddit
Reddit is one of the world's largest internet forums, bringing together countless communities looking for entertainment, answers to everyday questions, and so much more.
Recently, the team optimized its Android app to reduce startup times and improve rendering performance using Baseline Profiles. But the team didn't stop there. Reddit app developers also enabled Android's R8 compiler in full mode to maximize bytecode optimization and used Jetpack Compose to rewrite legacy UI, improving both the user and developer experience.
Maximizing optimization using Baseline Profiles and R8 full mode
The Reddit Android app has undergone countless performance upgrades over the years. Reddit developers have long since cleared the list of quick and easy tasks for optimization, but the team still wants to improve the app, bringing its performance to the next level and ensuring it runs well on every Android device.
"Reddit is looking for any strategic improvement to its app performance so we can make the app experience better for new and existing users," said Rob McWhinnie, a staff engineer at Reddit. "Baseline Profiles fit this use case well since they are based on critical user journeys."
Reddit's platform engineering team used screen-specific performance metrics and observability to help its feature teams improve key metrics like time to interactive and scroll performance. Baseline Profiles were a natural fit to help improve these metrics and the user experience behind them, so the team integrated them to make tracking and optimizing easier, using insights from geodata and device classes.
The team built Baseline Profiles for five critical user journeys so far, like scrolling the home feed, logging in, launching the full-screen video player, navigating between subreddits and scrolling their feeds, and using the chat feature.
Simplifying Baseline Profile management in their continuous integration processes, enabled Reddit to remove the need for manual maintenance and streamlining optimization. Now, Baseline Profiles are automatically regenerated for each release.
Enabling Android's R8 optimization compiler in full mode was another area Reddit engineers worked on. The team had already used R8 in compatibility mode, but some of Reddit's legacy code would've made implementing R8's more aggressive features difficult. The team worked through the app's existing technical debt first, making it easier to integrate R8's full mode capabilities and maximize Android app optimization.
Improvements with Baseline Profiles and R8 full mode
Reddit's Baseline Profiles and R8 full mode optimization led to multiple performance improvements across the app, with early benchmarks of the first Baseline Profile for feeds showing a 51% median startup time improvement. While responses from Redditors initially confirmed large startup improvements, Baseline Profile optimizations for less frequent journeys, like logging in, saw fewer user reports.
Baseline Profiles for the home feed had a 36% reduction in frozen frames' 95th percentile. Baseline Profiles for the community feed also delivered strong screen load and scroll performance improvements. At the 90th percentile, screen Time To Interactive improved by 12% and time to first draw decreased by 22%. Reddit's scrolling performance also saw a 12% reduction in P90 slow frames.
The upgrade to R8 full mode led to an increase in Google Play average ratings. The proportion of global positive ratings (fours and fives) increased by four percent, with a notable decrease in negative reports. R8 full mode also reduced total application-not-responding errors by almost 30%.
Overall, the app saw cold start improvements of 20%, scroll performance improvements of 15%, and widespread enhancements in lower-end devices and emerging markets. Google Play vitals saw improvements in slow cold starts, a 10% reduction in excessive frozen frames, and a 30% reduction in excessive slow frames. Nearly 75% of screens, refactored using Jetpack Compose, experienced performance gains.
Further optimizations using Jetpack Compose
Reddit adopted Jetpack Compose years ago and has since rebuilt much of its UI with the toolkit, benefitting both the app and its design system. According to the Reddit team, Google's ongoing support for Compose's stability and performance made it a strong fit as Reddit scaled its app, allowing for more efficient feature development and better performance.
One major example is Reddit's feed rewrite using Compose, which resulted in more maintainable code and an improved developer experience. Compose enabled teams to focus on future work instead of being bogged down by legacy code, allowing them to fix bugs quickly and improve overall app stability.
"The R8 and Compose upgrades were important to deploy in relative isolation and stabilize," said Drew Heavner, a staff engineer at Reddit. "We feel like we got great outcomes from this work for all teams adopting our modern tech stack and Compose."
After upgrading to the September 2024 release of Compose, the latest iteration, Reddit saw significant performance gains across the board. Cold start times improved by 13%, excessive slow frames decreased by 25%, and frozen frames dropped by 10%. Low- and mid-tier devices saw even greater improvements where app start times improved by up to 40%, especially in markets with lower-performing devices.
Screens using Reddit's modern Compose-powered design stack showed substantial improvements in both slow and frozen frame rates. For example, the home feed saw a 23% reduction in frozen frames, and scrolling performance visibly improved according to internal reviews. These updates were well received among users and reflected a 17% increase in the app's Google Play average rating.
Up-leveling UX through optimization
Adding value to an app isn't just about introducing new features-it's about refining and optimizing the ones users already love. Investing in performance improvements made Reddit's key features faster and more reliable, enhancing the overall user experience. These optimizations not only improved app startup and runtime performance but also simplified development workflows, increasing both developer satisfaction and app stability.
The focus on high-traffic features, such as feeds, has demonstrated the power of performance tuning, with substantial gains in user engagement and satisfaction. As the app has become more efficient, both users and developers have benefitted from a cleaner codebase and faster performance.
Looking ahead, Reddit plans to extend the usage of Baseline Profiles to other critical user journeys, including Reddit's post and comment experiences, ensuring even more users benefit from these ongoing performance improvements.
Reddit's platform engineers also want to continue collaborating with feature teams to integrate performance improvements across the app. These efforts will ensure that as the app evolves, it remains a smooth, fast, and engaging experience for all Redditors.
"Adding new features isn't the only way to add value to an experience for users," said Lauren Darcey, a senior engineering manager at Reddit. "When you find a feature that users love and engage with, taking the time to refine and optimize it can be the difference between a good and a great experience for your users."
Get started
Improve your app performance using Baseline Profiles, R8 full mode, and Jetpack Compose.
12 Dec 2024 10:00pm GMT
Introducing Android XR SDK Developer Preview
Posted by Matthew McCullough - VP of Product Management, Android Developer
Today, we're launching the developer preview of the Android XR SDK - a comprehensive development kit for Android XR. It's the newest platform in the Android family built for extended reality (XR) headsets (and glasses in the future!). You'll have endless opportunities to create and develop experiences that blend digital and physical worlds, using familiar Android APIs, tools and open standards created for XR. All of this means: if you build for Android, you're already building for XR! Read on to get started with development for headsets.
With the Android XR SDK you can:
- Break free of traditional screens by spatializing your app with rich 3D elements, spatial panels, and spatial audio that bring a natural sense of depth, scale, and tangible realism
- Transport your users to a fantastical virtual space, or engage with them in their own homes or workplaces
- Take advantage of natural, multimodal interaction capabilities such as hands and eyes
"We believe Android XR is a game-changer for storytelling. It allows us to merge narrative depth with advanced interactive features, creating an immersive world where audiences can engage with characters and stories like never before."
- Jed Weintrob, Partner at 30 Ninjas
Your apps on Android XR
The Android XR SDK is built on the existing foundations of Android app development. We're also bringing the Play Store to Android XR, where most Android apps will automatically be made available without any additional development effort. Users will be able to discover and use your existing apps in a whole new dimension. To differentiate your existing Compose app, you may opt-in, to automatically spatialize Material Design (M3) components and Compose for adaptive layouts in XR.
The Android XR SDK has something for every developer:
Building with Kotlin and Android Studio? You'll feel right at home with the Jetpack XR SDK, a suite of familiar libraries and tools to simplify development and accelerate productivity.
- Using Unity's real-time 3D engine? The Android XR Extensions for Unity provides the packages you need to build or port powerful, immersive experiences.
- Developing on the web? Use WebXR to add immersive experiences supported on Chrome.
- Working with native languages like C/C++? Android XR supports the OpenXR 1.1 standard.
Creating with Jetpack XR SDK
The Jetpack XR SDK includes new Jetpack libraries purpose-built for XR. The highlights include:
- Jetpack Compose for XR - enables you to declaratively create spatial UI layouts and spatialize your existing 2D UI built with Compose or Views
- Material Design for XR - includes components and layouts that automatically adapt for XR
- Jetpack SceneCore - provides the foundation for building custom 3D experiences
- ARCore for Jetpack XR - brings powerful perception capabilities for your app to understand the real world
"With Android XR, we can bring Calm directly into your world, capturing the senses and allowing you to experience it in a deeper and more transformative way. By collaborating closely with the Android XR team on this cutting-edge technology, we've reimagined how to create a sense of depth and space, resulting in a level of immersion that instantly helps you feel more present, focused, and relaxed."
- Dan Szeto, Vice President at Calm Studios
Kickstart your Jetpack XR SDK journey with the Hello XR Sample, a straightforward introduction to the essential features of Jetpack Compose for XR.
Learn more about developing with the Jetpack XR SDK.
We're also introducing new tools and capabilities to the latest preview of Android Studio Meerkat to boost productivity and simplify your creation process for Android XR.
- Use the new Android XR Emulator to create a virtualized XR device for deploying and testing apps built with the Jetpack XR SDK. The emulator includes XR-specific controls for using a keyboard and mouse to navigate an emulated virtual space.
- Use the Android XR template to get a jump-start on creating an app with Jetpack Compose for XR.
- Use the updated Layout Inspector to inspect and debug spatialized UI components created with Jetpack Compose for XR.
Learn more about the XR enabled tools in Android Studio and the Android XR Emulator.
Creating with Unity
We've partnered with Unity to natively integrate their real-time 3D engine with Android XR starting with Unity 6. Unity is introducing the Unity OpenXR: Android XR package for bringing your multi-platform XR experiences to Android XR.
Unity is adding Android XR support to these popular XR packages:
We're also rolling out the Android XR Extensions for Unity with samples and innovative features such as mouse interaction profile, environment blend mode, personalized hand mesh, object tracking, and more.
"Having already brought Demeo to most commercially available platforms, it's safe to say we were impressed with the process of adapting the game to run on Android XR."
- Johan Gastrin, CTO at Resolution Games
Check out our getting started guide for unity and Unity's blog post to learn more.
Creating for the Web
Chrome on Android XR supports the WebXR standard. If you're building for the web, you can enhance existing sites with 3D content or build new immersive experiences. You can also use full-featured frameworks like three.js, A-Frame, or PlayCanvas to create virtual worlds, or you can use a simpler API like model-viewer so your users can visualize products in an e-commerce site. And because WebXR is an open standard, the same experiences you build for mobile AR devices or dedicated VR hardware seamlessly work on Android XR.
Learn more about developing with WebXR.
Built on Open Standards
We're continuing the Android tradition of building with open standards. At the heart of the Android perception stack is OpenXR - a high-performance, cross-platform API focused on portability. Android XR is compliant with OpenXR 1.1, and we're also expanding the Open XR standards with leading-edge vendor extensions to introduce powerful world-sensing capabilities such as:
- AI-powered hand mesh, designed to adapt to the shape and size of hands to better represent the diversity of your users
- Detailed depth textures that allow real world objects to occlude virtual content
- Sophisticated light estimation, for lighting your digital content to match real-world lighting conditions
- New trackables that let you bring real world objects like laptops, phones, keyboards, and mice into a virtual environment
The Android XR SDK also supports open standard formats such as glTF 2.0 for 3D models and OpenEXR for high-dynamic-range environments.
Building the future together
We couldn't be more proud or excited to be announcing the Developer Preview of the Android XR SDK. We're releasing this developer preview, because we want to build the future of XR together with you. We welcome your feedback and can't wait to work with you and build your ideas and suggestions into the platform. Your passion, expertise, and bold ideas are absolutely essential as we continue to build Android XR.
We look forward to interacting with your apps, reimagined to take advantage of the unique spatial capabilities of Android XR, using familiar tools like Android Studio and Jetpack Compose. We're eager to visit the amazing 3D worlds you build using powerful tools and open standards like Unity and OpenXR. Most of all, we can't wait to go on this journey with all of you that make up the amazing community of Android and Unity developers.
To get started creating and developing for Android XR, check out developer.android.com/develop/xr where you will find all of the tools, libraries and resources you need to create with the Android XR SDK! If you are interested in getting access to prerelease hardware and collaborating with the Android XR team, express your interest to participate in an Android XR Developer Bootcamp in 2025 by filling out this form.
12 Dec 2024 4:00pm GMT
09 Dec 2024
Android Developers Blog
Notes from Google Play: The next phase of Play
Posted by Sam Bright - VP & GM, Google Play + Developer Ecosystem
Hello everyone,
Thank you for making this year another incredible one! Your innovative experiences continue to inspire us and bring joy to billions. We recently celebrated some of your amazing work in our Best of 2024 awards, showcasing moments of delight across phones, large-screen devices, watches, and PCs.
This year, we shared our vision for the next phase of Play where Play leans into being more than a store and becomes a dynamic platform that connects people with your content, when and where they need it most. To help people discover all you have to offer, truly engage with your experiences, and keep them coming back for more, we're making Play:
- A destination for discovery: Helping people find their new favorite apps and games and the content within
- The best place for gaming: So people can play more of the games they love across more surfaces, with exclusive rewards available only through Play Points, and
- Go beyond the store: Where people can get relevant content from installed apps directly on their home screen through our new Collections experience
Check out the video above, or keep reading for some of the key features we've launched this year to help you succeed at every stage of your app's lifecycle.
New tools and features built in 2024
Launch with confidence
Launching a new app or update is a critical moment and we want to make this process as smooth and successful as possible.
- Pre-review checks help you catch policy and compatibility issues before launch.
- The new quality panel gives you a centralized view of your app's quality so you can proactively find and address issues like crashes and ANRs, and see recommendations related to user experience.
- And with SDK Console, we're connecting you with SDK owners who can alert you in Android Studio and Play Console when new versions may address quality issues or help your app or game comply with Play policies.
Accelerate your growth and deepen your engagement with users
We've made Google Play even more content-forward with a visually engaging design that helps people discover the best of what you have to offer, wherever they are.
- We integrated Gemini models to make it easier for everyone to find what they're looking for with AI-generated app review summaries, FAQs, and app highlights, providing key information at a glance.
- Seamless app discovery helps users enjoy amazing experiences across their devices. Now, when people search for apps on their phone, they'll easily discover and install relevant apps for their TV, watch, and more.
- Enhanced custom store listings give you even more ways to tailor your content. And now, with the ability to segment by search keyword, you can connect with users who are actively searching for the specific benefits your app offers. Play Console will even give you keyword suggestions.
- Deep links help you create seamless web-to-app journeys to take users directly to the content they want, right inside your app. And now, we've made it even easier for you to manage and experiment with these deep links in Play Console, where you can make quick changes without waiting to publish a new app release.
Optimize revenue with Google Play Commerce
We're continuing to make it easier and more convenient for over 2.5 billion users in over 190 markets to have seamless and secure purchase experiences.
- This year, we've helped over half a billion people be ready to make purchases by proactively encouraging them to set up payment and authentication methods in advance. With new secure biometric authentication options like fingerprint and facial recognition, checkout is now faster and more secure.
- Our extensive payment method library, which includes over 300 local forms of payment in more than 65 markets, continues to grow. This year, we added CashApp (US), Blik Banking (Poland), Pix (Brazil), and MoMo (Vietnam).
- Expanded payment options give more ways for users to pay for content. Parents with Google Family setup can now approve their child's in-app purchases from any OS, not just on Android devices.
- And new subscription platform improvements, like flexible payment plans for long-term subscriptions, give users more options throughout the purchase experience, which helps drive higher conversions and new subscribers.
Reinforcing trust, safety, and security
We continue to invest in more ways to protect users, your business, and the ecosystem. This includes actively combating bad actors who try to deceive users or spread malware, and giving you tools to combat abuse.
- Google Play Protect scans 200 billion apps daily. When it finds a potentially harmful app, we let people know and may even disable particularly dangerous apps.
- Easier automatic app updates help ensure users have the latest features and improved security. Users with limited Wi-Fi access have the option to get their app updates over mobile data, and within their data budgets. We also launched a new tool that empowers you to prompt users for timely updates.
- Play Integrity API helps you detect suspicious activity so you can decide how to respond to abuse, like fraud, cheating, or data theft. Now, Play integrity verdicts are faster, more resilient, and more privacy-friendly.
These are just the highlights. To see how we're continuously improving the experience, check out our quarterly roundup of programs and launches on The Latest.
Investing in our app and game community
We're continuing to help app and game businesses of all sizes reach their full potential.
- This year, we've doubled the size of our global Indie Games Accelerator program and selected 60 game studios from around the world to participate in a 10-week program of masterclasses, workshops, and access to industry experts.
- Ten studios from across Latin America were selected to receive a share of $2 million in equity-free funding and hands-on guidance from the Google Play team as part of our Indie Games Fund.
- 500 aspiring developers in Indonesia participated in our Google Play x Unity Game Developer Training Program to build top-notch skills in game design, development, and monetization to kick-start their game development careers.
- And the ChangGoo initiative in Korea has nurtured a thriving startup ecosystem, supporting over 500 startups and attracting over KRW 147.6 billion in investments.
And with another year of #WeArePlay, we shared and celebrated the stories of 300 app and game businesses from all over the world. Take a look back at just a few of the inspiring founders we've featured.
Looking ahead
I'm excited about the future of Google Play as a dynamic platform that connects users with your amazing content, wherever they are.
Next year, we're going to continue helping you maximize your investments on Play by:
- Leaning into content-rich and interactive experiences for apps both within and beyond the Play store,
- Building on our gaming destination to make it even more personalized, engaging, and part of daily routines, and,
- Simplifying the payment and checkout experience for your apps and content.
Thanks again for your continued partnership and the innovation you've put into your apps and games. From our team to yours, happy holidays and best wishes for an amazing 2025!
09 Dec 2024 8:00pm GMT
06 Dec 2024
Android Developers Blog
User-Agent Reduction on Android WebView
Posted by Mike Taylor (Privacy Sandbox), and Mihai Cîrlănaru (Web on Android)
The User-Agent string has been reduced in Chrome on Desktop and Chrome on Android platforms since Chrome 107. Beginning in Android 16, the default User-Agent string in Android WebView will be similarly reduced.
Updated User-Agent string
The default, reduced WebView User-Agent string is as follows:
Mozilla/5.0 (Linux; Android 10; K; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/125.000 Mobile Safari/537.36
As seen in the diagram, the OS, CPU, and Build information will be reduced to the static "Linux; Android 10; K" string. Minor/build/patch version information is also reduced to "0.0.0" The rest of the default User-Agent remains unchanged (and is unchanging).
How can I detect WebView via the User-Agent string?
Sites can continue to look for the wv token in the User-Agent string, unless an application has decided to override the User-Agent string.
Does WebView support User-Agent Client Hints?
Android WebView has supported User-Agent Client Hints since version 116, but only for applications that send the default User-Agent string.
Will a custom WebView User-Agent string be affected?
The ability to set a custom User-Agent via setUserAgentString() won't be affected - and applications that choose to do so won't send the reduced User-Agent string.
06 Dec 2024 5:00pm GMT
04 Dec 2024
Android Developers Blog
Four Tips to Help You Build High-Quality, Engaging, and Age-Appropriate Apps
Posted by Mindy Brooks - Senior Director, Android Platform
App developers play a vital role in shaping how people of all ages interact with technology. Whether your app content is specifically designed for kids or simply attracts their attention, there is an added responsibility to ensure a safe and trusted experience. Google is here to support you in that work. Today, we're sharing some important reminders and updates on how we empower developers to build high-quality, engaging, and age-appropriate apps across the Android ecosystem.
Help Determine Android User Age with Digital IDs
Understanding a user's age range can be critical for providing minors with safer and more appropriate app experiences, as well as complying with local age-related regulations. Android's new Credential Manager API, now in Beta for Digital IDs, addresses this challenge by helping developers verify a user's age with a digital ID saved to any digital wallet application. Importantly, Android's Credential Manager was built with both safety and privacy at its core - it minimizes data exposure by only sharing information necessary with developers and asks the user for explicit permission to share an age signal. We encourage you to try out the Beta API for yourself and look forward to hearing your feedback.
While digital IDs are still in their early days, we're continuing to work with governments on further adoption to strengthen this solution. Android is also exploring how the API can support a range of age assurance methods, helping developers to safely confirm the age of their users, especially for users that can't or don't want to use a digital ID. Please keep in mind that ID-based solutions are just one tool that developers can use to determine age and the best approach will depend on your app.
Shield Young Users from Inappropriate Content on Google Play
As part of our continued commitment to creating a safe and positive environment for children across the Play Store, we recently launched the Restrict Declared Minors (RDM) setting within the Google Play Console that allows developers to designate their app as inappropriate for minors. When enabled, Google Play users with declared ages below 18 will not be able to download or purchase the app nor will they be able to continue subscriptions or make new purchases if the app is already installed.
Beyond Play's broader kids safety policies, this new setting gives developers an additional tool to proactively prevent minors from accessing content that may be unsuitable for them. It also empowers developers to take a more proactive role in ensuring their apps reach the appropriate audience. As a reminder, this feature is simply one tool of many to keep your apps safe and we are continuing to improve it based on early feedback. Developers remain solely responsible for compliance with relevant laws and regulations. You can learn more about opting in to RDM here.
Develop Teacher Approved Apps and Games on Google Play
Great content for kids can take many forms, whether that's sparking curiosity, helping kids learn, or just plain fun. Google Play's Teacher Approved program highlights high-quality apps that are reviewed and rated by teachers and child development specialists. Our team of teachers and experts across the world review and rate apps on factors like age-appropriateness, quality of experience, enrichment, and delight. For added transparency, we include information in the app listing about why the app was rated highly to help parents determine if the app is right for their child. Apps in the program also must meet strict privacy and security requirements.
Building a teacher-approved app not only helps raise app quality for kids - it can also increase your reach and engagement. All apps in this program are eligible to appear and be featured on Google Play's Kids tab where families go to easily discover quality apps and games. Please visit Google Play Academy for more information about how to design high-quality apps for kids.
Stay Updated With Google Play's Families Policies
Google Play policies provide additional protections for children and families. Our Families policies require that apps and games targeted to children have appropriate content, show ads suitable for children, and meet other requirements including ones related to personally identifiable information. We frequently update and strengthen these policies to ensure that Google Play remains a place where families can find safe and high-quality content for their children. This includes our new Child Safety Standards Policy for social and dating apps that goes into effect in January.
Developers can showcase compliance with Play's Families policies with a special badge on the Google Play Data safety section. This is another great way that you can better help families find apps that meet their needs, while supporting Play's commitment to provide users more transparency and control over their data. To display the badge, please visit the "Security practices" section of your Data Safety form in your Google Play Developer Console.
Additional Resources
Protecting kids online is a responsibility we all share and we hope these reminders are helpful as you prepare for 2025. We're grateful for your partnership in making Android and Google Play fantastic platforms for delightful, high-quality content for kids and families. For more resources:
- Learn more about Android's Credential Manager API.
- Watch our interactive Play Academy courses on complying with Play's Families policies, including SDK requirements, selecting your target age and content settings, and more.
- Review the updated Child Safety Standards Policy ahead of the January deadline.
04 Dec 2024 8:00pm GMT
#WeArePlay | Tentang Anak connects parents to experts across Indonesia
Posted by Robbie McLachlan, Developer Marketing
In our latest film for #WeArePlay, which celebrates the people behind apps and games, we meet Mesty and Garri - the husband and wife duo who created Tentang Anak. Their app helps parents across Indonesia navigate their parenting journey with confidence: with a focus on child health, growth tracking, and providing accessible expert consultations.
What inspired you to create Tentang Anak?
Mesty: I saw so much misinformation about child health and parenting, especially in Indonesia where there's a huge gap between the number of pediatricians and children. I wanted to provide parents with reliable, accessible information that could help them raise healthy, well-rounded children, allowing them to feel confident and calm in their parenting journey.
Garri: For me, it was about seeing the need for a one-stop solution for parents. Everything was scattered-pregnancy, growth tracking, expert advice-and I realized we could create something that brings it all together in one place. I wanted to build a platform that supported parents, especially in remote areas, with everything they need to raise their kids with confidence.
How does Tentang Anak ensure that the expert advice is both accurate and accessible to parents in remote areas of Indonesia?
Mesty: We make sure to partner with a team of highly qualified pediatricians, psychologists, and child development experts to ensure our advice is accurate and up-to-date.
Garri: Exactly, staying current with the latest research and best practices is crucial. Misinformation can have a huge impact, especially when it comes to child health. Parents often turn to social media or unverified sources for answers, which can lead to confusion or even harm. By partnering with qualified experts and constantly updating our content, we make sure that parents get accurate, reliable, and timely advice. This is especially important in remote areas where access to healthcare professionals can be limited.
How has Google Play supported Tentang Anak?
Garri: Google Play has provided us with the tools and support to optimize our app's performance and engagement. From using Google's analytics and A/B testing to improve the user experience, to the seamless distribution through the Play Store, Google has been a key partner in scaling Tentang Anak and making sure parents across Indonesia can access the app.
What is next for Tentang Anak?
Mesty: We're focused on expanding our reach across Indonesia, ensuring that more parents, especially in remote areas, have access to the support and resources they need. We're also enhancing our app with more interactive features to keep parents engaged and informed.
Garri: At the same time, we're expanding our offerings with products for children, including children's books, vitamins, and skincare. Our goal is to make Tentang Anak the go-to platform and brand for all things parenting in Indonesia, and we're excited to see how we can grow and help even more families.
Discover more global #WeArePlay stories and share your favorites.
04 Dec 2024 2:00pm GMT
03 Dec 2024
Android Developers Blog
Making the Play Integrity API faster, more resilient, and more private
Posted by Dom Elliott - Group Product Manager, Google Play
At Google Play, we're committed to providing a safe and secure environment for your business to thrive. That's why we continually invest in reinforcing user trust, protecting your business, and safeguarding the ecosystem. This includes actively combating bad actors who try to deceive users or spread malware, and giving you tools to combat abuse.
Our tools like the Play Integrity API helps protect your business from revenue loss and enhance user safety. You can use the Play Integrity API to detect suspicious activity and decide how to respond to abuse, such as fraud, bots, cheating, or data theft. In fact, apps that use Play Integrity features have seen 80% less unauthorized usage on average compared to other apps. Today, we're sharing how we're enhancing the Play Integrity API for everyone.
Play integrity verdicts are becoming faster, less spoofable, and more privacy-friendly
Starting today, we're changing the technology that powers the Play Integrity API on all devices running Android 13 (API level 33) and above to make it faster, more reliable, and more private for users. Developers already using Play Integrity API can opt-in to start using the new verdicts today; all API integrations will automatically transition to the new verdicts in May 2025. The improved verdicts will require, and make greater use of, hardware-backed security signals using Android Platform Key Attestation, making it significantly harder and more costly for attackers to bypass. We'll also be adjusting verdicts when we detect security threats across Android SDK versions, such as when there is evidence of excessive activity or key compromise, without requiring any developer work. And now, Play Integrity API will have the same level of reliability and support across all Android form factors.
The transition to the new verdicts will reduce the device signals that need to be collected and evaluated on Google servers by ~90% and our testing indicates verdict latency can improve by up to ~80%.
You can now check whether a device has a recent security update
Play Integrity API offers enhanced security signals, like the optional "meets-strong-integrity" and "meets-basic-integrity" responses in the device recognition verdict, to help you decide how much you trust the environment your app is running in. Now, we're updating the "meets-strong-integrity" response to require a security update within the last year on devices running Android 13 and above. This update gives apps with higher security needs, like banking and finance apps, governments, and enterprise apps, more ways to tailor their level of protection for sensitive features, like transferring money. When the strong label isn't available for the user, we recommend that you have a fallback option. Learn more about our recommended API practices.
We're also making it easier for you to adjust your app's behavior based on the user's Android SDK version with a new device attributes field. For example, your app could respond differently to the legacy "meets-strong-integrity" definition on devices running Android 12 and lower than to the enhanced definition on devices running Android 13 and higher. The FAQ includes some example code for using the new device attributes field.
We're standardizing all optional verdict signals so it's consistent for you to use
We're simplifying and standardizing all verdict content across apps, games, SDKs, and more, so that what you see will be more consistent and predictable. For apps installed by Google Play, you can get enhanced verdicts with optional signals such as the improved "meets-strong-integrity" device verdict and the recently launched app access risk verdict (which helps you detect and and respond to apps that can capture the screen or control the device, so you can protect your users from scams or malicious activity). For apps installed out of Google Play and all other API requests, you'll receive a verdict with information about the device, account license, and app, but without the extra security signals.
Developers can start using the improved verdicts today and they'll go live for all integrations in May 2025
Starting today, all new integrations will automatically receive the improved verdicts. Developers who already use the Play Integrity API can opt-in to the new verdicts now, or wait until it automatically updates for them in May 2025. For more information, see the Play Integrity API documentation. With these ongoing enhancements, the Play Integrity API is becoming an even more essential tool for safeguarding your apps and users.
03 Dec 2024 5:00pm GMT
21 Nov 2024
Android Developers Blog
Gaze Link Wins Best Android App in Gemini API Developer Competition
Posted by Thomas Ezan - Sr Developer Relation Engineer (@lethargicpanda)
We're excited to announce Gaze Link as the winner of the Best Android App for our Gemini API Developer Competition!
This innovative app demonstrates the potential of the Gemini API in providing a communication system for individuals with Amyotrophic lateral sclerosis (ALS) who develop severe motor and verbal disabilities, enabling them to type sentences with only their eyes.
About Gaze Link
Gaze Link uses Google's Gemini 1.5 Flash model to predict the user's intended sentence based on a few key words and the context of the conversation.
For example if the context is "Is the room temperature ok?" and the user replies "hot AC two" the app will leverage Gemini to generate the full sentence "I am hot, can you turn the AC down by two degrees?".
The Gaze Link team took advantage of Gemini 1.5 Flash multilingual capabilities to let the app generate sentences in English, Spanish and Chinese, the three languages currently supported by the app.
We were truly impressed by the Gaze Link app. The team used the Gemini API combined with ML Kit Face Detection to empower individuals with ALS providing them with a powerful communication system that is both accessible and affordable.
With Gemini 1.5 Flash currently supporting 38 languages, it is possible for Gaze Link to add support for more languages in the future. In addition, the model's multimodal abilities could enable the team to enhance the user experience by integrating image, audio and video to augment the context of the conversation.
Build with the Gemini API
The result of the integration of the Gemini API in Gaze Link is inspiring. If you are working on an Android app today, we encourage you to learn about the Gemini API capabilities to see how you can successfully add generative AI to your app and delight your users.
To get started, go to the Android AI documentation!
21 Nov 2024 5:30pm GMT
X improved login success rate by 2x after adopting passkeys
Posted by Niharika Arora - Developer Relations Engineer
From breaking news and entertainment to sports and politics, X is a social media app that aims to help nearly 500 million users worldwide get the full story with all the live commentary. Recently, X developers revamped the Android app's login process so users never miss out on the conversations they're interested in. Using the Credential Manager API, the team implemented new passkey authentication for quicker, easier, and safer access to the app.
Simplifying login with passkeys
Today, traditional password-based authentication systems are less secure and more prone to cyber attacks. Many users often choose easy-to-guess passwords, which bad actors can easily crack using brute force attacks. They also reuse the same passwords across multiple accounts, meaning if one password is hacked, all accounts are compromised.
Passkeys address the growing concern of account security from weak passwords and phishing attacks by eliminating the need for passwords entirely. The feature provides a safer, more seamless sign-in experience, freeing users from having to remember their usernames or passwords.
"Passkeys are a simpler, more secure way to log in, replacing passwords with pins or biometric data like fingerprints or facial recognition," said Kylie McRoberts, head of safety at X. "We explored using passkeys to make signing in easier and safer for users, helping protect their accounts without the hassle of remembering passwords."
Since implementing passkeys, the X team has seen a substantial reduction in login times and metrics showing improved login flow. With passkeys, the app's successful login rate has doubled compared to when it only relied on passwords. The team has also seen a decline in password reset requests from users who have enabled passkeys.
According to X developers, adopting passkeys even came with benefits beyond enhanced security and a simplified login experience, like lower costs and improved UX.
"Passkeys allowed us to cut down on expenses related to SMS-based two-factor authentication because they offer strong, inherent authentication," said Kylie. "And with the ease of login, users are more likely to engage with our platform since there's less friction to remember or reset passwords."
Passkeys rely on public-key cryptography to authenticate users and provide them with private keys. That means websites and apps can see and store the public key, but never the private key, which is encrypted and stored by the user's credential provider. As keys are unique and tied to the website or app, they cannot be phished, further enhancing their security.
Seamless integration using the Credential Manager API
To integrate passkeys, X developers used Android's Credential Manager API, which made the process "extremely smooth," according to Kylie. The API unifies Smart Lock, One Tap, and Google Sign-In into a single, streamlined workflow. This also allowed developers to remove hundreds of lines of code, boosting implementation and reducing maintenance overhead.
In the end, the migration to Credential Manager took X developers only two weeks to complete, followed by an additional two weeks to fully support passkeys. This was a "very fast migration" and the team "didn't expect it to be that simple and straightforward," said Saurabh Arora, a staff engineer at X. Thanks to Credential Manager's simple, coroutine-powered API, the complexities of handling multiple authentication options were essentially removed, reducing code, the likelihood of bugs, and overall developer efforts.
X developers saw a significant improvement in developer velocity by integrating the Credential Manager API. With their shift to passkey adoption through Credential Manager API, they achieved an:
- 80% code reduction in the authentication module
- 90% resolution of legacy edge case bugs
- 85% decrease in GIS, One Tap, and Smart Lock handling code
Using the Credential Manager API's top-level methods, like createCredential and getCredential, simplified integration by removing custom logic complexities surrounding individual protocols. This uniform approach also meant X developers could use a single, consistent interface to handle various authentication types, such as passkeys, passwords, and federated sign-ins like Sign in with Google.
"With Credential Manager's simple API methods, we could retrieve passkeys, passwords, and federated tokens with a single call, cutting down on branching logic and making response handling cleaner," said Saurabh. "Using different API methods, like createCredential() and getCredential(), also simplified credential storage, letting us handle passwords and passkeys in one place."
X developers didn't face many challenges when adopting Sign in With Google using the Credential Manager API. Replacing X's previous Google Sign In, One Tap, and Smart Lock code with a simpler Credential Manager implementation meant developers no longer had to handle connection or disconnection statuses and activity results, reducing the margin of error.
A future with passkeys
X's integration of passkeys shows that achieving a more secure and user-friendly authentication experience can be achieved. By leveraging Credential Manager API, X developers simplified the integration process, reduced potential bugs, and improved both security and developer velocity-all while sharpening the user experience.
"Our advice for developers considering passkey integration would be to take advantage of the Credential Manager API," said Saurabh. "It really simplifies the process and reduces code you need to write and maintain, making implementation better for developers."
Looking ahead, X plans to further enhance the user experience by allowing sign-ups with passkeys alone and providing a dedicated passkey management screen.
Get started
Learn how to improve your app's login UX using passkeys and the Credential Manager API.
21 Nov 2024 5:00pm GMT
20 Nov 2024
Android Developers Blog
Introducing Restore Credentials: Effortless account restoration for Android apps
Posted by Neelansh Sahai - Developer Relations Engineer
Did you know that, on average, 40% of the people in the US reset or replace their smartphones every year? This frequent device turnover presents a challenge - and an opportunity - for maintaining strong user relationships. When users get a new phone, the friction of re-entering login credentials can lead to frustration, app abandonment, and churn.
To address this issue, we're introducing Restore Credentials, a new feature of Android's Credential Manager API. With Restore Credentials, apps can seamlessly onboard users to their accounts on a new device after they restore their apps and data from their previous device. This makes the transition to a new device effortless and fosters loyalty and long term relationships.
On top of all this, there's no developer effort required for the transfer of a restore key from one device to the other, as this process is tied together with the android system's backup and restore mechanism. However, if you want to login your users silently as soon as the restore is completed, you might want to implement BackupAgent and add your logic in the onRestore callback. The experience is delightful - users will continue being signed in as they were on their previous device, and they will be able to get notifications to easily access their content without even needing to open the app on the new device.
Some of the benefits of the Restore Credentials feature include:
- Seamless user experience: Users can easily transition to a new Android device.
- Immediate engagement: Engage users with notifications or other prompts as soon as they start using their new device.
- Silent login with backup agent integration: If you're using a backup agent, users can be automatically logged back in after data restoration is complete.
- Restore key checks without backup agent integration: If a backup agent isn't being used, the app can check for a restore key upon first launch and then log the user in automatically.
- Easy implementation: Leverages the same server-side implementation used for passkeys.
How does Restore Credentials work?
The Restore Credentials feature enables seamless user account restoration on a new device. This process occurs automatically in the background during device setup when a user restores apps and data from a previous device. By restoring app credentials, the feature allows the app to sign the user back in without requiring any additional interaction.
The credential type that's supported for this feature is called restore key, which is a public key compatible with passkey / FIDO2 backends.
User flow
On the old device:
- If the current signed-in user is trusted, you can generate a restore key at any point after they've authenticated in your app. For instance, this could be immediately after login or during a routine check for an existing restore key.
- The restore key is stored locally and backed up to the cloud. Apps can opt-out of backing it up to the cloud.
On the new device:
- When setting up a new device, the user can select one of the two options to restore data. Either they can restore data from a cloud backup, or can locally transfer the data. If the user transfers locally, the restore key is transferred locally from the old to the new device. Otherwise, if the user restores using the cloud backup, the restore key gets downloaded along with the app data from cloud backup to the new device.
- Once this restore key is available on the new device, the app can use it to log in the user on the new device silently in the background.
Note: You should delete the restore key as soon as the user signs out. You don't want your user to get stuck in a cycle of signing out intentionally and then automatically getting logged back in.
How to implement Restore Credentials
Using the Jetpack Credential Manager let you create, get, and clear the relevant Restore Credentials:
- Create a Restore Credential: When the user signs in to your app, create a Restore Credential associated with their account. This credential is stored locally and synced to the cloud if the user has enabled Google Backup and end to end encryption is available. Apps can opt out of syncing to the cloud.
- Get the Restore Credential: When the user sets up a new device, your app requests the Restore Credential from Credential Manager. This allows your user to sign in automatically.
- Clear the Restore Credential: When the user signs out of your app, delete the associated Restore Credential.
Restore Credentials is available through the Credential Manager Jetpack library. The minimum version of the Jetpack Library is 1.5.0-beta01, and the minimum GMS version is 242200000. For more on this, refer to the Restore Credentials DAC page. To get started, follow these steps:
1. Add the Credential Manager dependency to your project.
// build.gradle.kts implementation("androidx.credentials:credentials:1.5.0-beta01")
2. Create a CreateRestoreCredentialRequest object.
// Fetch Registration JSON from server // Same as the registrationJson created at the time of creating a Passkey // See documentation for more info val registrationJson = ... // Create the CreateRestoreCredentialRequest object // Pass in the registrationJSON val createRequest = CreateRestoreCredentialRequest( registrationJson, /* isCloudBackupEnabled = */ true )
NOTE: Set the isCloudBackupEnabled flag to false if you want the restoreKey to be stored locally and not in the cloud. It's set as true by default
3. Call the createCredential() method on the CredentialManager object.
val credentialManager = CredentialManager.create(context) // On a successful authentication create a Restore Key // Pass in the context and CreateRestoreCredentialRequest object val response = credentialManager.createCredential( context, createRestoreRequest )
4. When the user sets up a new device, call the getCredential() method on the CredentialManager object.
// Fetch the Authentication JSON from server val authenticationJson = ... // Create the GetRestoreCredentialRequest object val options = GetRestoreCredentialOption(authenticationJson) val getRequest = GetCredentialRequest(Immutablelist.of(options)) // The restore key can be fetched in two scenarios to // 1. On the first launch of app on the device, fetch the Restore Key // 2. In the onRestore callback (if the app implements the Backup Agent) val response = credentialManager.getCredential(context, getRequest)
If you're using a backup agent, perform the getCredential part within the onRestore callback. This ensures that the app's credentials are restored immediately after the app data is restored.
5. When the user signs out of your app, call the clearCredentialState() method on the CredentialManager object.
// Create a ClearCredentialStateRequest object val clearRequest = ClearCredentialStateRequest(/* requestType = */ 1) // On user log-out, clear the restore key val response = credentialManager.clearCredentialState(clearRequest)
Conclusion
The Restore Credentials feature provides significant benefits, ensuring users experience a smooth transition between devices, and allowing them to log in quickly and easily through backup agents or restore key checks. For developers, the feature is straightforward to integrate and leverages existing passkey server-side infrastructure. Overall, Restore Credentials is a valuable tool that delivers a practical and user-friendly authentication solution.
This blog post is a part of our series: Spotlight Week: Passkeys. We're providing you with a wealth of resources through the week. Think informative blog posts, engaging videos, practical sample code, and more-all carefully designed to help you leverage the latest advancements in seamless sign-up and sign-in experiences.
With these cutting-edge solutions, you can enhance security, reduce friction for your users, and stay ahead of the curve in the rapidly evolving landscape of digital identity. To get a complete overview of what Spotlight Week has to offer and how it can benefit you, be sure to read our overview blog post.
20 Nov 2024 5:00pm GMT