01 May 2024

feedTalkAndroid

Amazfit’s Helio Smart Ring Is Coming To The US On May 15th

If you're looking for a smart ring that you won't need to wait until July for, Amazfit's Helio is expected in about two weeks.

01 May 2024 8:00pm GMT

How to switch from iOS to Android

Ready to switch to a new Android phone? We'll show you how to transfer your data from iOS to Android in just a few steps.

01 May 2024 7:47pm GMT

The Best Camera Phones In 2024

If you like taking professional pictures or just like snapping your loved ones, these are the top smartphones to use.

01 May 2024 4:04pm GMT

Rabbit R1 Might Be Hardware That Should Have Been Software

The Rabbit R1 is meant to usher in a new era of standalone AI devices, but it likely would have worked better as an app on your phone.

01 May 2024 3:00pm GMT

Monopoly Go Events Schedule Today – Updated Daily

Full Bloom Event as well as Green Thumb Contest Event.

01 May 2024 2:22pm GMT

Monopoly Go – Free Dice Links Today (Updated Daily)

If you keep on running out of dice, we have just the solution! Find all the latest Monopoly Go free dice links right here!

01 May 2024 2:17pm GMT

Family Island Free Energy Links (Updated Daily)

Tired of running out of energy on Family Island? We have all the latest Family Island Free Energy links right here, and we update these daily!

01 May 2024 2:17pm GMT

Crazy Fox Free Spins & Coins (Updated Daily)

If you need free coins and spins in Crazy Fox, look no further! We update our links daily to bring you the newest working links!

01 May 2024 2:15pm GMT

Match Masters Free Gifts, Coins, And Boosters (Updated Daily)

Tired of running out of boosters for Match Masters? Find new Match Masters free gifts, coins, and booster links right here! Updated Daily!

01 May 2024 2:13pm GMT

Solitaire Grand Harvest – Free Coins (Updated Daily)

Get Solitaire Grand Harvest free coins now, new links added daily. Only tested and working links, complete with a guide on how to redeem the links.

01 May 2024 2:12pm GMT

Dice Dreams Free Rolls – Updated Daily

Get the latest Dice Dreams free rolls links, updated daily! Complete with a guide on how to redeem the links.

01 May 2024 2:10pm GMT

Google Phone Gets Audio Emoji; It’s As Strange As You Think

Ever felt that all Google Phone was missing was sound effects? Probably not, but if you ever did, your wish has been granted.

01 May 2024 1:22pm GMT

iFixit Now Offers Fix Kits For Kobo’s Colored E-readers

iFixit has made it possible to repair your Kobo e-readers yourself.

01 May 2024 11:24am GMT

These Fitbit Devices Will Switch To Google Wallet

Fitbit Pay is being phased out, and users must transition to Google Wallet by the tail end of July to continue enjoying contactless payments on their devices.

01 May 2024 9:36am GMT

Trial of Abyss Guide – AFK Journey (AFK2)

This guide will show you how to unlock the Trial of Abyss in AFK Journey, the rewards you can get from it, as well as some tips to help you master this event.

01 May 2024 8:28am GMT

30 Apr 2024

feedTalkAndroid

Less Than A Month Later, There’s A New Cheapest Foldable

It hasn't been up to a month since the Nubia Flip took the cheapest foldable phone badge. The Blackview Hero 10 has come to snatch it away.

30 Apr 2024 8:00pm GMT

17 Apr 2024

feedAndroid Developers Blog

How to effectively A/B test power consumption for your Android app’s features

Posted by Mayank Jain - Product Manager, and Yasser Dbeis - Software Engineer; Android Studio


Android developers have been telling us they're looking for tools to help optimize power consumption for different devices on Android.

The new Power Profiler in Android Studio helps Android developers by showing power consumption happening on devices as the app is being used. Understanding power consumption across Android devices can help Android developers identify and fix power consumption issues in their apps. They can run A/B tests to compare the power consumption of different algorithms, features or even different versions of their app.

The new Power Profiler in Android Studio
The new Power Profiler in Android Studio

Apps which are optimized for lower power consumption lead to an improved battery and thermal performance of the device, which means an improved user experience on Android.

This power consumption data is made available through the On Device Power Monitor (ODPM) on Pixel 6+ devices, segmented by each sub-system called "Power Rails". See Profileable power rails for a list of supported sub-systems.

The Power Profiler can help app developers detect problems in several areas:

  • Detecting unoptimized code that is using more power than necessary.
  • Finding background tasks that are causing unnecessary CPU usage.
  • Identifying wakelocks that are keeping the device awake when they are not needed.

Once a power consumption issue has been identified, the Power Profiler can be used when testing different hypotheses to understand why the app could be consuming excessive power. For example, if the issue is caused by background tasks, the developer can try to stop the tasks from running unnecessarily or for longer periods. And if the issue is caused by wakelocks, the developer can try to release the wakelocks when the resource is not in use or use them more judiciously. Then compare the power consumption before/after the change using the Power Profiler.

In this blog post, we showcase a technique which uses A/B testing to understand how your app's power consumption characteristics might change with different versions of the same feature - and how you can effectively measure them.

A real-life example of how the Power Profiler can be used to improve the battery life of an app.

Let's assume you have an app through which users can purchase their favorite movies.

Sample app to demonstrate A/B testing for measure power consumption
Sample app to demonstrate A/B testing for measure power consumption
Video (c) copyright Blender Foundation | www.bigbuckbunny.org


As your app becomes popular and is used by more users, you realize that a high quality 4K video takes very long to load every time the app is started. Because of its large size, you want to understand its impact on power consumption on the device.

Originally, this video was in 4K quality in the best of intentions, so as to showcase the best possible movie highlights to your customers.

This makes you think…

  • Do you really need a 4K video banner on the home screen?
  • Does it make sense to load a 4K video over the network every time your app is run?
  • How will the power consumption characteristics of your app change if you replace the 4K video with something of lower quality (while still preserving the vivid look & feel of the video)?

This is a perfect scenario to perform an A/B test for power consumption

With an A/B test, you can test two slightly different variations of the video banner feature and choose the one with the better power consumption characteristics.

Scenario A : Run the app with 4K video banner on screen & measure power consumption

Scenario B : Run the app with lower resolution video banner on screen & measure power consumption

A/B Test setup

Let's take a moment and set up our Android Studio profiler to run this A/B test. We need to start the app and attach the CPU profiler to it and trigger a system trace (where the Power Profiler will be shown).

Step 1

Create a custom "Run configuration" by clicking the 3 dot menu > Edit

Custom run configuration
Custom run configuration

Step 2

Then select the "Profiling" tab and ensure that "Start this recording on startup" and CPU Activity > System Trace is selected. Then click "Apply".

Edit configuration settings
Edit configuration settings


Now simply run the "Profile app startup profiling with low overhead" whenever you want to run this app from start and attach the CPU profiler to it.

Note on precision

The following example scenarios use the entire app startup for estimating the power consumption for this blog's purpose. However you can use more advanced techniques to have even higher precision in getting power readings. Some techniques to try are:

  • Isolate and measure power consumption for video playback only after a tap event on the video player
  • Use the trace markers API to mark the start and stop time for power measurement timeline - and then only measure power consumption within that marked window

Scenario A

In this scenario, we run the app with 4K video playing and measure power consumption for the first 30 seconds. We can optionally also run the scenario A multiple times and average out the readings. Once the System trace is shown in Android Studio, select the 0-30 second time range from the timeline selection panel and record as a screenshot for comparing against scenario B

Power consumption in scenario A - playing a 4k video
Power consumption in scenario A - playing a 4k video


As you can see, the average power consumed by WLAN, CPU cores & Memory combined is about 1,352 mW (milliwatts)

Now let's compare and contrast how this power consumption changes in Scenario B

Scenario B

In this scenario, we run the app with low quality video playing and measure power consumption for the first 30 seconds. As before, we can also optionally run scenario B multiple times and average out the power consumption readings. Again, once the System trace is shown in Android Studio, select the 0-30 second time range from the timeline selection panel.

Power consumption in scenario B - playing a lower quality video
Power consumption in scenario B - playing a lower quality video

The total power consumed by WLAN, CPU Little, CPU Big and CPU Mid & Memory is about 741 mW (milliwatts)

Conclusion

All else being equal, Scenario B (with lower quality video) consumed 741 mW power as compared to Scenario A (with 4K video) which required 1,352 mW power.

Scenario B (lower quality video) took 45% less power than Scenario A (4K) - while the lower quality video provides little to no visual difference in perceived quality of the app's screen.

As a result of this A/B test for power consumption, you conclude that replacing the 4K video with a lower quality video on our app's home screen not only reduces power consumption by 45%, also reduces the required network bandwidth and can potentially also improve the thermal performance of the devices.

If your app's business logic still requires the 4K video to be shown on the app's screen, you can explore strategies like:

  • Caching the 4K video across subsequent runs of the app.
  • Loading video on a user tap.
  • Loading an image initially and only load the video after the screen has fully rendered (delayed loading).

The overall power consumption numbers presented in the above A/B test scenario might seem small, but it shows the techniques that app developers can use to effectively A/B test power consumption for their app's features using the Power Profiler in Android Studio.

Next Steps

The new Power Profiler is available in Android Studio Hedgehog onwards. To know more, please head over to the official documentation.

17 Apr 2024 4:00pm GMT

11 Apr 2024

feedAndroid Developers Blog

The First Beta of Android 15

Posted by Dave Burke, VP of Engineering


Android 14 logo


Today we're releasing the first beta of Android 15. With the progress we've made refining the features and stability of Android 15, it's time to open the experience up to both developers and early adopters, so you can now enroll any supported Pixel device here to get this and future Android 15 Beta and feature drop Beta updates over-the-air.

Android 15 continues our work to build a platform that helps improve your productivity, give users a premium app experience, protect user privacy and security, and make your app accessible to as many people as possible - all in a vibrant and diverse ecosystem of devices, silicon partners, and carriers.

Android delivers enhancements and new features year-round, and your feedback on the Android beta program plays a key role in helping Android continuously improve. The Android 15 developer site has lots more information about the beta, including downloads for Pixel and the release timeline. We're looking forward to hearing what you think, and thank you in advance for your continued help in making Android a platform that works for everyone.

We'll have lots more to share as we move through the release cycle, and be sure to tune into Google I/O where you can dive deeper into topics that interest you with over 100 sessions, workshops, codelabs, and demos.

Edge-to-edge

Apps targeting Android 15 are displayed edge-to-edge by default on Android 15 devices. This means that apps no longer need to explicitly call Window.setDecorFitsSystemWindows(false) or enableEdgeToEdge() to show their content behind the system bars, although we recommend continuing to call enableEdgeToEdge() to get the edge-to-edge experience on earlier Android releases.

To assist your app with going edge-to-edge, many of the Material 3 composables handle insets for you, based on how the composables are placed in your app according to the Material specifications.

a side-by-side comparison of App targets SDK 34 (left) and App targets SDK 35 (right) demonstrating edge-to-edge on an Android 15 device
On the left: App targets SDK 34 (Android 14) and is not edge-to-edge on an Android 15 device. On the right: App targets SDK 35 (Android 15) and is edge-to-edge on an Android 15 device. Note the Material 3 TopAppBar is automatically protecting the status bar, which would otherwise be transparent by default.


The system bars are transparent or translucent and content will draw behind by default. Refer to "Handle overlaps using insets" (Views) or Window insets in Compose to see how to prevent important touch targets from being hidden by the system bars.

Smoother NFC experiences - part 2

Android 15 is working to make the tap to pay experience more seamless and reliable while continuing to support Android's robust NFC app ecosystem. In addition to the observe mode changes from Android 15 developer preview 2, apps can now register a fingerprint on supported devices so they can be notified of polling loop activity, which allows for smooth operation with multiple NFC-aware applications.

Inter-character justification

Starting with Android 15, text can be justified utilizing letter spacing by using JUSTIFICATION_MODE_INTER_CHARACTER. Inter-word justification was first introduced in Android O, but inter-character solves for languages that use the white space for segmentation, e.g. Chinese, Japanese, etc.

image shows how japanese kanji (top) and english alphabet characters (bottom) appear with JUSTIFICATION_MODE_NONE
image shows how japanese kanji (top) and english alphabet characters (bottom) appear with JUSTIFICATION_MODE_INTER_WORD
image shows how japanese kanji (top) and english alphabet characters (bottom) appear with JUSTIFICATION_MODE_INTER_WORD

App archiving

Android and Google Play announced support for app archiving last year, allowing users to free up space by partially removing infrequently used apps from the device that were published using Android App Bundle on Google Play. Android 15 now includes OS level support for app archiving and unarchiving, making it easier for all app stores to implement it.

Apps with the REQUEST_DELETE_PACKAGES permission can call the PackageInstaller requestArchive method to request archiving a currently installed app package, which removes the APK and any cached files, but persists user data. Archived apps are returned as displayable apps through the LauncherApps APIs; users will see a UI treatment to highlight that those apps are archived. If a user taps on an archived app, the responsible installer will get a request to unarchive it, and the restoration process can be monitored by the ACTION_PACKAGE_ADDED broadcast.

App-managed profiling

Android 15 includes the all new ProfilingManager class, which allows you to collect profiling information from within your app. We're planning to wrap this with an Android Jetpack API that will simplify construction of profiling requests, but the core API will allow the collection of heap dumps, heap profiles, stack sampling, and more. It provides a callback to your app with a supplied tag to identify the output file, which is delivered to your app's files directory. The API does rate limiting to minimize the performance impact.

Better Braille

In Android 15, we've made it possible for TalkBack to support Braille displays that are using the HID standard over both USB and secure Bluetooth.

This standard, much like the one used by mice and keyboards, will help Android support a wider range of Braille displays over time.

Key management for end-to-end encryption

We are introducing the E2eeContactKeysManager in Android 15, which facilitates end-to-end encryption (E2EE) in your Android apps by providing an OS-level API for the storage of cryptographic public keys.

The E2eeContactKeysManager is designed to integrate with the platform contacts app to give users a centralized way to manage and verify their contacts' public keys.

Secured background activity launches

Android 15 brings additional changes to prevent malicious background apps from bringing other apps to the foreground, elevating their privileges, and abusing user interaction, aiming to protect users from malicious apps and give them more control over their devices. Background activity launches have been restricted since Android 10.

App compatibility

With Android 15 now in beta, we're opening up access to early-adopter users as well as developers, so if you haven't yet tested your app for compatibility with Android 15, now is the time to do it. In the weeks ahead, you can expect more users to try your app on Android 15 and raise issues they find.

To test for compatibility, install your published app on a device or emulator running Android 15 beta and work through all of your app's flows. Review the behavior changes to focus your testing. After you've resolved any issues, publish an update as soon as possible.

To give you more time to plan for app compatibility work, we're letting you know our Platform Stability milestone well in advance.

Android 15 release timeline

At this milestone, we'll deliver final SDK/NDK APIs and also final internal APIs and app-facing system behaviors. We're expecting to reach Platform Stability in June 2024, and from that time you'll have several months before the official release to do your final testing. The release timeline details are here.

Get started with Android 15

Today's beta release has everything you need to try the Android 15 features, test your apps, and give us feedback. Now that we've entered the beta phase, you can enroll any supported Pixel device here to get this and future Android Beta updates over-the-air. If you don't have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. If you're already in the Android 14 QPR beta program on a supported device, or have installed the developer preview, you'll automatically get updated to Android 15 Beta 1.

For the best development experience with Android 15, we recommend that you use the latest version of Android Studio Jellyfish (or more recent Jellyfish+ versions). Once you're set up, here are some of the things you should do:

  • Try the new features and APIs - your feedback is critical during the early part of the developer preview and beta program. Report issues in our tracker on the feedback page.
  • Test your current app for compatibility - learn whether your app is affected by changes in Android 15; install your app onto a device or emulator running Android 15 and extensively test it.

We'll update the beta system images and SDK regularly throughout the Android 15 release cycle. Read more here.

For complete information, visit the Android 15 developer site.


Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.

11 Apr 2024 8:00pm GMT

09 Apr 2024

feedAndroid Developers Blog

Google Drive cut code and development time in half with Jetpack Compose and new architecture

Posted by Nick Butcher - Product Manager for Jetpack Compose, and Florina Muntenescu - Developer Relations Engineer


As one of the world's most popular cloud-based storage services, Google Drive lets people do more than just store their files online. With Drive, users can synchronize, share, search, edit, and even pin specified files and content for safe and secure offline use.

Recently, Drive's developers revamped the application's home screen to provide a more seamless experience across devices, matching updates made to Google Drive's web version. However, the app's previous architecture and codebase would've prevented the team from completing the updates in a timely manner.

Instead of struggling with the app's previous tech stack to implement the update, the Drive team rebuilt the home page from the ground up using Android's recommended architecture and Jetpack Compose, Android's modern declarative toolkit for creating native UI.

Compose, combined with architecture improvements, cut our development time nearly in half.” — Dale Hawkins, Senior software engineer and tech lead at Google Drive

Experimenting with Kotlin and Compose

The Drive team experimented with Kotlin - which the Compose toolkit is built with - for several months before planning the app's home screen rebuild. Drive's developers liked Kotlin's improved syntax and null enforcement, making it easier to produce code.

"We had been using RxJava, but started looking into replacing that with coroutines," said Dale Hawkins, the features team lead for Google Drive. "This led to a more natural alignment between coroutines and Jetpack Compose. After a deep dive into Compose, we came away with a clear understanding of how Compose has numerous benefits over the Views-based approach."

Following the Kotlin exploration, Dale experimented with Jetpack Compose. "I was pleased with how easy it was to build the UI using Compose. So I continued the experiment after that week," said Dale. "I eventually rewrote the feature using Compose."

Using Compose

Shortly after experimenting with Jetpack Compose, the Drive team decided to use it to completely rebuild the app's home screen UI.

"We wanted to make some major changes to match the ones being done for the web version, but that project had a several-month head start. We wanted to release the Android version shortly after the web changes went live to ensure our users have a seamless Google Drive experience across devices," said Dale.

The Drive team's experimentation and testing with Jetpack Compose showed that the new toolkit was powerful and reliable and that it would enable them to move faster. With this in mind, the Drive team decided to step away from their old codebase and embrace Jetpack Compose for the app's home screen update. Not only would it be quicker and easier, but it would also better prepare the team to easily make future UI changes.

Using Android's guidance for architecture

Before going all-in with Jetpack Compose, Drive developers wanted to restructure the application by implementing a completely new app architecture. Drive developers followed Android's official architecture guidance to apply structural changes, paving the way for the new Kotlin codebase.

"The recommended architecture reinforces good separation between layers," said Quintin Knudsen, an Android engineer for Google Drive. "We work in a highly dynamic environment and need to be able to adjust to any app changes. Using well-defined and independent layers helps isolate any changes or UI requirements. The recommendations from Android offered sound ways to structure the layers." With a clear separation between the app's data and UI layers, developers could work in parallel to significantly speed up testing and development.

Drive developers also relied on Mappers and UseCases when creating the new architecture. These patterns allowed them to create flexible code that is easier to manage. They also exposed flows from their ViewModels to make the UI respond immediately to any data changes, making it much simpler to implement and understand UI updates.

Less code, faster development

With the app's newly improved architecture and Jetpack Compose, the Drive team was able to develop the app's new home screen in less than half the time that they expected. They also implemented the new code and finished quality assurance testing nearly seven weeks ahead of schedule.

"Thanks to Compose, we had the groundwork done within a couple of weeks. We delivered a great implementation over a month ahead of schedule, and it's been praised by product, UX, and even other engineering teams," said Dale.

Despite having fewer features, the original home screen required over 12,000 lines of code. The new Compose-based home screen has many new features and only required 5,100 lines of code-a 57% reduction. Having less code makes it much easier for developers to maintain the app and implement any updates.

Testing the new UI in Jetpack Compose also required significantly less code. Before Compose, Drive developers used roughly 9,000 lines of code to test about 62% of the UI. With Compose, it took only 2,200 lines to test over 80% of the new UI.

The original home screen required over 12,000 lines of code. The Compose-based home screen only required 5,100 lines of code. That’s a 57% reduction.” — Dale Hawkins, Senior software engineer and tech lead at Google Drive

Looking forward

A new and improved app architecture paired with Jetpack Compose allowed Drive developers to rebuild the app's home screen UI faster and easier than they could've imagined. The Drive team plans to expand its use of Compose within the application for things like supporting large dynamic displays and text resizing.

"As we work on new projects, we're taking the opportunity to update older UI code to make use of our new architecture and Compose. The new code will be objectively better and features will be easier to write, test, and maintain," said Dale.

Get started

Improve app architecture using Android's official architecture guidance and optimize your UI development with Jetpack Compose.

09 Apr 2024 7:00pm GMT

08 Apr 2024

feedAndroid Developers Blog

Android Studio uses Gemini Pro to make Android development faster and easier

Posted by Sandhya Mohan - Product Manager, Android Studio

As part of the next chapter of our Gemini era, we announced we were bringing Gemini to more products. Today we're excited to announce that Android Studio is using the Gemini 1.0 Pro model to make Android development faster and easier, and we've seen significant improvements in response quality over the last several months through our internal testing. In addition, we are making this transition more apparent by announcing that Studio Bot is now called Gemini in Android Studio.

Gemini in Android Studio is an AI-powered coding assistant which can be accessed directly in the IDE. It can accelerate your ability to develop high-quality Android apps faster by helping generate code for your app, providing complex code completions, answering your questions, finding relevant resources, adding code comments and more - all without ever having to leave Android Studio. It is available in 180+ countries and territories in Android Studio Jellyfish.

If you were already using Studio Bot in the canary channel, you'll continue experiencing the same helpful and powerful features, but you'll notice improved quality in responses compared to earlier versions.

Ask Gemini your Android development questions

Gemini in Android Studio can understand natural language, so you can ask development questions in your own words. You can enter your questions in the chat window ranging from very simple and open-ended ones to specific problems that you need help with.

Here are some examples of the types of queries it can answer:

  • How do I add camera support to my app?
  • Using Compose, I need a login screen with the following: a username field, a password field, a 'Sign In' button, a 'Forgot Password?' link. I want the password field to obscure the input.
  • What's the best way to get location on Android?
  • I have an 'orders' table with columns like 'order_id', 'customer_id', 'product_id', 'price', and 'order_date'. Can you help me write a query that calculates the average order value per customer over the last month?
Moving image demonstrating a conversation in Android Studio


Gemini in Android Studio remembers the context of the conversation, so you can also ask follow-up questions, such as "Can you give me the code for this in Kotlin?" or "Can you show me how to do it in Compose?"

Code faster with AI powered Code Completions

Gemini in Android Studio can help you be more productive by providing you with powerful AI code completions. You can receive suggestions of multi-line code completions, suggestions for how to do comments for your code, or how to add documentation to your code.

Moving image demonstrating code completion in Android Studio


Designed with privacy in mind

Gemini in Android Studio was designed with privacy in mind. Gemini is only available after you log in and enable it. You don't need to send your code context to take advantage of most features. By default, Gemini in Android Studio's chat responses are purely based on conversation history, and you control whether you want to share additional context for customized responses. You can update this anytime in Android Studio > Settings at a granular project level. We also have a custom way for you to opt out certain files and folders through an .aiexclude file. Much like our work on other AI projects, we stick to a set of AI Principles that hold us accountable. Learn more here.

image of share settings in Android Studio


Build a Generative AI app using the Gemini API starter template

Not only does Android Studio use Gemini to help you be more productive, it can also help you take advantage of Gemini models to create AI-powered features in your applications. Get started in minutes using the Gemini API starter template available in the canary release - channel for Android Studio - under File > New Project > Gemini API Starter. You can also use the code sample available at File > Import Sample > Google Generative AI sample.

The Gemini API is multimodal, meaning it can support image and text inputs. For example, it can support conversational chat, summarization, translation, caption generation etc. using both text and image inputs.

image of starter templates in Android Studio


Try Gemini in Android Studio

Gemini in Android Studio is still in preview, but we have added many feature improvements - and now a major model update - since we released the experience in May 2023. It is currently no-cost for developers to try out. Now is a great time to test it and let us know what you think, before we release this experience to stable.


Stay updated on the latest by following us on LinkedIn, Medium, YouTube, or X. Let's build the future of Android apps together!

08 Apr 2024 5:00pm GMT

29 Mar 2024

feedAndroid Developers Blog

Battling Impersonation Scams: Monzo’s Innovative Approach

Posted by Todd Burner - Developer Relations Engineer

Cybercriminals continue to invest in advanced financial fraud scams, costing consumers more than $1 trillion in losses. According to the 2023 Global State of Scams Report by the Global Anti-Scam Alliance, 78 percent of mobile users surveyed experienced at least one scam in the last year. Of those surveyed, 45 percent said they're experiencing more scams in the last 12 months.

ALT TEXT


The Global Scam Report also found that phone calls are the top method to initiate a scam. Scammers frequently employ social engineering tactics to deceive mobile users.

The key place these scammers want individuals to take action are in the tools that give access to their money. This means financial services are frequently targeted. As cybercriminals push forward with more scams, and their reach extends globally, it's important to innovate in the response.

One such innovator is Monzo, who have been able to tackle scam calls through a unique impersonation detection feature in their app.

Monzo's Innovative Approach

Founded in 2015, Monzo is the largest digital bank in the UK with presence in the US as well. Their mission is to make money work for everyone with an ambition to become the one app customers turn to to manage their entire financial lives.

Monzo logo


Impersonation fraud is an issue that the entire industry is grappling with and Monzo decided to take action and introduce an industry-first tool. An impersonation scam is a very common social engineering tactic when a criminal pretends to be someone else so they can encourage you to send them money. These scams often involve using urgent pretenses that involve a risk to a user's finances or an opportunity for quick wealth. With this pressure, fraudsters convince users to disable security safeguards and ignore proactive warnings for potential malware, scams, and phishing.

Call Status Feature

Android offers multiple layers of spam and phishing protection for users including call ID and spam protection in the Phone by Google app. Monzo's team wanted to enhance that protection by leveraging their in-house telephone systems. By integrating with their mobile application infrastructure they could help their customers confirm in real time when they're actually talking to a member of Monzo's customer support team in a privacy preserving way.

If someone calls a Monzo customer stating they are from the bank, their users can go into the app to verify this. In the Monzo app's Privacy & Security section, users can see the 'Monzo Call Status', letting them know if there is an active call ongoing with an actual Monzo team member.

"We've built this industry-first feature using our world-class tech to provide an additional layer of comfort and security. Our hope is that this could stop instances of impersonation scams for Monzo customers from happening in the first place and impacting customers."

- Priyesh Patel, Senior Staff Engineer, Monzo's Security team

Keeping Customers Informed

If a user is not talking to a member of Monzo's customer support team they will see that as well as some helpful information. If the 'Monzo call status' is showing that you are not speaking to Monzo, the call status feature tells you to hang up right away and report it to their team. Their customers can start a scam report directly from the call status feature in the app.

screen grab of Monzo call status alerting the customer that the call the customer is receiving is not coming from Monzo. The customer is being advised to end the call


If a genuine call is ongoing the customer will see the information.

screen grab of Monzo call status confirming to the customer that the call the customer is receiving is coming from Monzo.


How does it work?

Monzo has integrated a few systems together to help inform their customers. A cross functional team was put together to build a solution.

Monzo's in-house technology stack meant that the systems that power their app and customer service phone calls can easily communicate with one another. This allowed them to link the two and share details of customer service calls with their app, accurately and in real-time.

The team then worked to identify edge cases, like when the user is offline. In this situation Monzo recommends that customers don't speak to anyone claiming they're from Monzo until you're connected to the internet again and can check the call status within the app.

screen grab of Monzo call status displaying warning while the customer is offline letting the customer know the app is unable to verify whether or not the call is coming from Monzo, so it is safer not to answer.


Results and Next Steps

The feature has proven highly effective in safeguarding customers, and received universal praise from industry experts and consumer champions.

"Since we launched Call Status, we receive an average of around 700 reports of suspected fraud from our customers through the feature per month. Now that it's live and helping protect customers, we're always looking for ways to improve Call Status - like making it more visible and easier to find if you're on a call and you want to quickly check that who you're speaking to is who they say they are."

- Priyesh Patel, Senior Staff Engineer, Monzo's Security team

Final Advice

Monzo continues to invest and innovate in fraud prevention. The call status feature brings together both technological innovation and customer education to achieve its success, and gives their customers a way to catch scammers in action.

A layered security approach is a great way to protect users. Android and Google Play provide layers like app sandboxing, Google Play Protect, and privacy preserving permissions, and Monzo has built an additional one in a privacy-preserving way.

To learn more about Android and Play's protections and to further protect your app check out these resources:

29 Mar 2024 4:00pm GMT

25 Mar 2024

feedAndroid Developers Blog

#WeArePlay | Meet the founders changing women's lives: Women’s History Month Stories

Posted by Leticia Lago - Developer Marketing

In celebration of Women's History month, we're celebrating the founders behind groundbreaking apps and games from around the world - made by women or for women. Let's discover four of my favorites in this latest batch of nine #WeArePlay stories.


Múkami Kinoti Kimotho

Royelles Revolution / Royelles Revolution: Gaming For Girls (USA)

Múkami Kinoti Kimotho – Royelles Revolution / Royelles- Gaming For Girls | USA

Múkami's journey began when she noticed the lack of representation for girls in the gaming industry. Determined to change this narrative, she created Royelles, a game designed to inspire girls and non-binary people to pursue careers in STEAM (science, technology, engineering, art, math) fields. The game is anchored in fierce female avatars like the real life NASA scientist Mara who voices a character. Royelles is revolutionizing the gaming landscape and empowering the next generation of innovators. Múkami's excited to release more gamified stories and learning modules, and a range of extended reality and AI-powered avatars based on the game's characters.

"If we're going to effectively educate Gen Z and Gen Alpha, we have to meet them in the metaverse and leverage gamified play as a means of driving education, awareness, inspiration and empowerment."

- Múkami

Leonika Sari Njoto Boedioetomo

Reblood: Blood Services App (Indonesia)

Leonika Sari Njoto Boedioetomo – Reblood / Blood Services App | Indonesia

When her university friend needed an urgent blood transfusion but discovered there was none available in the blood bank, Leonika became aware of the blood donation shortage in Indonesia. Her mission to address this led her to create Reblood, an app connecting blood donors with those in need. With over 140,000 blood donations facilitated to date, Reblood is not only saving lives but also promoting healthier lifestyles with a recently added feature that allows people to find the most affordable medical checkups.

"Our goal is to save more lives by raising awareness of blood donation in Indonesia and promoting healthier lifestyles for blood donors."

- Leonika

Luciane Antunes dos Santos and Renato Hélio Rauber

CARSUL / Car Sul: Urban Mobility App (Brazil)

Luciane Antunes dos Santos and Renato Hélio Rauber – Car Sul: Urban Mobility App | Brazil

Luciane was devastated when she lost her son in a car accident. Her and her husband Renato's loss led them to develop Carsul, an urban mobility app prioritizing safety and security. By providing safe transportation options and partnering with government health programs to chauffeur patients long distances to larger hospitals, Carsul is not only preventing accidents but also saving lives. Luciane and Renato's dedication to protecting others from the pain they've experienced is ongoing and they plan to expand to more cities in Brazil.

"Carsul was born from this story of loss, inspiring me to protect other lives. Redefining myself in this way is very rewarding."

- Luciane

Diariata (Diata) N'Diaye

Resonantes / App-Elles: Safety App for Women (France)

Diariata (Diata) N'Diaye – Resonantes /App-Elles: Safety App for Women | France

After hearing the stories of young people who had experienced abuse that was similar to her own, Spoken word artist Diata developed App-Elles - an app that allows women to send alerts when they're in danger. By connecting users with support networks and professional services, App-Elles is empowering women to reclaim their safety and seek help when needed.Diata also runs writing and recording workshops to help victims overcome their experiences with violence and has plans to expand her app with the introduction of a discreet wearable that sends out alerts.

"I realized from my work on the ground that there were victims of violence who needed help and support systems. This was my inspiration to create App-Elles."

- Diata


Discover more #WeArePlay stories and share your favorites.



How useful did you find this blog post?

25 Mar 2024 4:00pm GMT

21 Mar 2024

feedAndroid Developers Blog

The Second Developer Preview of Android 15

Posted by Dave Burke, VP of Engineering


Android 14 logo


Today marks the second chapter of the Android 15 story with the release of Android 15 Developer Preview 2!

Android 15 continues our work to build a platform that helps improve your productivity while giving you new capabilities to produce superior media and AI experiences, take advantage of device form factors, minimize battery impact, maximize smooth app performance, and protect user privacy and security, all on the most diverse lineup of devices out there.

Android continues to add features enabling your apps to take advantage of premium device hardware, including the latest telecommunications features, high-end media capabilities, dazzling displays, foldable/filppable form factors, and AI processing.

Your feedback on the Android 15 Developer Preview and Beta program plays a key role in helping Android continuously improve. The Android 15 developer site has more information about the preview, including downloads for Pixel and detailed documentation about changes. This preview is just the beginning, and we'll have lots more to share as we move through the release cycle. Thank you in advance for your help in making Android a platform that works for everyone.

Updating Android communications

Android 15 updates the platform to give your app access to the latest advances in communication.

Satellite support

Android 15 continues to extend platform support for satellite connectivity and includes some UI elements to ensure a consistent user experience across the satellite connectivity landscape.

screen schot of a mobile Android device showing notification when device connects to satellite
Notification when device connects to satellite

Apps can use ServiceState.isUsingNonTerrestrialNetwork() to detect when a device is connected to a satellite, giving them more awareness of why full network services may be unavailable. Additionally, Android 15 provides support for SMS/ MMS applications as well as preloaded RCS applications to use satellite connectivity for sending and receiving messages.

Smoother NFC experiences

Android 15 is working to make the tap to pay experience more seamless and reliable while continuing to support Android's robust NFC app ecosystem. On supported devices, apps can request the NfcAdapter enter observe mode, where the device will listen but not respond to NFC readers, sending the app's NFC service PollingFrame objects to process. The PollingFrame objects

can be used to auth ahead of the first communication to the NFC reader, allowing for a one tap transaction in many cases.

Developer productivity

While most of our work to improve your productivity centers around tools like Android Studio, Jetpack Compose, and the Android Jetpack libraries, we always look for ways in the platform to help you more easily realize your vision.

PDF Improvements

screen schot of a mobile Android device showing search enabled for PDF files
Enable searching embedded PDF files with updates to PdfRenderer


Android 15 Developer Preview 2 includes an early preview of substantial improvements to the PdfRenderer APIs, giving apps capabilities to incorporate advanced features such as rendering password-protected files, annotations, form editing, searching, and selection with copy. Linearized PDF optimizations are supported to speed local PDF viewing and reduce resource use.

The PdfRenderer has been moved to a module that can be updated using Google Play system updates independent of the platform release, and we're supporting these changes back to Android R by creating a compatible pre-Android 15 version of the API surface, called PdfRendererPreV.

We value your feedback on the enhancements we've made to the PdfRenderer API surface, and we plan to make it much easier to incorporate these APIs into your app with an upcoming Android Jetpack library. Stay tuned.

Automatic language switching refinements

Android 14 added on-device multi-language audio recognition with automatic switching between languages, but this can cause words to get dropped, especially when languages switch with less of a pause between the two utterances. Android 15 has added additional controls to allow apps to help tune this switching for their use case. EXTRA_LANGUAGE_SWITCH_INITIAL_ACTIVE_DURATION_TIME_MILLIS confines the automatic switching to the beginning of the audio session, while EXTRA_LANGUAGE_SWITCH_MATCH_SWITCHES deactivates the language switching after a defined number of switches. This can be a useful refinement, particularly if the expectation is that there will be a single language spoken during the session that should be autodetected.

Granular line break controls

Starting in Android 15, the TextView and the underlying line breaker can preserve the given portion of text in the same line to improve readability. You can take advantage of this line break customization by using the <nobreak> tag in string resources or createNoBreakSpan. Similarly, you can preserve words from hyphenation by using the <nohyphen> tag or createNoHyphenationSpan.

Examples and screenshots:

<resources>
    <string name="pixel8pro">The power and brains behind Pixel 8 Pro.</string>
</resources>
text reads: The power and brains behind Pixel 8 Pro.
<resources>
    <string name="pixel8pro">The power and brains behind <nobreak>Pixel 8 Pro.</nobreak></string>
</resources>
text reads: The power and brains behind Pixel 8 Pro.

Expanded IntentFilter Functionality

Android 15 builds-in support for more precise Intent resolution through UriRelativeFilterGroup, which contain a set of UriRelativeFilter objects that form a set of Intent matching rules that must each be satisfied, including URL query parameters, URL fragments, and blocking/exclusion rules. This helps applications better keep up with the dynamic demands of web-hosted deep links.

These rules can be defined in the AndroidManifest with the new <uri-relative-filter-group> tag which can optionally include an android:allow tag. These tags can contain tags that use existing data tag attributes as well as the new android:query and android:fragment attributes.

An example of the AndroidManifest syntax that will be supported:

<intent-filter>
  <action android:name="android.intent.action.VIEW" />
  <category android:name="android.intent.category.BROWSABLE" />
  <data android:scheme="http" />
  <data android:scheme="https" />
  <data android:domain="astore.com" />
  <uri-relative-filter-group>
    <data android:pathPrefix="/auth" />
    <data android:query="region=na" />
  </uri-relative-filter-group>
  <uri-relative-filter-group android:allow="false">
    <data android:pathPrefix="/auth" />
    <data android:query="mobileoptout=true" />
  </uri-relative-filter-group>
  <uri-relative-filter-group android:allow="false">
    <data android:pathPrefix="/auth" />
    <data android:fragmentPrefix="faq" />
  </uri-relative-filter-group>
</intent-filter>

More OpenJDK API support

Android 15 continues to add OpenJDK APIs. Developer Preview 2 includes support for additional math/strictmath methods, lots of util updates including sequenced collection/map/set, ByteBuffer support in Deflater, and security key updates. These APIs are updated on over a billion devices running Android 12+ through Android 15 through Google Play System updates so you can target the latest programming features.

Giving your app more flexibility on more screens

Android 15 gives your apps the support to get the most out of Android's form factors, including large screens, flippables, and foldables.

Cover screen support

Your app can declare a property that Android 15 uses to allow your Application or Activity to be presented on the small cover screens of supported flippable devices. These screens are too small to be considered as compatible targets for Android apps to run on, but your app can opt-in to supporting them, making your app available in more places.

A more private, secure Android

We're always looking to give users more transparency and control over their data while enhancing the core security features of the platform.

Screen record detection

Android 15 adds support for apps to detect that they are being recorded. A callback is invoked whenever the app transitions between being visible or invisible within a screen recording. (An app is considered visible if activities owned by the registering process's UID are being recorded.) This way, if your app is performing a sensitive operation, you can inform the user that they're being recorded.

val mCallback = Consumer<Int> { state ->
  if (state == SCREEN_RECORDING_STATE_VISIBLE) {
    // we're being recorded
  } else {
    // we're not being recorded
  }
}

override fun onStart() {
   super.onStart()
   val initialState =
      windowManager.addScreenRecordingCallback(mainExecutor, mCallback)
   mCallback.accept(initialState)
}

override fun onStop() {
    super.onStop()
    windowManager.removeScreenRecordingCallback(mCallback)
}

Making Android more efficient

We are introducing new APIs that can help you gather insights about your apps, continuing to optimize the way background applications work, and providing APIs to help make tasks in your app more efficient to execute.

ApplicationStartInfo API

App startup on Android has always been a bit of a mystery. There was no easy way to know within your app whether it started from a cold, warm, or hot state. It was difficult to know how long your app spent during the various launch phases: forking the process, calling onCreate, drawing the first frame, and more. When your application class was instantiated, you had no way of knowing whether the app started from a broadcast, a content provider, a job, a backup, boot complete, an alarm, or an Activity.

The ApplicationStartInfo API on Android 15 gives you all of this and more. You can even choose to add your own timestamps into the flow to make it easy to collect timing data in one place. In addition to collecting metrics, you can use ApplicationStartInfo to help directly optimize app startup; for example, you can eliminate the costly instantiation of UI-related libraries within your Application class when your app is starting up due to a broadcast.

Changes to package stopped state

Android 15 includes several improvements to the PackageManager's Stopped State. Apps that are in a Stopped State should only be leaving this state through direct user action. Furthermore, apps entering the Stopped State will have their PendingIntents removed. To help developers re-register their pending intents, apps will now receive the BOOT_COMPLETED broadcast once they are removed from the Stopped State. Lastly, the new ApplicationStartInfo will also include the ApplicationStartInfo.wasForceStopped() to let developers know that their app was put into the Stopped State.

Detailed app size information

Android has offered an API, StorageStats.getAppBytes(), that summarizes the installed size of an app as a single number of bytes, which is a sum of the APK size, the size of files extracted from the APK, and files that were generated on the device such as ahead-of-time (AOT) compiled code. This number is not very insightful in terms of how your app is using storage.

Android 15 adds the StorageStats.getAppBytesByDataType([type]) API, which allows you to get insight into how your app is using up all that space, including apk file splits, AOT and speedup related code, dex metadata, libraries, and guided profiles.

Changes to foreground services

Android 14 began requiring Foreground Service Types. The documentation mentions that the dataSync Foreground Service type will be deprecated in a future version of Android.

To support migrating away from the dataSync Foreground Service type, Android 15 includes the mediaProcessing Foreground Service type, which is used to perform time-consuming operations on media assets, like converting media to different formats. In a future Beta release, this service will have a runtime limit of 6 hours.

SQLite database

Android 15 introduces new SQLite APIs that expose advanced features from the underlying SQLite engine that target specific performance issues that can manifest in apps.

Developers should consult best practices for SQLite performance to get the most out of their SQLite database, especially when working with large databases or when running latency-sensitive queries.

  • Row counts and IDs: new APIs were added to retrieve the count of changed rows or the last inserted row ID without issuing an additional query. getLastChangedRowCount() will return the number of rows that were inserted, updated, or deleted by the most recent SQL statement within the current transaction, while getTotalChangedRowCount() will return the count on the current connection. getLastInsertRowId() will return the "rowid" of the last row to be inserted on the current connection.
  • Raw statements: issue a raw SQlite statement, bypassing convenience wrappers and any additional processing overhead that they may incur.

Media refinements

Each release of Android focuses on improving the media experience.

HDR Headroom Control

side by side images of SDR content
The image on the left shows a view with SDR content. The image on the right simulates perceived headroom issues with SDR and HDR mixed content, which we can avoid by setting the desired HDR headroom.

Android 15 chooses HDR headroom that is appropriate for the underlying device capabilities and bit-depth of the panel; for pages that have lots of SDR content such as a messaging app displaying a single HDR thumbnail, this can end up adversely influencing the perceived brightness of the SDR content. Android 15 allows you to control the HDR headroom with setDesiredHdrHeadroom to strike a balance between SDR and HDR content.

Loudness Control

moving image of Droid wearing headphones and bopping his head rhythmically

Android 15 introduces support for the CTA-2075 loudness standard to help you avoid audio loudness inconsistencies and ensure users don't have to constantly adjust volume when switching between content. The system leverages known characteristics of the output devices (headphones, speaker) along with loudness metadata available in AAC audio content to intelligently adjust the audio loudness and dynamic range compression levels.

To enable this feature, you need to ensure loudness metadata is available in your AAC content and enable the platform feature in your app. For this, you instantiate a LoudnessCodecController object by calling its create factory method with the audio session ID from the associated AudioTrack; this automatically starts applying audio updates. You can pass an OnLoudnessCodecUpdateListener to modify/filter loudness parameters before they are applied on the MediaCodec.

// media contains metadata of type MPEG_4 OR MPEG_D
val mediaCodec = ...
val audioTrack = AudioTrack.Builder()
                                .setSessionId(sessionId)
                                .build()
...
// create new loudness controller that applies the parameters to the MediaCodec
try {
   val lcController = LoudnessCodecController.create(mSessionId)
   // starts applying audio updates for each added MediaCodec

AndroidX media3 ExoPlayer will soon be updated to leverage LoudnessCodecController APIs for a seamless app integration.

Use Spatializer instead of Virtualizer

Android 12 included the Spatializer class, which enables querying the capabilities and behavior of sound spatialization on the device. In Android 15, we're deprecating the Virtualizer class; instead use AudioAttributes.Builder.setSpatializationBehavior to characterize how you want your content to be played when spatialization is supported.

AndroidX media3 ExoPlayer 1.0 enables spatial audio by default for multichannel audio when the device supports it. See the blog post and documentation for more information, including APIs to control the feature.

User Experience

AutomaticZenRules allow apps to customize Attention Management (Do Not Disturb) rules and decide when to activate/deactivate them. Android 15 greatly enhances these rules with the goal of improving the user experience. It does this by:

  • Adding types to AutomaticZenRule, allowing the system to apply special treatment to some rules
  • Adding an icon to AutomaticZenRule, helping to make the modes be more recognizable
  • Adding a triggerDescription string to AutomaticZenRule that describes the conditions on which the rule should become active for the user
  • Added ZenDeviceEffects to AutomaticZenRule, allowing rules to trigger things like grayscale display, night mode, or dimming the wallpaper

Behavior changes

Because backward compatibility is so important to us, we try to limit impactful behavior changes, but some are inevitable.

Elegant fonts everywhere

Once your app targets Android 15, the elegantTextHeight TextView attribute becomes true by default, replacing the compact font used by default with some scripts that have large vertical metrics with one that is much more readable. The compact font was introduced to prevent breaking layouts; Android 13 prevents many of these breakages by allowing the text layout to stretch the vertical height utilizing the fallbackLineSpacing attribute. In Android 15, the compact font still remains in the system, so your app can set elegantTextHeight to false to get the same behavior as before, but it is unlikely to be supported in upcoming releases. So, if your application supports the following scripts: Arabic, Lao, Myanmar, Tamil, Gujarati, Kannada, Malayalam, Odia, Telugu or Thai, please test your applications by setting elegantTextHeight to true.

Examples and screenshots

Default behavior as of Android 14

Default behavior as of Android 14

Default behavior for applications that target Android 15

Default behavior as of Android 15

App compatibility

Android 15 release timeline

To give you more time to plan for app compatibility work, we're letting you know our Platform Stability milestone well in advance.

At this milestone, we'll deliver final SDK/NDK APIs and also final internal APIs and app-facing system behaviors. We're expecting to reach Platform Stability in June 2024, and from that time you'll have several months before the official release to do your final testing. The release timeline details are here.

Get started with Android 15

The Developer Preview has everything you need to try the Android 15 features, test your apps, and give us feedback. You can get started today by flashing a system image onto a Pixel 6, 7, or 8 series device, along with the Pixel Fold and Pixel Tablet. We are not offering sideload images for Developer Preview 2. If you don't have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. If you've already installed Android 15 Developer Preview 1, you should get an over-the-air update to Android 15 Developer Preview 2.

For the best development experience with Android 15, we recommend that you use the latest preview of Android Studio Jellyfish (or more recent Jellyfish+ versions). Once you're set up, here are some of the things you should do:

  • Try the new features and APIs - your feedback is critical during the early part of the developer preview. Report issues in our tracker on the feedback page.
  • Test your current app for compatibility - learn whether your app is affected by changes in Android 15; install your app onto a device or emulator running Android 15 and extensively test it.

We'll update the preview system images and SDK regularly throughout the Android 15 release cycle. This preview release is for developers only and not intended for daily or consumer use, so we're making it available by manual download only. Once you've manually installed a preview build, you'll automatically get future updates over-the-air for all later previews and Betas. Read more here.

If you intend to move from the Android 14 QPR Beta program to the Android 15 Developer Preview program and don't want to have to wipe your device, we recommend that you move to Developer Preview 2 now. Otherwise you may run into time periods where the Android 14 Beta will have a more recent build date which will prevent you from going directly to the Android 15 Developer Preview without doing a data wipe.

As we reach our Beta releases, we'll be inviting consumers to try Android 15 as well, and we'll open up enrollment for the Android Beta program at that time. For now, please note that the Android Beta program is not yet available for Android 15.

For complete information, visit the Android 15 developer site.

Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.

21 Mar 2024 6:00pm GMT

14 Mar 2024

feedAndroid Developers Blog

Tune in for Google I/O on May 14

Posted by Jeanine Banks - VP & General Manager, Developer X, and Head of Developer Relations

Google I/O is arriving this year on May 14th and you're invited to join us online! I/O offers something for everyone, whether you are developing a new application, modernizing an existing one, or transforming it into a business.

The Gemini era unlocks new possibilities for developers to build creative and productive AI-enabled applications. I/O is where you'll hear how you can get from idea to production AI applications faster. We're excited to share what's new for mobile, web, and multiplatform development, and how to scale your applications in the cloud. You will be able to dive deeper into topics that interest you with over 100 sessions, workshops, codelabs, and demos.

Visit the Google I/O site and register to stay informed about I/O and other related events coming soon. The livestreamed keynotes start May 14 at 10am PT, so mark your calendar.

If you haven't already, go try out our newest Google I/O puzzle and head to @googlefordevs on Instagram if you need a hint.

14 Mar 2024 8:00pm GMT

12 Mar 2024

feedAndroid Developers Blog

Key product updates from the 2024 Google for Games Developer Summit

Posted by Aurash Mahbod - General Manager, Games on Google Play

We're on a mission to make Google Play the best destination for gaming, and your amazing games are what make it possible. That's why we're investing in tools that empower you to achieve more by increasing user engagement, accelerating your business growth, and maximizing your reach.

Today, we shared more details about those investments at the Google for Games Developer Summit. Check out the Keynote session on demand, or keep reading for key product updates from the summit.

More ways to increase user engagement

We know that identity, progress, and achievements are important to gamers. With Play Games Services (PGS), gamers can switch between devices and jump back into gameplay instantly, on any device connected with PGS1. Based on your feedback, we've made it easier for you to integrate PGS and provide more relevant experiences for your users:

  • No database changes are required to integrate PGS, and you no longer need to store the association between in-game accounts and PGS profiles.
  • You can start automatically syncing your users' sign in information via PGS, including those without a profile, ensuring their progress is available to be synced when they change to a new device2.
  • Throughout this year, we're rolling out more ways for you to create engaging in-game content by achievement level or game progress. For example, you will be able to create Quests within your game to challenge players while unlocking rewards using Play Points.

To start creating these engaging experiences within your game, integrate PGS today - and stay tuned for more capabilities coming soon.

We've also been working to tailor our store experience to better engage users who've already installed your game. Starting today, we're rolling out enhancements to store listings to more prominently display your game updates, new content, and promotions directly within Play3. Your store listing page will source relevant content to keep your audience engaged with your game, including your latest YouTube videos, AI-generated FAQs, and more.

These improvements are currently limited to English language users and to select titles that are part of the early access program. If you're interested in testing the enhanced store listing for your game and helping shape this feature with your feedback, you can express interest here.

moving image of Games Hub (left) and Games Home (right)

Expanded programs to accelerate growth

We're also expanding our most popular programs to help you accelerate your business growth.

  • With over 220M members, Google Play Points is one of the largest loyalty programs in the world. Available in over 35 markets and covering the vast majority of purchase-ready, active spenders, Play Points helps you retain users and reach new ones. This year, we're excited to announce that we'll be launching Google Play Points in Brazil.
  • Play Points now offers the ability to set limited quantities for exclusive in-game offers, building excitement and boosting participation around limited deals. These offers have been highly effective at reaching paying users and driving repeat purchases.
image
  • Google Play Pass, which has grown over 120% in subscriptions over the past year, is expanding to include in-game offers from popular games. We're launching in-game offers in 21 markets including Japan, and launching Play Pass in Korea later this year.
moving image of Google Games play pass


New ways to maximize reach

Finally, to help you reach more users with your games, we're making it easier to scale and grow your multi-platform games across mobile, tablets, Chromebooks, and Windows PCs5.

ALT TEXT
  • We're also making user acquisition campaigns easier and more helpful for PC games. You can now use the Play Install Referrer API for Google Play Games to achieve closed-loop marketing on PC, passing campaign or click information into your game so you can optimize marketing campaigns across ad networks, social, and other channels.

For more news from the Google for Games Developer Summit, check out g.co/gamedevsummit. You can also connect with us at GDC in San Francisco. We'll be hosting a full day of sessions on March 19 and a week of hands-on product experience at the Moscone Center, West Hall, Level 2 Lobby from March 18 to March 22.

As always, thank you for partnering with us to bring amazing games to players everywhere.


___________________

1Subject to game availability and PC compatibility.
2This feature requires PGS SDK V2 integration. Developers can save sign-in information to PGS for users, but only retrieve this once the user has set up a profile.
3Feature availability limited to English language users and select titles that are part of the early access program.
4Google internal data. New buyer refers to users who made a purchase during this pilot experiment after making no purchase for the past 50 days.
5Windows is a trademark of the Microsoft group of companies.
6Google internal data.

12 Mar 2024 4:00pm GMT

Introducing Play Install Referrer for Google Play Games on PC

Posted by Arjun Dayal, Director - Google Play Games

Making informed marketing decisions relies on identifying your most valuable user acquisition channels for your games. By tracking referral data, you can understand which traffic sources send the most users to download your app from the Google Play store. These insights can help you make the most of your advertising spend and maximize ROI. That's why in 2017, we launched the Play Install Referrer API, which gives you an easy, reliable way to track your apps' referral information directly from the Play store.

Until now, this feature was only available for your games in the mobile Play store. Today, we're pleased to announce support for Google Play Games on PC, allowing you to attribute conversions from your marketing activities on the Web1. If you use the Google Play Install Referrer API to track your referral sources, you can now attribute conversions to specific campaigns directly from Google Play by manually retrieving referral information, or using third-party analytics tools from Google's App Attribution Partners.

Getting started is easy. First, generate a Google Play URL for your game's Google Play store listing page and add a referrer query parameter for your Web campaign. Then, when a PC user clicks the link, they will be redirected to your game's listing page on the Google Play Web store, which will give them the option to "Install on Windows." Once the user launches your game, you'll be able to track the referral using the Google Play Install Referrer library.

"With integration support from Adjust, developers can quickly and efficiently measure marketing campaigns for their games on Google Play Games on PC. We're excited about the opportunity this brings for developers to broaden their games' reach and strengthen cross-platform measurement."

- Gijsbert Pols, Director of Connected TV & New Channels, Adjust

Learn more about third-party referral codes for Google Play Games on PC and start optimizing your marketing performance today.


______________

1Subject to device compatibility with Google Play Games on PC.

12 Mar 2024 3:45pm GMT

Meet the class of 2024 for Google Play’s Indie Games Accelerator

Posted by Leticia Lago - Developer Marketing

Today, we're excited to reveal Google Play's Indie Games Accelerator class of 2024.

Selected game studios from around the world will take part in the 10-week accelerator designed to take their businesses to the next level on Google Play. The program includes:

  • A series of online masterclasses, talks and gaming workshops hosted by industry leaders
  • Mentorship sessions covering a broad variety of topics including technical development, gameplay and team leadership
  • Access to gaming experts from Google and leading studios

Due to the number of impressive applications, we've doubled this year's class size from 30 to 60 studios. Without further ado, meet the class of 2024 and join us in congratulating them!

moving image of IGA winners class 2024


Americas

Logisk Studio
Attu
Sprocket Games
Blu Studios
Highpoint Games
D20 Studios
Supernova Games
Cafund
ó Creative Studio
Hyperthought Games Inc.
North Star Digital Studios
Theia Studios
Aurecas Games
Mana Burn
67 Bits
Retora Games
Ocarina Studios
WonderLegend Games
EiP Game Studio
Asia Pacific

CLOVER-FI Games
Crimzen Red Studios
QueseraGames Co., Ltd.
Gonggamore Contents Inc.
ONDOT INC
LiberalDust
Studio Boxcat
Whoyaho Corp.
Blackhammer
Algorocks
Own Games
Kudos Games
Appspace Solutions
Lentera Nusantara
Dunali Games
Hexpion
Dreams Studio
Panthera Studio
Lunarite Studio
Npckc
ONDI Games
Playdew
Niku Games Studio
Avian Hearts Studios Pvt. Ltd
WASD Interactive
Europe, Middle East & Africa

First Pick Studios
Pank0
Big Loop Studios
BaldrickSoft
RPG games
Airapport
Post Physical
WALKME MOBILE SOLUTIONS
Iteration One
Veryo Studios
Monster League
TERAHYPE
3Hills
Gravity Code
Torpor Games
Nordic Stone Studio
TruePlayers
Pineapple on Pizza Studios

Congratulations again to all the founders selected; we can't wait to see your games grow on our platform.

We're committed to helping app and game businesses of all sizes reach their full potential. Discover more about Google Play's programs, resources and tools for indie games developers.

12 Mar 2024 10:00am GMT

11 Mar 2024

feedAndroid Developers Blog

Enhanced screen sharing capabilities in Android 14 (and Google Meet) improve meeting productivity

Posted by Francesco Romano - Developer Relations Engineer on Android

App screen sharing improves privacy and productivity

Android 14 QPR2 brings exciting advancements in user privacy and streamlined multitasking with app screen sharing. No longer do users have to broadcast their entire screen while screen sharing or casting, ensuring they share exactly what they want to share.

Leverage the new MediaProjection APIs to customize the screen sharing experience and deliver even greater utility to your users.

What is app screen sharing?

Prior to Android 14, users could only share or record their entire screen on Android devices, which could expose private information in other apps or notifications.

App screen sharing is a new platform feature that lets users restrict sharing and recording to a single app window, mitigating the risk of oversharing private messages or notifications. With app screen sharing, the status bar, navigation bar, notifications, and other system UI elements are excluded from the shared display. Only the content of the selected app is shared.

This not only enhances security for screen sharing, but also enables new use cases on large screens. Users can improve multitasking productivity - such as screen sharing while attending a meeting - by taking advantage of extra screen space on these larger devices.

How does it work?

There are three different entry points for users to start app screen sharing:

  1. Start casting from Quick Settings
  2. Start screen recording from Quick Settings
  3. Launch from an app with screen sharing or recording capabilities via the MediaProjection API

Let's consider an example where a host user wants to share a single app to the participants of a video call.

The host user starts screen sharing as usual, but now in Android 14 they are presented with an updated dialog that allows them to choose whether to share a single app instead of their entire screen.

The host user decides to share a single app, and they select the app from the App Selector.

During screen sharing, the video call participants can see only the content from the selected app.

The host user can end the screen capture in a few ways: from the app where sharing started, in the notification shade, by closing the app being shared, or by ending the video call.

visual journey of host sharing a single app to the participants in a video call across four panels

How to support app screen sharing?

Apps that use the MediaProjection APIs are capable of starting app screen sharing without any code changes. However, it's important to test your app to ensure that the screen sharing experience works as intended, since the user flow changes with this new behavior. Previously, the user would stay in the host app after the permission dialog. With app screen sharing the user is not returned to the host app, but the target app to be shared is launched instead. If the target app was already running in foreground (e.g. in multi window mode), then it simply becomes the top focused app.

Android 14 also introduces two callback methods to empower you to customize the sharing experience:

MediaProjection.Callback#onCapturedContentResize(width, height) is invoked immediately after capture begins or when the size of the captured region changes. The method arguments provide the accurate sizing for the streamed capture.

Note: The given width and height correspond to the same width and height that would be returned from android.view.WindowMetrics#getBounds() of the captured region.

If the recorded content has a different aspect ratio from either the VirtualDisplay or output Surface, the captured stream has black bars around the recorded content. The application can avoid the black bars around the recorded content by updating the size of both the VirtualDisplay and output Surface:

override fun onCapturedContentResize(width: Int, height: Int): String {
    // VirtualDisplay instance from MediaProjection#createVirtualDisplay().
    virtualDisplay.resize(width, height, dpi)

    // Create a new Surface with the updated size.
    val textureName: Int // the OpenGL texture object name
    val surfaceTexture = SurfaceTexture(textureName)
    surfaceTexture.setDefaultBufferSize(width, height)
    val surface = Surface(surfaceTexture)

    // Ensure the VirtualDisplay has the updated Surface to send the capture to.
    virtualDisplay.setSurface(surface)
}

The other API is MediaProjection.Callback#onCapturedContentVisibilityChanged(isVisible), which is invoked after capture begins or when the visibility of the captured region changes. The method argument indicates the current visibility of the captured region.

The callback is triggered when:

  • The captured region becomes invisible (isVisible==False).This may happen when the projected app is not topmost anymore, like when another app entirely covers it, or the user navigates away from the captured app.
  • The captured region becomes visible again (isVisible==True).This may happen if the user moves the covering app to show at least some portion of the captured app (for example, the user has multiple apps visible in multi-window mode).

Applications can take advantage of this callback by showing or hiding the captured content from the output Surface based on whether the captured region is currently visible to the user. You should pause or resume the sharing accordingly in order to conserve resources.

How Google Meet is improving meeting productivity

"App screen sharing enables users to share specific information in a Meet call without oversharing private information on the screen like messages and notifications. Users can choose specific apps to share, or they can share the whole screen as before. Additionally, users can leverage split-screen mode on large screen devices to share content while still seeing the faces of friends, families, coworkers, and other meeting participants." - Product Manager at Google Meet

Let's see app screen sharing in action during a video call, in this coming-soon version of Google Meet!

moving image of app screen sharing in action during a video call on Google Meet

Window on the world

App screen sharing opens doors (and windows) for more focused and secure app experiences within the Android ecosystem.

This new feature enhances several use cases:

  • Collaboration apps can facilitate focused discussion on specific design elements, documents, or spreadsheets without including distracting background details.
  • Tech support agents can remotely view the user's problem app without seeing potentially sensitive content in other areas.
  • Video conferencing tools can share a presentation window selectively rather than the entire screen.
  • Educational apps can demonstrate functionality without compromising student privacy, and students can share projects without fear of showing sensitive information.

By thoughtfully implementing app screen sharing, you can establish your app as a champion of user privacy and convenience.

11 Mar 2024 7:00pm GMT

08 Mar 2024

feedAndroid Developers Blog

Better, faster, stronger time zone updates on Android

Posted by Almaz Mingaleev - Software Engineer and Masha Khokhlova - Technical Program Manager

It's that time of year again when many of us move our clocks! Oh wait, your Android devices did it automatically, didn't they? For Android users living in many countries, this may not be surprising. For example, the US, EU and UK governments haven't changed their time legislation in a while*, so users wake up every morning to see the correct time.

But, what happens when time laws change? If you look globally, governments can and do change their time laws, sometimes every year, and Android devices have to keep up to support our global user base.

To implement a region's time legislation, Android devices have to follow a set of encoded rules. What are these rules? Let's start with why rules are needed in the first place. Clearly, 7am in Los Angeles and 7am in London are not the same time. Moreover, if you are in London and want to know the time in Los Angeles, you have to know how many hours to subtract, and this is not fixed throughout the year**. So to tell local time (time your watches should show) it is convenient to have a reference clock that everybody on the planet agrees on. This clock is named UTC, coordinated universal time. Local time in London during winter matches UTC, during summer it is calculated by adding one hour to UTC, usually referred to as UTC+1. For Los Angeles local time during summer is UTC-8 (8 hours behind, UTC offset is -8 hours) and during winter it is UTC-7 correspondingly. When a region changes from one offset to another, we call that a "transition". Combination of these offsets and rules when a transition happens (such as "last Sunday of March" or "first Sunday on or after 8th March") defines a time zone. For some countries, the time zone rules can be very simple and primarily determined by their chosen UTC offset: "no transitions, we don't move our clocks forwards and backwards".

Governments can decide to change the UTC offset for regions, introduce new time zone regions, or alter the day that daylight saving transitions occur. When governments do this, the time zone rules on every Android device needs to be updated, otherwise the Android device will continue to follow the old rules, which can lead to an incorrect local time being shown to users in the affected areas.

Android is not alone in needing to keep track of this information. Fortunately, there is a database supported by IANA (Internet Assigned Numbers Authority) and maintained by a small group of volunteers known as the TZDB (Time Zone Database) which is used as a basis for local timekeeping on most modern operating systems. The TZDB contains most of the information that Android needs.

There is no schedule, but typically the TZDB releases a new update 4-5 times a year. The Android team wants to release updates that affect its devices as soon as possible.

How do these changes reach your devices?

1. Government signs a law / decree.

2. Someone lets IANA know about these changes

3. Depending on how much lead time was given and changes announced by other countries IANA publishes a new TZDB release.

4. The Android team incorporates the TZDB release (along with a small amount additional information we obtain from related projects and derive ourselves) into our codebase.

5. We roll-out these updates to your devices. How the roll-out happens depends on the type and age of the Android device.

a. Many mobile Android devices are covered by Google's Project Mainline, which means that Google sends updates to devices directly.

b. Some devices are handled by the device's manufacturer who takes the Android team's source code updates and releases them to devices themselves according to their own update schedule.

As you can see, there are quite a few steps. Applying, testing and releasing an update can take weeks. And it is not just Android and other computer operating systems like it who need to take action. There are usually telecoms, banks, airlines and software companies that have to make adjustments to their own systems and time tables. Citizens of a country need to be made aware of changes so they know what to expect, especially if they are using older devices that might not receive necessary updates. And it all takes time and can cause problems for countless people if it isn't handled well. The amount of disruption caused by a change is usually determined by the clarity of the legislation and notice period that governments provide. The TZDB volunteers are good at spotting changes, but it helps if the governments notify IANA directly, especially when it's not clear the exact regions or existing laws affected. Unfortunately, many of the recent time zone changes were given with about a month or less notice time. Android has a set of recommendations for how much notice to provide. Other operating systems have similar recommendations.

Android is constantly evolving. One of such improvements, Project Mainline, introduced in Android 10, has made a big difference in how we update important parts of the Android operating system. It allows us to deliver select AOSP components directly through Google Play, making updates faster than a full OTA update and reducing duplication of efforts done by each OEM.

From the beginning, time zone rules were a component in Mainline, called Time Zone Data or tzdata module. This integration allowed us to react more quickly to government-mandated time zone changes than before. However until 2023 tzdata updates were still bundled with other Mainline changes, sometimes leading to testing complexities and slower deployment.

In 2023, we made further investments in Mainline's infrastructure and decoupled the tzdata module from the other components. With this isolation, we gained the ability to respond rapidly to time zone legislation changes - often releasing updates to Android users outside of the established release cadence. Additionally, this change means time zone updates can reach a far greater number of Android devices, ensuring you as Android users always see the correct time.

So while your Android phone may not be able to restore that lost hour of sleep, you can rest assured that it will show the accurate time, thanks to volunteers and the Android team.

Curious about the ever-changing world of time zones? Explore the IANA Time Zone Database and learn more about how time and time zones are managed on Android.


*In 2018-2019 there were changes in Alaska. This is a blogpost, not a technical documentation!

**Because the US and UK apply their daylight saving changes at different local times and on different days of the year.

08 Mar 2024 9:00am GMT

07 Mar 2024

feedAndroid Developers Blog

Introducing the Fused Orientation Provider API: Consistent device orientation for all

Posted by Geoffrey Boullanger - Senior Software Engineer, Shandor Dektor - Sensors Algorithms Engineer, Martin Frassl and Benjamin Joseph - Technical Leads and Managers

Device orientation, or attitude, is used as an input signal for many use cases: virtual or augmented reality, gesture detection, or compass and navigation - any time the app needs the orientation of a device in relation to its surroundings. We've heard from developers that orientation is challenging to get right, with frequent user complaints when orientation is incorrect. A maps app should show the correct direction to walk towards when a user is navigating to an exciting restaurant in a foreign city!

The Fused Orientation Provider (FOP) is a new API in Google Play services that provides quality and consistent device orientation by fusing signals from accelerometer, gyroscope and magnetometer.

Although currently the Android Rotation Vector already provides device orientation (and will continue to do so), the new FOP provides more consistent behavior and high performance across devices. We designed the FOP API to be similar to the Rotation Vector to make the transition as easy as possible for developers.

In particular, the Fused Orientation Provider

  • Provides a unified implementation across devices: an API in Google Play services means that there is no implementation variance across different manufacturers. Algorithm updates can be rolled out quickly and independent of Android platform updates;
  • Directly incorporates local magnetic declination, if available;
  • Compensates for lower quality sensors and OEM implementations (e.g., gyro bias, sensor timing).

In certain cases, the FOP returns values piped through from the AOSP Rotation Vector, adapted to incorporate magnetic declination.

How to use the FOP API

Device orientation updates can be requested by creating and sending a DeviceOrientationRequest object, which defines some specifics of the request like the update period.

The FOP then outputs a stream of the device's orientation estimates as quaternions. The orientation is referenced to geographic north. In cases where the local magnetic declination is not known (e.g., location is not available), the orientation will be relative to magnetic north.

In addition, the FOP provides the device's heading and accuracy, which are derived from the orientation estimate. This is the same heading that is shown in Google Maps, which uses the FOP as well. We recently added changes to better cope with magnetic disturbances, to improve the reliability of the cone for Google Maps and FOP clients.

The update rate can be set by requesting a specific update period. The FOP does not guarantee a minimum or maximum update rate. For example, the update rate can be faster than requested if another app has a faster parallel request, or it can be slower as requested if the device doesn't support the high rate.

For full specification of the API, please consult the API documentation:

Example usage (Kotlin)

package ...

import android.content.Context
import com.google.android.gms.location.DeviceOrientation
import com.google.android.gms.location.DeviceOrientationListener
import com.google.android.gms.location.DeviceOrientationRequest
import com.google.android.gms.location.FusedOrientationProviderClient
import com.google.android.gms.location.LocationServices
import com.google.common.flogger.FluentLogger
import java.util.concurrent.Executors

class Example(context: Context) {
  private val logger: FluentLogger = FluentLogger.forEnclosingClass()

  // Get the FOP API client
  private val fusedOrientationProviderClient: FusedOrientationProviderClient =
    LocationServices.getFusedOrientationProviderClient(context)

  // Create an FOP listener
  private val listener: DeviceOrientationListener =
    DeviceOrientationListener { orientation: DeviceOrientation ->
      // Use the orientation object returned by the FOP, e.g.
      logger.atFinest().log("Device Orientation: %s deg", orientation.headingDegrees)
    }

  fun start() {
    // Create an FOP request
    val request =
      DeviceOrientationRequest.Builder(DeviceOrientationRequest.OUTPUT_PERIOD_DEFAULT).build()

    // Create (or re-use) an Executor or Looper, e.g.
    val executor = Executors.newSingleThreadExecutor()

    // Register the request and listener
    fusedOrientationProviderClient
      .requestOrientationUpdates(request, executor, listener)
      .addOnSuccessListener { logger.atInfo().log("FOP: Registration Success") }
      .addOnFailureListener { e: Exception? ->
        logger.atSevere().withCause(e).log("FOP: Registration Failure")
      }
  }

  fun stop() {
    // Unregister the listener
    fusedOrientationProviderClient.removeOrientationUpdates(listener)
  }
}

Technical background

The Android ecosystem has a wide variety of system implementations for sensors. Devices should meet the criteria in the Android compatibility definition document (CDD) and must have an accelerometer, gyroscope, and magnetometer available to use the fused orientation provider. It is preferable that the device vendor implements the high fidelity sensor portion of the CDD.

Even though Android devices adhere to the Android CDD, recommended sensor specifications are not tight enough to fully prevent orientation inaccuracies. Examples of this include magnetometer interference from internal sources, and delayed, inaccurate or nonuniform sensor sampling. Furthermore, the environment around the device usually includes materials that distort the geomagnetic field, and user behavior can vary widely. To deal with this, the FOP performs a number of tasks in order to provide a robust and accurate orientation:

We have validated our algorithms on comprehensive test data to provide a high quality result on a wide variety of devices.

Availability and limitations

The Fused Orientation Provider is available on all devices running Google Play services on Android 5 (Lollipop) and above. Developers need to add the dependency play-services-location:21.2.0 (or above) to access the new API.

Permissions

No permissions are required to use the FOP API. The output rate is limited to 200Hz on devices running API level 31 (Android S) or higher, unless the android.permissions.HIGH_SAMPLING_RATE_SENSORS permission was added to your Manifest.xml.

Power consideration

Always request the longest update period (lowest frequency) that is sufficient for your use case. While more frequent FOP updates can be required for high precision tasks (for example Augmented Reality), it comes with a power cost. If you do not know which update period to use, we recommend starting with DeviceOrientationRequest::OUTPUT_PERIOD_DEFAULT as it fits most client needs.

Foreground behavior

FOP updates are only available to apps running in the foreground.


Copyright 2023 Google LLC.
SPDX-License-Identifier: Apache-2.0

07 Mar 2024 10:00pm GMT

#TheAndroidShow: the latest from MWC, Gemini Nano, Android 15 and more!

Posted by Anirudh Dewani, Director of Android Developer Relations


Last week, Android device makers released a slew of new devices, and today we're unpacking what that means for developers, as well as the latest in Gemini Nano, Android 15, Jetpack Compose and more, in another episode of our quarterly show, #TheAndroidShow:

The lastest wearables and foldables - get building!

Android device makers unveiled their latest wearables and foldables last week at Mobile World Congress, and we were on the ground in Barcelona taking a look at those new devices and how you can get started building on top of them. A few of our favorites:

  • Xiaomi Watch 2,the latest smart watch from the Xiaomi team. This device is powered by Wear OS by Google and provides upgraded camera, fitness, and sleep experiences to allow users to get the most from their device.
  • PORSCHE DESIGN HONOR Magic V2 RSR, the world's thinnest inward foldable smartphone. This is the latest foldable for Android and was designed with the user experience at the forefront, including human-centric eye comfort technology.

Compose is an amazing way to build apps for your users across form factors. Compose for Wear OS and the upcoming adaptive layouts for large screens help devs bring their apps to life with less code, powerful tools, and intuitive APIs. Check out the Wear OS and Large Screen galleries, where you can find UX inspiration and design guidance tailored to your type of app.




Behind the scenes, with Gemini Nano and AICore

With all of the excitement around generative AI, it could be daunting to know where to start. So in today's show, we're taking you behind the scenes with Gemini Nano, Google's most efficient model built for on-device tasks, and AICore, Android's system service for on-device foundation models. And we're spotlighting how the team that builds the Recorder app used Gemini Nano to help summarize users' voice memos on-device and with privacy in mind. And here's the best part: the team built the feature in a short time with only a small number of engineers.




Now in Android

We celebrated the 100th episode of Now in Android, covering the latest developer news, including:




And that's a wrap on another episode of our quarterly show, #TheAndroidShow. But the conversation continues on YouTube, X and LinkedIn: tell us your favorite part, or what you'd like us to dive into next time on our quarterly episode. And before we sign off, you can watch the full playlist, with the latest in Android developer news, here.

07 Mar 2024 6:00pm GMT

06 Mar 2024

feedAndroid Developers Blog

Designing your account deletion experience with users in mind

Posted by Tatiana van Maaren - Global T&S Partnerships Lead, Privacy & Security, May Smith - Product Manager, and Anita Issagholyan - Policy Lead

With millions of developers relying on our platform, Google Play is committed to keeping our ecosystem safe for everyone. That's why, in addition to our ongoing investments in app privacy and security, we also continuously update our policies to respond to new challenges and user expectations.

For example, we recently introduced a new account deletion policy with required disclosures within the Data Safety section on the Play Store. Deleting an account should be as easy as creating one, so the new policy requires developers to provide information and web resources that help users to manage their data and understand an app's deletion practices.

To help you build trust and design a user-friendly experience that helps meet our policy requirements, consider these 5 best practices when implementing your account deletion solution.

1. Make it seamless

Users prefer a simple and straightforward account deletion flow. Although users know that more steps may follow (such as authentication) navigating multiple screens before the deletion page can be a significant barrier and create negative feelings for the user. Consider providing your account deletion option on an account settings page or place a prominent button on the home screen. Design the flow with discoverability in mind by taking the user directly to the deletion process.

2. Allow automatic deletion

Users feel that if they can create an account without talking to a customer service agent, they should be able to delete their account online, too. If automation is not on your roadmap just yet, consider a step-by-step deletion request form or a dedicated page to connect users with customer support.

3. Offer guidance and explain potential implications

Users delete their accounts for various reasons, some of which may be better resolved another way. Early in your deletion flow, point your users toward a Help Center article that explains how your deletion process works in simple terms, including any potential consequences. For example, make it clear if your users will need to pause their payment method before deleting the account, or download any account data they want to keep. Helping your users understand the process in advance can prevent accidental deletions. For those who do change their minds, consider offering a way to recover their accounts within a reasonable timeframe.

Here's an example of how Play Store Developer, Canva, has designed the in-app deletion flow to explain potential consequences of account deletion:

user journey on the Canva app in three panels
User journey on the Canva app

"User data privacy has always been important for us. We've always been intentional about our approach in optimizing the Canva app so our users can have more transparency and control over their data. We're welcoming these new requirements from the Play store as we know this new flow will elevate users' trust in our app and benefit our business in the long term." - Will Currie, Software Engineer, Canva

4. Confirm account deletion

Sometimes users misunderstand whether the account itself or just data collected by the app was deleted in the deletion process. Users often think that the data your app stored in the cloud will automatically be deleted at the same time as account deletion. Since it may take time to remove account data from internal company systems or comply with data retention requirements in different regions, transparency about the process can help you maintain trust in your brand and make it more likely for users to return in the future.

Here's SYBO Games, has designed their web deletion in-app deletion flow:

user journey on the Sybo Games web resource in four panels
User journey on the SYBO Games web resource

"We are always striving to ensure that our games provide a fun user experience, built on a solid data protection foundation. When we learned about the new account deletion update on Google Play, we thought this was a great step forward to ensure that the entire developer ecosystem optimizes for user safety. We encourage developers to carve out time to embrace these improvements and prioritize regular privacy check-ins." - Elizabeth Ann Schweitzer, Games Compliance Manager, SYBO Games

5. Don't forget user engagement

This is a great opportunity to connect with your users at a critical moment. Make sure users who have uninstalled your app can easily remove their accounts through a web resource without needing to reinstall the app. You can also invite them to complete a survey or provide feedback on their decision.

Protecting users' data is essential for building trust and loyalty. By updating the Data Safety section on Google Play and continuing to optimize user experience for account deletion, you can strengthen trust in your company while striving for the highest level of user data protection.


Thank you for your continued collaboration and feedback in developing this data transparency feature and in helping make Google Play safe for all.

06 Mar 2024 5:30pm GMT