16 Feb 2024

feedAndroid Developers Blog

The First Developer Preview of Android 15

Posted by Dave Burke, VP of Engineering

Android 14 logo

We're releasing the first Developer Preview of Android 15 today so you, our developers, can collaborate with us to build a better Android.

Android 15 continues our work to build a platform that helps improve your productivity while giving you new capabilities to produce superior media experiences, minimize battery impact, maximize smooth app performance, and protect user privacy and security all on the most diverse lineup of devices out there.

Android enables your apps to take advantage of premium device hardware, including high-end camera capabilities, powerful GPUs, dazzling displays, and AI processing. The demand for large-screen devices, including tablets, foldables and flippables, continues to grow, offering an opportunity to reach high-value users. Also, Android is committed to providing tooling and libraries to help your apps take advantage of the latest advances in AI.

Your feedback on the Android 15 Developer Preview and QPR beta program plays a key role in helping Android continuously improve. The Android 15 developer site has more information about the preview, including downloads for Pixel and detailed documentation about changes. This preview is just the beginning, and we'll have lots more to share as we move through the release cycle. Thank you in advance for your help in making Android a platform that works for everyone.

Protecting user privacy and security

Android is constantly working to create solutions that maximize user privacy and security.

Privacy Sandbox on Android

Android 15 brings Android AD Services up to extension level 10, incorporating the latest version of the Privacy Sandbox on Android, part of our work to develop new technologies that improve user privacy and enable effective, personalized advertising experiences for mobile apps. Our website has more about the Privacy Sandbox on Android developer preview and beta programs to help you get started.

Health Connect

Android 15 integrates Android 14 extensions 10 around Health Connect by Android, a secure and centralized platform to manage and share app-collected health and fitness data. This update adds support for new data types across fitness, nutrition, and more.

File integrity

Android 15's FileIntegrityManager includes new APIs that tap into the power of the fs-verity feature in the Linux kernel. With fs-verity, files can be protected by custom cryptographic signatures, helping you ensure they haven't been tampered with or corrupted. This leads to enhanced security, protecting against potential malware or unauthorized file modifications that could compromise your app's functionality or data.

Partial screen sharing

Android 15 supports partial screen sharing so users can share or record just an app window rather than the entire device screen. This feature, enabled first in Android 14 QPR2, includes MediaProjection callbacks that allow your app to customize the partial screen sharing experience. Note that user consent is now required for each MediaProjection capture session.

Supporting creators

Android continues its work to give you access to tools and hardware to support creators to bring their vision to life on Android.

In-app Camera Controls

Android 15 adds new extensions for more control over the camera hardware and its algorithms on supported devices:

  • Advanced flash strength adjustments enabling precise control of flash intensity in both SINGLE and TORCH modes while capturing images.

Virtual MIDI 2.0 Devices

Android 13 added support for connecting to MIDI 2.0 devices via USB, which communicate using Universal MIDI Packets (UMP). Android 15 extends UMP support to virtual MIDI apps, enabling composition apps to control synthesizer apps as a virtual MIDI 2.0 device just like they would with an USB MIDI 2.0 device.

Performance and quality

Android continues its focus on helping you improve the quality of your apps. Much of this focus is around tooling and libraries, including Jetpack Compose, Android Studio, and more.

Dynamic Performance

Android 15 continues our investment in the Android Dynamic Performance Framework (ADPF), a set of APIs that allow games and performance intensive apps to interact more directly with power and thermal systems of Android devices. On supported devices, Android 15 will add new ADPF capabilities:

  • A power-efficiency mode for hint sessions to indicate that their associated threads should prefer power saving over performance, great for long-running background workloads.
  • GPU and CPU work durations can both be reported in hint sessions, allowing the system to adjust CPU and GPU frequencies together to best meet workload demands.

To learn more about how to use ADPF in your apps and games, head over to the documentation.

Developer Productivity

Android 15 continues to add OpenJDK APIs, including quality-of-life improvements around NIO buffers, streams, security, and more. These APIs are updated on over a billion devices running Android 12+ through Google Play System updates, so you can target the latest programming features.

App compatibility

Image of Android 15 Development timeline, indicating we are on time with Developer Previews in February


To give you more time to plan for app compatibility work, we're letting you know our Platform Stability milestone well in advance.

At this milestone, we'll deliver final SDK/NDK APIs and also final internal APIs and app-facing system behaviors. We're expecting to reach Platform Stability in June 2024, and from that time you'll have several months before the official release to do your final testing. The release timeline details are here.

Get started with Android 15

The Developer Preview has everything you need to try the Android 15 features, test your apps, and give us feedback. You can get started today by flashing a system image onto a Pixel 6, 7, or 8 series device, along with the Pixel Fold and Pixel Tablet. If you don't have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio.

For the best development experience with Android 15, we recommend that you use the latest preview of Android Studio Jellyfish (or more recent Jellyfish+ versions). Once you're set up, here are some of the things you should do:

  • Try the new features and APIs - your feedback is critical during the early part of the developer preview. Report issues in our tracker on the feedback page.
  • Test your current app for compatibility - learn whether your app is affected by changes in Android 15; install your app onto a device or emulator running Android 15 and extensively test it.

We'll update the preview system images and SDK regularly throughout the Android 15 release cycle. This initial preview release is for developers only and not intended for daily or consumer use, so we're making it available by manual download only. Once you've manually installed a preview build, you'll automatically get future updates over-the-air for all later previews and Betas. Read more here.

If you intend to move from the Android 14 QPR Beta program to the Android 15 Developer Preview program and don't want to have to wipe your device, we recommend that you move to Developer Preview 1 now. Otherwise you may run into time periods where the Android 14 Beta will have a more recent build date which will prevent you from going directly to the Android 15 Developer Preview without doing a data wipe.

As we reach our Beta releases, we'll be inviting consumers to try Android 15 as well, and we'll open up enrollment for the Android Beta program at that time. For now, please note that the Android Beta program is not yet available for Android 15.

For complete information, visit the Android 15 developer site.


Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.

16 Feb 2024 7:00pm GMT

14 Feb 2024

feedAndroid Developers Blog

#WeArePlay | How two sea turtle enthusiasts are revolutionizing marine conservation

Posted by Leticia Lago - Developer Marketing

When environmental science student Caitlin returned home from a trip monitoring sea turtles in Western Australia, she was inspired to create a conservation tool that could improve tracking of the species. She connected with a French developer and fellow marine life enthusiast Nicolas to design their app We Spot Turtles!, allowing anyone to support tracking efforts by uploading pictures of them spotted in the wild.

Caitlin and Nicolas shared their journey in our latest film for #WeArePlay, which showcases the amazing stories behind apps and games on Google Play. We caught up with the pair to find out more about their passion and how they are making strides towards advancing sea turtle conservation.

Tell us about how you both got interested in sea turtle conservation?

Caitlin: A few years ago, I did a sea turtle monitoring program for the Department of Biodiversity, Conservation and Attractions in Western Australia. It was probably one of the most magical experiences of my life. After that, I decided I only really wanted to work with sea turtles.

Nicolas: In 2010, in French Polynesia, I volunteered with a sea turtle protection project. I was moved by the experience, and when I came back to France, I knew I wanted to use my tech background to create something inspired by the trip.

How did these experiences lead you to create We Spot Turtles!?

Caitlin: There are seven species of sea turtle, and all are critically endangered. Or rather there's not enough data on them to inform an accurate endangerment status. This means the needs of the species are going unmet and sea turtles are silently going extinct. Our inspiration is essentially to better track sea turtles so that conservation can be improved.

Nicolas: When I returned to France after monitoring sea turtles, I knew I wanted to make an app inspired by my experience. However, I had put the project on hold for a while. Then, when a friend sent me Caitlin's social media post looking for a developer for a sea turtle conservation app, it re-ignited my inspiration, and we teamed up to make it together.

close up image of a turtle resting in a reef underwater

What does We Spot Turtles! do?

Caitlin: Essentially, members of the public upload images of sea turtles they spot - and even get to name them. Then, the app automatically geolocates, giving us a date and timestamp of when and where the sea turtle was located. This allows us to track turtles and improve our conservation efforts.

How do you use artificial intelligence in the app?

Caitlin: The advancements in AI in recent years have given us the opportunity to make a bigger impact than we would have been able to otherwise. The machine learning model that Nicolas created uses the facial scale and pigmentations of the turtles to not only identify its species, but also to give that sea turtle a unique code for tracking purposes. Then, if it is photographed by someone else in the future, we can see on the app where it's been spotted before.

How has Google Play supported your journey?

Caitlin: Launching our app on Google Play has allowed us to reach a global audience. We now have communities in Exmouth in Western Australia, Manly Beach in Sydney, and have 6 countries in total using our app already. Without Google Play, we wouldn't have the ability to connect on such a global scale.

Nicolas: I'm a mobile application developer and I use Google's Flutter framework. I knew Google Play was a good place to release our title as it easily allows us to work on the platform. As a result, we've been able to make the app great.

Photo pf Caitlin and Nicolas on the bach in Australia at sunset. Both are kneeling in the sand. Caitlin is using her phone to identify something in the distance, and gesturing to Nicolas who is looking in the same direction

What do you hope to achieve with We Spot Turtles!?

Caitlin: We Spot Turtles! puts data collection in the hands of the people. It's giving everyone the opportunity to make an impact in sea turtle conservation. Because of this, we believe that we can massively alter and redefine conservation efforts and enhance people's engagement with the natural world.

What are your plans for the future?

Caitlin: Nicolas and I have some big plans. We want to branch out into other species. We'd love to do whale sharks, birds, and red pandas. Ultimately, we want to achieve our goal of improving the conservation of various species and animals around the world.


Discover other inspiring app and game founders featured in #WeArePlay.



How useful did you find this blog post?

14 Feb 2024 5:00pm GMT

13 Feb 2024

feedAndroid Developers Blog

Cloud photos now available in the Android photo picker

Posted by Roxanna Aliabadi Walker - Product Manager

Available now with Google Photos

Our photo picker has always been the gateway to your local media library, providing a secure, date-sorted interface for users to grant apps access to selected images and videos. But now, we're taking it a step further by integrating cloud photos from your chosen cloud media app directly into the photo picker experience.

Moving image of the photo picker access

Unifying your media library

Backed-up photos, also known as "cloud photos," will now be merged with your local ones in the photo picker, eliminating the need to switch between apps. Additionally, any albums you've created in your cloud storage app will be readily accessible within the photo picker's albums tab. If your cloud media provider has a concept of "favorites," they will be showcased prominently within the albums tab of the photo picker for easy access. This feature is currently rolling out with the February Google System Update to devices running Android 12 and above.

Available now with Google Photos, but open to all

Google Photos is already supporting this new feature, and our APIs are open to any cloud media app that qualifies for our pilot program. Our goal is to make accessing your lifetime of memories effortless, regardless of the app you prefer.

The Android photo picker will attempt to auto-select a cloud media app for you, but you can change or remove your selected cloud media app at any time from photo picker settings.

Image of Cloud media settings in photo picker settings

Migrate today for an enhanced, frictionless experience

The Android photo picker substantially reduces friction by not requiring any runtime permissions. If you switch from using a custom photo picker to the Android photo picker, you can offer this enhanced experience with cloud photos to your users, as well as reduce or entirely eliminate the overhead involved with acquiring and managing access to photos on the device. (Note that apps without a need for persistent and/or broad scale access to photos - for example - to set a profile picture, must adopt the Android photo picker in lieu of any sensitive file permissions to adhere to Google Play policy).

The photo picker has been backported to Android 4.4 to make it easy to migrate without needing to worry about device compatibility. Access to cloud content will only be available for users running Android 12 and higher, but developers do not need to consider this when implementing the photo picker into their apps. To use the photo picker in your app, update the ActivityX dependency to version 1.7.x or above and add the following code snippet:

// Registers a photo picker activity launcher in single-select mode.
val pickMedia = registerForActivityResult(PickVisualMedia()) { uri ->
    // Callback is invoked after the user selects a media item or closes the
    // photo picker.
    if (uri != null) {
        Log.d("PhotoPicker", "Selected URI: $uri")
    } else {
        Log.d("PhotoPicker", "No media selected")
    }
}


// Launch the photo picker and let the user choose images and videos.
pickMedia.launch(PickVisualMediaRequest(PickVisualMedia.ImageAndVideo))

// Launch the photo picker and let the user choose only images.
pickMedia.launch(PickVisualMediaRequest(PickVisualMedia.ImageOnly))

// Launch the photo picker and let the user choose only videos.
pickMedia.launch(PickVisualMediaRequest(PickVisualMedia.VideoOnly))

More customization options are listed in our developer documentation.

13 Feb 2024 5:00pm GMT

02 Feb 2024

feedPlanet Maemo

libSDL2 and VVVVVV for the Wii

Just a quick update on something that I've been working on in my free time.

I recently refurbished my old Nintendo Wii, and for some reason I cannot yet explain (not even to myself) I got into homebrew programming and decided to port libSDL (the 2.x version -- a 1.x port already existed) to it. The result of this work is already available via the devkitPro distribution, and although I'm sure there are still many bugs, it's overall quite usable.

In order to prove it, I ported the game VVVVVV to the Wii:

During the process I had to fix quite a few bugs in my libSDL port and in a couple of other libraries used by VVVVVV, which will hopefully will make it easier to port more games. There's still an issue that bothers me, where the screen resolution seems to be not totally supported by my TV (or is it the HDMI adaptor's fault?), resulting in a few pixels being cut at the top and at the bottom of the screen. But unless you are a perfectionist, it's a minor issue.

In case you have a Wii to spare, or wouldn't mind playing on the Dolphin emulator, here's the link to the VVVVVV release. Have fun! :-)

0 Add to favourites0 Bury

02 Feb 2024 5:50pm GMT

30 Nov 2023

feedPlanet Maemo

Maemo Community e.V. - Invitation to the General Assembly 2023

Maemo Community e.V. - Invitation to the General Assembly 2023

Dear Member,

The meeting will be held on Friday, December 29th 2023 at 12:00 CET on irc.libera.chat channel #maemo-meeting.

Unless any further issues are raised, the agenda includes the following topics:
1. Welcome by the Chairman of the Board
2. Determination of the proper convocation and the quorum of the General Assembly
3. Acceptance of the annual report for the fiscal year and actions of the Executive
6. Any other business

Requests for additions to the agenda must be submitted to the Board in writing one week prior to the meeting (§ 9.2 of the Statutes).

On Behalf of the Maemo Council, Jussi Ohenoja

0 Add to favourites0 Bury

30 Nov 2023 8:52am GMT

15 Nov 2023

feedPlanet Maemo

stb_image_resize2.h – performance

Recently there was an large rework to the STB single-file image_resize library (STBIR) bumping it to 2.0. While the v1 was really slow and merely usable if you needed to quickly get some code running, the 2.0 rewrite claims to be more considerate of performance by using SIMD. So lets put it to a test.

As references, I chose the moderately optimized C only implementation of Ogre3D and the highly optimized SIMD implementation in OpenCV.

Below you find time to scale a 1024x1024px byte image to 512x512px. All libraries were set to linear interpolation. The time is the accumulated time for 200 runs.

RGB RGBA
Ogre3D 14.1.2 660 ms 668 ms
STBIR 2.01 632 ms 690 ms
OpenCV 4.8 245 ms 254 ms

For the RGBA test, STIBIR was set to the STBIR_4CHANNEL pixel layout. All libraries were compiled with -O2 -msse. Additionally OpenCV could dispatch AVX2 code. Enabling AVX2 with STBIR actually decreased performance.

Note that while STBIR has no performance advantage over a C only implementation for the simple resizing case, it offers some neat features if you want to handle SRGB data or non-premultiplied alpha.

0 Add to favourites0 Bury

15 Nov 2023 1:50pm GMT

18 Sep 2022

feedPlanet Openmoko

Harald "LaF0rge" Welte: Deployment of future community TDMoIP hub

I've mentioned some of my various retronetworking projects in some past blog posts. One of those projects is Osmocom Community TDM over IP (OCTOI). During the past 5 or so months, we have been using a number of GPS-synchronized open source icE1usb interconnected by a new, efficient but strill transparent TDMoIP protocol in order to run a distributed TDM/PDH network. This network is currently only used to provide ISDN services to retronetworking enthusiasts, but other uses like frame relay have also been validated.

So far, the central hub of this OCTOI network has been operating in the basement of my home, behind a consumer-grade DOCSIS cable modem connection. Given that TDMoIP is relatively sensitive to packet loss, this has been sub-optimal.

Luckily some of my old friends at noris.net have agreed to host a new OCTOI hub free of charge in one of their ultra-reliable co-location data centres. I'm already hosting some other machines there for 20+ years, and noris.net is a good fit given that they were - in their early days as an ISP - the driving force in the early 90s behind one of the Linux kernel ISDN stracks called u-isdn. So after many decades, ISDN returns to them in a very different way.

Side note: In case you're curious, a reconstructed partial release history of the u-isdn code can be found on gitea.osmocom.org

But I digress. So today, there was the installation of this new OCTOI hub setup. It has been prepared for several weeks in advance, and the hub contains two circuit boards designed entirely only for this use case. The most difficult challenge was the fact that this data centre has no existing GPS RF distribution, and the roof is ~ 100m of CAT5 cable (no fiber!) away from the roof. So we faced the challenge of passing the 1PPS (1 pulse per second) signal reliably through several steps of lightning/over-voltage protection into the icE1usb whose internal GPS-DO serves as a grandmaster clock for the TDM network.

The equipment deployed in this installation currently contains:

For more details, see this wiki page and this ticket

Now that the physical deployment has been made, the next steps will be to migrate all the TDMoIP links from the existing user base over to the new hub. We hope the reliability and performance will be much better than behind DOCSIS.

In any case, this new setup for sure has a lot of capacity to connect many more more users to this network. At this point we can still only offer E1 PRI interfaces. I expect that at some point during the coming winter the project for remote TDMoIP BRI (S/T, S0-Bus) connectivity will become available.

Acknowledgements

I'd like to thank anyone helping this effort, specifically * Sylvain "tnt" Munaut for his work on the RS422 interface board (+ gateware/firmware) * noris.net for sponsoring the co-location * sysmocom for sponsoring the EPYC server hardware

18 Sep 2022 10:00pm GMT

08 Sep 2022

feedPlanet Openmoko

Harald "LaF0rge" Welte: Progress on the ITU-T V5 access network front

Almost one year after my post regarding first steps towards a V5 implementation, some friends and I were finally able to visit Wobcom, a small German city carrier and pick up a lot of decommissioned POTS/ISDN/PDH/SDH equipment, primarily V5 access networks.

This means that a number of retronetworking enthusiasts now have a chance to play with Siemens Fastlink, Nokia EKSOS and DeTeWe ALIAN access networks/multiplexers.

My primary interest is in Nokia EKSOS, which looks like an rather easy, low-complexity target. As one of the first steps, I took PCB photographs of the various modules/cards in the shelf, take note of the main chip designations and started to search for the related data sheets.

The results can be found in the Osmocom retronetworking wiki, with https://osmocom.org/projects/retronetworking/wiki/Nokia_EKSOS being the main entry page, and sub-pages about

In short: Unsurprisingly, a lot of Infineon analog and digital ICs for the POTS and ISDN ports, as well as a number of Motorola M68k based QUICC32 microprocessors and several unknown ASICs.

So with V5 hardware at my disposal, I've slowly re-started my efforts to implement the LE (local exchange) side of the V5 protocol stack, with the goal of eventually being able to interface those V5 AN with the Osmocom Community TDM over IP network. Once that is in place, we should also be able to offer real ISDN Uk0 (BRI) and POTS lines at retrocomputing events or hacker camps in the coming years.

08 Sep 2022 10:00pm GMT

Harald "LaF0rge" Welte: Clock sync trouble with Digium cards and timing cables

If you have ever worked with Digium (now part of Sangoma) digital telephony interface cards such as the TE110/410/420/820 (single to octal E1/T1/J1 PRI cards), you will probably have seen that they always have a timing connector, where the timing information can be passed from one card to another.

In PDH/ISDN (or even SDH) networks, it is very important to have a synchronized clock across the network. If the clocks are drifting, there will be underruns or overruns, with associated phase jumps that are particularly dangerous when analog modem calls are transported.

In traditional ISDN use cases, the clock is always provided by the network operator, and any customer/user side equipment is expected to synchronize to that clock.

So this Digium timing cable is needed in applications where you have more PRI lines than possible with one card, but only a subset of your lines (spans) are connected to the public operator. The timing cable should make sure that the clock received on one port from the public operator should be used as transmit bit-clock on all of the other ports, no matter on which card.

Unfortunately this decades-old Digium timing cable approach seems to suffer from some problems.

bursty bit clock changes until link is up

The first problem is that downstream port transmit bit clock was jumping around in bursts every two or so seconds. You can see an oscillogram of the E1 master signal (yellow) received by one TE820 card and the transmit of the slave ports on the other card at https://people.osmocom.org/laforge/photos/te820_timingcable_problem.mp4

As you can see, for some seconds the two clocks seem to be in perfect lock/sync, but in between there are periods of immense clock drift.

What I'd have expected is the behavior that can be seen at https://people.osmocom.org/laforge/photos/te820_notimingcable_loopback.mp4 - which shows a similar setup but without the use of a timing cable: Both the master clock input and the clock output were connected on the same TE820 card.

As I found out much later, this problem only occurs until any of the downstream/slave ports is fully OK/GREEN.

This is surprising, as any other E1 equipment I've seen always transmits at a constant bit clock irrespective whether there's any signal in the opposite direction, and irrespective of whether any other ports are up/aligned or not.

But ok, once you adjust your expectations to this Digium peculiarity, you can actually proceed.

clock drift between master and slave cards

Once any of the spans of a slave card on the timing bus are fully aligned, the transmit bit clocks of all of its ports appear to be in sync/lock - yay - but unfortunately only at the very first glance.

When looking at it for more than a few seconds, one can see a slow, continuous drift of the slave bit clocks compared to the master :(

Some initial measurements show that the clock of the slave card of the timing cable is drifting at about 12.5 ppb (parts per billion) when compared against the master clock reference.

This is rather disappointing, given that the whole point of a timing cable is to ensure you have one reference clock with all signals locked to it.

The work-around

If you are willing to sacrifice one port (span) of each card, you can work around that slow-clock-drift issue by connecting an external loopback cable. So the master card is configured to use the clock provided by the upstream provider. Its other ports (spans) will transmit at the exact recovered clock rate with no drift. You can use any of those ports to provide the clock reference to a port on the slave card using an external loopback cable.

In this setup, your slave card[s] will have perfect bit clock sync/lock.

Its just rather sad that you need to sacrifice ports just for achieving proper clock sync - something that the timing connectors and cables claim to do, but in reality don't achieve, at least not in my setup with the most modern and high-end octal-port PCIe cards (TE820).

08 Sep 2022 10:00pm GMT