02 Feb 2026

feedTalkAndroid

You won’t believe these 12 gripping Harlan Coben series are all on Netflix

If you thought Harlan Coben's world of twisty mysteries was reserved for just a few special nights a…

02 Feb 2026 7:30am GMT

01 Feb 2026

feedTalkAndroid

NASA reveals: These sci-fi movies are shockingly close to real science

Space travel, alien dialogues, fiddling with human DNA-science fiction loves to shatter the limits of the impossible. Yet,…

01 Feb 2026 4:30pm GMT

All-electric city car under €20,000: what this shock move means for drivers

Think the idea of driving off in a sparkling new all-electric city car for less than €20,000 sounds…

01 Feb 2026 7:30am GMT

29 Jan 2026

feedAndroid Developers Blog

Accelerating your insights with faster, smarter monetization data and recommendations

Posted by Phalene Gowling, Product Manager, Google Play

To build a thriving business on Google Play, you need more than just data - you need a clear path to action. Today, we're announcing a suite of upgrades to the Google Play Console and beyond, giving you greater visibility into your financial performance and specific, data-backed steps to improve it.

From new, actionable recommendations to more granular sales reporting, here's how we're helping you maximize your ROI.

New: Monetization insights and recommendations
Launch Status: Rolling out today

The Monetize with Play overview page is designed to be your ultimate command center. Today, we are upgrading it with a new dynamic insights section designed to give you a clearer view of your revenue drivers.


This new insights carousel highlights the visible and invisible value Google Play delivers to your bottom line - including recovered revenue. Alongside these insights, you can now track these critical signals alongside your core performance metrics:

But we aren't just showing you the data - we're helping you act on it. Starting today, Play Console will surface customized, actionable recommendations. If there are relevant opportunities - for example, a high churn rate - we will suggest specific, high-impact steps to help you reach your next monetization goal. Recommendations include effort levels and estimated ROI (where available), helping you prioritize your roadmap based on actual business value. Learn more.



Granular visibility: Sales Channel reporting
Launch Status: Recently launched

We recently rolled out new Sales Channel data in your financial reporting. This allows you to attribute revenue to specific surfaces - including your app, the Play Store, and platforms like Google Play Games on PC.

For native-PC game developers and media & entertainment subscription businesses alike, this granularity allows you to calculate the precise ROI of your cross-platform investments and understand exactly which channels are driving your growth. Learn more.



Operational efficiency: The Orders API
Launch Status: Available now

The Orders API provides programmatic access to one-time and recurring order transaction details. If you haven't integrated it yet, this API allows you to ingest real-time data directly into your internal dashboards for faster reconciliation and improved customer support.

Feedback so far has been overwhelmingly positive:

Level Infinite (Tencent) says the API "works so well that we want every app to use it."

Continuous improvements towards objective-led reporting

You've told us that the biggest challenge isn't just accessing data, but connecting the dots across different metrics to see the full picture. We're enhancing reporting that goes beyond data dumps to provide straightforward, actionable insights that help you reach business objectives faster.

Our goal is to create a more cohesive product experience centered around your objectives. By shifting from static reporting to dynamic, goal-orientated tools, we're making it easier to track and optimize for revenue, conversion rates, and churn. These updates are just the beginning of a transformation designed to help you turn data into measurable growth.









29 Jan 2026 5:00pm GMT

28 Jan 2026

feedAndroid Developers Blog

How Automated Prompt Optimization Unlocks Quality Gains for ML Kit’s GenAI Prompt API

Posted by Chetan Tekur, PM at AI Innovation and Research, Chao Zhao, SWE at AI Innovation and Research, Paul Zhou, Prompt Quality Lead at GCP Cloud AI and Industry Solutions, and Caren Chang, Developer Relations Engineer at Android


To further help bring your ML Kit Prompt API use cases to production, we are excited to announce Automated Prompt Optimization (APO) targeting On-Device models on Vertex AI. Automated Prompt Optimization is a tool that helps you automatically find the optimal prompt for your use cases.

The era of On-Device AI is no longer a promise-it is a production reality. With the release of Gemini Nano v3, we are placing unprecedented language understanding and multimodal capabilities directly into the palms of users. Through the Gemini Nano family of models, we have wide coverage of supported devices across the Android Ecosystem. But for developers building the next generation of intelligent apps, access to a powerful model is only step one. The real challenge lies in customization: How do you tailor a foundation model to expert-level performance for your specific use case without breaking the constraints of mobile hardware?

In the server-side world, the larger LLMs tend to be highly capable and require less domain adaptation. Even when needed, more advanced options such as LoRA (Low-Rank Adaptation) fine-tuning can be feasible options. However, the unique architecture of Android AICore prioritizes a shared, memory-efficient system model. This means that deploying custom LoRA adapters for every individual app comes with challenges on these shared system services.

But there is an alternate path that can be equally impactful. By leveraging Automated Prompt Optimization (APO) on Vertex AI, developers can achieve quality approaching fine-tuning, all while working seamlessly within the native Android execution environment. By focusing on superior system instruction, APO enables developers to tailor model behavior with greater robustness and scalability than traditional fine-tuning solutions.

Note: Gemini Nano V3 is a quality optimized version of the highly acclaimed Gemma 3N model. Any prompt optimizations that are made on the open source Gemma 3N model will apply to Gemini Nano V3 as well. On supported devices, ML Kit GenAI APIs leverage the nano-v3 model to maximize the quality for Android Developers.


Automated Prompt Optimization (APO)

To further help bring your ML Kit Prompt API use cases to production, we are excited to announce Automated Prompt Optimization (APO) targeting On-Device models on Vertex AI. Automated Prompt Optimization is a tool that helps you automatically find the optimal prompt for your use cases.

The era of On-Device AI is no longer a promise-it is a production reality. With the release of Gemini Nano v3, we are placing unprecedented language understanding and multimodal capabilities directly into the palms of users. Through the Gemini Nano family of models, we have wide coverage of supported devices across the Android Ecosystem. But for developers building the next generation of intelligent apps, access to a powerful model is only step one. The real challenge lies in customization: How do you tailor a foundation model to expert-level performance for your specific use case without breaking the constraints of mobile hardware?

In the server-side world, the larger LLMs tend to be highly capable and require less domain adaptation. Even when needed, more advanced options such as LoRA (Low-Rank Adaptation) fine-tuning can be feasible options. However, the unique architecture of Android AICore prioritizes a shared, memory-efficient system model. This means that deploying custom LoRA adapters for every individual app comes with challenges on these shared system services.

But there is an alternate path that can be equally impactful. By leveraging Automated Prompt Optimization (APO) on Vertex AI, developers can achieve quality approaching fine-tuning, all while working seamlessly within the native Android execution environment. By focusing on superior system instruction, APO enables developers to tailor model behavior with greater robustness and scalability than traditional fine-tuning solutions.

Note: Gemini Nano V3 is a quality optimized version of the highly acclaimed Gemma 3N model. Any prompt optimizations that are made on the open source Gemma 3N model will apply to Gemini Nano V3 as well. On supported devices, ML Kit GenAI APIs leverage the nano-v3 model to maximize the quality for Android Developers


APO treats the prompt not as a static text, but as a programmable surface that can be optimized. It leverages server-side models (like Gemini Pro and Flash) to propose prompts, evaluate variations and find the optimal one for your specific task. This process employs three specific technical mechanisms to maximize performance:

  1. Automated Error Analysis: APO analyzes error patterns from training data to Automatically identify specific weaknesses in the initial prompt.

  2. Semantic Instruction Distillation: It analyzes massive training examples to distill the "true intent" of a task, creating instructions that more accurately reflect the real data distribution.

  3. Parallel Candidate Testing: Instead of testing one idea at a time, APO generates and tests numerous prompt candidates in parallel to identify the global maximum for quality.


Why APO Can Approach Fine Tuning Quality

It is a common misconception that fine-tuning always yields better quality than prompting. For modern foundation models like Gemini Nano v3, prompt engineering can be impactful by itself:

  • Preserving General capabilities: Fine-tuning ( PEFT/LoRA) forces a model's weights to over-index on a specific distribution of data. This often leads to "catastrophic forgetting," where the model gets better at your specific syntax but worse at general logic and safety. APO leaves the weights untouched, preserving the capabilities of the base model.

  • Instruction Following & Strategy Discovery: Gemini Nano v3 has been rigorously trained to follow complex system instructions. APO exploits this by finding the exact instruction structure that unlocks the model's latent capabilities, often discovering strategies that might be hard for human engineers to find.

To validate this approach, we evaluated APO across diverse production workloads. Our validation has shown consistent 5-8% accuracy gains across various use cases.Across multiple deployed on-device features, APO provided significant quality lifts.



Use Case

Task Type

Task Description

Metric

APO Improvement

Topic classification

Text classification

Classify a news article into topics such as finance, sports, etc

Accuracy

+5%

Intent classification

Text classification

Classify a customer service query into intents

Accuracy

+8.0%

Webpage translation

Text translation

Translate a webpage from English to a local language

BLEU

+8.57%

A Seamless, End-to-End Developer Workflow

It is a common misconception that fine-tuning always yields better quality than prompting. For modern foundation models like Gemini Nano v3, prompt engineering can be impactful by itself:

  • Preserving General capabilities: Fine-tuning ( PEFT/LoRA) forces a model's weights to over-index on a specific distribution of data. This often leads to "catastrophic forgetting," where the model gets better at your specific syntax but worse at general logic and safety. APO leaves the weights untouched, preserving the capabilities of the base model.

  • Instruction Following & Strategy Discovery: Gemini Nano v3 has been rigorously trained to follow complex system instructions. APO exploits this by finding the exact instruction structure that unlocks the model's latent capabilities, often discovering strategies that might be hard for human engineers to find.

To validate this approach, we evaluated APO across diverse production workloads. Our validation has shown consistent 5-8% accuracy gains across various use cases.Across multiple deployed on-device features, APO provided significant quality lifts.

Conclusion

The release of Automated Prompt Optimization (APO) marks a turning point for on-device generative AI. By bridging the gap between foundation models and expert-level performance, we are giving developers the tools to build more robust mobile applications. Whether you are just starting with Zero-Shot Optimization or scaling to production with Data-Driven refinement, the path to high-quality on-device intelligence is now clearer. Launch your on-device use cases to production today with ML Kit's Prompt API and Vertex AI's Automated Prompt Optimization.

Relevant links:
















28 Jan 2026 5:00pm GMT

27 Jan 2026

feedAndroid Developers Blog

The Embedded Photo Picker

Posted by Roxanna Aliabadi Walker, Product Manager and Yacine Rezgui, Developer Relations Engineer

The Embedded Photo Picker: A more seamless way to privately request photos and videos in your app



Get ready to enhance your app's user experience with an exciting new way to use the Android photo picker! The new embedded photo picker offers a seamless and privacy-focused way for users to select photos and videos, right within your app's interface. Now your app can get all the same benefits available with the photo picker, including access to cloud content, integrated directly into your app's experience.

Why embedded?

We understand that many apps want to provide a highly integrated and seamless experience for users when selecting photos or videos. The embedded photo picker is designed to do just that, allowing users to quickly access their recent photos without ever leaving your app. They can also explore their full library in their preferred cloud media provider (e.g., Google Photos), including favorites, albums and search functionality. This eliminates the need for users to switch between apps or worry about whether the photo they want is stored locally or in the cloud.

Seamless integration, enhanced privacy

With the embedded photo picker, your app doesn't need access to the user's photos or videos until they actually select something. This means greater privacy for your users and a more streamlined experience. Plus, the embedded photo picker provides users with access to their entire cloud-based media library, whereas the standard photo permission is restricted to local files only.

The embedded photo picker in Google Messages

Google Messages showcases the power of the embedded photo picker. Here's how they've integrated it:
  • Intuitive placement: The photo picker sits right below the camera button, giving users a clear choice between capturing a new photo or selecting an existing one.
  • Dynamic preview: Immediately after a user taps a photo, they see a large preview, making it easy to confirm their selection. If they deselect the photo, the preview disappears, keeping the experience clean and uncluttered.
  • Expand for more content: The initial view is simplified, offering easy access to recent photos. However, users can easily expand the photo picker to browse and choose from all photos and videos in their library, including cloud content from Google Photos.
  • Respecting user choices: The embedded photo picker only grants access to the specific photos or videos the user selects, meaning they can stop requesting the photo and video permissions altogether. This also saves the Messages from needing to handle situations where users only grant limited access to photos and videos.


Implementation

Integrating the embedded photo picker is made easy with the Photo Picker Jetpack library.

Jetpack Compose

First, include the Jetpack Photo Picker library as a dependency.

implementation("androidx.photopicker:photopicker-compose:1.0.0-alpha01")

The EmbeddedPhotoPicker composable function provides a mechanism to include the embedded photo picker UI directly within your Compose screen. This composable creates a SurfaceView which hosts the embedded photo picker UI. It manages the connection to the EmbeddedPhotoPicker service, handles user interactions, and communicates selected media URIs to the calling application.

@Composable
fun EmbeddedPhotoPickerDemo() {
    // We keep track of the list of selected attachments
    var attachments by remember { mutableStateOf(emptyList<Uri>()) }

    val coroutineScope = rememberCoroutineScope()
    // We hide the bottom sheet by default but we show it when the user clicks on the button
    val scaffoldState = rememberBottomSheetScaffoldState(
        bottomSheetState = rememberStandardBottomSheetState(
            initialValue = SheetValue.Hidden,
            skipHiddenState = false
        )
    )

    // Customize the embedded photo picker
    val photoPickerInfo = EmbeddedPhotoPickerFeatureInfo
        .Builder()
        // Set limit the selection to 5 items
        .setMaxSelectionLimit(5)
        // Order the items selection (each item will have an index visible in the photo picker)
        .setOrderedSelection(true)
        // Set the accent color (red in this case, otherwise it follows the device's accent color)
        .setAccentColor(0xFF0000)
        .build()

    // The embedded photo picker state will be stored in this variable
    val photoPickerState = rememberEmbeddedPhotoPickerState(
        onSelectionComplete = {
            coroutineScope.launch {
                // Hide the bottom sheet once the user has clicked on the done button inside the picker
                scaffoldState.bottomSheetState.hide()
            }
        },
        onUriPermissionGranted = {
            // We update our list of attachments with the new Uris granted
            attachments += it
        },
        onUriPermissionRevoked = {
            // We update our list of attachments with the Uris revoked
            attachments -= it
        }
    )

       SideEffect {
        val isExpanded = scaffoldState.bottomSheetState.targetValue == SheetValue.Expanded

        // We show/hide the embedded photo picker to match the bottom sheet state
        photoPickerState.setCurrentExpanded(isExpanded)
    }

    BottomSheetScaffold(
        topBar = {
            TopAppBar(title = { Text("Embedded Photo Picker demo") })
        },
        scaffoldState = scaffoldState,
        sheetPeekHeight = if (scaffoldState.bottomSheetState.isVisible) 400.dp else 0.dp,
        sheetContent = {
            Column(Modifier.fillMaxWidth()) {
                // We render the embedded photo picker inside the bottom sheet
                EmbeddedPhotoPicker(
                    state = photoPickerState,
                    embeddedPhotoPickerFeatureInfo = photoPickerInfo
                )
            }
        }
    ) { innerPadding ->
        Column(Modifier.padding(innerPadding).fillMaxSize().padding(horizontal = 16.dp)) {
            Button(onClick = {
                coroutineScope.launch {
                    // We expand the bottom sheet, which will trigger the embedded picker to be shown
                    scaffoldState.bottomSheetState.partialExpand()
                }
            }) {
                Text("Open photo picker")
            }
            LazyVerticalGrid(columns = GridCells.Adaptive(minSize = 64.dp)) {
                // We render the image using the Coil library
                itemsIndexed(attachments) { index, uri ->
                    AsyncImage(
                        model = uri,
                        contentDescription = "Image ${index + 1}",
                        contentScale = ContentScale.Crop,
                        modifier = Modifier.clickable {
                            coroutineScope.launch {
                                // When the user clicks on the media from the app's UI, we deselect it
                                // from the embedded photo picker by calling the method deselectUri
                                photoPickerState.deselectUri(uri)
                            }
                        }
                    )
                }
            }
        }
    }
}

Views

First, include the Jetpack Photo Picker library as a dependency.

implementation("androidx.photopicker:photopicker:1.0.0-alpha01")

To add the embedded photo picker, you need to add an entry to your layout file.

<view class="androidx.photopicker.EmbeddedPhotoPickerView"
    android:id="@+id/photopicker"
    android:layout_width="match_parent"
    android:layout_height="match_parent" />

And initialize it in your activity/fragment.

// We keep track of the list of selected attachments
private val _attachments = MutableStateFlow(emptyList<Uri>())
val attachments = _attachments.asStateFlow()

private lateinit var picker: EmbeddedPhotoPickerView
private var openSession: EmbeddedPhotoPickerSession? = null

val pickerListener = object EmbeddedPhotoPickerStateChangeListener {
    override fun onSessionOpened (newSession: EmbeddedPhotoPickerSession) {
        openSession = newSession
    }

    override fun onSessionError (throwable: Throwable) {}

    override fun onUriPermissionGranted(uris: List<Uri>) {
        _attachments += uris
    }

    override fun onUriPermissionRevoked (uris: List<Uri>) {
        _attachments -= uris
    }

    override fun onSelectionComplete() {
        // Hide the embedded photo picker as the user is done with the photo/video selection
    }
}

override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)
    setContentView(R.layout.main_view)
    
    //
    // Add the embedded photo picker to a bottom sheet to allow the dragging to display the full photo library
    //

    picker = findViewById(R.id.photopicker)
    picker.addEmbeddedPhotoPickerStateChangeListener(pickerListener)
    picker.setEmbeddedPhotoPickerFeatureInfo(
        // Set a custom accent color
        EmbeddedPhotoPickerFeatureInfo.Builder().setAccentColor(0xFF0000).build()
    )
}

You can call different methods of EmbeddedPhotoPickerSession to interact with the embedded picker.

// Notify the embedded picker of a configuration change
openSession.notifyConfigurationChanged(newConfig)

// Update the embedded picker to expand following a user interaction
openSession.notifyPhotoPickerExpanded(/* expanded: */ true)

// Resize the embedded picker
openSession.notifyResized(/* width: */ 512, /* height: */ 256)

// Show/hide the embedded picker (after a form has been submitted)
openSession.notifyVisibilityChanged(/* visible: */ false)

// Remove unselected media from the embedded picker after they have been
// unselected from the host app's UI
openSession.requestRevokeUriPermission(removedUris)

It's important to note that the embedded photo picker experience is available for users running Android 14 (API level 34) or higher with SDK Extensions 15+. Read more about photo picker device availability.

For enhanced user privacy and security, the system renders the embedded photo picker in a way that prevents any drawing or overlaying. This intentional design choice means that your UX should consider the photo picker's display area as a distinct and dedicated element, much like you would plan for an advertising banner.

If you have any feedback or suggestions, submit tickets to our
issue tracker.



27 Jan 2026 5:00pm GMT

18 Sep 2022

feedPlanet Openmoko

Harald "LaF0rge" Welte: Deployment of future community TDMoIP hub

I've mentioned some of my various retronetworking projects in some past blog posts. One of those projects is Osmocom Community TDM over IP (OCTOI). During the past 5 or so months, we have been using a number of GPS-synchronized open source icE1usb interconnected by a new, efficient but strill transparent TDMoIP protocol in order to run a distributed TDM/PDH network. This network is currently only used to provide ISDN services to retronetworking enthusiasts, but other uses like frame relay have also been validated.

So far, the central hub of this OCTOI network has been operating in the basement of my home, behind a consumer-grade DOCSIS cable modem connection. Given that TDMoIP is relatively sensitive to packet loss, this has been sub-optimal.

Luckily some of my old friends at noris.net have agreed to host a new OCTOI hub free of charge in one of their ultra-reliable co-location data centres. I'm already hosting some other machines there for 20+ years, and noris.net is a good fit given that they were - in their early days as an ISP - the driving force in the early 90s behind one of the Linux kernel ISDN stracks called u-isdn. So after many decades, ISDN returns to them in a very different way.

Side note: In case you're curious, a reconstructed partial release history of the u-isdn code can be found on gitea.osmocom.org

But I digress. So today, there was the installation of this new OCTOI hub setup. It has been prepared for several weeks in advance, and the hub contains two circuit boards designed entirely only for this use case. The most difficult challenge was the fact that this data centre has no existing GPS RF distribution, and the roof is ~ 100m of CAT5 cable (no fiber!) away from the roof. So we faced the challenge of passing the 1PPS (1 pulse per second) signal reliably through several steps of lightning/over-voltage protection into the icE1usb whose internal GPS-DO serves as a grandmaster clock for the TDM network.

The equipment deployed in this installation currently contains:

For more details, see this wiki page and this ticket

Now that the physical deployment has been made, the next steps will be to migrate all the TDMoIP links from the existing user base over to the new hub. We hope the reliability and performance will be much better than behind DOCSIS.

In any case, this new setup for sure has a lot of capacity to connect many more more users to this network. At this point we can still only offer E1 PRI interfaces. I expect that at some point during the coming winter the project for remote TDMoIP BRI (S/T, S0-Bus) connectivity will become available.

Acknowledgements

I'd like to thank anyone helping this effort, specifically * Sylvain "tnt" Munaut for his work on the RS422 interface board (+ gateware/firmware) * noris.net for sponsoring the co-location * sysmocom for sponsoring the EPYC server hardware

18 Sep 2022 10:00pm GMT

08 Sep 2022

feedPlanet Openmoko

Harald "LaF0rge" Welte: Progress on the ITU-T V5 access network front

Almost one year after my post regarding first steps towards a V5 implementation, some friends and I were finally able to visit Wobcom, a small German city carrier and pick up a lot of decommissioned POTS/ISDN/PDH/SDH equipment, primarily V5 access networks.

This means that a number of retronetworking enthusiasts now have a chance to play with Siemens Fastlink, Nokia EKSOS and DeTeWe ALIAN access networks/multiplexers.

My primary interest is in Nokia EKSOS, which looks like an rather easy, low-complexity target. As one of the first steps, I took PCB photographs of the various modules/cards in the shelf, take note of the main chip designations and started to search for the related data sheets.

The results can be found in the Osmocom retronetworking wiki, with https://osmocom.org/projects/retronetworking/wiki/Nokia_EKSOS being the main entry page, and sub-pages about

In short: Unsurprisingly, a lot of Infineon analog and digital ICs for the POTS and ISDN ports, as well as a number of Motorola M68k based QUICC32 microprocessors and several unknown ASICs.

So with V5 hardware at my disposal, I've slowly re-started my efforts to implement the LE (local exchange) side of the V5 protocol stack, with the goal of eventually being able to interface those V5 AN with the Osmocom Community TDM over IP network. Once that is in place, we should also be able to offer real ISDN Uk0 (BRI) and POTS lines at retrocomputing events or hacker camps in the coming years.

08 Sep 2022 10:00pm GMT

Harald "LaF0rge" Welte: Clock sync trouble with Digium cards and timing cables

If you have ever worked with Digium (now part of Sangoma) digital telephony interface cards such as the TE110/410/420/820 (single to octal E1/T1/J1 PRI cards), you will probably have seen that they always have a timing connector, where the timing information can be passed from one card to another.

In PDH/ISDN (or even SDH) networks, it is very important to have a synchronized clock across the network. If the clocks are drifting, there will be underruns or overruns, with associated phase jumps that are particularly dangerous when analog modem calls are transported.

In traditional ISDN use cases, the clock is always provided by the network operator, and any customer/user side equipment is expected to synchronize to that clock.

So this Digium timing cable is needed in applications where you have more PRI lines than possible with one card, but only a subset of your lines (spans) are connected to the public operator. The timing cable should make sure that the clock received on one port from the public operator should be used as transmit bit-clock on all of the other ports, no matter on which card.

Unfortunately this decades-old Digium timing cable approach seems to suffer from some problems.

bursty bit clock changes until link is up

The first problem is that downstream port transmit bit clock was jumping around in bursts every two or so seconds. You can see an oscillogram of the E1 master signal (yellow) received by one TE820 card and the transmit of the slave ports on the other card at https://people.osmocom.org/laforge/photos/te820_timingcable_problem.mp4

As you can see, for some seconds the two clocks seem to be in perfect lock/sync, but in between there are periods of immense clock drift.

What I'd have expected is the behavior that can be seen at https://people.osmocom.org/laforge/photos/te820_notimingcable_loopback.mp4 - which shows a similar setup but without the use of a timing cable: Both the master clock input and the clock output were connected on the same TE820 card.

As I found out much later, this problem only occurs until any of the downstream/slave ports is fully OK/GREEN.

This is surprising, as any other E1 equipment I've seen always transmits at a constant bit clock irrespective whether there's any signal in the opposite direction, and irrespective of whether any other ports are up/aligned or not.

But ok, once you adjust your expectations to this Digium peculiarity, you can actually proceed.

clock drift between master and slave cards

Once any of the spans of a slave card on the timing bus are fully aligned, the transmit bit clocks of all of its ports appear to be in sync/lock - yay - but unfortunately only at the very first glance.

When looking at it for more than a few seconds, one can see a slow, continuous drift of the slave bit clocks compared to the master :(

Some initial measurements show that the clock of the slave card of the timing cable is drifting at about 12.5 ppb (parts per billion) when compared against the master clock reference.

This is rather disappointing, given that the whole point of a timing cable is to ensure you have one reference clock with all signals locked to it.

The work-around

If you are willing to sacrifice one port (span) of each card, you can work around that slow-clock-drift issue by connecting an external loopback cable. So the master card is configured to use the clock provided by the upstream provider. Its other ports (spans) will transmit at the exact recovered clock rate with no drift. You can use any of those ports to provide the clock reference to a port on the slave card using an external loopback cable.

In this setup, your slave card[s] will have perfect bit clock sync/lock.

Its just rather sad that you need to sacrifice ports just for achieving proper clock sync - something that the timing connectors and cables claim to do, but in reality don't achieve, at least not in my setup with the most modern and high-end octal-port PCIe cards (TE820).

08 Sep 2022 10:00pm GMT