10 Feb 2026

feedPlanet Grep

Lionel Dricot: Offpunk 3.0 "A Community is Born" Release

Offpunk 3.0 "A Community is Born" Release

For the last four years, I've been developing Offpunk, a command-line Web, Gemini, and Gopher browser that allows you to work offline. And I've just released version 3.0. It is probably not for everyone, but I use it every single day. I like it, and it seems I'm not alone!

Something wonderful happened on the road leading to 3.0: Offpunk became a true cooperative effort. Offpunk 3.0 is probably the first release that contains code I didn't review line-by-line. Umerdify (by Vincent Jousse), all the translation infrastructure (by the always-present JMCS), and the community packaging effort are areas for which I barely touched the code.

So, before anything else, I want to thank all the people involved for sharing their energy and motivation. I'm very grateful for every contribution the project received. I'm also really happy to see "old names" replying from time to time on the mailing list. It makes me feel like there's an emerging Offpunk community where everybody can contribute at their own pace.

There were a lot of changes between 2.8 and 3.0, which probably means some new bugs and some regressions. We count on you, yes, you!, to report them and make 3.1 a lot more stable. It's as easy at typing "bugreport" in offpunk!

From the deepest of my terminal, thank you!

But enough with the cheering, let's jump to…

The 11 most important changes in Offpunk 3.0

0. Use Offpunk in your language.

Offpunk is now translatable and has been translated into Spanish, Galician, and Dutch. Step-in to translate Offpunk into your language! (awesome work by JMCS with the help of Bert Livens)

1. Openk as a standalone tool

"opnk" standalone tool has been renamed to "openk" to make it more obvious. Openk is a command-line tool that tries to open any file in the terminal and, if not possible, opens it in your preferred software, falling back to xdg-open as a last resort.

People using opnk directly should change it everywhere. Users not using "opnk" in their terminal are not affected.

2. See XKCD comics in your terminal

"xkcdpunk" is a new standalone tool that allows displaying XKCD comics directly in your terminal.

XKCDpunk in action XKCDpunk in action

3. Get only the good part and remove cruft for thousands of websites

Offpunk now integrates "unmerdify," a library written by Vincent Jousse that extracts the content of HTML articles using the "ftr-site-config" set of rules maintained by the FiveFilters community.

You can contribute by creating or improving rules for your frequently visited websites.

If no ftr rule is found, Offpunk falls back to "readability," as has been the case since 0.1. "info" will tell you if unmerdify or readability was used to display the content of a page.

To use umerdify, users should manually clone the ftr-site-config repository:
git clone https://github.com/fivefilters/ftr-site-config.git
Then, in their offpunkrc:

set ftr_site_config /path/to/ftr-site-config

Automating this step is an objective for 3.1

4. Offpunk goes social with "share" and "reply"

New social functions: "share" to send the URL of a page by email and "reply" to reply to the author if an email is found. "Reply" will remember the email used for each site/capsule/hole.

5. Browse websites while logged in

Offpunk doesn't support login into websites. But the new "cookies" command allows you to import a cookie txt file to be used with a given http domain.

From your traditional browser (Firefox, Librewolf, Chromium, … ), log into the website. Then export the cookie with the "cookie-txt" extension. Once you have this "mycookie.txt" text file, launch Offpunk and run:

cookies import mycookie.txt https://domain-of-the-cookie.net/

This allows you, for example, to read LWN.NET if you have a subscription. (contributed by Urja)

6. Bigger, better images, even in Gemini

Images are now displayed by default in gemini and their display size has been increased.

Gemini capsule of Thierry Crouzet displayed in Offpunk Gemini capsule of Thierry Crouzet displayed in Offpunk

This can be reverted with the following lines in offpunkrc:

set images_size 40
set gemini_images false

Remember that images are displayed as "blocks" when reading a page but if you access the image URL directly (by following the yellow link beneath), the image will be displayed perfectly if you are using a sixels-compatible terminal.

7. Display hidden RSS/Atom links

If available, links to hidden RSS/Atom feeds are now displayed at the bottom of HTML pages.

This makes the "feed" command a lot less useful and allows you to quickly discover interesting new feeds.

8. Display blocked links

Links to blocked domains are now displayed in red by default.

A blocked link to X on standblog.org A blocked link to X on standblog.org

This can be reverted with the following lines in offpunkrc:

theme blocked_link none

9. Preset themes

Support for multiple themes with "theme preset." Existing themes are "offpunk1" (default), "cyan," "yellow" and "bw." Don't hesitate to contribute yours!

10. Better redirects and true blocks

"redirects" now operate on the netcache level. This means that no requests to blocked URLs should ever be made (which was still happening before)

And many changes, improvements and bugfixes

- "root" is now smarter and goes to the root of a website, not the domain.
Old behaviour can still be achieved with "root /"
- "ls" command is deprecated and has been replaced by "links"
- new "websearch" command configured to use wiby.me by default
- "set default_cmd" allows you to configure what Offpunk will do when pressing enter on an empty command line. By default, it is "links 10."
- "view switch" allows you to switch between normal and full view (contributed by Andrew Fowlie)
- "help help" will allow you to send an email to the offpunk-users mailing list
- "bugreport" will send a bug report to the offpunk-devel mailing list
- And, of course, multiple bugfixes…

About the author

I'm Ploum, a writer and an engineer. I like to explore how technology impacts society. You can subscribe by email or by rss. I value privacy and never share your adress.

I write science-fiction novels in French. For Bikepunk, my new post-apocalyptic-cyclist book, my publisher is looking for contacts in other countries to distribute it in languages other than French. If you can help, contact me!

10 Feb 2026 10:55am GMT

Frank Goossens: Zijn er Meshcore-users in de Limburgse Maasvallei?

Ik steek het op de Fediverse, waar ik sinds een paar weken (maanden?) regelmatig posts langs zag komen over Meshcore als technologie/ software voor gedecentraliseerde ad-hoc netwerken voor tekst-gebaseerde berichten op basis van LoRa radio. Ik heb me zo een Sensecap T1000-e gekocht, meshcore geflasht (vanuit Chrome, poef) en verbonden met m'n Fairphone met de Meshcore app en … niks.

Source

10 Feb 2026 10:55am GMT

Frank Goossens: As seen on YouTube; Angine de Poitrine live on KEXP

Angine de Poitrine live on KEXP. French-Canadian Dada-ist instrumental rock-techno-jazz maybe? A bit of Can, Primus, King Crimson and Sun Ra and a whole lot of virtuoso drums and guitar loop-pedal juggling. If I could I'd go straight for the moshpit but it's just me in my homeoffice so … Some of the YouTube comments (not a cesspool, for once) are spot on; Anyway, just look & listen…

Source

10 Feb 2026 10:55am GMT

feedPlanet Debian

Freexian Collaborators: Writing a new worker task for Debusine (by Carles Pina i Estany)

Debusine is a tool designed for Debian developers and Operating System developers in general. You can try out Debusine on debusine.debian.net, and follow its development on salsa.debian.org.

This post describes how to write a new worker task for Debusine. It can be used to add tasks to a self-hosted Debusine instance, or to submit to the Debusine project new tasks to add new capabilities to Debusine.

Tasks are the lower-level pieces of Debusine workflows. Examples of tasks are Sbuild, Lintian, Debdiff (see the available tasks).

This post will document the steps to write a new basic worker task. The example will add a worker task that runs reprotest and creates an artifact of the new type ReprotestArtifact with the reprotest log.

Tasks are usually used by workflows. Workflows solve high-level goals by creating and orchestrating different tasks (e.g. a Sbuild workflow would create different Sbuild tasks, one for each architecture).

Overview of tasks

A task usually does the following:

If you want to follow the tutorial and add the Reprotest task, your Debusine development instance should have at least one worker, one user, a debusine client set up, and permissions for the client to create tasks. All of this can be setup following the steps in the Contribute section of the documentation.

This blog post shows a functional Reprotest task. This task is not currently part of Debusine. The Reprotest task implementation is simplified (no error handling, unit tests, specific view, docs, some shortcuts in the environment preparation, etc.). At some point, in Debusine, we might add a debrebuild task which is based on buildinfo files and uses snapshot.debian.org to recreate the binary packages.

Defining the inputs of the task

The input of the reprotest task will be a source artifact (a Debian source package). We model the input with pydantic in debusine/tasks/models.py:

class ReprotestData(BaseTaskDataWithExecutor):
   """Data for Reprotest task."""

   source_artifact: LookupSingle

class ReprotestDynamicData(BaseDynamicTaskDataWithExecutor):
   """Reprotest dynamic data."""

   source_artifact_id: int | None = None

The ReprotestData is what the user will input. A LookupSingle is a lookup that resolves to a single artifact.

We would also have configuration for the desired variations to test, but we have left that out of this example for simplicity. Configuring variations is left as an exercise for the reader.

Since ReprotestData is a subclass of BaseTaskDataWithExecutor it also contains environment where the user can specify in which environment the task will run. The environment is an artifact with a Debian image.

The ReprotestDynamicData holds the resolution of all lookups. These can be seen in the "Internals" tab of the work request view.

Add the new Reprotest artifact data class

In order for the reprotest task to create a new Artifact of the type DebianReprotest with the log and output metadata: add the new category to ArtifactCategory in debusine/artifacts/models.py:

    REPROTEST = "debian:reprotest"

In the same file add the DebianReprotest class:

class DebianReprotest(ArtifactData):
   """Data for debian:reprotest artifacts."""

   reproducible: bool | None = None

   def get_label(self) -> str:
       """Return a short human-readable label for the artifact."""
       return "reprotest analysis"

It could also include the package name or version.

In order to have the category listed in the work request output artifacts table, edit the file debusine/db/models/artifacts.py: In ARTIFACT_CATEGORY_ICON_NAMES add ArtifactCategory.REPROTEST: "folder", and in ARTIFACT_CATEGORY_SHORT_NAMES add ArtifactCategory.REPROTEST: "reprotest",.

Create the new Task class

In debusine/tasks/ create a new file reprotest.py.

reprotest.py
# Copyright © The Debusine Developers
# See the AUTHORS file at the top-level directory of this distribution
#
# This file is part of Debusine. It is subject to the license terms
# in the LICENSE file found in the top-level directory of this
# distribution. No part of Debusine, including this file, may be copied,
# modified, propagated, or distributed except according to the terms
# contained in the LICENSE file.

"""Task to use reprotest in debusine."""

from pathlib import Path
from typing import Any

from debusine import utils
from debusine.artifacts.local_artifact import ReprotestArtifact
from debusine.artifacts.models import (
    ArtifactCategory,
    CollectionCategory,
    DebianSourcePackage,
    DebianUpload,
    WorkRequestResults,
    get_source_package_name,
    get_source_package_version,
)
from debusine.client.models import RelationType
from debusine.tasks import BaseTaskWithExecutor, RunCommandTask
from debusine.tasks.models import ReprotestData, ReprotestDynamicData
from debusine.tasks.server import TaskDatabaseInterface


class Reprotest(
    RunCommandTask[ReprotestData, ReprotestDynamicData],
    BaseTaskWithExecutor[ReprotestData, ReprotestDynamicData],
):
    """Task to use reprotest in debusine."""

    TASK_VERSION = 1

    CAPTURE_OUTPUT_FILENAME = "reprotest.log"

    def __init__(
        self,
        task_data: dict[str, Any],
        dynamic_task_data: dict[str, Any] | None = None,
    ) -> None:
        """Initialize object."""
        super().__init__(task_data, dynamic_task_data)

        self._reprotest_target: Path | None = None

    def build_dynamic_data(
        self, task_database: TaskDatabaseInterface
    ) -> ReprotestDynamicData:
        """Compute and return ReprotestDynamicData."""
        input_source_artifact = task_database.lookup_single_artifact(
            self.data.source_artifact
        )

        assert input_source_artifact is not None
        self.ensure_artifact_categories(
            configuration_key="input.source_artifact",
            category=input_source_artifact.category,
            expected=(
                ArtifactCategory.SOURCE_PACKAGE,
                ArtifactCategory.UPLOAD,
            ),
        )
        assert isinstance(
            input_source_artifact.data, (DebianSourcePackage, DebianUpload)
        )
        subject = get_source_package_name(input_source_artifact.data)
        version = get_source_package_version(input_source_artifact.data)

        assert self.data.environment is not None

        environment = self.get_environment(
            task_database,
            self.data.environment,
            default_category=CollectionCategory.ENVIRONMENTS,
        )

        return ReprotestDynamicData(
            source_artifact_id=input_source_artifact.id,
            subject=subject,
            parameter_summary=f"{subject}_{version}",
            environment_id=environment.id,
        )

    def get_input_artifacts_ids(self) -> list[int]:
        """Return the list of input artifact IDs used by this task."""
        if not self.dynamic_data:
            return []

        return [
            self.dynamic_data.source_artifact_id,
            self.dynamic_data.environment_id,
        ]

    def fetch_input(self, destination: Path) -> bool:
        """Download the required artifacts."""
        assert self.dynamic_data

        artifact_id = self.dynamic_data.source_artifact_id
        assert artifact_id is not None
        self.fetch_artifact(artifact_id, destination)

        return True

    def configure_for_execution(self, download_directory: Path) -> bool:
        """
        Find a .dsc in download_directory.

        Install reprotest and other utilities used in _cmdline.
        Set self._reprotest_target to it.

        :param download_directory: where to search the files
        :return: True if valid files were found
        """
        self._prepare_executor_instance()

        if self.executor_instance is None:
            raise AssertionError("self.executor_instance cannot be None")

        self.run_executor_command(
            ["apt-get", "update"],
            log_filename="install.log",
            run_as_root=True,
            check=True,
        )
        self.run_executor_command(
            [
                "apt-get",
                "--yes",
                "--no-install-recommends",
                "install",
                "reprotest",
                "dpkg-dev",
                "devscripts",
                "equivs",
                "sudo",
            ],
            log_filename="install.log",
            run_as_root=True,
        )

        self._reprotest_target = utils.find_file_suffixes(
            download_directory, [".dsc"]
        )
        return True

    def _cmdline(self) -> list[str]:
        """
        Build the reprotest command line.

        Use configuration of self.data and self._reprotest_target.
        """
        target = self._reprotest_target
        assert target is not None

        cmd = [
            "bash",
            "-c",
            f"TMPDIR=/tmp ; cd /tmp ; dpkg-source -x {target} package/; "
            "cd package/ ; mk-build-deps ; apt-get install --yes ./*.deb ; "
            "rm *.deb ; "
            "reprotest --vary=-time,-user_group,-fileordering,-domain_host .",
        ]

        return cmd

    @staticmethod
    def _cmdline_as_root() -> bool:
        r"""apt-get install --yes ./\*.deb must be run as root."""
        return True

    def task_result(
        self,
        returncode: int | None,
        execute_directory: Path,  # noqa: U100
    ) -> WorkRequestResults:
        """
        Evaluate task output and return success.

        For a successful run of reprotest:
        -must have the output file
        -exit code is 0

        :return: WorkRequestResults.SUCCESS or WorkRequestResults.FAILURE.
        """
        reprotest_file = execute_directory / self.CAPTURE_OUTPUT_FILENAME

        if reprotest_file.exists() and returncode == 0:
            return WorkRequestResults.SUCCESS

        return WorkRequestResults.FAILURE

    def upload_artifacts(
        self, exec_directory: Path, *, execution_result: WorkRequestResults
    ) -> None:
        """Upload the ReprotestArtifact with the files and relationships."""
        if not self.debusine:
            raise AssertionError("self.debusine not set")

        assert self.dynamic_data is not None
        assert self.dynamic_data.parameter_summary is not None

        reprotest_artifact = ReprotestArtifact.create(
            reprotest_output=exec_directory / self.CAPTURE_OUTPUT_FILENAME,
            reproducible=execution_result == WorkRequestResults.SUCCESS,
            package=self.dynamic_data.parameter_summary,
        )

        uploaded = self.debusine.upload_artifact(
            reprotest_artifact,
            workspace=self.workspace_name,
            work_request=self.work_request_id,
        )

        assert self.dynamic_data is not None
        assert self.dynamic_data.source_artifact_id is not None
        self.debusine.relation_create(
            uploaded.id,
            self.dynamic_data.source_artifact_id,
            RelationType.RELATES_TO,
        )

Below are the main methods with some basic explanation.

In order for Debusine to discover the task, add "Reprotest" in the file debusine/tasks/__init__.py in the __all__ list.

Let's explain the different methods of the Reprotest class:

build_dynamic_data method

The worker has no access to Debusine's database. Lookups are all resolved before the task gets dispatched to a worker, so all it has to do is download the specified input artifacts.

build_dynamic_data method lookup the artifact, assert that is a valid category, extract the package name and version, and get the environment in which it will be executed.

The environment is needed to run the task (reprotest will run in a container using unshare, incus…).

    def build_dynamic_data(
        self, task_database: TaskDatabaseInterface
    ) -> ReprotestDynamicData:
        """Compute and return ReprotestDynamicData."""
        input_source_artifact = task_database.lookup_single_artifact(
            self.data.source_artifact
        )

        assert input_source_artifact is not None
        self.ensure_artifact_categories(
            configuration_key="input.source_artifact",
            category=input_source_artifact.category,
            expected=(
                ArtifactCategory.SOURCE_PACKAGE,
                ArtifactCategory.UPLOAD,
            ),
        )
        assert isinstance(
            input_source_artifact.data, (DebianSourcePackage, DebianUpload)
        )
        subject = get_source_package_name(input_source_artifact.data)
        version = get_source_package_version(input_source_artifact.data)

        assert self.data.environment is not None

        environment = self.get_environment(
            task_database,
            self.data.environment,
            default_category=CollectionCategory.ENVIRONMENTS,
        )

        return ReprotestDynamicData(
            source_artifact_id=input_source_artifact.id,
            subject=subject,
            parameter_summary=f"{subject}_{version}",
            environment_id=environment.id,
        )

get_input_artifacts_ids method

Used to list the task's input artifacts in the web UI.

   def get_input_artifacts_ids(self) -> list[int]:
       """Return the list of input artifact IDs used by this task."""
       if not self.dynamic_data:
           return []

       assert self.dynamic_data.source_artifact_id is not None
       return [self.dynamic_data.source_artifact_id]

fetch_input method

Download the required artifacts on the worker.

    def fetch_input(self, destination: Path) -> bool:
        """Download the required artifacts."""
        assert self.dynamic_data

        artifact_id = self.dynamic_data.source_artifact_id
        assert artifact_id is not None
        self.fetch_artifact(artifact_id, destination)

        return True

configure_for_execution method

Install the packages needed by the task and set _reprotest_target, which is used to build the task's command line.

   def configure_for_execution(self, download_directory: Path) -> bool:
       """
       Find a .dsc in download_directory.

       Install reprotest and other utilities used in _cmdline.
       Set self._reprotest_target to it.

       :param download_directory: where to search the files
       :return: True if valid files were found
       """
       self._prepare_executor_instance()

       if self.executor_instance is None:
           raise AssertionError("self.executor_instance cannot be None")

       self.run_executor_command(
           ["apt-get", "update"],
           log_filename="install.log",
           run_as_root=True,
           check=True,
       )
       self.run_executor_command(
           [
               "apt-get",
               "--yes",
               "--no-install-recommends",
               "install",
               "reprotest",
               "dpkg-dev",
               "devscripts",
               "equivs",
               "sudo",
           ],
           log_filename="install.log",
           run_as_root=True,
       )

       self._reprotest_target = utils.find_file_suffixes(
           download_directory, [".dsc"]
       )
       return True

_cmdline method

Return the command line to run the task.

In this case, and to keep the example simple, we will run reprotest directly in the worker's executor VM/container, without giving it an isolated virtual server.

So, this command installs the build dependencies required by the package (so reprotest can build it) and runs reprotest itself.

   def _cmdline(self) -> list[str]:
       """
       Build the reprotest command line.

       Use configuration of self.data and self._reprotest_target.
       """
       target = self._reprotest_target
       assert target is not None

       cmd = [
           "bash",
           "-c",
           f"TMPDIR=/tmp ; cd /tmp ; dpkg-source -x {target} package/; "
           "cd package/ ; mk-build-deps ; apt-get install --yes ./*.deb ; "
           "rm *.deb ; "
           "reprotest --vary=-time,-user_group,-fileordering,-domain_host .",
       ]

       return cmd

Some reprotest variations are disabled. This is to keep the example simple with the set of packages to install and reprotest features.

_cmdline_as_root method

Since during the execution it's needed to install packages, run it as root (in the container):

   @staticmethod
   def _cmdline_as_root() -> bool:
       r"""apt-get install --yes ./\*.deb must be run as root."""
       return True

task_result method

Task succeeded if a log is generated and the return code is 0.

    def task_result(
        self,
        returncode: int | None,
        execute_directory: Path,  # noqa: U100
    ) -> WorkRequestResults:
        """
        Evaluate task output and return success.

        For a successful run of reprotest:
        -must have the output file
        -exit code is 0

        :return: WorkRequestResults.SUCCESS or WorkRequestResults.FAILURE.
        """
        reprotest_file = execute_directory / self.CAPTURE_OUTPUT_FILENAME

        if reprotest_file.exists() and returncode == 0:
            return WorkRequestResults.SUCCESS

        return WorkRequestResults.FAILURE

upload_artifacts method

Create the ReprotestArtifact with the log and the reproducible boolean, upload it, and then add a relation between the ReprotestArtifact and the source package:

    def upload_artifacts(
        self, exec_directory: Path, *, execution_result: WorkRequestResults
    ) -> None:
        """Upload the ReprotestArtifact with the files and relationships."""
        if not self.debusine:
            raise AssertionError("self.debusine not set")

        assert self.dynamic_data is not None
        assert self.dynamic_data.parameter_summary is not None

        reprotest_artifact = ReprotestArtifact.create(
            reprotest_output=exec_directory / self.CAPTURE_OUTPUT_FILENAME,
            reproducible=execution_result == WorkRequestResults.SUCCESS,
            package=self.dynamic_data.parameter_summary,
        )

        uploaded = self.debusine.upload_artifact(
            reprotest_artifact,
            workspace=self.workspace_name,
            work_request=self.work_request_id,
        )

        assert self.dynamic_data is not None
        assert self.dynamic_data.source_artifact_id is not None
        self.debusine.relation_create(
            uploaded.id,
            self.dynamic_data.source_artifact_id,
            RelationType.RELATES_TO,
        )

Execution example

To run this task in a local Debusine (see steps to have it ready with an environment, permissions and users created) you can do:

$ python3 -m debusine.client artifact import-debian -w System http://deb.debian.org/debian/pool/main/h/hello/hello_2.10-5.dsc

(get the artifact ID from the output of that command)

The artifact can be seen in http://$DEBUSINE/debusine/System/artifact/$ARTIFACTID/.

Then create a reprotest.yaml:

$ cat <<EOF > reprotest.yaml
source_artifact: $ARTIFACT_ID
environment: "debian/match:codename=bookworm"
EOF

Instead of debian/match:codename=bookworm it could use the artifact ID.

Finally, create the work request to run the task:

$ python3 -m debusine.client create-work-request -w System reprotest --data reprotest.yaml

Using Debusine web you can see the work request, which should go to Running status, then Completed with Success or Failure (depending if reprotest could reproduce it or not). Clicking on the Output tab would have an artifact of type debian:reprotest with one file: the log. In the Metadata tab of the artifact it has Data: the package name and reproducible (true or false).

What is left to do?

This was a simple example of creating a task. Other things that could be done:

10 Feb 2026 12:00am GMT

08 Feb 2026

feedPlanet Debian

Colin Watson: Free software activity in January 2026

About 80% of my Debian contributions this month were sponsored by Freexian, as well as one direct donation via GitHub Sponsors (thanks!). If you appreciate this sort of work and are at a company that uses Debian, have a look to see whether you can pay for any of Freexian's services; as well as the direct benefits, that revenue stream helps to keep Debian development sustainable for me and several other lovely people.

You can also support my work directly via Liberapay or GitHub Sponsors.

Python packaging

New upstream versions:

Fixes for Python 3.14:

Fixes for pytest 9:

Porting away from the deprecated pkg_resources:

Other build/test failures:

I investigated several more build failures and suggested removing the packages in question:

Other bugs:

Other bits and pieces

Alejandro Colomar reported that man(1) ignored the MANWIDTH environment variable in some circumstances. I investigated this and fixed it upstream.

I contributed an ubuntu-dev-tools patch to stop recommending sudo.

I added forky support to the images used in Salsa CI pipelines.

I began working on getting a release candidate of groff 1.24.0 into experimental, though haven't finished that yet.

I worked on some lower-priority security updates for OpenSSH.

Code reviews

08 Feb 2026 7:30pm GMT

Dirk Eddelbuettel: chronometre: A new package (pair) demo for R and Python

Both R and Python make it reasonably easy to work with compiled extensions. But how to access objects in one environment from the other and share state or (non-trivial) objects remains trickier. Recently (and while r-forge was 'resting' so we opened GitHub Discussions) a question was asked concerning R and Python object pointer exchange.

This lead to a pretty decent discussion including arrow interchange demos (pretty ideal if dealing with data.frame-alike objects), but once the focus is on more 'library-specific' objects from a given (C or C++, say) library it is less clear what to do, or how involved it may get.

R has external pointers, and these make it feasible to instantiate the same object in Python. To demonstrate, I created a pair of (minimal) packages wrapping a lovely (small) class from the excellent spdlog library by Gabi Melman, and more specifically in an adapted-for-R version (to avoid some R CMD check nags) in my RcppSpdlog package. It is essentially a nicer/fancier C++ version of the tic() and tic() timing scheme. When an object is instantiated, it 'starts the clock' and when we accessing it later it prints the time elapsed in microsecond resolution. In Modern C++ this takes little more than keeping an internal chrono object.

Which makes for a nice, small, yet specific object to pass to Python. So the R side of the package pair instantiates such an object, and accesses its address. For different reasons, sending a 'raw' pointer across does not work so well, but a string with the address printed works fabulously (and is a paradigm used around other packages so we did not invent this). Over on the Python side of the package pair, we then take this string representation and pass it to a little bit of pybind11 code to instantiate a new object. This can of course also expose functionality such as the 'show time elapsed' feature, either formatted or just numerically, of interest here.

And that is all that there is! Now this can be done from R as well thanks to reticulate as the demo() (also shown on the package README.md) shows:

> library(chronometre)
> demo("chronometre", ask=FALSE)


        demo(chronometre)
        ---- ~~~~~~~~~~~

> #!/usr/bin/env r
> 
> stopifnot("Demo requires 'reticulate'" = requireNamespace("reticulate", quietly=TRUE))

> stopifnot("Demo requires 'RcppSpdlog'" = requireNamespace("RcppSpdlog", quietly=TRUE))

> stopifnot("Demo requires 'xptr'" = requireNamespace("xptr", quietly=TRUE))

> library(reticulate)

> ## reticulate and Python in general these days really want a venv so we will use one,
> ## the default value is a location used locally; if needed create one
> ## check for existing virtualenv to use, or else set one up
> venvdir <- Sys.getenv("CHRONOMETRE_VENV", "/opt/venv/chronometre")

> if (dir.exists(venvdir)) {
+ >     use_virtualenv(venvdir, required = TRUE)
+ > } else {
+ >     ## create a virtual environment, but make it temporary
+ >     Sys.setenv(RETICULATE_VIRTUALENV_ROOT=tempdir())
+ >     virtualenv_create("r-reticulate-env")
+ >     virtualenv_install("r-reticulate-env", packages = c("chronometre"))
+ >     use_virtualenv("r-reticulate-env", required = TRUE)
+ > }


> sw <- RcppSpdlog::get_stopwatch()                   # we use a C++ struct as example

> Sys.sleep(0.5)                                      # imagine doing some code here

> print(sw)                                           # stopwatch shows elapsed time
0.501220 

> xptr::is_xptr(sw)                                   # this is an external pointer in R
[1] TRUE

> xptr::xptr_address(sw)                              # get address, format is "0x...."
[1] "0x58adb5918510"

> sw2 <- xptr::new_xptr(xptr::xptr_address(sw))       # cloned (!!) but unclassed

> attr(sw2, "class") <- c("stopwatch", "externalptr") # class it .. and then use it!

> print(sw2)                                          # `xptr` allows us close and use
0.501597 

> sw3 <- ch$Stopwatch(  xptr::xptr_address(sw) )      # new Python object via string ctor

> print(sw3$elapsed())                                # shows output via Python I/O
datetime.timedelta(microseconds=502013)

> cat(sw3$count(), "\n")                              # shows double
0.502657 

> print(sw)                                           # object still works in R
0.502721 
> 

The same object, instantiated in R is used in Python and thereafter again in R. While this object here is minimal in features, the concept of passing a pointer is universal. We could use it for any interesting object that R can access and Python too can instantiate. Obviously, there be dragons as we pass pointers so one may want to ascertain that headers from corresponding compatible versions are used etc but principle is unaffected and should just work.

Both parts of this pair of packages are now at the corresponding repositories: PyPI and CRAN. As I commonly do here on package (change) announcements, I include the (minimal so far) set of high-level changes for the R package.

Changes in version 0.0.2 (2026-02-05)

  • Removed replaced unconditional virtualenv use in demo given preceding conditional block

  • Updated README.md with badges and an updated demo

Changes in version 0.0.1 (2026-01-25)

  • Initial version and CRAN upload

Questions, suggestions, bug reports, … are welcome at either the (now awoken from the R-Forge slumber) Rcpp mailing list or the newer Rcpp Discussions.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. If you like this or other open-source work I do, you can sponsor me at GitHub.

08 Feb 2026 5:11pm GMT

07 Feb 2026

feedPlanet Lisp

Joe Marshall: Vibe Coded Scheme Interpreter

Mark Friedman just released his Scheme-JS interpreter which is a Scheme with transparent JavaScript interoperability. See his blog post at furious ideas.

This interpreter apparently uses the techniques of lightweight stack inspection - Mark consulted me a bit about that hack works. I'm looking forward to seeing the vibe coded architecture.

07 Feb 2026 12:28am GMT

02 Feb 2026

feedPlanet Lisp

Gábor Melis: Untangling Literate Programming

Classical literate programming

A literate program consists of interspersed narrative and code chunks. From this, source code to be fed to the compiler is generated by a process called tangling, and documentation by weaving. The specifics of tangling vary, but the important point is that this puts the human narrative first and allows complete reordering and textual combination of chunks at the cost of introducing an additional step into the write-compile-run cycle.

The general idea

It is easy to mistake this classical implementation of literate programming for the more general idea that we want to

  1. present code to human readers in pedagogical order with narrative added, and

  2. make changing code and its documentation together easy.

The advantages of literate programming follow from these desiderata.

Untangled LP

In many languages today, code order is far more flexible than in the era of early literate programming, so the narrative order can be approximated to some degree using docstrings and comments. Code and its documentation are side by side, so changing them together should also be easy. Since the normal source code now acts as the LP source, there is no more tangling in the programming loop. This is explored in more detail here.

Pros and cons

Having no tangling is a great benefit, as we get to keep our usual programming environment and tooling. On the other hand, bare-bones untangled LP suffers from the following potential problems.

  1. Order mismatches: Things like inline functions and global variables may need to be defined before use. So, code order tends to deviate from narrative order to some degree.

  2. Reduced locality: Our main tool to sync code and narrative is factoring out small, meaningful functions, which is just good programming style anyway. However, this may be undesirable for reasons of performance or readability. In such a case, we might end up with a larger function. Now, if we have only a single docstring for it, then it can be non-obvious which part of the code a sentence in the docstring refers to because of their distance and the presence of other parts.

  3. No source code only view: Sometimes we want to see only the code. In classical LP, we can look at the tangled file. In untangled LP, editor support for hiding the narrative is the obvious solution.

  4. No generated documentation: There is no more tangling nor weaving, but we still need another tool to generate documentation. Crucially, generating documentation is not in the main programming loop.

In general, whether classical or untangled LP is better depends on the severity of the above issues in the particular programming environment.

The Lisp and PAX view

MGL-PAX, a Common Lisp untangled LP solution, aims to minimize the above problems and fill in the gaps left by dropping tangling.

  1. Order

    • Common Lisp is quite relaxed about the order of function definitions, but not so much about DEFMACRO, DEFVAR, DEFPARAMETER, DEFCONSTANT, DEFTYPE , DEFCLASS, DEFSTRUCT, DEFINE-COMPILER-MACRO, SET-MACRO-CHARACTER, SET-DISPATCH-MACRO-CHARACTER, DEFPACKAGE. However, code order can for the most part follow narrative order. In practice, we end up with some DEFVARs far from their parent DEFSECTIONs (but DECLAIM SPECIAL helps).

    • DEFSECTION controls documentation order. The references to Lisp definitions in DEFSECTION determine narrative order independently from the code order. This allows the few ordering problems to be patched over in the generated documentation.

    • Furthermore, because DEFSECTION can handle the exporting of symbols, we can declare the public interface piecemeal, right next to the relevant definitions, rather than in a monolithic DEFPACKAGE

  2. Locality

    • Lisp macros replace chunks in the rare, complex cases where a chunk is not a straightforward text substitution but takes parameters. Unlike text-based LP chunks, macros must operate on valid syntax trees (S-expressions), so they cannot be used to inject arbitrary text fragments (e.g. an unclosed parenthesis).

      This constraint forces us to organize code into meaningful, syntactic units rather than arbitrary textual fragments, which results in more robust code. Within these units, macros allow us to reshape the syntax tree directly, handling scoping properly where text interpolation would fail.

    • PAX's NOTE is an extractable, named comment. NOTE can interleave with code within e.g. functions to minimize the distance between the logic and its documentation.

    • Also, PAX hooks into the development to provide easy navigation in the documentation tree.

  3. Source code only view: PAX supports hiding verbose documentation (sections, docstrings, comments) in the editor.

  4. Generating documentation

    • PAX extracts docstrings, NOTEs and combines them with narrative glue in DEFSECTIONs.

    • Documentation can be generated as static HTML/PDF files for offline reading or browsed live (in an Emacs buffer or via an in-built web server) during development.

    • LaTeX math is supported in both PDF and HTML (via MathJax, whether live or offline).

In summary, PAX accepts a minimal deviation in code/narrative order but retains the original, interactive Lisp environment (e.g. SLIME/Sly), through which it offers optional convenience features like extended navigation, live browsing, and hiding documentation in code. In return, we give up easy fine-grained control over typesetting the documentation - a price well worth paying in Common Lisp.

02 Feb 2026 12:00am GMT

01 Feb 2026

feedPlanet Lisp

Joe Marshall: Some Libraries

Zach Beane has released the latest Quicklisp beta (January 2026), and I am pleased to have contributed to this release. Here are the highlights:

Selected Functions

Dual numbers

DERIVATIVE function → function

Returns a new unary function that computes the exact derivative of the given function at any point x.

The returned function utilizes Dual Number arithmetic to perform automatic differentiation. It evaluates f(x + ε), where ε is the dual unit (an infinitesimal such that ε2 = 0). The result is extracted from the infinitesimal part of the computation.

f(x + ε) = f(x) + f'(x)ε

This method avoids the precision errors of numerical approximation (finite difference) and the complexity of symbolic differentiation. It works for any function composed of standard arithmetic operations and elementary functions supported by the dual-numbers library (e.g., sin, exp, log).

Example

(defun square (x) (* x x))

(let ((df (derivative #'square)))
  (funcall df 5)) 
;; => 10
    

Implementation Note

The implementation relies on the generic-arithmetic system to ensure that mathematical operations within function can accept and return dual-number instances seamlessly.

Function

BINARY-COMPOSE-LEFT binary-fn unary-fn → function BINARY-COMPOSE-RIGHT binary-fn unary-fn → function

Composes a binary function B(x, y) with a unary function U(z) applied to one of its arguments.

(binary-compose-left B U)(x, y) ≡ B(U(x), y)
(binary-compose-right B U)(x, y) ≡ B(x, U(y))

These combinators are essential for "lifting" unary operations into binary contexts, such as when folding a sequence where elements need preprocessing before aggregation.

Example

;; Summing the squares of a list
(fold-left (binary-compose-right #'+ #'square) 0 '(1 2 3))
;; => 14  ; (+ (+ (+ 0 (sq 1)) (sq 2)) (sq 3))
    

FOLD

FOLD-LEFT function initial-value sequence → result

Iterates over sequence, calling function with the current accumulator and the next element. The accumulator is initialized to initial-value.

This is a left-associative reduction. The function is applied as:

(f ... (f (f initial-value x0) x1) ... xn)

Unlike CL:REDUCE, the argument order for function is strictly defined: the first argument is always the accumulator, and the second argument is always the element from the sequence. This explicit ordering eliminates ambiguity and aligns with the functional programming convention found in Scheme and ML.

Arguments

  • function: A binary function taking (accumulator, element).
  • initial-value: The starting value of the accumulator.
  • sequence: A list or vector to traverse.

Example

(fold-left (lambda (acc x) (cons x acc))
           nil
           '(1 2 3))
;; => (3 2 1)  ; Effectively reverses the list
    

Named Let

LET bindings &body body → result LET name bindings &body body → result

Provides the functionality of the "Named Let" construct, commonly found in Scheme. This allows for the definition of recursive loops within a local scope without the verbosity of LABELS.

The macro binds the variables defined in bindings as in a standard let, but also binds name to a local function that can be called recursively with new values for those variables.

(let name ((var val) ...) ... (name new-val ...) ...)

This effectively turns recursion into a concise, iterative structure. It is the idiomatic functional alternative to imperative loop constructs.

While commonly used for tail recursive loops, the function bound by named let is a first-class procedure that can be called anywhere or used as a value.

Example

;; Standard Countdown Loop
(let recur ((n 10))
  (if (zerop n)
      'blastoff
      (progn
        (print n)
        (recur (1- n)))))
    

Implementation Note

The named-let library overloads the standard CL:LET macro to support this syntax directly if the first argument is a symbol. This allows users to use let uniformly for both simple bindings and recursive loops.

01 Feb 2026 11:15pm GMT

29 Jan 2026

feedFOSDEM 2026

Join the FOSDEM Treasure Hunt!

Are you ready for another challenge? We're excited to host the second yearly edition of our treasure hunt at FOSDEM! Participants must solve five sequential challenges to uncover the final answer. Update: the treasure hunt has been successfully solved by multiple participants, and the main prizes have now been claimed. But the fun doesn't stop here. If you still manage to find the correct final answer and go to Infodesk K, you will receive a small consolation prize as a reward for your effort. If you're still looking for a challenge, the 2025 treasure hunt is still unsolved, so舰

29 Jan 2026 11:00pm GMT

26 Jan 2026

feedFOSDEM 2026

Guided sightseeing tours

If your non-geek partner and/or kids are joining you to FOSDEM, they may be interested in spending some time exploring Brussels while you attend the conference. Like previous years, FOSDEM is organising sightseeing tours.

26 Jan 2026 11:00pm GMT

Call for volunteers

With FOSDEM just a few days away, it is time for us to enlist your help. Every year, an enthusiastic band of volunteers make FOSDEM happen and make it a fun and safe place for all our attendees. We could not do this without you. This year we again need as many hands as possible, especially for heralding during the conference, during the buildup (starting Friday at noon) and teardown (Sunday evening). No need to worry about missing lunch at the weekend, food will be provided. Would you like to be part of the team that makes FOSDEM tick?舰

26 Jan 2026 11:00pm GMT