08 Feb 2026

feedPlanet Grep

Lionel Dricot: The Disconnected Git Workflow

The Disconnected Git Workflow

Using git-send-email while being offline and with multiple email accounts

WARNING: the following is a technical reminder for my future self. If you don't use the "git" software, you can safely ignore this post.

The more I work with git-send-email, the less I find the GitHub interface sufferable.

Want to send a small patch to a GitHub project? You need to clone the repository, push your changes to your own branch, then ask for a pull request using the cumbersome web interface, replying to comments online while trying to avoid smileys.

With git send-email, I simply work offline, do my local commit, then:

git send-email HEAD^

And I'm done. I reply to comments by email, with Vim/Mutt. When the patch is accepted, getting a clean tree usually boils down to:

git pull
git rebase

Yeah for git-send-email!

And, yes, I do that while offline and with multiple email accounts. That's one more reason to hate GitHub.

One mail account for each git repository

The secret is not to configure email accounts in git but to use "msmtp" to send email. Msmtp is a really cool sendmail replacement.

In .msmtprc, you can configure multiple accounts with multiple options, including calling a command to get your password.

# account 1 - pro
account work
host smtp.company.com
port 465
user login@company.com
from ploum@company.com
password SuPeRstr0ngP4ssw0rd
tls_starttls off

# personal account for FLOSS
account floss
host mail.provider.net
port 465
user ploum@mydomain.net
from ploum@mydomain.net
from ploum*@mydomain.net
passwordeval "cat ~/incredibly_encrypted_password.txt | rot13"
tls_starttls off

The important bit here is that you can set multiple "from" addresses for a given account, including a regexp to catch multiple aliases!

Now, we will ask git to automatically use the right msmtp account. In your global .gitconfig, set the following:

[sendemail]
   sendmailCmd = /usr/bin/msmtp --set-from-header=on
   envelopeSender = auto

The "envelopesender" option will ensure that the sendemail.from will be used and given to msmtp as a "from address." This might be redundant with "--set-from-header=on" in msmtp but, in my tests, having both was required. And, cherry on the cake, it automatically works for all accounts configured in msmtprc.

Older git versions (< 2.33) don't have sendmailCmd and should do:

[sendemail]
   smtpserver = /usr/bin/msmtp
   smtpserveroption = --set-from-header=on
   envelopesender = auto

I usually stick to a "ploum-PROJECT@mydomain.net" for each project I contribute to. This allows me to easily cut spam when needed. So far, the worst has been with a bug reported on the FreeBSD Bugzilla. The address used there (and nowhere else) has since been spammed to death.

In each git project, you need to do the following:

1. Set the email address used in your commit that will appear in "git log" (if different from the global one)

git config user.email "Ploum <ploum-PROJECT@mydomain.net>"

2. Set the email address that will be used to actually send the patch (could be different from the first one)

git config sendemail.from "Ploum <ploum-PROJECT@mydomain.net>"

3. Set the email address of the developer or the mailing list to which you want to contribute

git config sendemail.to project-devel@mailing-list.com

Damn, I did a commit with the wrong user.email!

Yep, I always forget to change it when working on a new project or from a fresh git clone. Not a problem. Just use "git config" like above, then:

git commit --amend --reset-author

And that's it.

Working offline

I told you I mostly work offline. And, as you might expect, msmtp requires a working Internet connection to send an email.

But msmtp comes with three wonderful little scripts: msmtp-enqueue.sh, msmtp-listqueue.sh and msmtp-runqueue.sh.

The first one saves your email to be sent in ~/.msmtpqueue, with the sending options in a separate file. The second one lists the unsent emails, and the third one actually sends all the emails in the queue.

All you need to do is change the msmtp line in your global .gitconfig to call the msmtpqueue.sh script:

[sendemail]
    sendmailcmd = /usr/libexec/msmtp/msmtpqueue/msmtp-enqueue.sh --set-from-header=on
    envelopeSender = auto 

In Debian, the scripts are available with the msmtp package. But the three are simple bash scripts that can be run from any path if your msmtp package doesn't provide them.

You can test sending a mail, then check the ~/.msmtpqueue folder for the email itself (.email file) and the related msmtp command line (.msmtp file). It happens nearly every day that I visit this folder to quickly add missing information to an email or simply remove it completely from the queue.

Of course, once connected, you need to remember to run:

/usr/libexec/msmtp/msmtpqueue/msmtp-runqueue.sh

If not connected, mails will not be sent and will be kept in the queue. This line is obviously part of my do_the_internet.sh script, along with "offpunk --sync".

It is not only git!

If it works for git, it works for any mail client. I use neomutt with the following configuration to use msmtp-enqueue and reply to email using the address it was sent to.

set sendmail="/usr/libexec/msmtp/msmtpqueue/msmtp-enqueue.sh --set-from-header=on"
unset envelope_from_address
set use_envelope_from
set reverse_name
set from="ploum@mydomain.net"
alternates ploum[A-Za-z0-9]*@mydomain.net

Of course, the whole config is a little more complex to handle multiple accounts that are all stored locally in Maildir format through offlineimap and indexed with notmuch. But this is a bit out of the scope of this post.

At least, you get the idea, and you could probably adapt it to your own mail client.

Conclusion

Sure, it's a whole blog post just to get the config right. But there's nothing really out of this world. And once the setup is done, it is done for good. No need to adapt to every change in a clumsy web interface, no need to use your mouse. Simple command lines and simple git flow!

Sometimes, I work late at night. When finished, I close the lid of my laptop and call it a day without reconnecting my laptop. This allows me not to see anything new before going to bed. When this happens, queued mails are sent the next morning, when I run the first do_the_internet.sh of the day.

And it always brings a smile to my face to see those bits being sent while I've completely forgotten about them…

About the author

I'm Ploum, a writer and an engineer. I like to explore how technology impacts society. You can subscribe by email or by rss. I value privacy and never share your adress.

I write science-fiction novels in French. For Bikepunk, my new post-apocalyptic-cyclist book, my publisher is looking for contacts in other countries to distribute it in languages other than French. If you can help, contact me!

08 Feb 2026 10:16pm GMT

Frank Goossens: Zijn er Meshcore-users in de Limburgse Maasvallei?

Ik steek het op de Fediverse, waar ik sinds een paar weken (maanden?) regelmatig posts langs zag komen over Meshcore als technologie/ software voor gedecentraliseerde ad-hoc netwerken voor tekst-gebaseerde berichten op basis van LoRa radio. Ik heb me zo een Sensecap T1000-e gekocht, meshcore geflasht (vanuit Chrome, poef) en verbonden met m'n Fairphone met de Meshcore app en … niks.

Source

08 Feb 2026 10:16pm GMT

Frank Goossens: As seen on YouTube; Angine de Poitrine live on KEXP

Angine de Poitrine live on KEXP. French-Canadian Dada-ist instrumental rock-techno-jazz maybe? A bit of Can, Primus, King Crimson and Sun Ra and a whole lot of virtuoso drums and guitar loop-pedal juggling. If I could I'd go straight for the moshpit but it's just me in my homeoffice so … Some of the YouTube comments (not a cesspool, for once) are spot on; Anyway, just look & listen…

Source

08 Feb 2026 10:16pm GMT

feedPlanet Debian

Colin Watson: Free software activity in January 2026

About 80% of my Debian contributions this month were sponsored by Freexian, as well as one direct donation via GitHub Sponsors (thanks!). If you appreciate this sort of work and are at a company that uses Debian, have a look to see whether you can pay for any of Freexian's services; as well as the direct benefits, that revenue stream helps to keep Debian development sustainable for me and several other lovely people.

You can also support my work directly via Liberapay or GitHub Sponsors.

Python packaging

New upstream versions:

Fixes for Python 3.14:

Fixes for pytest 9:

Porting away from the deprecated pkg_resources:

Other build/test failures:

I investigated several more build failures and suggested removing the packages in question:

Other bugs:

Other bits and pieces

Alejandro Colomar reported that man(1) ignored the MANWIDTH environment variable in some circumstances. I investigated this and fixed it upstream.

I contributed an ubuntu-dev-tools patch to stop recommending sudo.

I added forky support to the images used in Salsa CI pipelines.

I began working on getting a release candidate of groff 1.24.0 into experimental, though haven't finished that yet.

I worked on some lower-priority security updates for OpenSSH.

Code reviews

08 Feb 2026 7:30pm GMT

Dirk Eddelbuettel: chronometre: A new package (pair) demo for R and Python

Both R and Python make it reasonably easy to work with compiled extensions. But how to access objects in one environment from the other and share state or (non-trivial) objects remains trickier. Recently (and while r-forge was 'resting' so we opened GitHub Discussions) a question was asked concerning R and Python object pointer exchange.

This lead to a pretty decent discussion including arrow interchange demos (pretty ideal if dealing with data.frame-alike objects), but once the focus is on more 'library-specific' objects from a given (C or C++, say) library it is less clear what to do, or how involved it may get.

R has external pointers, and these make it feasible to instantiate the same object in Python. To demonstrate, I created a pair of (minimal) packages wrapping a lovely (small) class from the excellent spdlog library by Gabi Melman, and more specifically in an adapted-for-R version (to avoid some R CMD check nags) in my RcppSpdlog package. It is essentially a nicer/fancier C++ version of the tic() and tic() timing scheme. When an object is instantiated, it 'starts the clock' and when we accessing it later it prints the time elapsed in microsecond resolution. In Modern C++ this takes little more than keeping an internal chrono object.

Which makes for a nice, small, yet specific object to pass to Python. So the R side of the package pair instantiates such an object, and accesses its address. For different reasons, sending a 'raw' pointer across does not work so well, but a string with the address printed works fabulously (and is a paradigm used around other packages so we did not invent this). Over on the Python side of the package pair, we then take this string representation and pass it to a little bit of pybind11 code to instantiate a new object. This can of course also expose functionality such as the 'show time elapsed' feature, either formatted or just numerically, of interest here.

And that is all that there is! Now this can be done from R as well thanks to reticulate as the demo() (also shown on the package README.md) shows:

> library(chronometre)
> demo("chronometre", ask=FALSE)


        demo(chronometre)
        ---- ~~~~~~~~~~~

> #!/usr/bin/env r
> 
> stopifnot("Demo requires 'reticulate'" = requireNamespace("reticulate", quietly=TRUE))

> stopifnot("Demo requires 'RcppSpdlog'" = requireNamespace("RcppSpdlog", quietly=TRUE))

> stopifnot("Demo requires 'xptr'" = requireNamespace("xptr", quietly=TRUE))

> library(reticulate)

> ## reticulate and Python in general these days really want a venv so we will use one,
> ## the default value is a location used locally; if needed create one
> ## check for existing virtualenv to use, or else set one up
> venvdir <- Sys.getenv("CHRONOMETRE_VENV", "/opt/venv/chronometre")

> if (dir.exists(venvdir)) {
+ >     use_virtualenv(venvdir, required = TRUE)
+ > } else {
+ >     ## create a virtual environment, but make it temporary
+ >     Sys.setenv(RETICULATE_VIRTUALENV_ROOT=tempdir())
+ >     virtualenv_create("r-reticulate-env")
+ >     virtualenv_install("r-reticulate-env", packages = c("chronometre"))
+ >     use_virtualenv("r-reticulate-env", required = TRUE)
+ > }


> sw <- RcppSpdlog::get_stopwatch()                   # we use a C++ struct as example

> Sys.sleep(0.5)                                      # imagine doing some code here

> print(sw)                                           # stopwatch shows elapsed time
0.501220 

> xptr::is_xptr(sw)                                   # this is an external pointer in R
[1] TRUE

> xptr::xptr_address(sw)                              # get address, format is "0x...."
[1] "0x58adb5918510"

> sw2 <- xptr::new_xptr(xptr::xptr_address(sw))       # cloned (!!) but unclassed

> attr(sw2, "class") <- c("stopwatch", "externalptr") # class it .. and then use it!

> print(sw2)                                          # `xptr` allows us close and use
0.501597 

> sw3 <- ch$Stopwatch(  xptr::xptr_address(sw) )      # new Python object via string ctor

> print(sw3$elapsed())                                # shows output via Python I/O
datetime.timedelta(microseconds=502013)

> cat(sw3$count(), "\n")                              # shows double
0.502657 

> print(sw)                                           # object still works in R
0.502721 
> 

The same object, instantiated in R is used in Python and thereafter again in R. While this object here is minimal in features, the concept of passing a pointer is universal. We could use it for any interesting object that R can access and Python too can instantiate. Obviously, there be dragons as we pass pointers so one may want to ascertain that headers from corresponding compatible versions are used etc but principle is unaffected and should just work.

Both parts of this pair of packages are now at the corresponding repositories: PyPi and CRAN. As I commonly do here on package (change) announcements, I include the (minimal so far) set of high-level changes for the R package.

Changes in version 0.0.2 (2026-02-05)

  • Removed replaced unconditional virtualenv use in demo given preceding conditional block

  • Updated README.md with badges and an updated demo

Changes in version 0.0.1 (2026-01-25)

  • Initial version and CRAN upload

Questions, suggestions, bug reports, … are welcome at either the (now awoken from the R-Forge slumber) Rcpp mailing list or the newer Rcpp Discussions.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. If you like this or other open-source work I do, you can sponsor me at GitHub.

08 Feb 2026 5:10pm GMT

Vincent Bernat: Fragments of an adolescent web

I have unearthed a few old articles typed during my adolescence, between 1996 and 1998. Unremarkable at the time, these pages now compose, three decades later, the chronicle of a vanished era.1

The word "blog" does not exist yet. Wikipedia remains to come. Google has not been born. AltaVista reigns over searches, while already struggling to embrace the nascent immensity of the web2. To meet someone, you had to agree in advance and prepare your route on paper maps. 🗺️

The web is taking off. The CSS specification has just emerged, HTML tables still serve for page layout. Cookies and advertising banners are making their appearance. Pages are adorned with music and videos, forcing browsers to arm themselves with plugins. Netscape Navigator sits on 86% of the territory, but Windows 95 now bundles Internet Explorer to quickly catch up. Facing this offensive, Netscape opensource its browser.

France falls behind. Outside universities, Internet access remains expensive and laborious. Minitel still reigns, offering phone directory, train tickets, remote shopping. This was not yet possible with the Internet: buying a CD online was a pipe dream. Encryption suffers from inappropriate regulation: the DES algorithm is capped at 40 bits and cracked in a few seconds.

These pages bear the trace of the web's adolescence. Thirty years have passed. The same battles continue: data selling, advertising, monopolies.


  1. Most articles linked here are not translated from French to English. ↩︎

  2. I recently noticed that Google no longer fully indexes my blog. For example, it is no longer possible to find the article on lanĉo. I assume this is a consequence of the explosion of AI-generated content or a change in priorities for Google. ↩︎

08 Feb 2026 2:51pm GMT

07 Feb 2026

feedPlanet Lisp

Joe Marshall: Vibe Coded Scheme Interpreter

Mark Friedman just released his Scheme-JS interpreter which is a Scheme with transparent JavaScript interoperability. See his blog post at furious ideas.

This interpreter apparently uses the techniques of lightweight stack inspection - Mark consulted me a bit about that hack works. I'm looking forward to seeing the vibe coded architecture.

07 Feb 2026 12:28am GMT

02 Feb 2026

feedPlanet Lisp

Gábor Melis: Untangling Literate Programming

Classical literate programming

A literate program consists of interspersed narrative and code chunks. From this, source code to be fed to the compiler is generated by a process called tangling, and documentation by weaving. The specifics of tangling vary, but the important point is that this puts the human narrative first and allows complete reordering and textual combination of chunks at the cost of introducing an additional step into the write-compile-run cycle.

The general idea

It is easy to mistake this classical implementation of literate programming for the more general idea that we want to

  1. present code to human readers in pedagogical order with narrative added, and

  2. make changing code and its documentation together easy.

The advantages of literate programming follow from these desiderata.

Untangled LP

In many languages today, code order is far more flexible than in the era of early literate programming, so the narrative order can be approximated to some degree using docstrings and comments. Code and its documentation are side by side, so changing them together should also be easy. Since the normal source code now acts as the LP source, there is no more tangling in the programming loop. This is explored in more detail here.

Pros and cons

Having no tangling is a great benefit, as we get to keep our usual programming environment and tooling. On the other hand, bare-bones untangled LP suffers from the following potential problems.

  1. Order mismatches: Things like inline functions and global variables may need to be defined before use. So, code order tends to deviate from narrative order to some degree.

  2. Reduced locality: Our main tool to sync code and narrative is factoring out small, meaningful functions, which is just good programming style anyway. However, this may be undesirable for reasons of performance or readability. In such a case, we might end up with a larger function. Now, if we have only a single docstring for it, then it can be non-obvious which part of the code a sentence in the docstring refers to because of their distance and the presence of other parts.

  3. No source code only view: Sometimes we want to see only the code. In classical LP, we can look at the tangled file. In untangled LP, editor support for hiding the narrative is the obvious solution.

  4. No generated documentation: There is no more tangling nor weaving, but we still need another tool to generate documentation. Crucially, generating documentation is not in the main programming loop.

In general, whether classical or untangled LP is better depends on the severity of the above issues in the particular programming environment.

The Lisp and PAX view

MGL-PAX, a Common Lisp untangled LP solution, aims to minimize the above problems and fill in the gaps left by dropping tangling.

  1. Order

    • Common Lisp is quite relaxed about the order of function definitions, but not so much about DEFMACRO, DEFVAR, DEFPARAMETER, DEFCONSTANT, DEFTYPE , DEFCLASS, DEFSTRUCT, DEFINE-COMPILER-MACRO, SET-MACRO-CHARACTER, SET-DISPATCH-MACRO-CHARACTER, DEFPACKAGE. However, code order can for the most part follow narrative order. In practice, we end up with some DEFVARs far from their parent DEFSECTIONs (but DECLAIM SPECIAL helps).

    • DEFSECTION controls documentation order. The references to Lisp definitions in DEFSECTION determine narrative order independently from the code order. This allows the few ordering problems to be patched over in the generated documentation.

    • Furthermore, because DEFSECTION can handle the exporting of symbols, we can declare the public interface piecemeal, right next to the relevant definitions, rather than in a monolithic DEFPACKAGE

  2. Locality

    • Lisp macros replace chunks in the rare, complex cases where a chunk is not a straightforward text substitution but takes parameters. Unlike text-based LP chunks, macros must operate on valid syntax trees (S-expressions), so they cannot be used to inject arbitrary text fragments (e.g. an unclosed parenthesis).

      This constraint forces us to organize code into meaningful, syntactic units rather than arbitrary textual fragments, which results in more robust code. Within these units, macros allow us to reshape the syntax tree directly, handling scoping properly where text interpolation would fail.

    • PAX's NOTE is an extractable, named comment. NOTE can interleave with code within e.g. functions to minimize the distance between the logic and its documentation.

    • Also, PAX hooks into the development to provide easy navigation in the documentation tree.

  3. Source code only view: PAX supports hiding verbose documentation (sections, docstrings, comments) in the editor.

  4. Generating documentation

    • PAX extracts docstrings, NOTEs and combines them with narrative glue in DEFSECTIONs.

    • Documentation can be generated as static HTML/PDF files for offline reading or browsed live (in an Emacs buffer or via an in-built web server) during development.

    • LaTeX math is supported in both PDF and HTML (via MathJax, whether live or offline).

In summary, PAX accepts a minimal deviation in code/narrative order but retains the original, interactive Lisp environment (e.g. SLIME/Sly), through which it offers optional convenience features like extended navigation, live browsing, and hiding documentation in code. In return, we give up easy fine-grained control over typesetting the documentation - a price well worth paying in Common Lisp.

02 Feb 2026 12:00am GMT

01 Feb 2026

feedPlanet Lisp

Joe Marshall: Some Libraries

Zach Beane has released the latest Quicklisp beta (January 2026), and I am pleased to have contributed to this release. Here are the highlights:

Selected Functions

Dual numbers

DERIVATIVE function → function

Returns a new unary function that computes the exact derivative of the given function at any point x.

The returned function utilizes Dual Number arithmetic to perform automatic differentiation. It evaluates f(x + ε), where ε is the dual unit (an infinitesimal such that ε2 = 0). The result is extracted from the infinitesimal part of the computation.

f(x + ε) = f(x) + f'(x)ε

This method avoids the precision errors of numerical approximation (finite difference) and the complexity of symbolic differentiation. It works for any function composed of standard arithmetic operations and elementary functions supported by the dual-numbers library (e.g., sin, exp, log).

Example

(defun square (x) (* x x))

(let ((df (derivative #'square)))
  (funcall df 5)) 
;; => 10
    

Implementation Note

The implementation relies on the generic-arithmetic system to ensure that mathematical operations within function can accept and return dual-number instances seamlessly.

Function

BINARY-COMPOSE-LEFT binary-fn unary-fn → function BINARY-COMPOSE-RIGHT binary-fn unary-fn → function

Composes a binary function B(x, y) with a unary function U(z) applied to one of its arguments.

(binary-compose-left B U)(x, y) ≡ B(U(x), y)
(binary-compose-right B U)(x, y) ≡ B(x, U(y))

These combinators are essential for "lifting" unary operations into binary contexts, such as when folding a sequence where elements need preprocessing before aggregation.

Example

;; Summing the squares of a list
(fold-left (binary-compose-right #'+ #'square) 0 '(1 2 3))
;; => 14  ; (+ (+ (+ 0 (sq 1)) (sq 2)) (sq 3))
    

FOLD

FOLD-LEFT function initial-value sequence → result

Iterates over sequence, calling function with the current accumulator and the next element. The accumulator is initialized to initial-value.

This is a left-associative reduction. The function is applied as:

(f ... (f (f initial-value x0) x1) ... xn)

Unlike CL:REDUCE, the argument order for function is strictly defined: the first argument is always the accumulator, and the second argument is always the element from the sequence. This explicit ordering eliminates ambiguity and aligns with the functional programming convention found in Scheme and ML.

Arguments

  • function: A binary function taking (accumulator, element).
  • initial-value: The starting value of the accumulator.
  • sequence: A list or vector to traverse.

Example

(fold-left (lambda (acc x) (cons x acc))
           nil
           '(1 2 3))
;; => (3 2 1)  ; Effectively reverses the list
    

Named Let

LET bindings &body body → result LET name bindings &body body → result

Provides the functionality of the "Named Let" construct, commonly found in Scheme. This allows for the definition of recursive loops within a local scope without the verbosity of LABELS.

The macro binds the variables defined in bindings as in a standard let, but also binds name to a local function that can be called recursively with new values for those variables.

(let name ((var val) ...) ... (name new-val ...) ...)

This effectively turns recursion into a concise, iterative structure. It is the idiomatic functional alternative to imperative loop constructs.

While commonly used for tail recursive loops, the function bound by named let is a first-class procedure that can be called anywhere or used as a value.

Example

;; Standard Countdown Loop
(let recur ((n 10))
  (if (zerop n)
      'blastoff
      (progn
        (print n)
        (recur (1- n)))))
    

Implementation Note

The named-let library overloads the standard CL:LET macro to support this syntax directly if the first argument is a symbol. This allows users to use let uniformly for both simple bindings and recursive loops.

01 Feb 2026 11:15pm GMT

29 Jan 2026

feedFOSDEM 2026

Join the FOSDEM Treasure Hunt!

Are you ready for another challenge? We're excited to host the second yearly edition of our treasure hunt at FOSDEM! Participants must solve five sequential challenges to uncover the final answer. Update: the treasure hunt has been successfully solved by multiple participants, and the main prizes have now been claimed. But the fun doesn't stop here. If you still manage to find the correct final answer and go to Infodesk K, you will receive a small consolation prize as a reward for your effort. If you're still looking for a challenge, the 2025 treasure hunt is still unsolved, so舰

29 Jan 2026 11:00pm GMT

26 Jan 2026

feedFOSDEM 2026

Guided sightseeing tours

If your non-geek partner and/or kids are joining you to FOSDEM, they may be interested in spending some time exploring Brussels while you attend the conference. Like previous years, FOSDEM is organising sightseeing tours.

26 Jan 2026 11:00pm GMT

Call for volunteers

With FOSDEM just a few days away, it is time for us to enlist your help. Every year, an enthusiastic band of volunteers make FOSDEM happen and make it a fun and safe place for all our attendees. We could not do this without you. This year we again need as many hands as possible, especially for heralding during the conference, during the buildup (starting Friday at noon) and teardown (Sunday evening). No need to worry about missing lunch at the weekend, food will be provided. Would you like to be part of the team that makes FOSDEM tick?舰

26 Jan 2026 11:00pm GMT