19 Apr 2026

feedPlanet Grep

Jeroen De Dauw: Boot Your Productivity With AI

My productivity has increased by at least 300% with AI assistance. You can get amazing results nowadays. If you use the tools right. Discover 4 key ingredients that make the tools work for you in this post.

Many people have only tried free AI via ChatGPT or similar web chatbots. It's easy to dismiss those tools, since they lack all 4 ingredients.

1. Context. If I ask you how I could improve my business, you won't be able to provide a good answer. You don't know all the details about my business that matter. All you can do is offer generic advice or make guesses (hallucinations). It's the same with AI tools. Don't rely on the knowledge baked into the LLM base models. Either provide this knowledge, or provide the tools to obtain that knowledge. You can include "all relevant knowledge" in the prompt, but this is labor-intensive. This is why you want an agentic tool.

2. Agentic tools. I've been using Claude Code, a CLI tool that provides agentic AI for knowledge tasks (not just coding). There is also Claude Cowork, a desktop tool, and alternatives from vendors besides Antrophic. These tools use a loop in which the AI determines whether it needs more information and then goes looking for it. You can give these tools a task or a question, and then they will, if called for, run hundreds of searches and commands. They can look at your documents, codebases, and web resources. Tell these tools "Fix GitHub issue $link", and they'll look at the issue, anything references on the issue, as your codebase, make changes, run tests, make more changes, check results via the browser, fix some final issues, create a draft pull request, and provide you with a summary of what was done and possible next steps.

3. Feedback harness. When writing code, you often don't get everything correct the first time. Which is why automated tests are great. More generally, fast feedback loops are great, regardless of whether you're doing software development. For software development, you'll get much better results if the AI tools can actually run the code and run tests and other CI tools to verify everything is correct.

4. Model. AI capabilities are increasing at an incredible pace. If you're using the latest models, your experience will be worlds apart from those using 2-year-old models. For maximum quality, there are 3 metrics to max out: model size/capability, model version, and effort parameters. In other words, use the latest version of the biggest model with "max effort". At the time of writing this post, that is Claude Opus 4.7 with max-effort when using Antrophic, or GPT-5.4 Pro with heavy thinking when using OpenAI. These settings eat tokens, so you will quickly run into your subscription limits of the basic tiers. Then again, paying 200 USD a month for the higher tiers so you can 5x your productivity is quite the bargain.

Those 4 points provide a conceptual framework. There is more to learn, and the AI space is evolving quickly. Ask your favorite AI tool how you can improve your AI workflows, starting from this post, to get specifics.

Some more tips:

You can stay up to speed with AI capabilities development via Don't Worry About the Vase and Astral Codex Ten, which I both highly recommend.

Shameless plug: my company provides an AI Assistant for MediaWiki, giving you AI capabilities on top of collaborative knowledge management, ideal for organizations.

The post Boot Your Productivity With AI appeared first on Blog of Jeroen De Dauw.

19 Apr 2026 8:57am GMT

Frank Goossens: De nieuwe Harstad “Onder de kasseien, het strand” is er bijna maar nog niet helemaal

Ik ben niet zo voor lijstjes, maar als ik onder dreiging van foltering een boeken top 3 zou moeten geven, dan zou "Max, Micha & het Tet-offensief" daar zeker in staan. Van auteur Johan Hardstad verscheen in 2024 al een nieuwe roman in het Noors onder de titel "Under brosteinen, stranden!" en volgens doorgaans goed ingelichte bronnen (ik mailde met de uitgever) zou in het najaar van 2026 de…

Source

19 Apr 2026 8:57am GMT

Dries Buytaert: What does 'Buy European' even mean?

This post was co-authored with Nicholas Gates, senior policy advisor at OpenForum Europe. It was originally published on EUobserver, an independent online newspaper widely read by EU policymakers, journalists and advocacy groups. The article summarizes a series of posts I've been writing about digital sovereignty.

European digital assets have a habit of not staying European - a problem current discussions about sovereignty are overlooking.

For example, Skype had Swedish and Danish founders, Estonian engineers, a Luxembourg headquarters, and proprietary code.

Every sovereignty credential was correct on the day it would have been assessed - and meaningless after eBay acquired it, Microsoft bought it, and eventually shut it down in 2025.

This speaks to a core tension at the heart of Europe's digital sovereignty moment. The real story has to do with licensing, dependencies, and supply chains more than it has to do with ownership or operational control - both of which can (and often do) change in Europe.

The current conception of cloud sovereignty asks the right questions about where data is stored, where companies are headquartered, and whether supply chains are European.

What they don't yet ask is whether the sovereignty they are assessing is durable and resilient - for example, whether it will survive a change of ownership, a corporate acquisition, or a disruption in the infrastructure the software depends on.

The European Commission's Cloud Sovereignty Framework provides a non-legislative assessment tool designed to evaluate the digital independence of cloud services in Europe.

It enables public authorities to rank services based on factors such as immunity from non-EU laws, operational control, and data protection.

The forthcoming Cloud and AI Development Act (CAIDA) - expected at the end of May - will possibly go further.

That said, while both are serious and welcome efforts, they are likely to solve only part of the problem.

'Buy European' is a fragile concept

Europe's 'Buy European' strategy is being built on two fragile foundations it hasn't yet explicitly addressed, and this could have disastrous implications in the cloud domain in particular.

Proprietary software with a perfect sovereignty score today is one acquisition away from a different answer tomorrow. Open Source software means the question doesn't arise.

The legal right to fork changes the power dynamic entirely: it gives you leverage, lets a community step in, and means the technology cannot be held hostage.

This is the distinction the Cloud Sovereignty Framework currently misses.

When Oracle acquired Sun Microsystems in 2010, governments running MySQL faced an immediate question: what happens to this software now?

The answer turned on one thing - the licence. Because MySQL was GPL-licensed, the right to fork and maintain it independently was already being exercised before the acquisition even completed.

MySQL's creator, Monty Widenius, forked it in 2009 precisely because he saw the acquisition coming - that fork exists today as MariaDB. The licence didn't prevent Oracle from buying Sun. It meant the acquisition couldn't end the software, and anyone paying attention could act on that right before any harm materialised.

Getting the licence right is necessary, but it is not sufficient.

In 2024, a conflict between WordPress co-founder Matt Mullenweg and WP Engine disrupted updates for millions of websites.

The code was Open Source. The delivery infrastructure had a single point of control. Most programming languages rely on a single central registry and most are controlled by US companies.

In 2019, GitHub restricted access for developers in sanctioned countries; since GitHub also owns npm, the JavaScript ecosystem's delivery infrastructure became subject to the same trade controls. These aren't interchangeable download sites you can swap out.

Sovereign software on fragile infrastructure is not sovereign. It is software waiting for a supply chain to break.

Both fragility problems point to the same conclusion: a 'Buy European' label is not a sovereignty guarantee unless it embraces licensing as a tool and helps to safeguard the supply chains the software depends on.

Consider two scenarios. A government running proprietary software on a European cloud has jurisdiction, but no exit if the provider is acquired - replacing the software could take years.

A government running Open Source software on Amazon Web Services (AWS) in Europe can move the same software to a European provider whenever it wants. Neither is ideal, but they are not equal.

Europe's sovereignty frameworks need to internalise this asymmetry. Structural sovereignty - the kind that survives change - requires open foundations that flow from licensing through the critical supply chains on which that software depends.

A call-to-action for the Cloud and AI Development Act

CAIDA should not make the same mistakes as the Cloud Sovereignty Framework. It would be a mistake to simply extend a 'Buy European' checklist. The legislation should instead define what makes sovereignty durable.

Two concrete steps would make an immediate difference.

First, it can make Open Source licensing a pass/fail gate for mission-critical procurement under the Cloud Sovereignty Framework - a condition of eligibility at the highest assurance levels, not a weighted factor in a composite score.

Second, it should require supply chain resilience assessments that distinguish between dependencies switchable in weeks and those that would take an entire language community years to replicate, with federated or mirrored European alternatives required where no fallback exists.

Yes, requiring Open Source for mission-critical systems narrows the field in the short term.

But the providers you lose are the ones whose sovereignty credentials don't survive change.

In the longer term, these requirements push European companies toward Open Source software - technology that no one can take away.

19 Apr 2026 8:57am GMT

feedPlanet Debian

Russ Allbery: Review: Collision Course

Review: Collision Course, by Michelle Diener

Series: Class 5 #6
Publisher: Eclipse
Copyright: November 2024
ISBN: 1-7637844-0-1
Format: Kindle
Pages: 289

Collision Course is the sixth novel in the Class 5 science fiction series and the first that doesn't use the Dark X naming convention. There are lots of spoilers in this story for the earlier books, but you don't have to remember all the details of previous events. Like the novella, Dark Ambitions, this novel returns to Rose, Sazo, and Dav instead of introducing another Earth woman and Class 5 ship.

In Dark Class, Ellie discovered an interesting artifact of a previously-unknown space-faring civilization. Rose, Sazo, and Dav are on their way to make first contact when, during a routine shuttle flight between the Class 5 and Dav's Grih military ship, Rose is abducted. The aliens they came to contact have an aggressive, leverage-based negotiating strategy. They're also in the middle of a complicated war with more sides than are readily apparent.

What I liked most about Dark Horse, the first book of this series and our introduction to Rose, was the revealed ethical system and a tense plot that hinged primarily on establishing mutual trust when there were excellent reasons for the characters to not trust each other. As the series has continued, I think the plots have become more complicated but the ethical dilemmas and revealing moments of culture shock have become less common. That is certainly true of Collision Course; this is science fiction as thriller, with a complex factional conflict, a lot of events, more plot reversals than the earlier books, but also less ethics and philosophy.

I'm not sure if this is a complaint. I kind of miss the ethics and philosophy, but Diener also hasn't had much new to say for the past few books. The plot of Collision Course is quite satisfyingly twisty for a popcorn-style science fiction series. I was kept guessing about the merits of some of the factions quite late into the book, although admittedly I was in the mood for light entertainment and was not trying too hard to figure out where the book was going. I did read nearly the entire book in one sitting and stayed up until 2am to finish it, which is a solid indication that something Diener was doing worked.

I do have quibbles, though. One is that the ending is a bit unsatisfying. Like Sazo, I was getting quite annoyed at the people capturing (and recapturing) Rose and would have enjoyed somewhat more decisive consequences. Also, and here I have to be vague to avoid spoilers, I was expecting a bit more of a redemption arc for one of the players in the multi-sided conflict. The ending I did get was believable but rather sad, and I wish Diener had either chosen a different outcome (this is light happily-ever-after science fiction, after all) or wrestled more directly with the implications. There were a bit too many "wait, one more thing" ending reversals and not quite enough emotional payoff for me.

The other quibble is that Collision Course was a bit too damsel in distress for this series. Rose is pregnant, which Diener uses throughout the book as a way to raise the stakes of the plot and also make Rose more annoyed but also less capable than she was in her earlier novel. Both Sazo and Dav are in full heroic rescue mode, and while Diener still ensures Rose is primarily responsible for her own fate, there is some "military men attempt to protect the vulnerable woman" here. One of the things I like about this series is that it does not use that plot, so while the balance between Rose rescuing herself and other people rescuing her is still tilted towards Rose, I would have liked this book more if Rose were in firmer control of events.

I will mostly ignore the fact that a human and a Grih sexually reproducing makes little to no biological sense, since Star Trek did similar things routinely and it's an established genre trope. But I admit that it still annoys me a bit that the alien hunk is essentially human except that he's obsessed with Rose's singing and has pointy ears. Diener cares about Rose's pregnancy a lot more than I did, which added to my mild grumpiness at how often it came up.

Overall, this was fine. I prefer a bit more of a protagonist discovering how powerful she is by making ingenious use of the ethical dilemmas her captors have trapped themselves in, and a bit less of Rose untangling a complicated political situation by getting abducted by every player serially, but it still kept the pages turning. Any book that is sufficiently engrossing for me to read straight through is working at some level. Collision Course was highly readable, undemanding, and distracting, which is what I was looking for when I read it. I would put it about middle of pack in the series. If Rose's pregnancy is more interesting to you than it was to me, that might push it a bit higher.

If you have gotten this far in the series, you will probably enjoy this, although it does feel like Diener is running out of new things to say about this universe. That's unfortunate given the number of threads about AI sentience and rights that could still be followed, but I think tracing them properly would require more philosophical meat than Diener intends for these books. Which is why the next book I grabbed was a Culture novel.

Currently this is the final book in the Class 5 series, but there is no inherent reason why Diener couldn't write more of them.

Rating: 7 out of 10

19 Apr 2026 4:52am GMT

18 Apr 2026

feedPlanet Debian

Charles Plessy: Thanks Branchable!

I was hosted for a long time, free of charge, on https://www.branchable.com/ by Joey and Lars. Branchable and Ikiwiki were wonderful ideas that never took off as much as they deserved. To avoid being a burden now that Branchable is nearing its end, I migrated to a VPS at Sakura.

However, I have not left Ikiwiki. I only use it as a site engine, but I haven't found any equivalent that gives me both native Git integration, wiki syntax for a personal site, the creativity of its directives (you can do anything with inline and pagespec), and its multilingual support through the po plugin.

Joey and Lars, thank you for everything!

18 Apr 2026 1:37pm GMT

Matthias Klumpp: Hello old new “Projects” directory!

If you have recently installed a very up-to-date Linux distribution with a desktop environment, or upgraded your system on a rolling-release distribution, you might have noticed that your home directory has a new folder: "Projects"

Why?

With the recent 0.20 release of xdg-user-dirs we enabled the "Projects" directory by default. Support for this has already existed since 2007, but was never formally enabled. This closes a more than 11 year old bug report that asked for this feature.

The purpose of the Projects directory is to give applications a default location to place project files that do not cleanly belong into one of the existing categories (Documents, Music, Pictures, Videos). Examples of this are software engineering projects, scientific projects, 3D printing projects, CAD design or even things like video editing projects, where project files would end up in the "Projects" directory, with output video being more at home in "Videos".

By enabling this by default, and subsequently in the coming months adding support to GLib, Flatpak, desktops and applications that want to make use of it, we hope to give applications that do operate in a "project-centric" manner with mixed media a better default storage location. As of now, those tools either default to the home directory, or will clutter the "Documents" folder, both of which is not ideal. It also gives users a default organization structure, hopefully leading to less clutter overall and better storage layouts.

This sucks, I don't like it!

As usual, you are in control and can modify your system's behavior. If you do not like the "Projects" folder, simply delete it! The xdg-user-dirs utility will not try to create it again, and instead adjust the default location for this directory to your home directory. If you want more control, you can influence exactly what goes where by editing your ~/.config/user-dirs.dirs configuration file.

If you are a system administrator or distribution vendor and want to set default locations for the default XDG directories, you can edit the /etc/xdg/user-dirs.defaults file to set global defaults that affect all users on the system (users can still adjust the settings however they like though).

What else is new?

Besides this change, the 0.20 release of xdg-user-dirs brings full support for the Meson build system (dropping Automake), translation updates, and some robustness improvements to its code. We also fixed the "arbitrary code execution from unsanitized input" bug that the Arch Linux Wiki mentions here for the xdg-user-dirs utility, by replacing the shell script with a C binary.

Thanks to everyone who contributed to this release!

18 Apr 2026 8:06am GMT

16 Apr 2026

feedPlanet Lisp

Tim Bradshaw: Structures of arrays

Or, second system.

A while ago, I decided that I'd like to test my intuition that Lisp (specifically implementations of Common Lisp) was not, in fact, bad at floating-point code and that the ease of designing languages in Lisp could make traditional Fortran-style array-bashing numerical code pretty pleasant to write.

I used an intentionally naïve numerical solution to a gravitating many-body system as a benchmark, so I could easily compare Lisp & C versions. The brief result is that the Lisp code is a little slower than C, but not much: Lisp is not, in fact, slow. Who knew?

The point here though, is that I wanted to dress up the array-bashing code so it looked a lot more structured. To do this I wrote a macro which hid what was in fact an array of (for instance) double floats behind a bunch of syntax which made it look like an array of structures. That macro took a couple of hours.

This was fine and pretty simple, but it only dealt with a single type for each conceptual array of objects, there was no inheritance and it was restricted in various other ways. In particular it really was syntactic sugar on a vector: there was no distinct implementational type at all. So I thought well, I could make it more general and nicer.

Big mistake.

The second system

Here is an example of what I wanted to be able to do (this is in fact the current syntax):

(define-soa-class example ()
  ((x :array t :type double-float)
   (y :array t :type double-float)
   (p :array t :type double-float :group pq)
   (q :array t :type double-float :group pq)
   (r :array t :type fixnum)
   (s)))

This defines a class, instances of which have five array slots and one scalar slot. Of the array slots:

The implementation will tell you this:

> (describe (make-instance 'example :dimensions '(2 2)))
#<example 8010059EEB> is an example
[...]
dimensions      (2 2)
total-size      4
rank            2
tick            1
its class example has a valid layout
it has 3 arrays:
 index 0, element type double-float, 2 slots
 index 1, element type (signed-byte 64), 1 slot
 index 2, element type double-float, 2 slots
it has 5 array slots:
 name x, index 0 offset 0
 name y, index 0 offset 1
 name r, index 1 offset 0
 name p, index 2 offset 0
 name q, index 2 offset 1

This is already too complicated: the ability to control sharing via groups is almost certainly never going to be useful: it's only even there because I thought of it quite early on and never removed it.

The class definition macro then needs to arrange life so that enough information is available so that a macro can be written which turns indexed slot access into indexed array access of the underlying arrays which are secretly stored in instances, inserting declarations to make this as fast as possible: anything slower than explicit array access is not acceptable. This might (and does) look like this, for example:

(with-array-slots (x y) (thing example)
  (for* ((i ...) (j ...))
    (setf (x i j) (- (y i j) (y j i)))))

As you can see from this, the resulting objects should be allowed to have rank other than 1. Inheritance should also work, including for array slots. Redefinition should be supported and obsolete macro expansions and instances at least detected.

In other words there are exactly two things I should have aimed at achieving: the ability to define fields of various types and have them grouped into (generally fewer) underlying arrays, and an implementational type to hold these things. Everything else was just unnecessary baggage which made the implementation much more complicated than it needed to be.

I had not finished making mistakes. The system needs to store some metadata about how slots map onto the underlying arrays, element types and so on, so the macro can use this to compile efficient code. There are two obvious ways to do this: use the property list of the class name, or subclass standard-class and store the metadata in the class. The first approach is simple, portable, has clear semantics, but it's 'hacky'; the second is more complicated, not portable, has unclear semantics1, but it's The Right Thing2. Another wrong decision I made without even trying.

The only thing that saved me was that the nature of software is that you can only make a finite number of bad decisions in a finite time.

More bad decisions

I was not done. Early on, I thought that, well, I could make this whole thing be a shim around defstruct: single inheritance was more than enough, and obviously I could store metadata on the property list of the type name as described above. And there's no nausea with multiple accessors or any of that nonsense.

But, somehow, I found writing a thing which would process the (structure-name ...) case of defstruct too painful, so I decided to go for the shim-around-defclass version instead. I even have a partly-complete version of the defstructy code which I abandoned. Another mistake.

I also decided that The Right Thing was to have the system support objects of rank 0. That constrains the underlying array representation (it needs to use rank \(n+1\) arrays for an object of rank \(n\)) in a way which I thought for a long time might limit performance.

Things I already knew

At any point during the implementation of this I could have told you that it was too general and the implementation was going to be too complicated for no real gain. I don't know why I made so many bad choices.

The whole process took weeks and I nearly just gave up several times.

The light at the end of the tunnel

Or: all-up testing.

Eventually, I had a thing I thought might work. The macro syntax was a bit ugly (that macro still exists, with a different name) but it seemed to work. But since the whole purpose of the thing was performance, that needed to be checked. I wasn't optimistic.

What I did was to write a version of my naïve gravitational many-body system using the new code, based closely on the previous one. The function that updates the state of the particles looks like this:

(defun/quickly step-pvs (source destination from below dt G &aux
                                (n (particle-vector-length source)))
  ;; Step a source particle vector into a destination one.
  ;;
  ;; Operation count:
  ;;  3
  ;;  + (below - from) * (n - 1) * (3 + 8 + 9)
  ;;  + (below - from) * (12 + 6)
  ;;  = (below - from) * (20 * (n - 1) + 18) + 3
  (declare (type particle-vector source destination)
           (type vector-index from)
           (type vector-dimension below)
           (type fpv dt G)
           (type vector-dimension n))
  (when (eq source destination)
    (error "botch"))
  (let*/fpv ((Gdt (* G dt))
             (Gdt^2/2 (/ (* Gdt dt) (fpv 2.0))))
    (binding-array-slots (((source particle-vector :check nil :rank 1 :suffix _s)
                           m x y z vx vy vz)
                          ((destination particle-vector :check nil :rank 1 :suffix _d)
                           m x y z vx vy vz))
      (for ((i1 (in-naturals :initially from :bound below :fixnum t)))
        (let/fpv ((ax/G zero.fpv)
                  (ay/G zero.fpv)
                  (az/G zero.fpv)
                  (x1 (x_s i1))
                  (y1 (y_s i1))
                  (z1 (z_s i1))
                  (vx1 (vx_s i1))
                  (vy1 (vy_s i1))
                  (vz1 (vz_s i1)))
          (for ((i2 (in-naturals n t)))
            (when (= i1 i2) (next))
            (let/fpv ((m2 (m_s i2))
                      (x2 (x_s i2))
                      (y2 (y_s i2))
                      (z2 (z_s i2)))
              (let/fpv ((rx (- x2 x1))
                        (ry (- y2 y1))
                        (rz (- z2 z1)))
                (let/fpv ((r^3 (let* ((r^2 (+ (* rx rx) (* ry ry) (* rz rz)))
                                      (r (sqrt r^2)))
                                 (declare (type nonnegative-fpv r^2 r))
                                 (* r r r))))
                  (incf ax/G (/ (* rx m2) r^3))
                  (incf ay/G (/ (* ry m2) r^3))
                  (incf az/G (/ (* rz m2) r^3))))))
          (setf (x_d i1) (+ x1 (* vx1 dt) (* ax/G Gdt^2/2))
                (y_d i1) (+ y1 (* vy1 dt) (* ay/G Gdt^2/2))
                (z_d i1) (+ z1 (* vz1 dt) (* az/G Gdt^2/2)))
          (setf (vx_d i1) (+ vx1 (* ax/G Gdt))
                (vy_d i1) (+ vy1 (* ay/G Gdt))
                (vz_d i1) (+ vz1 (* az/G Gdt)))))))
  destination)

And it not only worked, the performance was very close to the previous version, straight out of the gate. The syntax is not as nice as that of the initial, quick-and-dirty version, but it is much more general, so I think that's worth it on the whole.

There have been problems since then: in particular the dependency on when classes get defined. It will never be as portable as I'd like because of the unnecessary MOP dependencies3, but it is usable and quick4.

Was it worth it? May be, but it should have been simpler.


  1. When exactly do classes get defined? Right.

  2. Nothing that uses the AMOP MOP is ever The Right Thing, because the whole thing was designed by people who were extremely smart, but still not as smart as they needed to be and thought they were. It's unclear if any MOP for CLOS can ever be satisfactory, in part because CLOS itself suffers from the same smart-but-not-smart-enough problem to a large extent not helped by bring dropped wholesale into CL at the last minute: by the time CL was standardised people had written large systems in it, but almost nobody had written anything significant using CLOS, let alone the AMOP MOP.

  3. A mistake I somehow managed to avoid was using the whole slot-definition mechanism the MOP wants you to use.

  4. I will make it available at some point.

16 Apr 2026 11:01am GMT

14 Apr 2026

feedPlanet Lisp

Robert Smith: Not all elementary functions can be expressed with exp-minus-log

By Robert Smith

All Elementary Functions from a Single Operator is a paper by Andrzej Odrzywołek that has been making rounds on the internet lately, being called everything from a "breakthrough" to "groundbreaking". Some are going as far as to suggest that the entire foundations of computer engineering and machine learning should be re-built as a result of this. The paper says that the function

$$ E(x,y) := \exp x - \log y $$

together with variables and the constant $1$, which we will call EML terms, are sufficient to express all elementary functions, and proceeds to give constructions for many constants and functions, from addition to $\pi$ to hyperbolic trigonometry.

I think the result is neat and thought-provoking. Odrzywołek is explicit about his definition of "elementary function". His Table 1 fixes "elementary" as 36 specific symbols, and under that definition his theorem is correct and clever, so long as we accept some of his modifications to the conventional $\log$ function and do arithmetic with infinities.

My concern is that the word "elementary" in the title carries a much broader meaning in standard mathematical usage. Odrzywołek recognizes this, saying little more than "[t]hat generality is not needed here" and that his work takes "the ordinary scientific-calculator point of view". He does not offer further commentary.

What is this more general setting, and does his claim still hold? In modern pure mathematics, dating back to the 19th century, the definition of "elementary function" has been well established. We'll get to a definition shortly, but to cut to the chase, the titular result does not hold in this setting. As such, in layman's terms, I do not consider the "Exp-Minus-Log" function to be the continuous analog of the Boolean NAND gate or the universal quantum CCNOT/CSWAP gates.

The rough TL;DR is this: Elementary functions typically include arbitrary polynomial root functions, and EML terms cannot express them. Below, I'll give a relatively technical argument that EML terms are not sufficient to express what I consider standard elementary functions.

To avoid any confusion, the purpose of this blog post is manifold:

  1. To elucidate what many mathematicians consider to be an "elementary function", which is the foundation for a variety of rich and interesting math (especially if you like computer science).
  2. To prove a result about EML terms using topological Galois theory.
  3. To demonstrate how this result may be used to show an elementary function not expressible by EML terms.

This blog post is not a refutation of Odrzywołek's work, though the title might be considered just as clickbait (and accurate) as his, depending on where you sit in the hall of mathematics and computation.

Disclaimer: I audited graduate-level mathematics courses almost 20 years ago, and I am not a professional mathematician. Please email me if my statements are clumsy or incorrect.

The 19th century is where all modern understanding of elementary functions was developed, Liouville being one of the big names with countless theorems of analysis and algebra named after him. One such result is about integration: do the outputs of integrals look the same as their inputs? Well, what does "input" and "look the same" mean? Liouville defined a class of functions called elementary functions, and said that the integral of an elementary function will sometimes be elementary, and when it is, it will always resemble the input in a specific way, plus potential extra logarithmic factors.

Since then, elementary functions have been defined by starting with rational functions and closing under arithmetic operations, composition, exponentiation, logarithms, and polynomial roots. While EML terms are quite expressive, they are unable to capture the "polynomial roots" in full generality. We will show this by using Khovanskii's topological Galois theory: the monodromy group of a function built from rational functions by composition with $\exp$ and $\log$ is solvable. For anybody that has studied Galois theory in an algebra course, this will be familiar, as the destination here is effectively the same, but with more powerful intermediate tooling to wrangle exponentials and logarithms.

First, let's be more precise by what we mean by an EML term and by a standard elementary function.

Definition (EML Term): An EML term in the variables $x_1,\dots,x_n$ is any expression obtained recursively, starting from $\{1, x_1,\dots,x_n\}$, by the rule $$ T,S \mapsto \exp T-\log S. $$ Each such term, evaluated at a point where all the $\log$ arguments are nonzero, determines an analytic germ; we take $\mathcal T_n$ to be the class of germs representable this way, together with their maximal analytic continuations.

Definition (Standard Elementary Function): The standard elementary functions $\mathcal{E}_n$ are the smallest class of multivalued analytic functions on domains in $\mathbb{C}^n$ containing the rational functions and closed under

What we will show is that the class of elementary functions defined this way is strictly larger than the class induced by EML terms.

Lemma: Every EML term has solvable monodromy group. In particular, if $f\in\mathcal T_n$ is algebraic over $\mathbb C(x_1,\dots,x_n)$, then its monodromy group is a finite solvable group.

Proof: We prove by induction on EML term construction. Constants and coordinate functions have trivial monodromy.

For the inductive step, suppose $f = \exp A-\log B$ with $A,B\in\mathcal T_n$, and assume that $\mathrm{Mon}(A)$ and $\mathrm{Mon}(B)$ are solvable. We argue in three steps.

Step 1: $\mathrm{Mon}(\exp A)$ is solvable. The germs of $\exp A$ are images under $\exp$ of the germs of $A$, with germs of $A$ differing by $2\pi i\mathbb Z$ collapsing to the same value. So there is a surjection $\mathrm{Mon}(A)\twoheadrightarrow\mathrm{Mon}(\exp A)$, and a quotient of a solvable group is solvable.

Step 2: $\mathrm{Mon}(\log B)$ is solvable. At a generic point $p$, germs of $\log B$ are parameterized by pairs $(b,k)$ where $b$ is a germ of $B$ at $p$ and $k\in\mathbb Z$ selects the branch of $\log$. A loop $\gamma$ acts by $$ (b,k)\mapsto\bigl(\rho_B(\gamma)(b), k+n(\gamma,b)\bigr), $$ where $\rho_B(\gamma)$ is the monodromy action of $\gamma$ on germs of $B$, and $n(\gamma,b)\in\mathbb Z$ is the winding number around $0$ of the analytic continuation of $b$ along $\gamma$. The projection $\mathrm{Mon}(\log B)\to\mathrm{Mon}(B)$ onto the first component is a surjective homomorphism. Its kernel consists of the elements of $\mathrm{Mon}(\log B)$ induced by loops $\gamma$ with $\rho_B(\gamma)=\mathrm{id}$, which then act only by integer shifts on the $k$-coordinate. Let $S_B$ be the set of germs of $B$ at $p$. For each $b\in S_B$, such a loop determines an integer shift $n(\gamma,b)$, so the kernel embeds in the direct product $\mathbb Z^{S_B}$. In particular, the kernel is abelian. Hence $\mathrm{Mon}(\log B)$ is an extension of $\mathrm{Mon}(B)$ by an abelian group, and extensions of solvable groups by abelian groups are solvable.

Step 3: $\mathrm{Mon}(f)$ is solvable. At a generic point, a germ of $f=\exp A-\log B$ is obtained by subtraction from a pair (germ of $\exp A$, germ of $\log B$), and analytic continuation acts componentwise on such pairs. This gives a surjection of $\pi_1$ onto some subgroup $$ H \le \mathrm{Mon}(\exp A)\times\mathrm{Mon}(\log B), $$ and, since $f$ is obtained from the pair by subtraction, this descends to a surjection $H\twoheadrightarrow\mathrm{Mon}(f)$. So $\mathrm{Mon}(f)$ is a quotient of a subgroup of a direct product of solvable groups, hence solvable.

The second statement of the lemma follows: an algebraic function has finitely many branches, so its monodromy group is finite; a solvable group that is finite is, well, finite and solvable. ∎

Remark. This is the core of Khovanskii's topological Galois theory; see Topological Galois Theory: Solvability and Unsolvability of Equations in Finite Terms.

Theorem: $\mathcal T_n \subsetneq \mathcal E_n$.

Proof: $\mathcal E_n$ is closed under algebraic adjunction, so any local branch of an algebraic function is elementary. In particular, a branch of a root of the generic quintic $$ f^5+a_1f^4+a_2f^3+a_3f^2+a_4f+a_5=0 $$ is elementary.

Suppose for contradiction that at some point $p$ a germ of a branch of this root agrees with a germ of an EML term $T$. By uniqueness of analytic continuation, the Riemann surfaces obtained by maximally continuing these two germs coincide, so in particular their monodromy groups coincide. The monodromy group of the generic quintic is $S_5$, which is not solvable. But by the lemma, the monodromy group of any EML term is solvable. Contradiction.

Hence $\mathcal T_n$ is a strict subset of $\mathcal E_n$. ∎

Edit (15 April 2026): This article used to have an example proving that the real and complex absolute value cannot be expressed over their entire domain as EML terms under the conventional definition of $\log$. I wrote it to emphasize that Odrzywołek's approach required mathematical "patching" in order to work as intended. However, it ended up more distracting than illuminating, and was tangential to the point about the definition of "elementary", so it has been removed.

14 Apr 2026 12:00am GMT

13 Apr 2026

feedPlanet Lisp

Scott L. Burson: FSet v2.4.2: CHAMP Bags, and v1.0 of my FSet book!

A couple of weeks ago I released FSet 2.4.0, which brought a CHAMP implementation of bags, filling out the suite of CHAMP types. 🚀 FSet users should have a look at the release page, as it also contained a number of bug fixes and minor changes.

I've since released v2.4.1 and v2.4.2, with some more bug fixes.

But the big news is the book! It brings together all the introductory material I have written, plus a lot more, along with a complete API Reference chapter.

FSet is now in the state I decided last summer I wanted to get it into: faster, better tested and debugged, more feature-complete, and much better documented than it has ever been in its nearly two decades of existence. I am, of course, very much hoping that these months of work have made the library more interesting and accessible to CL programmers who haven't tried it yet. I am even hoping that its existence helps attract newcomers to the CL community. Time will tell!

13 Apr 2026 6:21am GMT

29 Jan 2026

feedFOSDEM 2026

Join the FOSDEM Treasure Hunt!

Are you ready for another challenge? We're excited to host the second yearly edition of our treasure hunt at FOSDEM! Participants must solve five sequential challenges to uncover the final answer. Update: the treasure hunt has been successfully solved by multiple participants, and the main prizes have now been claimed. But the fun doesn't stop here. If you still manage to find the correct final answer and go to Infodesk K, you will receive a small consolation prize as a reward for your effort. If you're still looking for a challenge, the 2025 treasure hunt is still unsolved, so舰

29 Jan 2026 11:00pm GMT

26 Jan 2026

feedFOSDEM 2026

Guided sightseeing tours

If your non-geek partner and/or kids are joining you to FOSDEM, they may be interested in spending some time exploring Brussels while you attend the conference. Like previous years, FOSDEM is organising sightseeing tours.

26 Jan 2026 11:00pm GMT

Call for volunteers

With FOSDEM just a few days away, it is time for us to enlist your help. Every year, an enthusiastic band of volunteers make FOSDEM happen and make it a fun and safe place for all our attendees. We could not do this without you. This year we again need as many hands as possible, especially for heralding during the conference, during the buildup (starting Friday at noon) and teardown (Sunday evening). No need to worry about missing lunch at the weekend, food will be provided. Would you like to be part of the team that makes FOSDEM tick?舰

26 Jan 2026 11:00pm GMT