18 Apr 2025

feedPlanet Lisp

Joe Marshall: Stupid reader tricks

Here are some stupid reader tricks for Lisp. I've tested them on SBCL, and they are of questionable portability and utility.

Run Shell Commands from the Lisp Prompt

(set-macro-character #\! 
    (lambda (stream char)
      (declare (ignore stream char))
      (uiop:run-program (read-line stream) :output *standard-output*))
    t)

> !ls -als
total 4068
   4 drwxr-x--- 21 jrm  jrm     4096 Apr 18 06:42 .
   4 drwxr-xr-x  4 root root    4096 Mar 22 17:27 ..
1900 -rwx--x--x  1 jrm  jrm  1940604 Apr 17 19:10 .bash_history
   4 -rw-r--r--  1 jrm  jrm      220 Mar 19 12:16 .bash_logout
   8 -rw-r--r--  1 jrm  jrm     4961 Apr  1 11:13 .bashrc
   4 drwx------  6 jrm  jrm     4096 Mar 21 07:52 .cache
   0 lrwxrwxrwx  1 jrm  jrm       51 Mar 24 05:20 .config -> /mnt/c/Users/JosephMarshall/AppData/Roaming/.config
   0 lrwxrwxrwx  1 jrm  jrm       50 Mar 26 03:12 .emacs -> /mnt/c/Users/JosephMarshall/AppData/Roaming/.emacs
   4 drwx------  6 jrm  jrm     4096 Apr 17 12:13 .emacs.d
      ... etc ...

>

Make λ an alias for lambda

(set-macro-character #\λ (lambda (stream char) (declare (ignore stream char)) 'cl:lambda) t)

> ((λ (x) (+ x 4)) 3)
7

If you do this you might want to add a print function for the lambda symbol:

(defmethod print-object ((obj (eql 'cl:lambda)) stream) ;; doubt this is portable
  (format stream "λ"))

> '(λ (x) (+ x 4))
(λ (x) (+ x 4))

> (symbol-name (car *))
"LAMBDA"

18 Apr 2025 2:38pm GMT

Joe Marshall: DES Machine

The MIT CADR Lisp Machine had a number of static RAMs that were used in the processor for various things such as state machines and register files. The core parts of the LMI Lambda Lisp Machine were similar to the CADR (similar enough that they could run the same microcode) but technology had advanced such that the static RAM chips were typically double the size of the CADR's. The LMI Lambda thus had twice as many registers as the CADR, but because there weren't any extra bits in the instruction set, you couldn't address half of them. The extra address bit from the RAM was wired to a status register. So the LMI Lambda had, in effect, two banks of registers which you could swap between by toggling the bit in the status register. This was not normally used - the microcode would set the bit to zero and leave it there.

A friend of mine was interested in security and he had written a high performance version of the encryption algorithm used by Unix to encrypt passwords. He was experimenting with dictionary attacks against passwords and one bottleneck was the performance of the password encryption algorithm.

It occurred to me that I could store the DES S-boxes in the alternate register bank of the LMI Lambda. With some special microcode, I could turn an LMI Lambda into a DES machine that could churn through a dictionary attack at a high speed. I added a special Lisp primitive that would swap the register banks and then run several hundred rounds of the DES algorithm before swapping back and returning to Lisp. Then I wrote a Lisp program that would feed a dictionary into the DES primitives when the processor was idle.

I was able to discover a few passwords this way, but I was more interested in the proof of concept that the actual results. A microcoded DES machine would work, but you'd get better performance out of dedicated hardware.

18 Apr 2025 6:13am GMT

Thomas Fitzsimmons: Lisp ELF toolkit

I recently needed to generate an ELF binary with both RPATH and RUNPATH entries. I could not figure out how to produce this using linker command line arguments.

I was considering attempting a linker script, but first I switched to my Lisp REPL buffer 1 and found that (ql:quickload "elf") loaded a promising-looking Common Lisp ELF library.

I created a stub library with RPATH using gcc and an empty C file, then loaded it with (elf:read-elf).

With the SLIME inspector (M-x slime-inspect) I could traverse the structure of the ELF headers. I eventually found the RPATH entry.

In the REPL I built up a function to search for RPATH then push a new RUNPATH entry alongside it.

It turned out the ELF library had no support for the RUNPATH entry, so I redefined its dyn-tag dictionary to include it.

After adding RUNPATH, I wrote the modified ELF structures to a file using (elf:write-elf). The generated ELF file sufficed for the test case.

I thought this was an interesting use case to share, demonstrating unique properties of the Lisp environment. I published the result (I realize now I should have written generate-example-library.sh in Lisp instead of shell!; oh well).

  1. Which I have been trying to keep open lately, inspired by this post.

18 Apr 2025 4:35am GMT

13 Apr 2025

feedPlanet Lisp

Joe Marshall: Mea Culpa

OH NO! There's something wrong on the Internet!

It is often me. Mea culpa. Mea maxima culpa. I'm only human.

But this is a blog, not a reference manual. I use it to organize my thoughts, get things off my chest, provoke discussion, maybe entertain a few fellow hackers, and show others how much fun I'm having playing with computers. Accuracy and precision aren't the primary objectives although I try not to be egregiously incorrect.

Mostly I see people misinterpreting something I say casually. I gloss over some details that some find important but I consider boring. I make the first-order error of assuming everyone has the same backgroud as I do and will interpret what I say in the way I had in mind. Clearly this isn't the case, yet I persist in thinking this way. Oh well.

I'll try to correct errors that are brought to my attention and will elaborate things in more detail if asked, but if my blog irritates you with its broad generalizations, inattention to detail, omission of specifics, and things that are just plain wrong, why are you reading it?

13 Apr 2025 3:35pm GMT

Joe Marshall: Emacs and Lisp

The first editor I learned how to use was TECO on a line printer. You'd print the line of code you were on, then you'd issue commands to move the cursor around. You tried to avoid printing the line because that would be wasting paper. So you'd move the cursor around blind until you thought you got it to where you wanted, and then you'd start inserting characters. When you thought you had it, you'd print out the edited line.

Another undergrad saw me struggling with this and asked why I wasn't using vi. I had never heard of vi and I was amazed that you could view the code on the screen and move your cursor around visually before going into insert mode and adding text. With vi I was orders of magnitude more productive than with TECO.

When I came to the 'tute in the early 80s, I found that computer accounts were only routinely issued to students taking computer science courses. I hadn't decided on a computer science major, so I didn't have an account. However the Student Information Processing Board would give out a Multics account to interested students, so I signed up for that. The Multics terminal was in the basement and it had a dial up connection with an acoustically coupled modem: two rubber cups that the handset would cradle in.

Everyone at the AI Lab used a home grown editor called emacs, and there was a Multics port. I abandoned vi and learned emacs. The paradigm was different, but I didn't see one as superior to the other. When I declared computer science as my major, I got an account at the AI Lab on the Decsystem 20 machine. The editor was TECO with the editor macros (emacs) package loaded.

When I took S&ICP, we had a lab with HP9836 "Chipmunks" running MIT Scheme. The Chipmunks were PCs, not time-shared, and each acted as its own Scheme machine, complete with an emacs clone for editing the code.

At the AI Lab, there were a couple of machines running ITS, the Incompatible Timesharing System. You could run Maclisp on them, but Maclisp was such a pig, using over a megabyte of RAM, that a couple of instances of Maclisp would bring the machine to its knees. The Lab had developed the Lisp Machine, a single user computer that would run ZetaLisp (the successor to Maclisp). In addition to Lisp, the Lisp machine ran the ZWEI editor. Zwei Was Eine Initially, and Eine Is Not Emacs, but rather an emacs clone written in Zetalisp.

ZWEI was integrated with the Lisp environment. You could insert Lisp objects in the ZWEI editor and their printed representation would appear in the edited text. The printed represention was mouse sensitive and had its own context menu.

If you weren't using a Lisp Machine, your options were Unipress Emacs and Gosling Emacs which you could run on this new OS called "Unix"

Around this time (in the late 80s) a hacker named Stallman decided to write a new OS. His first task was to write an editor and he decided on a new version of Emacs written in its own Lisp dialect.

If you wanted to use Lisp, you interacted with it via Emacs.

These days, I use GNU Emacs and I load up the sly package. Sly is a slime fork and it turns GNU Emacs into an IDE for a Lisp running in another process. The interface gives you much of what you used to get when using ZWEI on the Lisp machine. You can evaluate subexpressions, macroexpand and compile programs, gather output, and run the Lisp debugger from within GNU emacs.

Emacs and Lisp co-evolved at MIT and Emacs has always been used as a front-end to Lisp. I've never gone back to vi, and when I've had to use it I've found it frustrating (but this is because I have forgotten everything about how to use it).

I understand that some people find Emacs impossible to use and have a strong preference for vi. Is the experience of hacking Lisp in vi any good?

13 Apr 2025 2:54pm GMT

Joe Marshall: Lisp Still Has No Parentheses — Doubling Down

I wrote that Lisp Programs Don't Have Parenthesis. A reader is unconvinced. He asserts that "Lisp programs are source code representations, and they contain parentheses." Allow me to double down and engage in some more "stupid pointless wrong-headed pedanticism". But don't take my word for it, try it out on a Lisp interpreter.

Here is a list. It consists of three elements, the symbol lambda, a sublist containing the single symbol x, and a sublist containing the symbol +, the symbol x, and the number 3.

> (defvar *my-list*
    (list 'lambda
          (list 'x)
          (list '+ 'x 3)))
*MY-LIST*

;; First element is symbol LAMBDA
> (car *my-list*)
LAMBDA

;; Second element is a list of one item:
> (format t "Items: ~{~#[~;~a~;~a and ~a~:;~@{~a~#[~;, and ~:;, ~]~}~]~}" (second *my-list*))
Items: X

;; Third element is list of three items:
> (format t "Items: ~{~#[~;~a~;~a and ~a~:;~@{~a~#[~;, and ~:;, ~]~}~]~}" (third *my-list*))
Items: +, X, and 3

This list structure only contains symbols and numbers.

For comparison, here is a string

(defvar *my-string* "(lambda (x) (+ x 3))")
*MY-STRING*

> (char *my-string* 8)
#\(
  
> (position #\) *my-string*)
10

This string obviously contains parenthesis.

Let's try calling eval and apply:

> (eval *my-list*)
#<FUNCTION (LAMBDA (X)) {B800D8AA6B}>

> (apply (eval *my-list*) (list 6))
9

Looks like *my-list*, which contains symbols and numbers (but no parenthesis), is a program.

> (eval *my-string*)
"(lambda (x) (+ x 3))"

> (apply (eval *my-string*) (list 6))
; Evaluation aborted on #<TYPE-ERROR expected-type: (OR FUNCTION SYMBOL) datum: "(lambda (x) (+ x 3))">

Looks like "(lambda (x) (+ x 3))", which contains parentheses, is not a program, but a string.

And this was my point. Lisp programs are made of s-expressions, which are nested lists of atoms (symbols, numbers, etc.). Lisp programs are not text strings with piles of parentheses.

At the end of the day, though, aren't you writing a .lisp file full of parentheses that gets turned into nested lists by the reader? Yes (with the important exception of macros, which generate code directly as list structure). When someone says "Lisp has a lot of parentheses." I'm the annoying guy saying "Actually …" Isn't that obvious from the title of the post?

13 Apr 2025 7:00am GMT

12 Apr 2025

feedPlanet Lisp

Joe Marshall: Writing Your OS in Lisp

How to write an OS in Lisp is beyond the scope of a single post in this blog, and I don't have a series prepared. However, I can drop some hints of how you can overcome some problems.

If you are writing your OS in Lisp, it helps to have some idea of how Lisp compiles your program to machine code. The disassemble function prints out the disassembly of a Lisp function. It is obviously implementation dependent.

The inner dispatch of the compiler, when it is called, will be given a target location in which to deposit the result of evaluating the compiled expression. The target location given to the compiler will typically be a register, the top of stack, a fixed offset within the stack, or a global memory location, that sort of thing. The compiler should generate code that ultimately results in the value of the expression ending up in the target location.

A number of Lisp primitive functions can be implemented as a single CPU instruction. (Alternatively, you can create quirky Lisp "CPU subprimitive" functions out of each of the non-jump CPU instructions.) When compiling a function call to such a subprimitive function, the compiler emits code that fetches each argument from its location and places the value in a register, and then emits the single CPU instruction that the subprimitive consists of. It arranges for the result of the CPU instruction to be placed in the target location. What I'm describing here is basically what is known as a "compiler intrinsic" in other languages.

If you have a tree of compound expressions, each of which is one of these CPU subprimitive functions, the compiler will assign locations to all the intermediate computed values, determine an order in which to compile the primitives (linearized left to right), and then emit linear code that carries out the primitive operations. If all your code consists of calls to CPU primitives, then the compiler can generate straight-line assembly code CPU instructions. The compiler in this case only acts as a register allocator. From here, you can bootstrap yourself to a set of primitive procedures each of which is written in straght-line assembly code.

When writing an OS, you occasionally run into places where you want a very specific sequence of CPU instructions. You have a couple of options: some Lisp compilers offer a way to insert literal assembly code instructions in the compiled code instruction stream in a progn like manner. Naturally such code is "unsafe", but it nicely solves the problem where you need hand-crafted assembly code in your solution. The other solution is to write the code as a compound expression of CPU primitives and let the compiler sort out the register allocation.

12 Apr 2025 11:37pm GMT

11 Apr 2025

feedPlanet Lisp

Joe Marshall: Bloom Filter

This is the response time of the Google Mini search appliance on several thousand queries. There is an odd spike at 300 ms. A number of machines were taking exactly 300 ms to respond regardless of the query.

I soon tracked this down to the spell server. This is the microservice that puts "Did you mean foobar?" on the top of the page when you spell something wrong. The results page generator will wait up to 300 ms for the spell server to come up with its suggestions and then it will proceed without it. What appeared to be happening is that the spell server was giving a "no go" signal to the page generator at the beginning of page composition. The results page generator would then wait for the spell server to make a suggestion. The spell server would invariably take too long, so after 300 ms, the results page generator would time out and ship the page as is. This happened often enough that it showed up as a blip in the performance graph.

The spell server was based on a Bloom filter. A Bloom filter is a variation on a hash table where you only record that a bucket exists, but not its contents. A Bloom filter will quickly and reliably tell you if an entry is not in the table, but only probabilistically tell you if an entry is in the table. Bloom filters rely on having a good hash function and having a mostly empty hash table. If the hash table is mostly empty, the Bloom filter will usually end up hitting an empty bucket and returning false immediately.

A quick look at the spellserver showed that the Bloom filter was almost always getting a hit, this would send the "no go" signal and trigger a slow lookup to find the misspelled word. The problem was that the Bloom filter was too small. It had too few buckets, so most buckets had an entry. Words would always get a hit in the Bloom filter and so the search appliance thought that all words were misspelled.

I adusted the size of the Bloom filter, giving it several million buckets. Now correctly spelled words would likely hit an empty bucket and the filter would give a "go" signal to the response page generator. Problem solved, and the spike at 300 ms went away.

11 Apr 2025 7:00am GMT

John Jacobsen: Lisp Testing Fun

I've been making some progress with steelcut, my Common Lisp template generator (does every Lisper write one of these as a rite of passage?), and I keep running into some interesting problems which have been fun to solve.

Whenever my code starts to accumulate any sort of complexity, I start trying to err on the side of writing more tests, even for "recreational" projects such as this one. The tests help make changes safely, as well as documenting the expected functionality. But test code can be fairly repetitive, often with many small variations and lots of boilerplate, in the midst of which it can be hard to see the forest for the trees.

Such was the case for steelcut tests. I found myself remembering Paul Graham's line, "Any other regularity in the code is a sign, to me at least, that I'm using abstractions that aren't powerful enough - often that I'm generating by hand the expansions of some macro that I need to write."

In my current non-Lisp programming work, I most often miss macros when writing test code. But in this case, the macro I (apparently) needed to write is here. Very roughly speaking, with-setup creates a temporary directory, runs the template generator with a given list of requested features, and allows the user to check properties of the generated files.

The macro grew and shrank as I DRYed out and simplified the test code. I realized many of my tests were checking the generated dependencies for the projects (from the .ASD file), so it would be helpful if the macro would make those available to the test code in the form of a function the user could name.

This wound up being a sweet bit of sugar for my tests, but not every test needed such a helper function. Stubborn "unused variable" warnings ensued, for those tests which don't use the deps symbol bound by the macro. I went back and forth with ChatGPT on this one (it made several wrong suggestions) until I realized I could give the variable an explicit _ name and change the behavior based on the name. This is something I've seen in other languages and was tickled I could get it so easily here.

I find the resulting tests pretty easy to read:

(test needed-files-created
  (with-setup appname "testingapp" appdir _ +default-features+
    (loop for file in '("Makefile"
                        "Dockerfile"
                        ".github/workflows/build.yml"
                        "build.sh"
                        "test.sh"
                        "src/main.lisp"
                        "src/package.lisp"
                        "test/test.lisp"
                        "test/package.lisp")
          do
             (is (uiop:file-exists-p (join/ appdir file))))))

(test cmd-feature-adds-dependency-and-example
  (with-setup appname "testingapp" appdir deps +default-features+
    (let ((main-text (slurp (join/ appdir "src/main.lisp"))))
      ;; :cmd is not there:
      (is (not (find :cmd (deps))))
      ;; It doesn't contain the example function:
      (is (not (has-cmd-example-p main-text)))))

  ;; Add :cmd to features, should get the new dependency:
  (with-setup appname "test2" appdir deps (cons :cmd +default-features+)
    (let ((main-text (slurp (join/ appdir "src/main.lisp"))))
      ;; :cmd is now there:
      (is (find :cmd (deps)))
      ;; It contains the example function:
      (is (has-cmd-example-p main-text)))))

N.B.: A keyword argument would also work, obviously. For a macro in production code, I would probably go that route instead.

Fun

I've also been thinking about a recent post on Hacker News, and the ensuing discussion, in which both the author and commenters point out that programming Lisp is fun. While I must acknowledge that Lisp has been a life-long attraction/distraction/diversion, I'm noticing that the way it is fun for me is evolving somewhat.

First, I used to find Common Lisp pretty painful because of all its sharp edges - the language is so powerful and flexible, you can shoot yourself in the foot pretty much any way you like, and it's not as easy as some newer languages to get started with. But now, with LLMs, it's much easier to get help when you get into the weeds. Sure, "the corpus of training data was smaller for Lisp than for mainstream languages," blah, blah. But often even the wrong answers from ChatGPT point me in the right direction (some of this is probably just rubberducking). When one's own ignorance is less of an obstacle, the pointy bits become less intimidating.

Second, there are some really good, and fun, books on Lisp. PAIP, Let Over Lambda, Practical Common Lisp, Land of Lisp (plus video!). Lisp has some pretty mind-bending things in it, and I'm enjoying taking the time to dig into some of these books again, understanding things better than the first time I swung by.

Third, I'm regularly stunned by how fast Lisp is. All the tests I've been writing run effectively instantly at the REPL on my M1 Macbook Air. And now that I'm starting to get good enough at the language that it is getting out of my way, that speed is more noticeable, and, well, fun.

Finally, I am intrigued by the history of computing, of which Lisp has been an important part. This is worth saying more about in a future post, but for the moment I find a stable, highly-performant, fun language that has intrigued people for decades to be more of interest than the bleeding edge. (I'm not hating on Rust. Rust is nice. But I don't find it fun, at least not yet.)

Beyond fun, I'm also enjoying "escaping the hamster wheel of backwards incompatibility" for awhile. While so many things around us crumble and whirl, and new forms of AI scare the pants out of us as much as they intrigue, older tools and traditions start to feel more like comfortable tools that weigh down one's hands satisfyingly and invite calm creation.

11 Apr 2025 12:00am GMT

10 Apr 2025

feedPlanet Lisp

Joe Marshall: Why I Program in Lisp

Lisp is not the most popular language. It never was. Other general purpose languages are more popular and ultimately can do everything that Lisp can (if Church and Turing are correct). They have more libraries and a larger user community than Lisp does. They are more likely to be installed on a machine than Lisp is.

Yet I prefer to program in Lisp. I keep a Lisp REPL open at all times, and I write prototypes and exploratory code in Lisp. Why do I do this? Lisp is easier to remember, has fewer limitations and hoops you have to jump through, has lower "friction" between my thoughts and my program, is easily customizable, and, frankly, more fun.

Lisp's dreaded Cambridge Polish notation is uniform and universal. I don't have to remember whether a form takes curly braces or square brackets or what the operator precedency is or some weird punctuated syntax that was invented for no good reason. It is (operator operands ...) for everything. Nothing to remember. I basically stopped noticing the parenthesis 40 years ago. I can indent how I please.

I program mostly functionally, and Lisp has three features that help out tremendously here. First, if you avoid side effects, it directly supports the substitution model. You can tell Lisp that when it sees this simple form, it can just replace it with that more complex one. Lisp isn't constantly pushing you into thinking imperatively. Second, since the syntax is uniform and doesn't depend on the context, you can refactor and move code around at will. Just move things in balanced parenthesis and you'll pretty much be ok.

Third, in most computer languages, you can abstract a specific value by replacing it with a variable that names a value. But you can perform a further abstraction by replacing a variable that names a quantity with a function that computes a quantity. In functional programming, you often downplay the distinction between a value and a function that produces that value. After all, the difference is only one of time spent waiting for the answer. In Lisp, you can change an expression that denotes an object into an abtraction that computes an object by simply wrapping a lambda around it. It's less of a big deal these days, but properly working lambda expressions were only available in Lisp until recently. Even so, lambda expressions are generally pretty clumsy in other languages.

Functional programming focuses on functions (go figure!). These are the ideal black box abstraction: values go in, answer comes out. What happens inside? Who knows! Who cares! But you can plug little simple functions together and get bigger more complex functions. There is no limit on doing this. If you can frame your problem as "I have this, I want that", then you can code it as a functional program. It is true that functional programming takes a bit of practice to get used to, but it allows you to build complex systems out of very simple parts. Once you get the hang of it, you start seeing everything as a function. (This isn't a limitation. Church's lambda calculus is a model of computation based on functional composition.)

Lisp lets me try out new ideas as quickly as I can come up with them. New programs are indistinguishable from those built in to the language, so I can build upon them just as easily. Lisp's debugger means I don't have to stop everything and restart the world from scratch every time something goes wrong. Lisp's safe memory model means that bugs don't trash my workspace as I explore the problem.

The REPL in lisp evaluates expressions, which are the fundamental fragments of Lisp programs. You can type in part of a Lisp program and see what it does immediately. If it works, you can simply embed the expression in a larger program. Your program takes shape in real time as you explore the problem.

Lisp's dynamic typing gives you virtually automatic ad hoc polymorphism. If you write a program that calls +, it will work on any pair of objects that have a well-defined + operator. Now this can be a problem if you are cavalier about your types, but if you exercise a little discipline (like not defining + on combinations of strings and numbers, for example), and if you avoid automatic type coercion, then you can write very generic code that works on a superset of your data types. (Dynamic typing is a two-edged sword. It allows for fast prototyping, but it can hide bugs that would be caught at compile time in a statically typed language.)

Other languages may share some of these features, but Lisp has them all together. It is a language that was designed to be used as a tool for thinking about problems, and that is the fun part of programming.

10 Apr 2025 7:00am GMT