09 Dec 2025

feedPlanet GNOME

Asman Malika: My Outreachy Journey: From curiosity to contribution

Hello! I'm Asman Malika, and I still can't quite believe I'm writing this as an Outreachy intern.

I'm working on the GNOME project: Improving document signing in GNOME Document Viewer (Papers), focusing on adding both manual and digital signing features, and improving the user interface, through a smoother signing experience.

Before I could even imagine working on a project like GNOME Papers, I was exploring a new side of software development. Just 19 months ago, I barely knew anything about coding. No degree, no bootcamp. Just curiosity, determination, and the urge to prove to myself that I belonged in tech.

Today, I work with Rust, Go, JavaScript, C, React and I contribute to open-source projects. But the road to this point? Let's just say it wasn't straight.

The struggles that led me here

I've applied to opportunities before, and been rejected. Sometimes because of my identity. Sometimes because I didn't have enough experience or a formal degree. Every rejection whispered the same doubt: maybe I wasn't ready yet.

But each rejection also pushed me to look for a space where effort, curiosity, and willingness to learn mattered more than credentials on paper. And then I found Outreachy. The moment I read about the program, it clicked: this was a place built for people like me.

Why Outreachy felt different

I didn't just apply because I wanted an internship. I applied because I wanted to contribute meaningfully to real-world projects. After months of learning, experimenting, and self-teaching, I wanted to show that persistence counts, that your journey doesn't need to follow a traditional path to matter.

The community aspect drew me in even more. Reading about past interns who started exactly where I was gave me hope. Every line of code I wrote during the application period felt like a building block towards improving myself. And the support from mentors and the wider community? I truly appreciate every bit of it.

The contribution phase: chaos, learning, and late nights

The contribution period tested my patience and resilience. Imagine this: working a full-time job (where I was still learning software development skills) during the day, then switching gears at night to contribute to Outreachy projects.

Most of my real contribution time came late at night, fueled by curiosity, determination, and maybe a little too much coffee. I had to adapt and learn quickly from understanding unfamiliar project structures, to reading documentation, asking questions (which was terrifying at first), and sometimes struggling more than I expected.

Some tasks took hours longer than anticipated. Some pull requests needed multiple revisions. Some nights, imposter syndrome kicked in.

But every challenge taught me something meaningful. I learned how open-source communities operate: writing clean code, submitting patches, communicating clearly, and staying consistent. The biggest surprise? Collaborating in public. At first, it felt intimidating, every question, every mistake visible to everyone. But gradually, it became empowering. Asking for help isn't weakness; it's how real developers grow.

Contributions I'm proud of

I fixed bugs. Improved documentation. Implemented and tested features. Helped refine workflows.

But here's the truth: the real achievement wasn't the list of tasks, it was consistency, I showed up when it was hard. I learned to work efficiently in a community, and contributed in ways that genuinely helped me grow as a developer.

Even small contributions taught me big lessons. Each merged pull request felt like a win. Each piece of mentor feedback felt like progress. Every late night debugging was worth it because I was building something real.

What I hope to gain

I want to deepen my technical skills, learn best practices from my mentors, and make contributions that truly matter. I also hope to grow my confidence in open-source collaboration and continue growing a software developer.

Throughout this journey, I want to document my progress and share my experiences with the community, reflecting on what I learn and hopefully inspire others along the way.

09 Dec 2025 3:38pm GMT

Michael Catanzaro: Significant Drag and Drop Vulnerability in WebKitGTK

WebKitGTK 2.50.3 contains a workaround for CVE-2025-13947, an issue that allows websites to exfiltrate files from your filesystem. If you're using Epiphany or any other web browser based on WebKitGTK, then you should immediately update to 2.50.3.

Websites may attach file URLs to drag sources. When the drag source is dropped onto a drop target, the website can read the file data for its chosen files, without any restrictions. Oops. Suffice to say, this is not how drag and drop is supposed to work. Websites should not be able to choose for themselves which files to read from your filesystem; only the user is supposed to be able to make that choice, by dragging the file from an external application. That is, drag sources created by websites should not receive file access.

I failed to find the correct way to fix this bug in the two afternoons I allowed myself to work on this issue, so instead my overly-broad solution was to disable file access for all drags. With this workaround, the website will only receive the list of file URLs rather than the file contents.

Apple platforms are not affected by this issue.

09 Dec 2025 3:29pm GMT

Laura Kramolis: Rewriting Cartridges

Gamepad support, collections, instant imports, and more!

Cartridges is, in my biased opinion, the best game launcher out there. To use it, you do not need to wait 2 minutes for a memory-hungry Electron app to start up before you can start looking for what you want to play. You don't need to sign into anything. You don't need to spend 20 minutes configuring it. You don't need to sacrifice your app menu, filling it with low-resolution icons designed for Windows that don't disappear after you uninstall a game. You install the app, click "Import", and all your games from anywhere on your computer magically appear.

It was also the first app I ever wrote. From this, you can probably already guess that it is an unmaintainable mess. It's both under- and over-engineered, it is full of bad practices, and most importantly, I don't trust it. I've learned a lot since then. I've learned so much that if I were to write the app again, I would approach it completely differently. Since Cartridges is the preferred way to launch games for so many other people as well, I feel it is my duty as a maintainer to give it my best shot and do just that: rewrite the app from scratch.

Myself, Zoey, and Jamie have been working on this for the past two weeks and we've made really good progress so far. Beyond stability improvements, the new base has allowed us to work on the following new features:

Gamepad Support

Support for controller navigation has been something I've attempted in the past but had to give up as it proved too challenging. That's why I was overjoyed when Zoey stepped up to work on it. In the currently open pull request, you can already launch games and navigate many parts of the UI with a controller. You can donate to her on Ko-fi if you would like to support the feature's development. Planned future enhancements include navigating menus, remapping, and button prompts.

Collections

Easily the most requested feature, a lot of people asked for a way to manually organize their games. I initially rejected the idea as I wanted Cartridges to remain a single-click game launcher but softened up to it over time as more and more people requested it since it's an optional dimension that you can just ignore if you don't use it. As such, I'm happy to say that Jamie has been working on categorization with an initial implementation ready for review as of writing this. You can support her on Liberapay or GitHub Sponsors.

Instant Imports

I mentioned that Cartridges' main selling point is being a single-click launcher. This is as good it gets, right? Wrong: how about zero clicks?

The app has been reworked to be even more magical. Instead of pulling data into Cartridges from other apps at the request of the user, it will now read data directly from other apps without the need to keep games in-sync manually. You will still be able to edit the details of any game, but only these edits will be saved, meaning if any information on, let's say Steam gets updated, Cartridges will automatically reflect these changes.

The existing app has settings to import and remove games automatically, but this has been a band-aid solution and it will be nice to finally do this properly, saving you time, storage space, and saving you from conflicts.

To allow for this, I also changed the way the Steam source works to fetch all data from disk instead of making calls to Steam's web API. This was the only source that relied on the network, so all imports should now be instant. Just install the app and all your games from anywhere on your computer appear. How cool is that? :3

And More

We have some ideas for the longer term involving installing games, launching more apps, and viewing more game data. There are no concrete plans for any of these, but it would be nice to turn Cartridges into a less clunky replacement for most of Steam Big Picture.

And of course, we've already made many quality of life improvements and bug fixes. Parts of the interface have been redesigned to work better and look nicer. You can expect many issues to be closed once the rewrite is stable. Speaking of…

Timeline

We would like to have feature-parity with the existing app. The new app will be released under the same name, as if it was just a regular update so no action will be required to get the new features.

We're aiming to release the new version sometime next year, I'm afraid I can't be more precise than that. It could take three more months, it could take 12. We all have our own lives, working on the app as a side project so we'll see how much time we can dedicate to it.

If you would like to keep up with development, you can watch open pull requests on Codeberg targeting the rewrite branch. You can also join the Cartridges Discord server.

Thank you again Jamie and Zoey for your efforts!

09 Dec 2025 12:00am GMT

08 Dec 2025

feedPlanet GNOME

Jakub Steiner: Dithering

One of the new additions to the GNOME 49 wallpaper set is Dithered Sun by Tobias. It uses dithering not as a technical workaround for color banding, but as an artistic device.

Halftone app

Tobias initially planned to use Halftone - a great example of a GNOME app with a focused scope and a pleasantly streamlined experience. However, I suggested that a custom dithering method and finer control over color depth would help execute the idea better. A long time ago, Hans Peter Jensen responded to my request for arbitrary color-depth dithering in GIMP by writing a custom GEGL op.

Now, since the younger generation may be understandably intimidated by GIMP's somewhat… vintage interface, I promised to write a short guide on how to process your images to get a nice ordered dither pattern without going overboard on reducing colors. And with only a bit of time passing since the amazing GUADEC in Brescia, I'm finally delivering on that promise. Better late than later.

GEGL dithering op

I've historically used the GEGL dithering operation to work around potential color banding on lower-quality displays. In Tobias' wallpaper, though, the dithering is a core element of the artwork itself. While it can cause issues when scaling (filtering can introduce moiré patterns), there's a real beauty to the structured patterns of Bayer dithering.

You will find the GEGL Op in Color > Dither menu. The filter/op parameters don't allow you to set the number of colors directly-only the per-channel color depth (in bits). For full-color dithers I tend to use 12-bit. I personally like the Bayer ordered dither, though there are plenty of algorithms to choose from, and depending on your artwork, another might suit you better. I usually save my preferred settings as a preset for easier recall next time (find Presets at the top of the dialog).

Happy dithering!

08 Dec 2025 2:35pm GMT

06 Dec 2025

feedPlanet GNOME

Javad Rahmatzadeh: AI and GNOME Shell Extensions

Since I joined the extensions team, I've only had one goal in mind. Making the extension developers' job easier by providing them documentation and help.

I started with the port guide and then I became involved in the reviews by providing developers code samples, mentioning best practices, even fixing the issue myself and sending them merge requests. Andy Holmes and I spent a lot of time writing all the necessary documentation for the extension developers. We even made the review guidelines very strict and easy to understand with code samples.

Today, extension developers have all the documentation to start with extensions, a port guide to port their extensions, and a very friendly place on the GNOME Extensions Matrix channel to ask questions and get fast answers. Now, we have a very strong community for GNOME Shell extensions that can easily overcome all the difficulties of learning and changes.

The number of submitted packages to EGO is growing every month and we see more and more people joining the extensions community to create their own extensions. Some days, I spend more than 6 hours a day reviewing over 15,000 lines of extension code and answering the community.

In the past two months, we have received many new extensions on EGO. This is a good thing since it can make the extensions community grow even more, but there is one issue with some packages. Some devs are using AI without understanding the code.

This has led to receiving packages with many unnecessary lines and bad practices. And once a bad practice is introduced in one package, it can create a domino effect, appearing on other extensions. That alone has increased the waiting time for all packages to be reviewed.

At the start, I was really curious about the increase in unnecessary try-catch block usage in many new extensions submitted on EGO. So I asked, and they answered that it is coming from AI.

Just to give you a gist of how these unnecessary code might look:

destroy() {
    try {
        if (typeof super.destroy === 'function') {
            super.destroy();
        }
    } catch (e) {
        console.warn(`${e.message}`);
    }
}

Instead of simply calling `super.destroy()`, which you clearly know exists in the parent:

destroy() {
    super.destroy();
}

At this point, we have to add a new rule to the EGO review guidelines. So the packages with unnecessary code that indicate they are AI-generated will be rejected.

This doesn't mean you cannot use AI for learning or fixing some issues. AI is a fantastic tool for learning and helping find and fix issues. Use it for that, not for generating the entire extension. For sure, in the future, AI can generate very high quality code without any unnecessary lines but until then, if you want to start writing extensions, you can always ask us in the GNOME Extensions Matrix channel.

06 Dec 2025 4:48pm GMT

03 Dec 2025

feedPlanet GNOME

Felipe Borges: One Project Selected for the December 2025 Outreachy Cohort with GNOME!

We are happy to announce that the GNOME Foundation is sponsoring an Outreachy project for the December 2025 Outreachy cohort.

Outreachy provides internships to people subject to systemic bias and impacted by underrepresentation in the tech industry where they are living.

Let's welcome Malika Asman! Malika will be working with Lucas Baudin on improving document signing in Papers, our document viewer.

The new contributor will soon get their blogs added to Planet GNOME making it easy for the GNOME community to get to know them and the projects that they will be working on. We would like to also thank our mentor, Lucas for supporting Outreachy and helping new contributors enter our project.

If you have any questions, feel free to reply to this Discourse topic or message us privately at soc-admins@gnome.org.

03 Dec 2025 11:03am GMT

02 Dec 2025

feedPlanet GNOME

Cassidy James Blaede: Looking back on GNOME in 2025—and looking forward to 2026

This past year has been an exceptional one for GNOME. The project released two excellent releases on schedule with GNOME 48 in March and GNOME 49 in September. Contributors have been relentless in delivering a set of new and improved default apps, constant performance improvements across the board benefitting everyone (but especially lower-specced hardware), a better experience on high end hardware like HiDPI and HDR displays, refined design and refreshed typography, all new digital wellbeing features and parental controls improvements, improved accessibility support across the entire platform, and much more.

Just take a look back through This Week in GNOME where contributors provided updates on development every single week of 2025 so far. (And a huge thank you to Felix, who puts This Week in GNOME together!)

All of these improvements were delivered for free to users of GNOME across distributions-and even beyond users of GNOME itself via GNOME apps running on any desktop thanks to Flatpak and distribution via Flathub.

Earlier this year the GNOME Foundation also relaunched Friends of GNOME where you can set up a small recurring donation to help fund initiatives including:

While I'm proud of what GNOME has accomplished in 2025 and that the GNOME Foundation is operating sustainably, I'm personally even more excited to look ahead to what I hope the Foundation will be able to achieve in the coming year.

Let's Reach 1,500 Friends of GNOME

The newly-formed fundraising committee kicked off their efforts by announcing a simple goal to close out 2025: let's reach 1,500 Friends of GNOME! If we can reach this goal by the end of this year, it will help GNOME deliver even more in 2026; for example, by enabling the Foundation to sponsor more community travel for hackfests and conferences, and potentially even sponsoring specific, targeted development work.

But GNOME needs your help!

How You Can Help

First, if you're not already a Friend of GNOME, please consider setting up a small recurring donation at donate.gnome.org. Every little bit helps, and donating less but consistently is super valuable to not only keep the lights on at the GNOME Foundation, but to enable explicit budgeting for and delivering on more interesting initiatives that directly support the community and the development of GNOME itself.

Become a Friend of GNOME

If you're already a Friend of GNOME (or not able to commit to that at the moment-no hard feelings!), please consider sharing this message far and wide! I consistently hear that not only do so many users of GNOME not know that it's a nonprofit, but they don't know that the GNOME Foundation relies on individual donations-and that users can help out, too! Please share this post to your circles-especially outside of usual contributor spaces-to let them know the cool things GNOME does and that GNOME could use their help to be able to do even more in the coming year.

Lastly, if you represent an organization that relies on GNOME or is invested in its continued success, please consider a corporate sponsorship. While this sponsorship comes with no strings attached, it's a really powerful way to show that your organization supports Free and Open Source software-and puts their money where their mouth is.

Sponsor GNOME

Thank You!

Thank you again to all of the dedicated contributors to GNOME making everyone's computing experience that much better. As we close out 2025, I'm excited by the prospect of the GNOME Foundation being able to not just be sustainable, but-with your help-to take an even more active role in supporting our community and the development of GNOME.

And of course, thank you to all 700+ current Friends of GNOME; your gracious support has helped GNOME achieve everything in 2025 while ensuring the sustainability of the Foundation going forward. Let's see if we can close out the year with 1,500 Friends helping GNOME do even more!

02 Dec 2025 12:00am GMT

01 Dec 2025

feedPlanet GNOME

Christian Hergert: Status Week 48

This week was Thanksgiving holiday in the US and I spent the entire week quite sick with something that was not Covid nor Flu according to rapid tests.

Managed to get a few things done through sheer force of will. I don't recommend it.

Red Hat

Ptyxis

Libdex

Foundry

Builder

01 Dec 2025 9:53pm GMT

Federico Mena-Quintero: Mutation testing for librsvg

I was reading a blog post about the testing strategy for the Wild linker, when I came upon a link to cargo-mutants, a mutation testing tool for Rust. The tool promised to be easy to set up, so I gave it a try. I'm happy to find that it totally delivers!

Briefly: mutation testing catches cases where bugs are deliberately inserted in the source code, but the test suite fails to catch them: after making the incorrect changes, all the tests still pass. This indicates a gap in the test suite.

Previously I had only seen mentions of "mutation testing" in passing, as something exotic to be done when testing compilers. I don't recall seeing it as a general tool; maybe I have not been looking closely enough.

Setup and running

Setting up cargo-mutants is easy enough: you can cargo install cargo-mutants and run it with cargo mutants.

For librsvg this ran for a few hours, but I discovered a couple of things related to the way the librsvg repository is structured. The repo is a cargo workspace with multiple crates: the librsvg implementation and public Rust API, the rsvg-convert binary, and some utilities like rsvg-bench.

  1. By default cargo-mutants only seemed to pick up the tests for rsvg-convert. I think it may have done this because it is the only binary in the workspace that has a test suite (e.g. rsvg-bench does not have a test suite).

  2. I had to run cargo mutants --package librsvg to tell it to consider the test suite for the librsvg crate, which is the main library. I think I could have used cargo mutants --workspace to make it run all the things; maybe I'll try that next time.

Initial results

My initial run on rsvg-covert produced useful results; cargo-mutants found 32 mutations in the rsvg-convert source code that ought to have caused failures, but the test suite didn't catch them.

The running output of cargo-mutants on the librsvg

The second run, on the librsvg crate, took about 10 hours. It is fascinating to watch it run. In the end it found 889 mutations with bugs that the test suite couldn't catch:

5243 mutants tested in 9h 53m 15s: 889 missed, 3663 caught, 674 unviable, 17 timeouts

What does that mean?

Starting to analyze the results

Due to the way cargo-mutants works, the "missed" results come in an arbitrary order, spread among all the source files:

rsvg/src/path_parser.rs:857:9: replace <impl fmt::Display for ParseError>::fmt -> fmt::Result with Ok(Default::default())
rsvg/src/drawing_ctx.rs:732:33: replace > with == in DrawingCtx::check_layer_nesting_depth
rsvg/src/filters/lighting.rs:931:16: replace / with * in Normal::bottom_left
rsvg/src/test_utils/compare_surfaces.rs:24:9: replace <impl fmt::Display for BufferDiff>::fmt -> fmt::Result with Ok(Default::default())
rsvg/src/filters/turbulence.rs:133:22: replace - with / in setup_seed
rsvg/src/document.rs:627:24: replace match guard is_mime_type(x, "image", "svg+xml") with false in ResourceType::from
rsvg/src/length.rs:472:57: replace * with + in CssLength<N, V>::to_points

So, I started by sorting the missed.txt file from the results. This is much better:

rsvg/src/accept_language.rs:136:9: replace AcceptLanguage::any_matches -> bool with false
rsvg/src/accept_language.rs:136:9: replace AcceptLanguage::any_matches -> bool with true
rsvg/src/accept_language.rs:78:9: replace <impl fmt::Display for AcceptLanguageError>::fmt -> fmt::Result with Ok(Default::default())
rsvg/src/angle.rs:40:22: replace < with <= in Angle::bisect
rsvg/src/angle.rs:41:56: replace - with + in Angle::bisect
rsvg/src/angle.rs:49:35: replace + with - in Angle::flip
rsvg/src/angle.rs:57:23: replace < with <= in Angle::normalize

With the sorted results, I can clearly see how cargo-mutants gradually does its modifications on (say) all the arithmetic and logic operators to try to find changes that would not be caught by the test suite.

Look at the first two lines from above, the ones that refer to AcceptLanguage::any_matches:

rsvg/src/accept_language.rs:136:9: replace AcceptLanguage::any_matches -> bool with false
rsvg/src/accept_language.rs:136:9: replace AcceptLanguage::any_matches -> bool with true

Now look at the corresponding lines in the source:

... impl AcceptLanguage {
135     fn any_matches(&self, tag: &LanguageTag) -> bool {
136         self.iter().any(|(self_tag, _weight)| tag.matches(self_tag))
137     }
... }
}

The two lines from missed.txt mean that if the body of this any_matches() function were replaced with just true or false, instead of its actual work, there would be no failed tests:

135     fn any_matches(&self, tag: &LanguageTag) -> bool {
136         false // or true, either version wouldn't affect the tests
137     }
}

This is bad! It indicates that the test suite does not check that this function, or the surrounding code, is working correctly. And yet, the test coverage report for those lines shows that they are indeed getting executed by the test suite. What is going on?

I think this is what is happening:

Getting a bit pedantic with the purpose of tests, rsvg-convert assumes that the underlying librsvg library works correctly. The library advertises support in its API for matching based on AcceptLanguage, even though it doesn't test it internally.

On the other hand, rsvg-convert has a test for its own --accept-language option, in the sense of "did we implement this command-line option correctly", not in the sense of "does librsvg implement the AcceptLanguage functionality correctly".

After adding a little unit test for AcceptLanguage::any_matches in the librsvg crate, we can run cargo-mutants just for that the accept_language.rs file again:

# cargo mutants --package librsvg --file accept_language.rs
Found 37 mutants to test
ok       Unmutated baseline in 24.9s build + 6.1s test
 INFO Auto-set test timeout to 31s
MISSED   rsvg/src/accept_language.rs:78:9: replace <impl fmt::Display for AcceptLanguageError>::fmt -> fmt::Result with Ok(Default::default()) in 4.8s build + 6.5s test
37 mutants tested in 2m 59s: 1 missed, 26 caught, 10 unviable

Great! As expected, we just have 1 missed mutant on that file now. Let's look into it.

The function in question is now <impl fmt::Display for AcceptLanguageError>::fmt, an error formatter for the AcceptLanguageError type:

impl fmt::Display for AcceptLanguageError {
    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
        match self {
            Self::NoElements => write!(f, "no language tags in list"),
            Self::InvalidCharacters => write!(f, "invalid characters in language list"),
            Self::InvalidLanguageTag(e) => write!(f, "invalid language tag: {e}"),
            Self::InvalidWeight => write!(f, "invalid q= weight"),
        }
    }
}

What cargo-mutants means by "replace ... -> fmt::Result with Ok(Default::default()) is that if this function were modified to just be like this:

impl fmt::Display for AcceptLanguageError {
    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
        Ok(Default::default())
    }
}

then no tests would catch that. Now, this is just a formatter function; the fmt::Result it returns is just whether the underlying call to write!() succeeded. When cargo-mutants discovers that it can change this function to return Ok(Default::default()) it is because fmt::Result is defined as Result<(), fmt::Error>, which implements Default because the unit type () implements Default.

In librsvg, those AcceptLanguageError errors are just surfaced as strings for rsvg-convert, so that if you give it a command-line argument with an invalid value like --accept-language=foo, it will print the appropriate error. However, rsvg-convert does not make any promises as to the content of error messages, so I think it is acceptable to not test this error formatter - just to make sure it handles all the cases, which is already guaranteed by its match statement. Rationale:

For cases like this, cargo-mutants allows marking code to be skipped. After marking this fmt implementation with #[mutants::skip], there are no more missed mutants in accept_language.rs.

Yay!

Understanding the tool

You should absolutely read "using results" in the cargo-mutants documentation, which is very well-written. It gives excellent suggestions for how to deal with missed mutants. Again, these indicate potential gaps in your test suite. The documentation discusses how to think about what to do, and I found it very helpful.

Then you should read about genres of mutants. It tells you the kind of modifications that cargo-mutants does to your code. Apart from changing individual operators to try to compute incorrect results, it also does things like replacing whole function bodies to return a different value instead. What if a function returns Default::default() instead of your carefully computed value? What if a boolean function always returns true? What if a function that returns a HashMap always returns an empty hash table, or one full with the product of all keys and values? That is, do your tests actually check your invariants, or your assumptions about the shape of the results of computations? It is really interesting stuff!

Future work for librsvg

The documentation for cargo-mutants suggests how to use it in CI, to ensure that no uncaught mutants are merged into the code. I will probably investigate this once I have fixed all the missed mutants; this will take me a few weeks at least.

Librsvg already has the gitlab incantation to show test coverage for patches in merge requests, so it would be nice to know if the existing tests, or any new added tests, are missing any conditions in the MR. That can be caught with cargo-mutants.

Hackery relevant to my tests, but not to this article

If you are just reading about mutation testing, you can ignore this section. If you are interested in the practicalities of compilation, read on!

The source code for the librsvg crate uses a bit of conditional compilation to select whether to export functions that are used by the integration tests as well as the crate's internal tests. For example, there is some code for diffing two images, and this is used when comparing the pixel output of rendering an SVG to a reference image. For historical reasons, this code ended up in the main library, so that it can run its own internal tests, but then the rest of the integration tests also use this code to diff images. The librsvg crate exports the "diff two images" functions only if it is being compiled for the integration tests, and it doesn't export them for a normal build of the public API.

Somehow, cargo-mutants didn't understand this, and the integration tests did not build since the cargo feature to select that conditionally-compiled code... wasn't active, or something. I tried enabling it by hand with something like cargo mutants --package librsvg -- --features test-utils but that still didn't work.

So, I hacked up a temporary version of the source tree just for mutation testing, which always exports the functions for diffing images, without conditional compilation. In the future it might be possible to split out that code to a separate crate that is only used where needed and never exported. I am not sure how it would be structured, since that code also depends on librsvg's internal representation of pixel images. Maybe we can move the whole thing out to a separate crate? Stop using Cairo image surfaces as the way to represent pixel images? Who knows!

01 Dec 2025 1:06pm GMT

30 Nov 2025

feedPlanet GNOME

Sophie Herold: Weekly report #75

Hello world! Last week, I asked the people that financially support me, if I should post my updates publicly. A majority voted to release my future weekly reports to the public, some voted to make them public every other week. So I will try to post some of them publicly in the future.

These updates are made possible by the amazing people that support me on Ko-fi, Patreon, Open Collective, and GitHub! Thank you so much, folks!

Since this is my first public weekly report, let's maybe start with a short introduction: I started my GNOME related work in 2018 by working a bit on Gajim UI and starting the Pika Backup project. Since March 2024 I have slowly started to ask for donations for my work on GNOME. I am disabled due to ME/CFS and being autistic. Working within the GNOME project allows me to earn a bit of extra money on top of my social assistance while also doing something that I love, and I can do at my own pace. I am working on too many things within GNOME: Pika Backup, Loupe, glycin, websites, Release Team, Circle Committee, and a bunch of other things, like trying to advocate for queer and disabled people within the GNOME community.

You will notice that my weekly reports will usually not contain giant achievements or huge promises. Maintenance work can be tedious, and generally, fundraisers have developed a frustrating habit of frequently over-promising and under-delivering. I don't want to be part of that.

So, let's finally get to the actual updates for last week. We landed the translation support within www.gnome.org. At the moment, this is still practically invisible. We are waiting for the Translation Team to enable translators to do the work. Once we got some translations, we will also enable the language selection dialog. I also asked if we want translations for donate.gnome.org, but got no feedback so far.

A release for gst-thumbnailers is now out, such that distributions can package it. There is more about the thumbnailers in the Release Team issue. I updated the Circle benefits list, and updated circle.gnome.org to the version that does not list apps and components on its own. That's something design people wanted to see for a while. Since the old Circle page stopped building, that was a good moment to finally do it :)

I spend some time on Pika Backup hoping that we are very close to a 0.8 beta release. However, I noticed that the current state of the setup dialog isn't what we want. After discussing the options with Fina, we are now sure that we have to rework what we have in some way. Not shying away from throwing code away, and reworking something again, is often very important for approaching at a good result. Maybe I will summarize this example, once we have arrived at a solution.

Some of the less tangible work this week: Shortly discussed Emmanuele's GNOME Governance proposal in Matrix. Something that might look like making changes within GNOME more complicated from the outside. But the actual goal is the opposite: Currently, it can be very hard to make changes within GNOME since there is no clear way how to go about it. This not only slows people down but, at least for me, can also me quite emotionally draining. So, a very important proposal. Maybe we got a tiny step closer to making it reality. Also contributed to an internal Release Team discussion and contacted an involved party.

That's all for this week! If you want to support my work financially, you can check my GitLab profile.

Hope you all have a great week!

30 Nov 2025 9:03pm GMT

29 Nov 2025

feedPlanet GNOME

This Week in GNOME: #227 Circle Benefits

Update on what happened across the GNOME project in the week from November 22 to November 29.

GNOME Circle Apps and Libraries

Sophie (she/her) says

The benefits for GNOME Circle projects now explicitly include the participation in internship programs as well as the inclusion on the help.gnome.org, welcome.gnome.org, apps.gnome.org or developer.gnome.org pages.

The circle.gnome.org page as been redesigned to link to the respective pages instead of having its own app and component list.

NewsFlash feed reader

Follow your favorite blogs & news sites.

Jan Lukas reports

Last week Newsflash 4.2 was released with the usual amount of small improvements and fixes. What makes it worthy of the minor version bump in my opinion is the new popover right above the article containing all relevant font and spacing options for a more direct interaction, a new slightly different layout for tablets (specifically the PineNote) and combining multiple image enclosures into a carousel.

newsflash-twig.png

Third Party Projects

Alain announces

Planify 4.16.1 is now available!

A small but polished update focused on improving the overall experience: • UX improvements and minor bug fixes • Shift+Enter is back for quick "keep adding" • Zoom links now supported in calendar events • Updated Donate page • The Planify website has been refreshed with a cleaner design and several improvements → https://www.useplanify.com

Thanks for following the journey of Planify 💙

Dzheremi announces

Mass Lyrics Downloading with Chronograph 5.3

Chronograph got an update with, as promised before, mass lyrics downloading support. Now users allowed to query LRClib to give them lyrics for all tracks in their current library. Chronograph will try to find the closest lyrics among results for your tracks. This feature expands coverage of the app usefulness for users, who doesn't want to sync lyrics themselves, but want to just download them.

Sync lyrics of your loved songs 🕒

chronograph-mass-lyrics.png

Parabolic

Download web video and audio.

Nick announces

Parabolic V2025.11.1 is here! This release contains many bug fixes for issues users were experiencing.

A larger new feature update is in the works to address many long standing feature requests.

Here's the full changelog:

  • Fixed the sleep interval for multiple subtitle downloads
  • Fixed an issue where low-resolution media was being downloaded on Windows
  • Fixed an issue where aria2c couldn't download media from certain sites
  • Fixed an issue where Remove Source Data was not clearing all identifiable metadata fields

parabolic.png

Shell Extensions

boerdereinar announces

Hey everyone, this week I've released my clipboard manager extension Copyous with the following features:

  • Supports text, code, images, files, links, characters and colors.
  • Can be opened at mouse pointer or text cursor
  • Pin favorite items
  • Group items with 9 colored tags
  • Customizable clipboard actions
  • Highly customizable

copyous-screenshot.png

PakoVM announces

This week I publish Tinted Shell, a extension that simply adds a spalsh of color to the Gnome Shell theme based on the user's current accent color while respecting the original look.

You can get it from Gnome Extensions and contribute to it on Github.

tinted-shell-preview.png

Miscellaneous

Krafting - Vincent says

Hey everyone, this week I pushed updates to use the GNOME 49 runtime, and revamping the keyboard shortcuts pages on all my apps available on Flathub :

In addition, SemantiK got some versions bumps and PedantiK got a lot of bug fixes (including fixing the broken Wikipedia API)

Also, work started on a PedantiK English pack, to allow languages other than french.

GNOME Foundation

Allan Day announces

A new GNOME Foundation update is available, covering what has happened at the GNOME Foundation over the past two weeks. It covers a fairly long list of topics, including the recent budget report, funding for Outreachy, banking and finance changes, Flathub progress, and more.

Digital Wellbeing Project

Philip Withnall reports

This week in parental controls, Ignacy is working on changing the Shell lock screen to show when a child's screen time limit has been reached; and I've spent a bit of time writing up the technical details of how web filtering will work at https://tecnocode.co.uk/2025/11/27/parental-controls-web-filtering-backend/ (the backend is written, the UI integration is future work)

That's all for this week!

See you next week, and be sure to stop by #thisweek:gnome.org with updates on your own projects!

29 Nov 2025 12:00am GMT

28 Nov 2025

feedPlanet GNOME

Christian Hergert: Libdex Futures from asyncio

One of my original hopes for Libdex was to help us structure complex asynchronous code in the GNOME platform. If my work creating Foundry and Sysprof are any indicator, it has done leaps and bounds for my productivity and quality in that regard.

Always in the back of my mind I hoped we could make those futures integrate with language runtimes.

This morning I finally got around to learning enough of Python's asyncio module to write the extremely minimal glue code. Here is a commit that implements integration as a PyGObject introspection override. It will automatically load when you from gi.repository import Dex.

This only integrates with the asyncio side of things so your application is still responsible for integrating asyncio and GMainContext or else nothing will be pumping the DexScheduler. But that is application toolkit specific so I must leave that to you.

I would absolutely love it if someone could work on the same level of integration for GJS as I have even less experience with that platform.

28 Nov 2025 9:18pm GMT

Allan Day: GNOME Foundation Update, 2025-11-28

Welcome to another GNOME Foundation update; an overview of everything that's been happening at the Foundation. There was no update last week, due to me being away from my computer last Friday, so this post covers a two week period rather than the usual single week.

Many thanks to everyone who responded to my request for feedback in my previous post! It was great to hear your views on these posts, and it was extremely motivating to get positive feedback on the blog series.

Budget report

In case you didn't see it, last week Rob posted a detailed breakdown of the Foundation's current operating budget. This is the second year in a row that we have provided a budget report for the community, and I'm thrilled that we've been able to keep up the momentum around financial transparency. I'd encourage you to give it a read if you haven't already.

Community travel

One positive aspect of the current budget is that we have a healthy community travel budget, and I really want to encourage members to make use of the fund. The travel budget is there to be spent, and we absolutely want to see community members applying for travel. If you have been interested in organising a hackfest, or attending a GNOME conference, and finances have been a barrier, please do make use of the funding that is available. Information about how to apply can be found in the handbook.

Also on travel: we are currently looking to recruit additional volunteers to help administer the travel budget, as part of the Travel Committee. So, if you are interested in helping with GNOME and would like to get involved, please do get in touch using the comments below, or by messaging the Travel Committee.

Outreachy

The Foundation has a proud history of funding outreach efforts, and has regularly supported interns through Outreachy. The December to March round is almost upon us, and the Internship Committee has coordinated the selection of an intern who we will be sponsoring. We were pleased to release the funding for this internship this week. More details about the internship itself will follow.

Banking and finance systems

As mentioned in recent updates, we have been working through a round of improvements to our banking setup, which will give us enhanced fraud protection, as well as automatic finance management features. This week we had a training session with our bank, the fraud protection features were turned on, and I signed the last of the paperwork. As a result, this round of work is now complete.

I have also been going through the process of signing up for the new financial system that Dawn our new finance advisor will be setting up for us.

Bookkeeping meetings

Our regular monthly bookkeeping meeting happened last week, and we had another follow-up call more recently. We are still working through the 2024-25 financial year end accounts, which primarily involves resolving a list of small questions, to make sure that the accounts are 100% complete and accurate. Our bookkeeper has also been very patiently answering questions from Deepa, our treasurer, and myself as we continue to familiarise ourselves with the finance and accounting setup (thank you!)

Board meeting

The Board had a regular meeting this week. The topics under discussion included:

We came away with a healthy list of action items, and I'm looking forward to making progress in each of these areas.

GNOME.Asia

Our upcoming conference is Tokyo continues to be a focus, and Kristi is busy putting the final arrangements together. The event is just 15 days away! A reminder: if you want to participate, please do head over to the site and register.

Flathub

There has been some good progress around Flathub over the past two weeks. Bart has done quite a bit of work to improve the performance of the Flathub website, which I'm sure users will appreciate. We also received some key pieces of legal work, which are required as part of the roadmap to establish Flathub as its own financial/legal entity. With those legal documents in place we have turned our attention to planning Flathub's financial systems; discussions about this are ongoing.

Digital Wellbeing

There was another review call this week to check on progress as the current phase of the program reaches its final stages. The main focus right now is making sure that the new screen time limits feature is in good shape before we use up the remaining funding.

Progress is looking good in general: the main changes for GNOME Shell and Settings have all been merged. There are two more pieces of work to land before we can say that we are in a feature complete state. After that we will circle back to UX review and papercut fixing. If you want more information about these features, I would recommend Ignacy's recent post on the topic.

Philip has also published a fantastic post on the web filtering functionality that has been implemented as part of this program.

That's it for this week! Thanks for reading, and see you next week.

28 Nov 2025 7:26pm GMT

27 Nov 2025

feedPlanet GNOME

Philip Withnall: Parental controls web filtering backend

In my previous post I gave an overview of the backend for the screen time limits feature of parental controls in GNOME. In this post, I'll try and do the same for the web filtering feature.

We haven't said much about web filtering so far, because the user interface for it isn't finished yet. The backend is, though, and it will get plumbed up eventually. Currently we don't have a GNOME release targeted for it yet.

When is web filterings? What is web filtering?

(Apologies to Radio 4 Friday Night Comedy.)

Firstly, what is the aim of web filtering? As with screen time limits, we've written a design document which (hopefully) covers everything. But the summary is that it should allow parents to filter out age-inappropriate content on the web when it's accessed by child accounts, while not breaking the web (for example, by breaking TLS for websites) and not requiring us (as a project) to become curators of filter lists. It needs to work for all apps on the system (lots of apps other than web browsers can show web content), and needs to be able to filter things differently for different users (two different children of different ages might use the same computer, as well as the parents themselves).

After looking at various different possible ways of implementing it, the best solution seemed to be to write an NSS module to respond to name resolution (i.e. DNS) requests and potentially block them according to a per-user filter list.

A brief introduction to NSS

NSS (Name Service Switch) is a standardised name lookup API in libc. It's used for hostname resolution, but also for user accounts and various other things. Names are resolved by various modules which are dlopen()ed into your process by libc and queried in the order given in /etc/nsswitch.conf. So for hostname resolution, a typical configuration in nsswitch.conf would cause libc to query the module which looks at /etc/hosts first, then the module which checks your machine's hostname, then the mDNS module, then systemd-resolved.

So, we can insert our NSS module into /etc/nsswitch.conf, have it run somewhere before systemd-resolved (which in this example does the actual DNS resolution), and have it return a sinkhole address for blocked domains. Because /etc/nsswitch.conf is read by libc within your process, this means that the configuration needs to be modified for containers (flatpak) as well as on the host system.

Because the filter module is loaded into the name lookup layer, this means that content filtering (as opposed to domain name filtering) is not possible with this approach. That's fine - content filtering is hard, I'm not sure it gives better results overall than domain name filtering, and means we can't rely on existing domain name filter lists which are well maintained and regularly updated. We're not planning on adding content filtering.

It also means that DNS-over-HTTPS/-TLS can be supported, as long as the app doesn't implement it natively (i.e. by talking HTTPS over a socket itself). Some browsers do that, so the module needs to set a canary to tell them to disable it. DNS-over-HTTPS/-TLS can still be used if it's implemented by one of the NSS modules, like systemd-resolved.

Nothing here stops apps from deliberately bypassing the filtering if they want, perhaps by talking DNS over UDP directly, or by calling secret internal glibc functions to override nsswitch.conf. In the future, we'd have to implement per-app network sandboxing to prevent bypasses. But for the moment, trusting the apps to cooperate with parental controls is fine.

Filter update daemon

So we have a way of blocking things; but how does it know what to block? There are a lot of filter lists out there on the internet, targeted at existing web filtering software. Basically, a filter list is a list of domain names to block. Some filter lists allow wildcards and regexps, others just allow plain strings. For simplicity, we've gone with plain strings.

We allow the parent to choose zero or more filter lists to build a web filtering policy for a child. Typically, these filter lists will correspond to categories of content, so the parent could choose a filter list for advertising, and another for violent content, for example. The web filtering policy is basically the set of these filter lists, plus some options like "do you want to enforce safe search". This policy is, like all other parental controls policies, stored against the child user in accounts-service.

Combine these filter lists, and you have the filter list to give to NSS in the child's session, right? Not quite - because the internet unfortunately keeps changing, filter lists need to be updated regularly. So actually what we need is a system daemon which can regularly check the filter lists for updates, combine them, and make them available as a compiled file to the child's NSS module - for each user on the system.

This daemon is malcontent-webd. It has a D-Bus interface to allow the parent to trigger compiling the filter for a child when changing the parental controls policy for that child in the UI, and to get detailed feedback on any errors. Since the filter lists come from third parties on the internet, there are various ways they could have an error.

It also has a timer unit trigger, malcontent-webd-update, which is what triggers it to periodically check the filter lists for all users for updates.

High-level diagram of the web filtering system, showing the major daemons and processes, files, and IPC calls. If it's not clear, the awful squiggled line in the bottom left is meant to be a cloud. Maybe this representation is apt.

And that's it! Hopefully it'll be available in a GNOME release once we've implemented the user interface for it and done some more end-to-end testing, but the screen time limits work is taking priority over it.

27 Nov 2025 1:25pm GMT

Sam Thursfield: Bollocks to Github

I am spending the evening deleting my Github.com account.

There are plenty of reasons you might want to delete your Github account. I'd love to say that this is a coherently orchestrated boycott on my part, in sympathy with the No Azure for Apartheid movement. Microsoft, owner of Github, is a big pile of cash happy to do business with an apartheid state. That's a great reason to delete your Github.com account.

I will be honest with you though, the thing that pushed me over the edge was a spam email they sent entitled "GitHub Copilot: What's in your free plan 🤖". I was in a petty mood this morning.

Offering free LLM access is a money loser. The long play is this: Microsoft would like to create a generation of computer users hooked on GitHub Copilot. And, I have to hand it to them, they have an excellent track record in monopolising how we interact with our PCs.

Deleting my Github.com account isn't going to solve any of that. But it feels good to be leaving, anyway. The one billionth Github repository was created recently and it has a single line README containing the word "shit". I think that summarizes the situation more poetically than I could.

I had 145 repositories in the ssssam/ namespace to delete. The oldest was Iris; forked in 2011.

Quite a story to that one. A fork of a project by legendary GNOME hacker Christian Hergert. In early 2011, I'd finished university, had no desire to work in the software industry, and was hacking on a GTK based music player app from time to time. A rite of passage that every developer has to go through, I suppose. At some point I decided to overcomplicate some aspect of the app and ended up integrating libiris, a library to manage concurrent tasks. Then I started working professionally and abandoned the thing.

Its fun to look at that with 13 years perspective. I since learned, largely thanks to Rust, that I cannot possibly ever write correct concurrent thread-based C code. (All my libiris changes had weird race conditions). I met Christian various times. Christian created libdex which does the same thing, but much better. I revived the music player app as a playlist generation toolkit. We all lived happily ever after.

Except for the Github fork, which is gone.

What else?

This guy was an early attempt at creating a sort of GObject mapping for SPARQL data as exposed by Localsearch (then Tracker). Also created around 2011. Years later, I did a much better implementation in TrackerResource which we still use today.

The Sam of 2011 would be surprised to hear that we organized GUADEC in Manchester only 6 years later. Back in those days we for some reason maintained our own registration system written in Node.js. I spent the first few weeks of 2017 hacking in support for accommodation bookings.

I discovered another 10 year old gem called "Software Integration Ontology." Nowadays we'd call that an SBOM. Did that term exist 10 years ago? I have spent too much time working on software integration.

Various other artifacts research into software integration and complexity. A vestigal "Software Dependency Visualizer" project. (Which took on a life of its own, and many years later the idea is alive in KDE Codevis). A fork of Aboriginal Linux, which we unwittingly brought to an end back in 2016. Bits of Baserock, which never went very far but also led to the beginning of BuildStream.

A fork of xdg-app, which is the original name of Flatpak. A library binding GLib to the MS RPC API on Windows, from the 2nd professional project I ever did. These things are now dust.

I had over 100 repositories on github.com. I sponsored one person, who I can't sponsor any more as they only accept Github money. (I sponsor plenty of people in other platforms)

Anyway, lots of nice people use Github, you can keep using Github. I will probably have to create the occasional burner account to push PRs where projects haven't migrated away. Some of my projects are now in Gitlab.com and in GNOME's Gitlab. Others are gone!

It's a weird thing but in 2025 I'm actually happy knowing that there's a little bit less code in the world.

27 Nov 2025 12:27am GMT

25 Nov 2025

feedPlanet GNOME

Sam Thursfield: Status update — 23/11/2025

Bo día.

I am writing this from a high speed train heading towards Madrid, en route to Manchester. I have a mild hangover, and a two hundred page printout of "The STPA Handbook"… so I will have no problem sleeping through the journey. I think the only thing keeping me awake is the stunning view.

Sadly I havent got time to go all the way by train, in Madrid i will transfer to Easy Jet. It is indeed easy compared to trying to get from Madrid into France by train. Apparently this is mainly the fault of France's SNCF.

On the Spain side, fair play. The ministro de fomento (I think this translates as "Guy in charge of trains?") just announced major works in Barcelona, including a new station in La Sagrera with space for more trains than they have now, and a more direct access from Madrid, and a speed boost via some new type of railway sleeper, which would make the top speed 350km/h instead of 300km/h as it is now. And some changes in Madrid, which would reduce the transfer time when arriving from the west and heading out further east. You can argue with many things about the trains in Spain… perhaps it would be useful if the regional trains here ran more than once per day… but you cant argue with the commitment to fast inter-city travel.

If only we had similar investment to fix the cross border links between Spain and France, which are something of a joke. Engineers around the world will know this story. The problem is social: two different organizations, who speak different languages, have to agree on something. There is already a perfectly usable modern train line across the border. How many trains per day? Uh… two. Hope you planned your trip in advance because they're fully booked next week.

Anyway, this isn't meant to be a post on the status of the railways of western Europe.

Digital Resilience Forum

Last month I hopped on another Madrid-bound train to attend the Digital Resilience Forum. It's a one day conference organized by Bitergia who you might know as world leaders in open source community analysis.

I have mixed feelings about "community metrics" projects. As Nick Wellnhofer said regarding libxml, when you participate as a volunteer in a project that is being monitored, its easy to feel like you're being somehow manipulated by the corporations who sponsor these things. How come you guys will spend time and money analyzing my project's development processes and Git history, but you won't spend time actually fixing bugs and improvements upstream? As the ffmpeg developers said: How come you will pay top calibre security researchers to read our code and find very specific exploits, but then wait for volunteers to fix them?

The Bitergia team are great people who genuinely care about open source, and I really enjoyed the conference. The main themes were: digital sovereignty, geopolitics, the rise of open source, and that XKCD where all our digital
infrastructure depends on a single unpaid volunteer in Nebraska. (https://xkcd.com/2347/). (Coincidentally, one of the Bitergia guys actually does live in Nebraska).

It was a day in a world where I am not used to participating: less engineering, more politics and campaigning. Yes, the Sovereign Tech Agency were around. We played a cool role play game simulating various hypothetical software crisis that might happen in the year 2027 (spoiler: in most cases a vendor-neutral, state-funded organization focused on open source was able to save the day : -). It is amazing what they've done so far with a relatively small investment, but it is a small organization and they maintain that citizens of every country should be campaigning and organizing to setting up an equivalent. Let's not tie the health of open source infrastructure too closely to German politics.

Also present, various campaign groups with "Open" at the start of their name: OpenForum Europe, OpenUK, OpenIreland, OpenRail. When I think about the future of Free Software platforms, such as our beloved GNOME, my mind always goes to funding contributors. There's very little money here and meanwhile Apple and Microsoft have nearly all of the money and I feel like still GNOME succeeds largely thanks to the evenings and weekends of a small core of dedicated hackers; including some whose day job involves working on some other part of GNOME. It's a bit depressing sometimes to see things this way, because the global economy gets more unequal every day, and how do you convince people who are already squeezed for cash to pay for something that's freely available online? How do you get students facing a super competitive job market to hack on GTK instead of studying for university exams?

There's another side which I talk about less, and that's education. There are more desktop Linux users than ever - apparently 5% of all desktop users or something - but there's still very little agreement or understanding what "open source" is. Most computer users couldn't tell you what an "operating system" does, and don't know why "source code" can be an interesting thing to share and modify.

I don't like to espouse any dogmatic rule that the right way to solve any problem is to release software under the GPLv3. I think the problems society has today with technology come from over-complexity and under-study. (See also, my rant from last month. ). To tackle that, it is important to have software infrastructure like drivers and compilers available under free software licenses. The Free Software movement has spent the last 40 years doing a pretty amazing job of that, and I think its surprising how widely software engineers accept that as normal and fight to maintain it. Things could easily be worse. But this effort is one part of a larger problem, of helping those people who think of themselves as "non-technical" to understand the fundamentals of computing and not see it as a magic box. Most people alive today have learned to read and write one or more languages, to do mathematics, to operate a car, to build spreadsheets, and operate a smartphone. Most people I know under 45 have learned to prompt a large language model in the last few years.

With a basic grounding in how a computer operates, you can understand what an operating system does. And then you can see that whoever controls your OS has complete control over your digital life. And you will start to think twice about leaving that control to Apple, Google and Microsoft - big piles of cash where the concept of "ethics" barely has a name.

Reading was once a special skill reserved largely for monks. And it was difficult: we only started spaces between the words later on. Now everyone knows what a capital letter is. We need to teach how computers work, we need to stop making them so complicated, and the idea of open development will come into focus for everyone.

(and yes i realize this sounds a bit like the permacomputing manifesto).

Codethink work

This is a long rant, isn't it? My train only just left Zamora and I didnt fall asleep yet, so there's more to come.

I had a nice few months hacking on Endless OS 7, which has progressed from an experiment to a working system, bootable on bare metal, albeit with a various open issues that would block a stable release as yet. The overview docs in the repo tell you how to play with it.

This is now fully in the hands of the team at Endless, and my next move is going to be in some internal research that has been ongoing for a number of years. Not much of it is secret, in fact quite a lot is being developed in the open, and it relates in part to regulatory compliance and safety-critical softare systems.

Codethink dedicates more to open source than most companies its size. We never have trouble getting sponsorship for events like GUADEC. But I do wish I could spend more time maintaining open infrastructure that I use every day, like, you know, GNOME.

This project isn't going to solve that tomorrow, but it does occupy an interesting space in the intersection between industry and open source. The education gap I talked to you above is very much present in some industries where we work. Back in February a guy in a German car firm told me, "Nobody here wants open source. What they want is somebody to blame when the thing goes wrong."

Open source software comes with a big disclaimer that says, roughly, that if it breaks you get to keep both pieces. You get to blame yourself.

And that's a good thing! The people who understand a final, integrated system are the only people who can really define "correct behaviour". If you've worked in the same industries I have you might recognise a common anti-pattern: teams who spend all their time arguing about ownership of a particular bug, and team A are convinced it's a misbehaviour of component B and team B will try to prove the exact opposite. Meanwhile nobody actually spends the 15 minutes it would take to actually fix the bug. Another anti-pattern: team A would love to fix the bug in component B, but team B won't let them even look at the source code. This happens muuuuuuuch more than you might think.

So we're not trying to teach the world how computers work, on this project, but we are trying to increase adoption and understanding at least in the software industry. There are some interesting ideas. Looking at software systems from new angles. This is where STPA comes in, by the way - it's a way of breaking a system down not into components but rather into one or more control loops. Its going to take a while to make sense of everything in this new space… but you can expect some more 1500 word blog posts on the topic.

25 Nov 2025 12:03am GMT