12 Feb 2026

feedPlanet GNOME

Olav Vitters: GUADEC 2026 accommodation

One of the things that I appreciate in a GUADEC (if available) is a common accommodation. Loads of attendees appreciated the shared accommodation in Vilanova i la Geltrú, Spain (GUADEC 2006). For GUADEC 2026 Deepesha announced one recommended accommodation, a student's residence. GUADEC 2026 is at the same place as GUADEC 2012, meaning: A Coruña, Spain. I didn't go to the 2012 one though I heard it also had a shared accommodation. For those wondering where to stay, suggest the recommended one.

12 Feb 2026 8:51am GMT

11 Feb 2026

feedPlanet GNOME

Andy Wingo: free trade and the left

I came of age an age ago, in the end of history: the decade of NAFTA, the WTO, the battle of Seattle. My people hated it all, interpreting the global distribution of production as a race to the bottom leaving post-industrial craters in its wake. Our slogans were to buy local, to the extent that participation in capitalism was necessary; to support local businesses, local products, local communities. What were trade barriers to us? One should not need goods from far away in the first place; autarky is the bestarky.

Now, though, the vibe has shifted. If you told me then that most leftists would be aghast at the cancelling of NAFTA and subsequently relieved when it was re-instated as the USMCA, I think I would have used that most 90s of insults: sellouts. It has been a perplexing decade in many ways.

The last shoe to drop for me was when I realized that, given the seriousness of the climate crisis, it is now a moral imperative everywhere to import as many solar panels as cheaply as possible, from wherever they might be made. Solar panels will liberate us from fossil fuel consumption. China makes them cheapest. Neither geopolitical ("please believe China is our enemy") nor antiglobalist ("home-grown inverters have better vibes") arguments have any purchase on this reality. In this context, any trade barrier on Chinese solar panels means more misery for our grandchildren, as they suffer under global warming that we caused. Every additional half-kilowatt panel we install is that much less dinosaur that other people will choose to burn.

But if trade can be progressive, then when is it so, and when not? I know that the market faithful among my readers will see the question as nonsense, but that is not my home church. We on the left pride ourselves in our critical capacities, in our ability to challenge common understandings. However, I wonder about the extent to which my part of the left has calcified and become brittle, recycling old thoughts that we thunk about new situations that rhyme with things we remember, in which we spare ourselves the effort of engagement out of confidence in our past conclusions, but in doing so miss the mark. What is the real leftist take on the EU-Mercosur agreement, anyway?

This question has consumed me for some weeks. To help me think through it, I have enlisted four sources: Quinn Slobodian's Globalists (2018), an intellectual history of neoliberalism, from the first world war to the 1990s; Marc-William Palen's Pax Economica (2024), a history of arguments for free trade "from the left", from the 1840s to the beginnings of Slobodian's book; Starhawk's Webs of power (2002), a moment-in-time, first-person-plural account of the antiglobalization movement between Seattle and 9/11; and finally Florence Reese's Which side are you on? (1931), as recorded by the Weavers, a haunting refrain that dogs me all through this investigation, reminding me that the question is visceral and not abstract.

stave 1: mechanism

Before diving in though, I should admit that I don't actually understand the basics, and it is not for a lack of trying.

Let's work through some thought experiments together. Let's imagine you are in France and want to buy nails. To produce nails, a capitalist has to purchase machines and buildings and steel, has to pay the workers to operate the machines, then finally the nails have to get boxed and shipped to, you know, me. I could buy French-made nails, apparently there is one clouterie left. Even assuming that Creil is a hotbed of industrial development and that its network of suppliers and machine operators is such that they can produce nails very efficiently, the wage in France is higher than the wage in, say, China, or Vietnam, or Turkey. The cost of the labor that goes into each nail is more.

So if I have 1000€ that I would like to devote to making a thing in France, I might prefer to obtain my nails from Turkey for 50€ rather than from France for 100€, because it leaves me more left over in which I can do my thing.

Perhaps France sees this as a travesty; in response, France imposes a tariff of 100% on imported nails. In that way, it might be the same to me as a consumer whether the nails were from Turkey or France. But this imposes a cost on me; I have less left over with which to do my thing. In exchange, the French clouterie gets to keep on going, along with the workers that work there, their families and the community around the factory, the suppliers of the factory, and so on.

But do I actually care that my nails are made in France? Should the country care that it has a capacity to make nails? When I was 25, I thought so, that it would be better if everything were made locally. Perhaps that was a distortion from having grown up in the US, in which we consider goods originating between either ocean as "ours"; smaller polities cannot afford that conceit. There may or may not be a reason that France might want nails to be made locally; it is a political question, but one that is not without tradeoffs. If we always choose the local option, we will ultimately do worse overall than a country that is happy to import; in the limit case, some goods like coffee would be unobtainable, with a price floor established by the cost to artificially re-create in France the mountain climate of Perú.

Let us accept, however, the argument that free trade improves overall efficiency, yielding productivity gains that ultimately get spread around society, making everyone's lives better. We on the left are used to being more critical than constructive, but we do ourselves a disservice if we let our distaste of capitalism prevent us from appreciating how it works. Reading Marx's Capital, for example, one can't help but appreciate the awe which he has towards the dynamism of capitalism, apart from analysis and critique. In that spirit of dynamic awe, therefore, let us assume that free trade leads to higher productivity. How does it do this?

Here, I think, we must recognize a kind of ruthless cruelty. The one clouterie survives because of local demand; it can't compete on the global market. It will fold eventually. Its workers will be laid off, its factories vacated, its town... well Creil is on the outskirts of Paris, it will live on, but less as a center in and of itself.

The nail factory will try to hang on for a while, though. It will start by putting downward pressure on wages, and extract other concessions from workers, in the name of economic necessity. Otherwise, the boss says, we move the factory to Tunisia. This operates also on larger levels, with the chamber of commerce (MEDEF) arguing for regulatory "simplifications", new kinds of contracts with fewer protections, and so on. To the extent that a factory's product is a commodity - whether the nails are Malaysian or French doesn't matter - to that extent, free trade is indeed a race to the bottom, allowing mobile capital to play different jurisdictions off each other. The same goes for quality standards, environmental standards, and similar ways in which countries might choose to internalize what are elsewhere externalities.

I am a sentimental dude and have always found this sort of situation to be quite sad. Places that are on the up and up have a yuppie, rootless vibe, while those that are in decline have roots but no branches. Economics washes its hands of vibes, though; if it's not measurable in euros, it doesn't exist. What is the value of 19th-century India's textile industry against the low price of Manchester-milled cloth? Can we put a number on it? We did, of course; zero is a number after all.

Finally, before I close today's missive, a note on distribution. Most macro-economics is indifferent as regards distribution; gross domestic product is assessed relative to a country, not its parts. But if nails are producible for 50€ in Turkey, shipping included, that doesn't mean they get sold at that price in France; if they sell for 70€, the capitalist pockets the change. That's the base of surplus value: the owner pays the input costs, and the input cost of a worker is what the worker needs to make a living (rent, food, etc), but the that cost is less than the value of what the worker produces. That productivity goes up doesn't necessarily mean that it makes it to the producers; that requires a finer analysis.

to be continued

For me, the fundamental economic argument for free trade is not unambiguously positive. Yes, lower trade barriers should leave us all with more left over with which we can do cool things. But the mechanism is cruel, the benefits accrue unequally, and there is value in producing communities that is not captured in prices.

In my next missive, we go back to the 19th century to see what Marx and Cobden had to say about the topic. Until then, happy trading!

11 Feb 2026 10:20pm GMT

09 Feb 2026

feedPlanet GNOME

Asman Malika: Career Opportunities: What This Internship Is Teaching Me About the Future

Before Outreachy, when I thought about career opportunities, I mostly thought about job openings, applications, and interviews. Opportunities felt like something you wait for, or hope to be selected for.

This internship has changed how I see that completely.

I'm learning that opportunities are often created through contribution, visibility, and community, not just applications.

Opportunities Look Different in Open Source

Working with GNOME has shown me that contributing to open source is not just about writing code, it's about building a public track record. Every merge request, every review cycle, every improvement becomes part of a visible body of work.

Through my work on Papers: implementing manual signature features, fixing issues, contributing to Poppler codebase and now working on digital signatures, I'm not just completing tasks. I'm building real-world experience in a production codebase used by actual users.

That kind of experience creates opportunities that don't always show up on job boards:

Skills That Expand My Career Options

This internship is also expanding what I feel qualified to do.I'm gaining experience with:

These are skills that apply across many roles, not just one job title. They open doors to remote collaboration, open-source roles, and product-focused engineering work.

Career Is Bigger Than Employment

One mindset shift for me is that career is no longer just about "getting hired." It's also about impact and direction.

I now think more about:

Open source makes career feel less like a ladder and more like a network.

Creating Opportunities for Others

Coming from a non-traditional path into tech, I'm especially aware of how powerful access and guidance can be. Programs like Outreachy don't just create opportunities for individuals, they multiply opportunities through community.

As I grow, I want to contribute not only through code, but also through sharing knowledge, documenting processes, and encouraging others who feel unsure about entering open source.

Looking Ahead

I don't have every step mapped out yet. But I now have something better: direction and momentum.

I want to continue contributing to open source, deepen my technical skills, and work on tools that people actually use. Outreachy and GNOME have shown me that opportunities often come from showing up consistently and contributing thoughtfully.

That's the path I plan to keep following.

09 Feb 2026 2:04pm GMT

Andy Wingo: six thoughts on generating c

So I work in compilers, which means that I write programs that translate programs to programs. Sometimes you will want to target a language at a higher level than just, like, assembler, and oftentimes C is that language. Generating C is less fraught than writing C by hand, as the generator can often avoid the undefined-behavior pitfalls that one has to be so careful about when writing C by hand. Still, I have found some patterns that help me get good results.

Today's note is a quick summary of things that work for me. I won't be so vain as to call them "best practices", but they are my practices, and you can have them too if you like.

static inline functions enable data abstraction

When I learned C, in the early days of GStreamer (oh bless its heart it still has the same web page!), we used lots of preprocessor macros. Mostly we got the message over time that many macro uses should have been inline functions; macros are for token-pasting and generating names, not for data access or other implementation.

But what I did not appreciate until much later was that always-inline functions remove any possible performance penalty for data abstractions. For example, in Wastrel, I can describe a bounded range of WebAssembly memory via a memory struct, and an access to that memory in another struct:

struct memory { uintptr_t base; uint64_t size; };
struct access { uint32_t addr; uint32_t len; };

And then if I want a writable pointer to that memory, I can do so:

#define static_inline \
  static inline __attribute__((always_inline))

static_inline void* write_ptr(struct memory m, struct access a) {
  BOUNDS_CHECK(m, a);
  char *base = __builtin_assume_aligned((char *) m.base_addr, 4096);
  return (void *) (base + a.addr);
}

(Wastrel usually omits any code for BOUNDS_CHECK, and just relies on memory being mapped into a PROT_NONE region of an appropriate size. We use a macro there because if the bounds check fails and kills the process, it's nice to be able to use __FILE__ and __LINE__.)

Regardless of whether explicit bounds checks are enabled, the static_inline attribute ensures that the abstraction cost is entirely burned away; and in the case where bounds checks are elided, we don't need the size of the memory or the len of the access, so they won't be allocated at all.

If write_ptr wasn't static_inline, I would be a little worried that somewhere one of these struct values would get passed through memory. This is mostly a concern with functions that return structs by value; whereas in e.g. AArch64, returning a struct memory would use the same registers that a call to void (*)(struct memory) would use for the argument, the SYS-V x64 ABI only allocates two general-purpose registers to be used for return values. I would mostly prefer to not think about this flavor of bottleneck, and that is what static inline functions do for me.

avoid implicit integer conversions

C has an odd set of default integer conversions, for example promoting uint8_t to signed int, and also has weird boundary conditions for signed integers. When generating C, we should probably sidestep these rules and instead be explicit: define static inline u8_to_u32, s16_to_s32, etc conversion functions, and turn on -Wconversion.

Using static inline cast functions also allows the generated code to assert that operands are of a particular type. Ideally, you end up in a situation where all casts are in your helper functions, and no cast is in generated code.

wrap raw pointers and integers with intent

Whippet is a garbage collector written in C. A garbage collector cuts across all data abstractions: objects are sometimes viewed as absolute addresses, or ranges in a paged space, or offsets from the beginning of an aligned region, and so on. If you represent all of these concepts with size_t or uintptr_t or whatever, you're going to have a bad time. So Whippet has struct gc_ref, struct gc_edge, and the like: single-member structs whose purpose it is to avoid confusion by partitioning sets of applicable operations. A gc_edge_address call will never apply to a struct gc_ref, and so on for other types and operations.

This is a great pattern for hand-written code, but it's particularly powerful for compilers: you will often end up compiling a term of a known type or kind and you would like to avoid mistakes in the residualized C.

For example, when compiling WebAssembly, consider struct.set's operational semantics: the textual rendering states, "Assert: Due to validation, val is some ref.struct structaddr." Wouldn't it be nice if this assertion could translate to C? Well in this case it can: with single-inheritance subtyping (as WebAssembly has), you can make a forest of pointer subtypes:

typedef struct anyref { uintptr_t value; } anyref;
typedef struct eqref { anyref p; } eqref;
typedef struct i31ref { eqref p; } i31ref;
typedef struct arrayref { eqref p; } arrayref;
typedef struct structref { eqref p; } structref;

So for a (type $type_0 (struct (mut f64))), I might generate:

typedef struct type_0ref { structref p; } type_0ref;

Then if I generate a field setter for $type_0, I make it take a type_0ref:

static inline void
type_0_set_field_0(type_0ref obj, double val) {
  ...
}

In this way the types carry through from source to target language. There is a similar type forest for the actual object representations:

typedef struct wasm_any { uintptr_t type_tag; } wasm_any;
typedef struct wasm_struct { wasm_any p; } wasm_struct;
typedef struct type_0 { wasm_struct p; double field_0; } type_0;
...

And we generate little cast routines to go back and forth between type_0ref and type_0* as needed. There is no overhead because all routines are static inline, and we get pointer subtyping for free: if a struct.set $type_0 0 instruction is passed a subtype of $type_0, the compiler can generate an upcast that type-checks.

fear not memcpy

In WebAssembly, accesses to linear memory are not necessarily aligned, so we can't just cast an address to (say) int32_t* and dereference. Instead we memcpy(&i32, addr, sizeof(int32_t)), and trust the compiler to just emit an unaligned load if it can (and it can). No need for more words here!

for ABI and tail calls, perform manual register allocation

So, GCC finally has __attribute__((musttail)): praise be. However, when compiling WebAssembly, it could be that you end up compiling a function with, like 30 arguments, or 30 return values; I don't trust a C compiler to reliably shuffle between different stack argument needs at tail calls to or from such a function. It could even refuse to compile a file if it can't meet its musttail obligations; not a good characteristic for a target language.

Really you would like it if all function parameters were allocated to registers. You can ensure this is the case if, say, you only pass the first n values in registers, and then pass the rest in global variables. You don't need to pass them on a stack, because you can make the callee load them back to locals as part of the prologue.

What's fun about this is that it also neatly enables multiple return values when compiling to C: simply go through the set of function types used in your program, allocate enough global variables of the right types to store all return values, and make a function epilogue store any "excess" return values-those beyond the first return value, if any-in global variables, and have callers reload those values right after calls.

what's not to like

Generating C is a local optimum: you get the industrial-strength instruction selection and register allocation of GCC or Clang, you don't have to implement many peephole-style optimizations, and you get to link to to possibly-inlinable C runtime routines. It's hard to improve over this design point in a marginal way.

There are drawbacks, of course. As a Schemer, my largest source of annoyance is that I don't have control of the stack: I don't know how much stack a given function will need, nor can I extend the stack of my program in any reasonable way. I can't iterate the stack to precisely enumerate embedded pointers (but perhaps that's fine). I certainly can't slice a stack to capture a delimited continuation.

The other major irritation is about side tables: one would like to be able to implement so-called zero-cost exceptions, but without support from the compiler and toolchain, it's impossible.

And finally, source-level debugging is gnarly. You would like to be able to embed DWARF information corresponding to the code you residualize; I don't know how to do that when generating C.

(Why not Rust, you ask? Of course you are asking that. For what it is worth, I have found that lifetimes are a frontend issue; if I had a source language with explicit lifetimes, I would consider producing Rust, as I could machine-check that the output has the same guarantees as the input. Likewise if I were using a Rust standard library. But if you are compiling from a language without fancy lifetimes, I don't know what you would get from Rust: fewer implicit conversions, yes, but less mature tail call support, longer compile times... it's a wash, I think.)

Oh well. Nothing is perfect, and it's best to go into things with your eyes wide open. If you got down to here, I hope these notes help you in your generations. For me, once my generated C type-checked, it worked: very little debugging has been necessary. Hacking is not always like this, but I'll take it when it comes. Until next time, happy hacking!

09 Feb 2026 1:47pm GMT

07 Feb 2026

feedPlanet GNOME

Jussi Pakkanen: C and C++ dependencies, don't dream it, be it!

Bill Hoffman, the original creator of the CMake language held a presentation at CppCon. At approximately 49 minutes in he starts talking about future plans for dependency management. He says, and I now quote him directly, that "in this future I envision", one should be able to do something like the following (paraphrasing).

Your project has dependencies A, B and C. Typically you get them from "the system" or a package manager. But then you'd need to develop one of the deps as well. So it would be nice if you could somehow download, say, A, build it as part of your own project and, once you are done, switch back to the system one.

Well mr Hoffman, do I have wonderful news for you! You don't need to treasure these sensual daydreams any more. This so called "future" you "envision" is not only the present, but in fact ancient past. This method of dependency management has existed in Meson for so long I don't even remember when it got added. Something like over five years at least.

How would you use such a wild and an untamed thing?

Let's assume you have a Meson project that is using some dependency called bob. The current build is using it from the system (typically via pkg-config, but the exact method is irrelevant). In order to build the source natively, first you need to obtain it. Assuming it is available in WrapDB, all you need to do is run this command:

meson wrap install bob

If it is not, then you need to do some more work. You can even tell Meson to check out the project's Git repo and build against current trunk if you so prefer. See documentation for details.

Then you need to tell Meson to use the internal one. There is a global option to switch all dependencies to be local, but in this case we want only this dependency to be built and get the remaining ones from the system. Meson has a builtin option for exactly this:

meson configure builddir -Dforce_fallback_for=bob

Starting a build would now reconfigure the system to use the builtin option. Once you are done and want to go back to using system deps, run this command:

meson configure builddir -Dforce_fallback_for=

This is all you need to do. That is the main advantage of competently designed tools. They rose tint your world and keep you safe from trouble and pain. Sometimes you can see the blue sky through the tears in your eyes.

Oh, just one more deadly sting

If you keep watching the presenter first asks the audience if this is something they would like. Upon receiving a positive answer he then follows up with this [again quoting directly]:

So you should all complain to the DOE [presumably US Department of Energy] for not funding the SBIR [presumably some sort of grant or tender] for this.

Shaming your end users into advocating an authoritarian/fascist government to give large sums of money in a tender that only one for-profit corporation can reasonably win is certainly a plan.

Instead of working on this kind of a muscle man you can alternatively do what we in the Meson project did: JFDI. The entire functionality was implemented by maybe 3 to 5 people, some working part time but most being volunteers. The total amount of work it took is probably a fraction of the clerical work needed to deal with all the red tape that comes with a DoE tender process.

In the interest of full disclosure

While writing this blog post I discovered a corner case bug in our current implementation. At the time of writing it is only seven hours old, and not particularly beautiful to behold as it has not been fixed yet. And, unfortunately, the only thing I've come to trust is that bugfixes take longer than you would want them to.

07 Feb 2026 5:43pm GMT

06 Feb 2026

feedPlanet GNOME

Christian Hergert: Mid-life transitions

The past few months have been heavy for many people in the United States, especially families navigating uncertainty about safety, stability, and belonging. My own mixed family has been working through some of those questions, and it has led us to make a significant change.

Over the course of last year, my request to relocate to France while remaining in my role moved up and down the management chain at Red Hat for months without resolution, ultimately ending in a denial. That process significantly delayed our plans despite providing clear evidence of the risks involved to our family. At the beginning of this year, my wife and I moved forward by applying for long-stay visitor visas for France, a status that does not include work authorization.

During our in-person visa appointment in Seattle, a shooting involving CBP occurred just a few parking spaces from where we normally park for medical outpatient visits back in Portland. It was covered by the news internationally and you may have read about it. Moments like that have a way of clarifying what matters and how urgently change can feel necessary.

Our visas were approved quickly, which we're grateful for. We'll be spending the next year in France, where my wife has other Tibetan family. I'm looking forward to immersing myself in the language and culture and to taking that responsibility seriously. Learning French in mid-life will be humbling, but I'm ready to give it my full focus.

This move also means a professional shift. For many years, I've dedicated a substantial portion of my time to maintaining and developing key components across the GNOME platform and its surrounding ecosystem. These projects are widely used, including in major Linux distributions and enterprise environments, and they depend on steady, ongoing care.

For many years, I've been putting in more than forty hours each week maintaining and advancing this stack. That level of unpaid or ad-hoc effort isn't something I can sustain, and my direct involvement going forward will be very limited. Given how widely this software is used in commercial and enterprise environments, long-term stewardship really needs to be backed by funded, dedicated work rather than spare-time contributions.

If you or your organization depend on this software, now is a good time to get involved. Perhaps by contributing engineering time, supporting other maintainers, or helping fund long-term sustainability.

The folliwing is a short list of important modules where I'm roughly the sole active maintainer:

There are, of course, many other modules I contribute to, but these are the ones most in need of attention. I'm committed to making the transition as smooth as possible and am happy to help onboard new contributors or teams who want to step up.

My next chapter is about focusing on family and building stability in our lives.

06 Feb 2026 11:37pm GMT

Lucas Baudin: Being a Mentor for Outreachy

I first learned about Outreachy reading Planet GNOME 10 (or 15?) years ago. At the time, I did not know much about free software and I was puzzled by this initiative, as it mixed politics and software in a way I was not used to.

Now I am a mentor for the December 2025 Outreachy cohort for Papers (aka GNOME Document Viewer), so I figured I would write a blog post to explain what Outreachy is and perpetuate the tradition! Furthermore, I thought it might be interesting to describe my experience as a mentor so far.

Papers and Outreachy logo

What is Outreachy?

Quoting the Outreachy website:

Outreachy provides [paid] internships to anyone from any background who faces underrepresentation, systemic bias, or discrimination in the technical industry where they are living.

These internships are paid and carried out in open-source projects. By way of anecdote, it was initially organized by the GNOME community around 2006-2009 to encourage women participation in GNOME and was progressively expanded to other projects later on. It was formally renamed Outreachy in 2015 and is now managed independently on GNOME, apart from its participation as an open-source project.

Compared to the well-funded Summer of Code program by Google, Outreachy has a much more precarious financial situation, especially in recent years. With little surprise, the evolution of politics in the US and elsewhere over the last few years does not help.

Therefore, most internships are nowadays funded directly by open-source projects (in our case the GNOME Foundation, you can donate and become a Friend of GNOME), and Outreachy still has to finance (at least) its staff (donations here).

Outreachy as a Mentor

So, I am glad that the GNOME Foundation was able to fund an Outreachy internship for the December 2025 cohort. As I am one of the Papers maintainers, I decided to volunteer to mentor an intern and came up with a project on document signatures. This was one of the first issues filled when Papers was forked from Evince, and I don't think I need to elaborate on how useful PDF signing is nowadays. Furthermore, Tobias had already made designs for this feature, so I knew that if we actually had an intern, we would precisely know what needed to be implemented1.

Once the GNOME Internship Committee for Outreachy approved the project, the project was submitted on the Outreachy website, and applicants were invited to start making contributions to projects during the month of October so projects could then select interns (and interns could decide whether they wanted to work for three months in this community). Applicants were already selected by Outreachy (303 applications were approved out of 3461 applications received). We had several questions and contributions from around half a dozen applicants, and that was already an enriching experience for me. For instance, it was interesting to see how newcomers to Papers could be puzzled by our documentation.

At this point, a crucial thing was labeling some issues as "Newcomers". It is much harder than what it looks (because sometimes things that seem simple actually aren't), and it is necessary to make sure that issues are not ambiguous, as applicants typically do not dare to ask questions (even, of course, when it is specified that questions are welcomed!). Communication is definitively one of the hardest things.

In the end, I had to grade applicants (another hard thing to do), and the Internship Committee selected Malika Asman who accepted to participate as an intern! Malika wrote about her experience so far in several posts in her blog.

1

Outreachy internships do not have to be centered around programming; however, that is what I could offer guidance for.

06 Feb 2026 8:03pm GMT

Allan Day: GNOME Foundation Update, 2026-02-06

Welcome to another GNOME Foundation weekly update! FOSDEM happened last week, and we had a lot of activity around the conference in Brussels. We are also extremely busy getting ready for our upcoming audit, so there's lots to talk about. Let's get started.

FOSDEM

FOSDEM happened in Brussels, Belgium, last weekend, from 31st January to 1st February. There were lots of GNOME community members in attendance, and plenty of activities around the event, including talks and several hackfests. The Foundation was busy with our presence at the conference, plus our own fringe events.

Board hackfest

Seven of our nine directors met for an afternoon and a morning prior to FOSDEM proper. Face to face hackfests are something that the Board has done at various times previously, and have always been a very effective way to move forward on big ticket items. This event was no exception, and I was really happy that we were able to make it happen.

During the event we took the time to review the Foundation's financials, and to make some detailed plans in a number of key areas. It's exciting to see some of the initiatives that we've been talking about starting to take more shape, and I'm looking forward to sharing more details soon.

Advisory Board meeting

The afternoon of Friday 30th January was occupied with a GNOME Foundation Advisory Board meeting. This is a regular occurence on the day before FOSDEM, and is an important opportunity for the GNOME Foundation Board to meet with partner organizations and supporters.

Turn out for the meeting was excellent, with Canonical, Google, Red Hat, Endless and PostmarketOS all in attendance. I gave a presentation on the how the Foundation is currently performing, which seemed to be well-received. We then had presentations and discussion amongst Advisory Board members.

I thought that the discussion was useful, and we identified a number of areas of shared interest. One of these was around how partners (companies, projects) can get clear points of contact for technical decision making in GNOME and beyond. Another positive theme was a shared interest in accessibility work, which was great to see.

We're hoping to facilitate further conversations on these topics in future, and will be holding our next Advisory Board meeting in the summer prior to GUADEC. If there are any organizations out there would like to join the Advisory Board, we would love to hear from you.

Conference stand

GNOME had a stand during both FOSDEM days, which was really busy. I worked the stand on the Saturday and had great conversations with people who came to say hi. We also sold a lot of t-shirts and hats!

I'd like to give a huge thank you to Maria Majadas who organized and ran our stand this year. It is incredibly exhausting work and we are so lucky to have Maria in our community. Please say thank you to her!

We also had plenty of other notable volunteers, including Julian Sparber, Ignacy Kuchciński, Sri Ramkrishna. Richard Litteaur, our previous Interim Executive Director even took a shift on the stand.

Social

On the Saturday night there was a GNOME social event, hosted at a local restaurant. As always it was fantastic to get together with fellow contributors, and we had a good turnout with 40-50 people there.

Audit preparation

Moving on from FOSDEM, there has been plenty of other activity at the Foundation in recent weeks. The first of these is preparation for our upcoming audit. I have written a fair bit about this in these previous updates. The audit is a routine exercise, but this is also our first, so we are learning a lot.

The deadline for us to provide our documentation submission to the auditors is next Tuesday, so everyone on the finance side of the operation has been really busy getting all that ready. Huge thanks to everyone for their extra effort here.

GUADEC & LAS planning

Conference planning has been another theme in the past few weeks. For GUADEC, accommodation options have been announced, artwork has been produced, and local information is going up on the website.

Linux App Summit, which we co-organise with KDE, has been a bit delayed this year, but we have a venue now and are in the process of finalizing the budget. Announcements about the dates and location will hopefully be made quite soon.

Google verification

A relatively small task, but a good one to highlight: this week we facilitated (ie. paid for) the assessment process for GNOME's integration with Google services. This is an annual process we have to go through in order to keep Evolution Data Server working with Google.

Infrastructure optimization

Finally, Bart, along with Andrea, has been doing some work to optimize the resource usage of GNOME infrastructure. If you are using GNOME services you might have noticed some subtle changes as a result of this, like Anubis popping up more frequently.

That's it for this week. Thanks for reading; I'll see you next week!

06 Feb 2026 6:00pm GMT

Matthias Clasen: GTK hackfest, 2026 edition

As is by now a tradition, a few of the GTK developers got together in the days before FOSDEM to make plans and work on your favorite toolkit.

Code

We released gdk-pixbuf 2.44.5 with glycin-based XPM and XBM loaders, rounding out the glycin transition. Note that the XPM/XBM support in will only appear in glycin 2.1. Another reminder is that gdk_pixbuf_new_from_xpm_data()was deprecated in gdk-pixbuf 2.44, and should not be used any more, as it does not allow for error handling in case the XPM loader is not available; if you still have XPM assets, please convert them to PNG, and use GResource to embed them into your application if you don't want to install them separately.

We also released GTK 4.21.5, in time for the GNOME beta release. The highlights in this snapshot are still more SVG work (including support for SVG filters in CSS) and lots of GSK renderer refactoring. We decided to defer the session saving support, since early adopters found some problems with our APIs; once the main development branch opens for GTK 4.24, we will work on a new iteration and ask for more feedback.

Discussions

One topic that we talked about is unstable APIs, but no clear conclusion was reached. Keeping experimental APIs in the same shared object was seen as problematic (not just because of ABI checkers). Making a separate shared library (and a separate namespace, for bindings) might not be easy.

Still on the topic of APIs, we decided that we want to bump our C runtime requirement to C11 in the next cycle, to take advantage of standard atomics, integer types and booleans. At the moment, C11 is a soft requirement through GLib. We also talked about GLib's autoptrs, and were saddened by the fact that we still can't use them without dropping MSVC. The defer proposal for C2y would not really work with how we use automatic cleanup for types, either, so we can't count on the C standard to save us.

Mechanics

We collected some ideas for improving project maintenance. One idea that came up was to look at automating issue tagging, so it is easier for people to pay closer attention to a subset of all open issues and MRs. Having more accurate labels on merge requests would allow people to get better notifications and avoid watching the whole project.

We also talked about the state of GTK3 and agreed that we want to limit changes in this very mature code base to crash and build fixes: the chances of introducing regressions in code that has long since been frozen is too high.

Accessibility

On the accessibility side, we are somewhat worried about the state of AccessKit. The code upstream is maintained, but we haven't seen movement in the GTK implementation. We still default to the AT-SPI backend on Linux, but AccessKit is used on Windows and macOS (and possibly Android in the future); it would be nice to have consumers of the accessibility stack looking at the code and issues.

On the AT-SPI side we are still missing proper feature negotiation in the protocol; interfaces are now versioned on D-Bus, but there's no mechanism to negotiate the supported set of roles or events between toolkits, compositors, and assistive technologies, which makes running newer applications on older OS versions harder.

We discussed the problem of the ARIA specification being mostly "stringly" typed in the attributes values, and how it impacts our more strongly typed API (especially with bindings); we don't have a good generic solution, so we will have to figure out possible breaks or deprecations on a case by case basis.

Finally, we talked about a request by the LibreOffice developers on providing a wrapper for the AT-SPI collection interface; this API is meant to be used as a way to sidestep the array-based design, and perform queries on the accessible objects tree. It can be used to speed up iterating through large and sparse trees, like documents or spreadsheets. It's also very AT-SPI specific, which makes it hard to write in a platform-neutral way. It should be possible to add it as a platform-specific API, like we did for GtkAtSpiSocket.

Carlos is working on landing the pointer query API in Mutter, which would address the last remnant of X11 use inside Orca.

Outlook

Some of the plans and ideas that we discussed for the next cycle include:

Until next year, ❤

06 Feb 2026 10:43am GMT

This Week in GNOME: #235 Integrating Fonts

Update on what happened across the GNOME project in the week from January 30 to February 06.

GNOME Core Apps and Libraries

GTK

Cross-platform widget toolkit for creating graphical user interfaces.

Emmanuele Bassi says

The GTK developers published the report for the 2026 GTK hackfest on their development blog. Lots of work and plans for the next 12 months:

  • session save/restore
  • toolchain requirements
  • accessibility
  • project maintenance

and more!

Glycin

Sandboxed and extendable image loading and editing.

Sophie (she/her) reports

Glycin 2.1.beta has been released. Starting with this version, the JPEG 2000 image format is supported by default. This was made possible by a new JPEG 2000 implementation that is completely written in safe Rust.

While this image format isn't in widespread use for images directly, many PDFs contain JPEG 2000 images since PDF 1.5 and PDF/A-2 support embedded JPEG 2000 images. Therefore, images extracted from PDFs, frequently have the JPEG 2000 format.

GNOME Circle Apps and Libraries

Resources

Keep an eye on system resources

nokyan says

This week marks the release of Resources 1.10 with support for new hardware, software and improvements all around! Here are some highlights:

  • Added support for AMD NPUs using the amdxdna driver
  • Improved accessibility for screen reader users and keyboard users
  • Vastly improved app detection
  • Significantly cut down CPU usage
  • Searching for multiple process names at once is now possible using the "|" operator in the search field

In-depth release notes can be found on GitHub.

Resources is available on Flathub.

gtk-rs

Safe bindings to the Rust language for fundamental libraries from the GNOME stack.

Julian 🍃 announces

I've added another chapter for the gtk4-rs book. It describes how to use gettext to make your app available in other languages: https://gtk-rs.org/gtk4-rs/stable/latest/book/i18n.html

Third Party Projects

Ronnie Nissan announces

This week I released Sitra, an app to install and manage fonts from google fonts. It also helps devs integrate fonts into their projects using fontsource npm and CDN.

The app is a replacement to the Font Downloader app which has been abandoned for a while.

Sitra can be downloaded from flathub

Arnis (kem-a) announces

AppManager is a GTK/Libadwaita developed desktop utility in Vala that makes installing and uninstalling AppImages on Linux desktop painless. It supports both SquashFS and DwarFS AppImage formats, features a seamless background auto-update process, and leverages zsync delta updates for efficient bandwidth usage. Double-click any .AppImage to open a macOS-style drag-and-drop window, just drag to install and AppManager will move the app, wire up desktop entries, and copy icons.

And of course, it's available as AppImage. Get it on Github

Parabolic

Download web video and audio.

Nick reports

Parabolic V2026.2.0 is here!

This release contains a complete overhaul of the downloading engine as it was rewritten from C++ to C#. This will provide us with more stable performance and faster iteration of highly requested features (see the long list below!!). The UIs for both Windows and Linux were also ported to C# and got a face lift, providing a smoother and more beautiful downloading experience.

Besides the rewrite, this release also contains many new features (including quality and subtitle options for playlists - finally!) and plenty of bug fixes with an updated yt-dlp.

Here's the full changelog:

  • Parabolic has been rewritten in C# from C++
  • Added arm64 support for Windows
  • Added support for playlist quality options
  • Added support for playlist subtitle options
  • Added support for reversing the download order of a playlist
  • Added support for remembering the previous Download Immediately selection in the add download dialog
  • Added support for showing yt-dlp's sleeping pauses within download rows
  • Added support for enabling nightly yt-dlp updates within Parabolic
  • Redesigned both platform application designs for a faster and smoother download experience
  • Removed documentation pages as Parabolic shows in-app documentation when needed
  • Fixed an issue where translator-credits were not properly displayed
  • Fixed an issue where Parabolic crashed when adding large amounts of downloads from a playlist
  • Fixed an issue where Parabolic crashed when validating certain URLs
  • Fixed an issue where Parabolic refused to start due to keyring errors
  • Fixed an issue where Parabolic refused to start due to VC errors
  • Fixed an issue where Parabolic refused to start due to version errors
  • Fixed an issue where opening the about dialog would freeze Parabolic for a few seconds
  • Updated bundled yt-dlp

Shell Extensions

subz69 reports

I just released Pigeon Email Notifier, a new GNOME Shell extension for Gmail and Microsoft email notifications using GNOME Online Accounts. Supports priority-only mode, persistent and sound notifications.

Miscellaneous

Arjan reports

PyGObject 3.55.3 has been released. It's the third development release (it's not available on PyPI) in the current GNOME release cycle.

The main achievements for this development cycle, leading up to GNOME 50, are:

  • Support for do_dispose and do_constructed methods in Python classes. do_constructed is called after an object has been constructed (as a post-init method), and do_dispose is called when a GObject is disposed.
  • Removal of duplicate marshalling code for fields, properties, constants, and signal closures.
  • Removal of old code, most notable pygtkcompat and wrappers for Glib.OptionContext/OptionGroup.
  • Under the hood toggle references have been replaced by normal references, and PyGObject sinks "floating" objects by default.

Notable changes include for this release include:

  • Type annotations to Glib and GObject overrides. This makes it easier for pygobject-stubs to generate type hints.
  • Updates to the asyncio support.

A special thanks to Jamie Gravendeel, Laura Kramolis, and K.G. Hammarlund for test-driving the unstable versions.

All changes can be found in the Changelog.

This release can be downloaded from Gitlab and the GNOME download server.If you use PyGObject in your project, please give it a spin and see if everything works as expected.⁦

That's all for this week!

See you next week, and be sure to stop by #thisweek:gnome.org with updates on your own projects!

06 Feb 2026 12:00am GMT

Cassidy James Blaede: ROOST at FOSDEM 2026

A few months ago I joined ROOST (Robust Open Online Safety Tools) to build our open source community that would be helping to create, distribute, and maintain common tools and building blocks for online trust and safety. One of the first events I wanted to make sure we attended in order to build that community was of course FOSDEM, the massive annual gathering of open source folks in Brussels, Belgium.

Luckily for us, the timing aligned nicely with the v1 release of our first major online safety tool, Osprey, as well as its adoption by Bluesky and the Matrix.org Foundation. I wrote and submitted a talk for the FOSDEM crowd and the decentralized communications track, which was accepted. Our COO Anne Bertucio and I flew out to Brussels to meet up with folks, make connections, and learn how our open source tools could best serve open protocols and platforms.

Brunch with the Christchurch Call Foundation

Saturday, ROOST co-hosted a brunch with the Christchurch Call Foundation where we invited folks to discuss the intersection of open source and online safety. The event was relatively small, but we engaged in meaningful conversations and came away with several recurring themes. Non-exhaustively, some areas attendees were interested in: novel classifiers for unique challenges like audio recordings and pixel art; how to ethically source and train classifiers; ways to work better together across platforms and protocols.

Personally I enjoyed meeting folks from Mastodon, GitHub, ATproto, IFTAS, and more in person for the first time, and I look forward to continuing several conversations that were started over coffee and fruit.

Talk

Our Sunday morning talk "Stop Reinventing in Isolation" (which you can watch on YouTube or at fosdem.org) filled the room and was really well-received.

Cassidy Anne

Cassidy and Anne giving a talk. | Photos from @matrix@mastodon.matrix.org

In it we tackled three major topics: a crash course on what is "trust and safety"; why the field needs an open source approach; and then a bit about Osprey, our self-hostable automated rules engine and investigation tool that started as an internal tool built at Discord.

Q&A

We had a few minutes for Q&A after the talk, and the folks in the room spurred some great discussions. If there's something you'd like to ask that isn't covered by the talk or this Q&A, feel free to start a discussion! Also note that this gets a bit nerdy; if you're not interested in the specifics of deploying Osprey, feel free to skip ahead to the Stand section.

Room

When using Osprey with the decentralized Matrix protocol, would it be a policy server implementation?

Yes, in the Matrix model that's the natural place to handle it. Chat servers are designed to check with the policy server before sending room events to clients, so it's precisely where you'd want to be able to run automated rules. The Matrix.org Foundation is actively investigating how exactly Osprey can be used with this setup, and already have it deployed in their staging environment for testing.

Does it make sense to use Osprey for smaller platforms with fewer events than something like Matrix, Bluesky, or Discord?

This one's a bit harder to answer, because Osprey is often the sort of tool you don't "need" until you suddenly and urgently do. That said, it is designed as an in-depth investigation tool, and if that's not something needed on your platform yet due to the types and volume of events you handle, it could be overkill. You might be better off starting with a moderation/review dashboard like Coop, which we expect to be able to release as v0 in the coming weeks. As your platform scales, you could then explore bringing Osprey in as a complementary tool to handle more automation and deeper investigation.

Does Osprey support account-level fraud detection?

Osprey itself is pretty agnostic to the types of events and metadata it handles; it's more like a piece of plumbing that helps you connect a firehose of events to one end, write rules and expose those events for investigation in the middle, and then connect outgoing actions on the other end. So while it's been designed for trust and safety uses, we've heard interest from platforms using it in a fraud prevention context as well.

What are the hosting requirements of Osprey, and what do deployments look like?

While you can spin Osprey up on a laptop for testing and development, it can be a bit beefy. Osprey is made up of four main components: worker, UI, database, and Druid as the analytics database. The worker and UI have low resource requirements, your database (e.g. Postgres) could have moderate requirements, but then Druid is what will have the highest requirements. The requirements will also scale with your total throughput of events being processed, as well as the TTLs you keep in Druid. As for deployments, Discord, Bluesky, and the Matrix.org Foundation have each integrated Osprey into their Kubernetes setups as the components are fairly standard Docker images. Osprey also comes with an optional coordinator, an action distribution and load-balancing service that can aid with horizontal scaling.

Stand

This year we were unable to secure a stand (there were already nearly 100 stands in just 5 buildings!), but our friends at Matrix graciously hosted us for several hours at their stand near the decentralized communications track room so we could follow up with folks after our talk. We blew through our shiny sticker supply as well as our 3D printed ROOST keychains (which I printed myself at home!) in just one afternoon. We'll have to bring more to future FOSDEMs!

Stickers

When I handed people one of our hexagon stickers the reaction was usually some form of, "ooh, shiny!" but my favorite was when someone essentially said, "Oh, you all actually know open source!" That made me proud, at least. :)

Interesting Talks

Lastly, I always like to shout out interesting talks I attended or caught on video later so others can enjoy them on their own time. I recommend checking out:

06 Feb 2026 12:00am GMT

30 Jan 2026

feedPlanet GNOME

This Week in GNOME: #234 Annotated Documents

Update on what happened across the GNOME project in the week from January 23 to January 30.

GNOME Core Apps and Libraries

Document Viewer (Papers)

View, search or annotate documents in many different formats.

lbaudin announces

Papers can now be used to draw freehand annotations on PDF documents (ink), as well as add text to them! These features were merged this week and are now available in GNOME nightly, more details in this blog post.

GTK

Cross-platform widget toolkit for creating graphical user interfaces.

Emmanuele Bassi reports

As usual, a few GTK developers are meeting up before FOSDEM for the planning hackfest; we are discussing the current state of the project, and also where do we want to go in the next 6-12 months:

  • the new SVG rendering code
  • accessibility
  • icons and other assets
  • platform support, especially Windows and Android
  • various improvements in the GLib code
  • the state of various dependencies, like gdk-pixbuf and accesskit
  • whether to introduce unstable API as an opt in for experimentation, before finalising it

You can follow along the agenda, and the notes here: https://pad.gnome.org/gtk-hackfest-2026

We are also going to be at the GNOME social event on Saturday in Brussels, so make sure to join us!

Emmanuele Bassi says

Matthias just released a new GTK 4.21 developers snapshot, in time for GNOME 50's beta release. This release brings various changes:

  • the state saving and restoring API has been made private; we have received feedback by early adopters, and we are going to need to go back to the drawing board in order to address some issues related to its use
  • GSK shaders are now autogenerated
  • GTK does not depend on librsvg any more, and implements its own SVG renderer, including various filters
  • the Inspector has a heat map generator
  • SVG filters can be used inside CSS data URLs
  • GtkAspectFrame's measurement has been fixed to properly (and efficiently) support more cases and fractional sizes

Additionally, we have multiple fixes for Windows, macOS, and Android. Lots of things to look forward for the 4.22 stable release!

GNOME Circle Apps and Libraries

gtk-rs

Safe bindings to the Rust language for fundamental libraries from the GNOME stack.

Julian 🍃 announces

After a quite long hiatus, I continued writing on the gtk4-rs book. This time we introduce the build system Meson. This sets the stage for more interesting features like internationalization: https://gtk-rs.org/gtk4-rs/stable/latest/book/meson.html

Mahjongg

Match tiles and clear the board

Mat announces

Mahjongg 49.1 has been released, and is available on Flathub. This release mainly focuses on usability improvements, and includes the following changes:

  • Implement pause menu with 'Resume' and 'Quit' buttons
  • Add Escape keyboard shortcut to pause game
  • Pause game when main window is obscured
  • Pause game when dialogs and menus are visible
  • Don't allow pausing completed games
  • Don't show confirmation dialog for layout change after completing game
  • Fix text entry not always receiving focus in Scores dialog
  • Translation updates

Third Party Projects

Danial reports

We are announcing an important update to Carburetor, our tool for easily setting up a Tor proxy. This release focuses on crucial improvements for users in Iran, where Tor remains one of the few reliable ways to stay connected.

Following the massacre of protesters by Iran state which reportedly led to the killing of more than 60,000 individuals in a couple of days (this includes shooting injured people into the head on the hospital beds), the Internet and all other means of communications such as SMS and landlines suffered a total shutdown. After dozen of days, network access is now very fragile and heavily restricted there.

In response, this update adds support for Snowflake bridges with AMP cache rendezvous, which have proven more reliable under current conditions. To use them, ensure these two bridges are included in your inventory:

snowflake 192.0.2.5:80 2B280B23E1107BB62ABFC40DDCC8824814F80A72 url=https://snowflake-broker.torproject.net/ ampcache=https://cdn.ampproject.org/ front=www.google.com tls-imitate=hellorandomizedalpn
snowflake 192.0.2.6:80 8838024498816A039FCBBAB14E6F40A0843051FA url=https://snowflake-broker.torproject.net/ ampcache=https://cdn.ampproject.org/ front=www.google.com tls-imitate=hellorandomizedalpn

We've also removed the previous 90 seconds connection timeout, as establishing a connection now often takes much longer due to extreme throttling and filtering, sometimes more than 10 minutes.

Additionally, dependencies like Tor and pluggable transports have been updated to ensure better stability and security.

Stay safe. Keep connected.

justinrdonnelly announces

I've just released a new version of Bouncer. Launching Bouncer now opens a dashboard to show the status of required components and configurations. Longtime users may not notice, but this will be especially helpful for new users trying to get Bouncer up and running. You can get Bouncer from Flathub!

Jeffry Samuel says

Alpaca 9 is out, now users can now implement character cards to make role-play scenarios with their AI models, this update also brings changes to how Alpaca integrates Ollama instances, simplifying the process of running local AI even more. Check out the release discussion for more information -> https://github.com/Jeffser/Alpaca/discussions/1088

Daniel Wood reports

Design, 2D computer aided design (CAD) for GNOME sees a new release, highlights include:

  • Enable clipboard management (Cut, Copy, Paste, Copy with basepoint, Select All)
  • Add Cutclip Command (CUTCLIP)
  • Add Copyclip Command (COPYCLIP)
  • Add Copybase Command (COPYBASE)
  • Add Pasteclip Command (PASTECLIP)
  • Add Match Properties Command (MA)
  • Add Pan Command (P)
  • Add Zoom Command (Z)
  • Show context menu on right click
  • Enable Undo and Redo
  • Improved Trim (TR) command with Arc, Circle and Line entities
  • Indicate save state on tabs and header bar
  • Plus many fixes!

Design is available from Flathub:

https://flathub.org/apps/details/io.github.dubstar_04.design

slomo announces

GStreamer 1.28.0 has been released! This is a major new feature release, with lots of exciting new features and other improvements. Some highlights:

  • GTK4 is now shipped with the GStreamer binaries on macOS and Windows alongside the gtk4paintablesink video sink
  • vulkan plugin now supports AV1, VP9, HEVC-10 decoding and H264 encoding
  • glupload now has a udmabuf uploader to more efficiently share video buffers, leading to better perf when using, say, a software decoder and waylandsink or gtk4paintablesink
  • waylandsink has improved handling for HDR10 metadata
  • New AMD HIP plugin and integration library
  • Analytics (AI/ML) plugin suite has gained numerous new features
  • New plugins for transcription, translation and speech synthesis, etc
  • Enhanced RTMP/FLV support with HEVC support and multi-track audio
  • New vmaf element for perceptual video quality assessment using Netflix's VMAF framework
  • New source element to render a Qt6 QML scene
  • New GIF decoder element with looping support
  • Improved support for iOS and Android
  • And many, many more new features alongside the usual bug fixes

Check the extensive release notes for more details.

rat reports

Echo 3 is released! Echo is a GUI ping utlity.

Version 3 brings along two notable features: instant cancelling of pings and a "Trips" tab showing details about each trip made in the ping.

As well as smaller changes to the layout: removed the ping options expander and moved error messages below the address bar.

Get it on Flathub: https://flathub.org/en/apps/io.github.lo2dev.Echo

Pipeline

Follow your favorite video creators.

schmiddi reports

Pipeline 3.2.0 was released. This release updates the underlying video player, Clapper, to the latest version. This in particular allows specifying options passed to yt-dlp for video playback, including cookies files or extractor arguments. Besides that, it also adds some new keyboard shortcuts for toggling fullscreen and the sidebar, and fixes quite a few bugs.

One important note: Shortly before the release of this version, YouTube decided to break yt-dlp. We are working on updating the yt-dlp version, but as a temporary workaround, you can add the following string to the yt-dlp extraction arguments configurable in the preferences: youtube:player_client=default,-android_sdkless.

Shell Extensions

Just Perfection says

Just Perfection extension is now ported to GNOME Shell 50 and available on EGO. This update brings bug fixes and new features, including toggles for backlight and DND button visibility.

Internships

lbaudin announces

Malika is now halfway through her Outreachy internship about signatures in Papers and has made great progress! She just published a blog post about her experience so far, you can read it here.

That's all for this week!

See you next week, and be sure to stop by #thisweek:gnome.org with updates on your own projects!

30 Jan 2026 12:00am GMT

28 Jan 2026

feedPlanet GNOME

Mathias Bonn: The Hobby Lives On

Maintaining an open source project in your free time is incredibly rewarding. A large project full of interesting challenges, limited only by your time and willingness to learn. Years of work add up to something you've grown proud of. Who would've thought an old project on its last legs could turn into something beautiful?

The focus is intense. So many people using the project, always new things to learn and improve. Days fly by when time allows for it. That impossible feature sitting in the backlog for years, finally done. That slow part of the application, much faster now. This flow state is pretty cool, might as well tackle a few more issues while it lasts.

Then comes the day. The biggest release yet is out the door. More tasks remain on the list, but it's just too much. That release took so much effort, and the years are adding up. You can't keep going like this. You wonder, is this the beginning of the end? Will you finally burn out, like so many before you?

A smaller project catches your eye. Perhaps it would be fun to work on something else again. Maybe it doesn't have to be as intense? Looks like this project uses a niche programming language. Is it finally time to learn another one? It's an unfamiliar project, but it's pretty fun. It tickles the right spots. All the previous knowledge helps.

You work on the smaller project for a while. It goes well. That larger project you spent years on lingers. So much was accomplished. It's not done yet, but software is never done. The other day, someone mentioned this interesting feature they really wanted. Maybe it wouldn't hurt to look into it? It's been a while since the last feature release. Maybe the next one doesn't have to be as intense? It's pretty fun to work on other projects sometimes, too.

The hobby lives on. It's what you love doing, after all.

28 Jan 2026 4:17am GMT

Lucas Baudin: Drawing and Writing on PDFs in Papers (and new blog)

Nearly 10 years ago, I first looked into this for Evince but quickly gave up. One year and a half ago, I tried again, this time in Papers. After several merge requests in poppler and in Papers, ink and free text annotations support just landed in Papers repository!

Therefore, it is now possible to draw on documents and add text, for instance to fill forms. Here is a screenshot with the different tools:

Papers with the new drawing tools

This is the result of the joint work of several people who designed, developed, and tested all the little details. It required adding support for ink and free text annotations in the GLib bindings of poppler, then adding support for highlight ink annotations there. Then several things got in the way adding those in Papers; among other things, it became clear that an undo/redo mechanism was necessary and annotations management was entangled with the main view widget. It was also an opportunity to improve document forms, which are now more accessible.

This can be tested directly from the GNOME Nightly flatpak repository and new issues are welcomed.

Also, this is a new blog and I never quite introduced myself: I actually started developing with GTK on GTK 2, at a time when GTK 3 was looming. Then I took a long break and delved again into desktop development two years ago. Features that just got merged were, in fact, my first contributions to Papers. They are also the ones that took the most time to be merged! I became one of Papers maintainers last March, joining Pablo (who welcomed me in this community and stopped maintenance since then), Markus, and Qiu.

Next time, a post about our participation in Outreachy with Malika's internship!

28 Jan 2026 12:00am GMT

26 Jan 2026

feedPlanet GNOME

Asman Malika: Mid-Point Project Progress: What I’ve Learned So Far

Dark mode: Manual Signature Implementation

Light mode: When there is no added signature

Reaching the midpoint of this project feels like a good moment to pause, not because the work is slowing down, but because I finally have enough context to see the bigger picture.

At the start, everything felt new: the codebase, the community, the workflow, and even the way problems are framed in open source. Now, halfway through, things are starting to connect.

Where I Started

When I began working on Papers, my main focus was understanding the codebase and how contributions actually happen in a real open-source project. Reading unfamiliar code, following discussions, and figuring out where my work fit into the larger system was challenging.

Early on, progress felt slow. Tasks that seemed small took longer than expected, mostly because I was learning how the project works, not just what to code. But that foundation has been critical.

Photo: Build failure I encountered during development

What I've Accomplished So Far

At this midpoint, I'm much more comfortable navigating the codebase and understanding the project's architecture. I've worked on the manual signature feature and related fixes, which required carefully reading existing implementations, asking questions, and iterating based on feedback. I'm now working on the digital signature implementation, which is one of the most complext part of the project and builds directly on the foundation laid by the earlier work.

Beyond the technical work, I've learned how collaboration really functions in open source:

These skills have been just as important as writing code.

Challenges Along the Way

One of the biggest challenges has been balancing confidence and humility, knowing when to try things independently and when to ask for help. I've also learned that progress in open source isn't always linear. Some days are spent coding, others reading, debugging, or revisiting decisions.

Another challenge has been shifting my mindset from "just making it work" to thinking about maintainability, users, and future contributors. That shift takes time, but it's starting to stick.

What's Changed Since the Beginning

The biggest change is how I approach problems.

I now think more about who will use the feature, who might read this code later, and how my changes fit into the overall project. Thinking about the audience, both users of Papers and fellow contributors, has influenced how I write code, documentation, and even this blog.

I'm also more confident participating in discussions and expressing uncertainty when I don't fully understand something. That confidence comes from realizing that learning in public is part of the process.

Looking Ahead

The second half of this project feels more focused. With the groundwork laid, I can move faster and contribute more meaningfully. My goal is to continue improving the quality of my contributions, take on more complex tasks, and deepen my understanding of the project.

Most importantly, I want to keep learning about open source, about collaboration, and about myself as a developer.

Final Thoughts

This midpoint has reminded me that growth isn't always visible day to day, but it becomes clear when you stop and reflect. I'm grateful for the support, feedback, and patience from GNOME community, especially my mentor Lucas Baudin. And I'm so excited to see how the rest of the project unfolds.

26 Jan 2026 1:42pm GMT

24 Jan 2026

feedPlanet GNOME

Sam Thursfield: AI predictions for 2026

Its a crazy time to be part of the tech world. I'm happy to be sat on the fringes here but I want to try and capture a bit of the madness, so in a few years we can look back on this blogpost and think "Oh yes, shit was wild in 2026".

(insert some AI slop image here of a raccoon driving a racing car or something)

I have read the blog of Geoffrey Huntley for about 5 years since he famously right-clicked all the NFTs. Smart & interesting guy. I've also known the name Steve Yegge for a while, he has done enough notable things to get the honour of an entry in Wikipedia. Recently they've both written a lot about generating code with LLMs. I mean, I hope in 2026 we've all had some fun feeding freeform text and code into LLMs and playing with the results, they are a fascinating tool. But these two dudes are going into what looks like a sort of AI psychosis, where you feed so many LLMs into each other that you can see into the future, and in the process give most of your money to Anthropic.

It's worth reading some of their articles if you haven't, there are interesting ideas in there, but I always pick up some bad energy. They're big on the hook that, if you don't study their techniques now, you'll be out of a job by summer 2026. (Mark Zuckerborg promised this would happen by summer 2025, but somehow I still have to show up for work five days every week). The more I hear this, the more it feels like a sort of alpha-male flex, except online and in the context of the software industry. The alpha tech-bro is here, and he will Vibe Code the fuck out of you. The strong will reign, and the weak will wither. Is that how these guys see the world? Is that the only thing they think we can do with these here computers, is compete with each other in Silicon Valley's Hunger Games?

I felt a bit dizzy when I saw Geoffrey's recent post about how he was now funded by cryptocurrency gamblers ("two AI researchers are now funded by Solana") who are betting on his project and gifting him the fees. I didn't manage to understand what the gamblers would win. It seemed for a second like an interesting way to fund open research, although "Patreon but it's also a casino" is definitely turn for the weird. Steve Yegge jumped on the bandwagon the same week ("BAGS and the Creator Economy") and, without breaking any laws, gave us the faintest hint that something big is happening over there.

Well…

You'll be surprised to know that both of them bailed on it within a week. I'm not sure why - I suspect maybe the gamblers got too annoying to deal with - but it seems some people lost some money. Although that's really the only possible outcome from gambling. I'm sure the casino owners did OK out of it. Maybe its still wise to be wary of people who message you out of the blue wanting to sell you cryptocurrency.

The excellent David Gerard had a write up immediately on Pivot To AI: "Steve Yegge's Gas Town: Vibe coding goes crypto scam". (David is not a crypto scammer and has a good old fashioned Patreon where you can support his journalism). He talks about addiction to AI, which I'm sure you know is a real thing.

Addictive software was perfected back in the 2010s by social media giants. The same people who had been iterating on gambling machines for decades moved to California and gifted us infinite scroll. OpenAI and Anthropic are based in San Francisco. There's something inherently addictive about a machine that takes your input, waits a second or two, and gives you back something that's either interesting or not. Next time you use ChatGPT, look at how the interface leans into that!

(Pivot To AI also have a great writeup of this: "Generative AI runs on gambling addiction - just one more prompt, bro!")

So, here we are in January 2026. There's something very special about this post "Stevey's Birthday Blog". Happy birthday, Steve, and I'm glad you're having fun. That said, I do wonder if we'll look back in years to come on this post as something of an inflection point in the AI bubble.

All though December I had weird sleeping patterns while I was building Gas Town. I'd work late at night, and then have to take deep naps in the middle of the day. I'd just be working along and boom, I'd drop. I have a pillow and blanket on the floor next to my workstation. I'll just dive in and be knocked out for 90 minutes, once or often twice a day. At lunch, they surprised me by telling me that vibe coding at scale has messed up their sleep. They get blasted by the nap-strike almost daily, and are looking into installing nap pods in their shared workspace.

Being addicted to something such that it fucks with your sleeping patterns isn't a new invention. Ask around the punks in your local area. Humans can do amazing things. That story starts way before computers were invented. Scientists in the 16th century were absolute nutters who would like… drink mercury in the name of discovery. Isaac Newton came up with his theory of optics by skewering himself in the eye. (If you like science history, have a read of Neal Stephenson's Baroque Cycle 🙂 Coding is fun and making computers do cool stuff can be very addictive. That story starts long before 2026 as well. Have you heard of the demoscene?

Part of what makes Geoffrey Huntley and Steve Yegge's writing compelling is they are telling very interesting stories. They are leaning on existing cultural work to do that, of course. Every time I think about Geoffrey's 5 line bash loop that feeds an LLMs output back into its input, the name reminds me of my favourite TV show when I was 12.

Ralph Wiggum with his head glued to his shoulder. "Miss Hoover? I glued my head to my shoulder."

Which is certainly better than the "human centipede" metaphor I might have gone with. I wasn't built for this stuff.

The Gas Town blog posts are similarly filled with steampunk metaphors and Steve Yegge's blog posts are interspersed with generated images that, at first glance, look really cool. "Gas Town" looks like a point and click adventure, at first glance. In fact it's a CLI that gives kooky names to otherwise dry concepts,… but look at the pictures! You can imagine gold coins spewing out of a factory into its moat while you use it.

All the AI images in his posts look really cool at first glance. The beauty of real art is often in the details, so let's take a look.

What is that tower on the right? There's an owl wearing goggles about to land on a tower… which is also wearing goggles?

What's that tiny train on the left that has indistinct creatures about the size of a foxes fist? I don't know who on earth is on that bridge on the right, some horrific chimera of weasel and badger. The panda is stoicly ignoring the horrors of his creation like a good industrialist.

What is the time on the clock tower? Where is the other half of the fox? Is the clock powered by …. oh no.

Gas Town here is a huge factory with 37 chimneys all emitting good old sulphur and carbon dioxide, as God intended. But one question: if you had a factory that could produce large quantities of gold nuggets, would you store them on the outside ?

Good engineering involves knowing when to look into the details, and when not to. Translating English to code with an LLM is fun and you can get some interesting results. But if you never look at the details, somewhere in your code is a horrific weasel badger chimera, a clock with crooked hands telling a time that doesn't exist, and half a fox. Your program could make money… or it could spew gold coins all around town where everyone can grab them.

So… my AI predictions for 2026. Let's not worry too much about code. People and communities and friendships are the thing.

The human world is 8 billion people. Many of us make a modest living growing and selling vegetables or fixing cars or teaching children to read and write. The tech industry is a big bubble that's about to burst. Computers aren't going anywhere, and our open source communities and foundations aren't going anywhere. People and communities and friendships are the main thing. Helping out in small ways with some of the bad shit going on in the world. You don't have to solve everything. Just one small step to help someone is more than many people do.

Pay attention to what you're doing. Take care of the details. Do your best to get a good night's sleep.

AI in 2026 is going to go about like this:

24 Jan 2026 8:32pm GMT