22 Jan 2026

feedDrupal.org aggregator

Four Kitchens: Don’t let SimpleSAMLphp block your Drupal upgrade

Thousands of Drupal sites use SimpleSAMLphp for SSO authentication. But with Drupal 11 around the corner, this setup won't be supported anymore. Here's what site owners and developers should know.

PHP applications integrating with SAML have relied on the SimpleSAMLphp library. Following this standard, the Drupal community created the SimpleSAMLphp Authentication module, which is used by more than 14,000 sites (as of this publication), to provide SSO authentication integration for Drupal sites.

By migrating now, you'll ensure your Drupal site is ready for Drupal 11 without downtime or dependency conflicts, and you can take advantage of the latest features immediately.

Although this library and module is a great resource for SAML integrations, it conflicts with Drupal upgrade paths, hindering the efforts to keep Drupal sites up to date. This is the case for sites that want to upgrade to Drupal 11. If your site has this dependency, you may be stuck until there's a compatible version of the library.

The reason behind this issue is due to the SimpleSAMLphp library having a dependency to Symfony, which collides with Drupal Core Symfony dependency.

Although we're almost there, this issue will continue to persist and will make your site's upgrades dependent on this library. From a technical standpoint, the goal when developing sites is to have the least amount of dependencies, or at least dependencies that are constantly maintained. You can read this Drupal issue to know more about the current issues with the module and D11.

The good news is that there's another PHP library for SAML integrations that has a Drupal contributed module, and it has no conflicting dependencies with Drupal Core! The module is called SAML Authentication, and it uses OneLogin SAML toolkit for PHP. This guide will provide you with the steps for an easy and seamless migration from the SimpleSAMLphp module to SAML Authentication!

Thousands of Drupal sites, especially in higher education, rely on SimpleSAMLphp for single sign-on (SSO). It's been a reliable solution for years. But with Drupal 11 here, that dependency has quietly become a blocker.

If your site depends on SimpleSAMLphp today, you may find yourself stuck waiting on library updates before you can safely upgrade Drupal. For institutions that prioritize security, accessibility, and long-term platform health, that delay isn't just inconvenient. It's risky.

The issue isn't SAML itself. It's the underlying dependency chain. SimpleSAMLphp relies on Symfony versions that conflict with Drupal core, making Drupal 11 compatibility uncertain until the library catches up.

The good news? You don't have to wait.

There's a supported, Drupal-friendly alternative, the SAML Authentication module, that avoids these conflicts entirely. Even better, the community has built tooling to make migration significantly easier than you might expect.

This guide walks through a practical, field-tested approach to migrating from SimpleSAMLphp to SAML Authentication - so you can unblock your Drupal 11 upgrade path without downtime or guesswork.

Who this guide is for

  • Higher ed Drupal teams planning for Drupal 11
  • Platform owners managing SSO across multiple environments
  • Engineers who want to reduce upgrade risk caused by third-party dependencies

SAML Authentication

The SAML Authentication module is a really straightforward resource to set up SSO into your site. From my point of view, it's way easier to configure and maintain than SimpleSAMLphp, so, even if you're not worried about upgrades (though should you be, read our post about upgrading to Drupal 11), this module will make the maintenance of your site less complicated. However, it's not perfect. It has its caveats, which I will cover!

Most of the configuration required for SAML Auth is already handled in your site's SimpleSAMLphp settings. However, there's already a tool that automates this task!

SimpleSAMLphp to SAML Auth Migration helper module

If you read the Drupal 11 compatible issue thread for SimpleSAMLphp, you will stumble upon Jay Beaton's comment: He created a helper module to automate the migration of SAML Auth from SimpleSAMLphp. This module will make the switch pretty fast! However, depending on your site's setup, you must be careful on what settings you're migrating, especially if the configuration varies by environment.

Migrating from SimpleSAMLphp to SAML Authentication

Before migrating, there are a few decisions worth making. Let's dive into them!

1. Install SAML Authentication

Naturally, because we want to migrate into SAML Authentication, we need to install it on our site.

  1. Run composer require drupal/samlauth

2. Install the helper module

The SimpleSAMLphp to SAML Auth Migration module is currently in development, so the module itself is a sandbox project in Drupal.org. In order to install it, you need to:

  1. Add the sandbox as a custom repository into your composer.json:
    "samlauth_helper": {
                "type":"package",
                "package": {
                    "name": "drupal/samlauth_simplesamlphp_auth_migration",
                    "version":"1.0.0",
                    "type": "drupal-module",
                    "source": {
                        "url": "https://git.drupalcode.org/sandbox/jrb-3531935.git",
                        "type": "git",
                        "reference":"1.0.x"
                    }
                }
            }
  2. This is the composer command: composer config repositories.samlauth_helper '{"type": "package", "package": {"name": "drupal/samlauth_simplesamlphp_auth_migration", "version": "1.0.0", "type": "drupal-module", "source": {"url": "https://git.drupalcode.org/sandbox/jrb-3531935.git", "type": "git", "reference": "1.0.x"}}}'
  3. Require the module: composer require drupal/samlauth_simplesamlphp_auth_migration
  4. Enable it: drush en samlauth_simplesamlphp_auth_migration -y

3. Migrate the configuration

There are two ways for you to migrate the configuration: either through drush commands or through the UI. For the purpose of this article, we will migrate the configuration through the command line, however, you can do it through the UI in this path: admin/config/people/saml/simplesamlphp-auth-migration.

That path also provides a nice visualization of the configuration mapping and migration that will be performed. You can also view it running drush samlauth-simplesamlphp-show-changes command.

3.1. Before migrating

Usually, sites have different setups for different environments, such as testing, development, or production. Each environment may have its own configuration, and if you're running a migration locally, SimpleSAMLphp might not even be enabled.

Knowing this, depending on where you're running the migration, you will need to manually set some of the settings yourself. For example, the staging environment uses a different certificate than production. Locally, the staging environment SimpleSAMLphp settings are enabled by default. When you run the migration command, the settings for the active environment will be migrated over, not the other for production.

Another example is that the Entity ID of the SP is dynamic, depending on the environment. If so, you will need to override the SAML Authentication settings, rather than the SimpleSAMLphp.

For the ones that need to be migrated manually, you can either set up a config split for each environment, or you can do some configuration overrides through settings.php. Just beware that, if you need to override arrays, it may not be possible to override them through settings.php. Read more about the issue.

I've done both approaches. The approach you take will depend on your site. I've set different configuration splits, and have overridden values through settings.php. In my case, I just create a settings.saml.php file and include it in settings.php.

3.2. Known caveats

Although SAML Authentication is way simpler to set up, it's missing a feature SimpleSAMLphp provides: you can link a remote IdP certificate. In SAML Authentication, the files must be physical in your server. You can only link their path.

This means if your site has remote IdP certificates, you will have to copy its contents and create the files into your site folder. This may make your maintenance process a bit more nuanced, as if the IdP certificates change, you will have to update the physical files from your site.

For the purposes of this migration, if you are linking to remote IdP certificates, you will have to create the files. I've faced two situations: remote and local files. For the remote certificates, I decided to put them into web/private/saml-certificates.

3.3. Migrate the configuration

Having the whole context of what we need to be careful about, let's migrate the config! It's very simple:

  1. Run drush samlauth-simplesamlphp-migrate
    1. You can provide the --cert-path for your certificate path; or
    2. You can go to /admin/config/people/saml/saml and add it into the X.509 certificate(s) field
  2. Confirm it was migrated correctly
    1. You can compare against your SimpleSAMLphp configuration

3.4. The Auth Map table

Both SimpleSAMLphp and SAML Authentication use the External Authentication module, it allows a SAML authenticated user with an authname identical to the one in the authmap table to be logged in as that Drupal user. Among those values, it stores the provider, i.e., the module that allowed the authentication.

The SAML helper module automatically migrates the SimpleSAMLphp authmap values into SAML Authentication. However, this is a database migration, so, if you migrated locally and then pushed your changes into a remote environment, this migration won't be reflected there. To make sure it also happens, you can:

  1. If you keep the helper module in your remote environments, you can create an update hook that calls the service that makes the changes:
    /**
     * Enable SAML auth migration module and update authmap table.
     */
    function hook_update_N(): void {
      if (!\Drupal::moduleHandler()->moduleExists('samlauth_simplesamlphp_auth_migration')) {
        \Drupal::service('module_installer')->install(['samlauth_simplesamlphp_auth_migration']);
      }
      /** @var \Drupal\samlauth_simplesamlphp_auth_migration\Migrate $saml_auth_utility */
      $saml_auth_utility = \Drupal::service('samlauth_simplesamlphp_auth_migration.migrate');
      $saml_auth_utility->updateAuthmapTable();
    }
    

Or, you can just do it yourself:

/**
 * Enable SAML auth module and update authmap table.
 */
function hook_update_N(): void {
  if (!\Drupal::moduleHandler()->moduleExists('samlauth')) {
    \Drupal::service('module_installer')->install(['samlauth']);
  }

  \Drupal::database()
    ->update('authmap')
    ->fields(['provider' => 'samlauth'])
    ->condition('provider', 'simplesamlphp_auth')
    ->execute();
}

4. Testing the new SAML Authentication

Once you've migrated over everything, the last step is to test if the configuration migration actually works. In order for you to test it, you will need to make changes to your IdP with the new values of the new SAML implementation: the metadata URL, the Assertion Consumer Service, and the Single Logout Service.

If you don't want to override the existing values from your current SAML IdP, you can modify the Entity ID for SAML Authentication to create a new one, that way you could have two entries in you IdP for SimpleSAMLphp and SAML Authentication.

Regardless, it may be the case that making such a change could take some time if it's handled by another team. If that's the case, you would have to wait until those changes are implemented to test the new SAML implementation. Fortunately, the SAML Auth helper module provides a way to masquerade SimpleSAMLphp endpoints with SAML Auth implementations, that way you can test right away without having to make any changes to the IdP. You would eventually need to do it, but for testing purposes, you can use the helper module. To do so, you have to:

  1. Remove the simplesaml folder from your web/docroot folder
  2. Disable the simplesamlphp_auth module
  3. Remove any composer script/automated script that symlinks any SimpleSAMLphp related behavior
  4. Enable the Use SimpleSAMLphp Authentication Routes option offered by the SAML Auth helper module (find it on admin/config/people/saml/saml)

Once that's done, you can try and log in using SSO into the site you're trying to migrate. If it works, fantastic! You only need to (i) request the IdP changes for you to completely move away from SimpleSAMLphp, and (ii) you can remove the helper module.

4.1. Remote host provider issues

In some cases, SSO is enabled in certain specific environments. That means you may want to test on those environments; you only need to deploy and let the environment be built with the new changes. However, it is not as straightforward as you may think. As a matter of fact, the helper module route masquerading may not even work.

Before speaking about these issues, these are the remote host providers the masquerading will work:

  1. Acquia
  2. Platform.sh (though you will have to edit the .platform.app.yml file)
4.1.1. Pantheon special routing rule

If your site is hosted on Pantheon, the helper module masquerading won't work. This is due to a specific routing rule Pantheon seems to have.

If you try and test the new SAML implementation with the helper module, you will get a HTTP 404 error, specifically stating that the file was not found - a text 404, completely different from your Drupal 404. In order to test this with the helper module, the steps (from the module itself) specify that "Make sure that SimpleSAMLphp is not available on the server at https://site/simplesaml/module.php".

Even after removing the library from your web/docroot, the helper module won't work on Pantheon. After some testing and investigation, and knowing that Pantheon uses Nginx for routing, I was able to confirm Pantheon has a special rule for paths that go to /simplesaml/*.php.

Any path following that pattern won't be routed through Drupal. Nginx makes sure to look up for a file inside your web/docroot simplesaml folder. Hence why I was getting a text 404, instead of Drupal's 404. I made a test to confirm this: I placed a hello.php file inside that folder that just returns a simple text message. When I went to /simplesaml/hello.php I was getting said message.

Due to this fact, the helper module route masquerading won't work because that's a path expected to be handled by Drupal; on Pantheon, that won't be the case. It won't work.

But, good news is, there's a workaround you can use in order to confirm your migration worked! Before diving into that, I want to give a shout out to Jay Beaton, the helper module maintainer! I reached out to him on Drupal Slack about the route overrides not working. He was kind enough to listen and help me; and, after talking to him, I was able to realize and discover this Pantheon issue; and the workaround I'm going to mention was recommended by him. Thanks so much for such a great tool, and help!

4.1.2. Testing workaround

Even if you cannot test remotely, you can do it locally. SAML validation and redirections happen on the browser level, meaning that, if you change your machine DNS to point a SSO-enabled domain to your local machine environment, you will be able to test it there. This is how you do it:

  1. For DDEV, you need to edit the config.yaml file and:
    1. Add the domain you want to override in the additional_fqdns property
    2. Switch to false the use_dns_when_possible property, so DDEV always updates the /etc/hosts file with the project hostname instead of using DNS for name resolution
    3. Restart DDEV
    4. If for some reason the custom domain is not reaching your local environment, make sure the custom domain is in your /etc/hosts file.
  2. For lando, you can add proxies with custom domains.
    1. You will need to edit your /etc/hosts file.

And that's it! You can now go to the custom domain while using your local environment!

5. You're almost there

With the configuration migrated over to SAML Authentication, you should be able to use SSO. However, if your site has custom code that uses SimpleSAMLphp services or is hooked into one of its hooks, you will need to adjust your code accordingly.

For example, from the sites I've migrated:

  1. There was code that was using the simplesamlphp_auth.manager service, specifically to call the isAuthenticated() function.
    1. I changed it to use the SAML Authentication service, samlauth.saml, and calling the getAttributes() function, which does the same as SimpleSAMLphp isAuthenticated() (it gets the attributes if the current user was logged in using SSO, otherwise, it returns an empty array).
  2. The hook_simplesamlphp_auth_user_attributes and hook_simplesamlphp_auth_existing_user hooks were being used.
    1. I had to create an Event Subscriber (as SAML Authentication doesn't use hooks for these) that hooks to SamlauthEvents::USER_LINK, equivalent to the existing user hook, and SamlauthEvents::USER_SYNC, equivalent to user attributes hook.

You will have to go and search in your code for any SimpleSAMLphp usage and change it!

6. Hurray!

That's it. With SimpleSAMLphp out of the way, your Drupal site is no longer blocked by upstream dependencies - and you're free to plan your Drupal 11 upgrade on your timeline. Happy coding!

The post Don't let SimpleSAMLphp block your Drupal upgrade appeared first on Four Kitchens.

22 Jan 2026 11:15pm GMT

Talking Drupal: TD Cafe #013 - Hilmar & Martin - Drupal in a Day

In this episode, we discuss the 'Drupal in a Day' initiative, aimed at introducing computer science students to Drupal and invigorating the community with new energy. Martin Anderson-Clutz and Hilmar Hallbjörnsson talk about its origins, development, and the specifics of condensing a comprehensive university course into a single-day curriculum. They also cover the enthusiasm and logistics behind the events, insights from past sessions in Vienna and Drupal Jam, and future plans for expanding the scope of this program. Tune in to hear the vision for bringing more students into the Drupal community and the benefits for universities and organizations alike.

For show notes visit: https://www.talkingDrupal.com/cafe013

Topics

Hilmar Hallbjörnsson

Hilmar Kári Hallbjörnsson is a senior Drupal developer, educator, and open-source advocate based in Iceland. He works as a Senior Drupal Developer at the University of Iceland and is the CEO/CTO of the Drupal consultancy Um að gera. Hilmar is also an adjunct professor at Reykjavík University, where he teaches "Designing open-sourced web software with Drupal and PHP."

Deeply involved in the Drupal ecosystem, Hilmar is an active contributor and community organizer, with a particular focus on Drupal 11, modern configuration management, and the emerging Recipes initiative. He is a co-founder of the Drupal Open University Initiative and Drupal-in-a-Day, and has served on the organizing committee for DrupalCon Europe.

His work bridges real-world engineering, teaching, and community leadership, with a strong interest in both the technical evolution and philosophical direction of Drupal as an open-source platform.

Martin Anderson-Clutz

Martin is a highly respected figure in the Drupal community, known for his extensive contributions as a developer, speaker, and advocate for open-source innovation. Based in London, Ontario, Canada, Martin began his career as a graphic designer before transitioning into web development. His journey with Drupal started in late 2005 when he was seeking a robust multilingual CMS solution, leading him to embrace Drupal's capabilities.

Martin holds the distinction of being the world's first Triple Drupal Grand Master, certified across Drupal 7, 8, and 9 as a Developer, Front-End Specialist, and Back-End Specialist. (TheDropTimes) He also possesses certifications in various Acquia products and is UX certified by the Nielsen Norman Group.

Currently serving as a Senior Solutions Engineer at Acquia, Martin has been instrumental in advancing Drupal's ecosystem. He has developed and maintains several contributed modules, including Smart Date and Search Overrides, and has been actively involved in the Drupal Recipes initiative, particularly focusing on event management solutions. His current work on the Event Platform aims to streamline the creation and management of event-based websites within Drupal.

Beyond development, Martin is a prominent speaker and educator, having presented at numerous Drupal events such as DrupalCon Barcelona and EvolveDrupal. He is also a co-host of the "Talking Drupal" podcast, where he leads the "Module of the Week" segment, sharing insights on various Drupal modules. Martin's dedication to the Drupal community is evident through his continuous efforts to mentor, innovate, and promote best practices within the open-source landscape.

Guests

Hilmar Hallbjörnsson - drupalviking Martin Anderson-Clutz - mandclu

22 Jan 2026 3:00pm GMT

Très Bien Blog: I tried Claude Code for a month with Drupal

I tried Claude Code for a month with Drupal

Large Language Models are a pretty fascinating piece of technology. Back in December I saw some chatter about Claude Code and I wanted to try it out myself. It is really fun, it's jQuery in 2010 levels of fun. And, more concerning, it's fun even when it waste my time. For example I tried it on a few things in the past 4 weeks:

theodore

22 Jan 2026 9:00am GMT

Dripyard Premium Drupal Themes: Dripyard on the Pantheon Livestream

I had a great time chatting with Pantheon's Chris Reynolds earlier today about what is coming next for Drupal. We dug into Drupal's upcoming marketplace and where Dripyard fits into that ecosystem.

We also talked through site templates, recipes, single directory components, and how all of this is shaping the future of Drupal.

If you are curious about where Drupal is headed and how the commercial ecosystem is evolving, this one is worth a watch!

22 Jan 2026 1:04am GMT

21 Jan 2026

feedDrupal.org aggregator

Drupal Association blog: Join Drupal for a week of innovation at EU Open Source Week 2026

The Drupal Community will have a large showing at EU Open Source Week 2026 in Brussels. You are invited to join Drupal Association board members Baddy Sonja Breidert, Tiffany Farriss, Sachiko Muto, Imre Gmelig, and Alex Moreno at the following events throughout the week. Drupal Association CEO Tim Doyle, CTO Tim Lehnen, and Head of Global Programs, Meghan Harrell will also be in attendance.

Happening from 26 January to 1 February, 2026 in Brussels, the global open source community is gearing up for the EU Open Source Week. We are proud to highlight the significant presence of the Drupal project throughout the week.

Here's where you can find Drupal making an impact:

Play to impact - Drupal AI Hackathon: The event will kick-off online on 22 January 2026 and further will be continued during the EU Open Source Week on 27 and 28 January 2026. During the two-day event, developers, designers and other digital innovators will work side by side to create smarter, faster and more open digital solutions built with Drupal and AI.

Learn more and register here.

Drupal Pivot: A 1.5-day, peer-led un-conference, to be held on 27 and 28 January 2026, for Drupal agency CEOs, founders, and senior executives to collaboratively explore the most pressing strategic questions shaping the future of the Drupal business ecosystem.

Learn more and register here.

Drupal EU Government Day: A unique free one-day event, scheduled for 29 January 2026, bringing together policymakers, technologists, and digital leaders from across Europe's public sector.

Learn more and register here.

EU Open Source Policy Summit: An invite-only one-day event with free online access, hosted by OpenForum Europe (OFE), bringing together leaders from the public and private sectors to focus on digital sovereignty.

The Drupal Association is honoured to support the EU Open Source Policy Summit 2026 as a Bronze sponsor.

In addition to our sponsorship, we are pleased to highlight two members of our Board of Directors who will be sharing insights during the program:

  • Baddy Sonja Breidert will join the panel: Europe as the World's Home for Open Source
  • Sachiko Muto will provide an intervention on: UN Open Source Week 2026 and the UN Open Source Principles

Learn more and register here.

FOSDEM: A two-day volunteer-run free event for the global open source community. Held every year at the end of January in Brussels, it is a massive event of contributors to share code, host community-led "devrooms," and collaborate face-to-face without registration fees or corporate barriers.

  • HackerTrain to FOSDEM: An event that's meant to coordinate several train routes to Brussels to bring hackers from all over Europe to FOSDEM and the EU Open Source Week. This means that there will not be one HackerTrain, but several at the same time on different routes and different dates - all converging in Brussels.

Learn more and register here.

EU OpenSource Week is going to be an immersive experience for developers, policymakers, and industry leaders to shape the future of open technology offering unparalleled opportunities to:

  • Engage directly with EU policymakers,
  • Learn from and collaborate with leading developers,
  • Network and connect with open source communities,
  • Explore new partnerships

Explore other events happening during the EU Open Source Week on their official website.

21 Jan 2026 6:26pm GMT

ImageX: A Step-by-Step Guide to the OpenAI Module Integration on Your Drupal Website

Supercharging your workflows with artificial intelligence is one of the hottest trends of content editing on a Drupal website. When it comes to AI, Drupal is ahead of the rest, rapidly integrating advanced machine-learning capabilities into its ecosystem.

21 Jan 2026 5:52pm GMT

Centarro: How API Extensibility Future-Proofs B2B Commerce

In B2B ecommerce, one size never fits all. Every business develops unique workflows and processes organically over the years. These vary but include negotiated pricing, custom catalogs, bulk ordering, purchase approvals, and complex fulfillment requirements.

An extensible eCommerce platform provides the flexibility to adapt and customize to specific business needs while keeping pace with evolving market demands.

We can modify a famous quote, usually attributed to General Robert H. Barrow: "Amateurs talk about tactics, but professionals study logistics."

Tactics are important, but it's logistics that enable those tactics and keep an army agile and well-supplied. Long-term thinking and planning are required. It's the difference between winning a single battle and winning a long, protracted war.

So let's modify this principle for B2B companies:

Amateurs shop for features. Professionals architect for tomorrow.

And it's APIs and data structures that help professionals architect for that tomorrow.

Read more

21 Jan 2026 4:28pm GMT

Dries Buytaert: Funding Open Source for Digital Sovereignty

One red cube stands out in a grid of gray cubes.

As global tensions rise, governments are waking up to the fact that they've lost digital sovereignty. They depend on foreign companies that can change terms, cut off access, or be weaponized against them. A decision in Washington can disable services in Brussels overnight.

Last year, the International Criminal Court ditched Microsoft 365 after a dispute over access to the chief prosecutor's email. Denmark's Ministry of Digitalisation is moving to LibreOffice. And Germany's state of Schleswig-Holstein is migrating 30,000 workstations off Microsoft.

Reclaiming digital sovereignty doesn't require building the European equivalent of Microsoft or Google. That approach hasn't worked in the past, and there is no time to make it work now. Fortunately, Europe has something else: some of the world's strongest Open Source communities, regulatory reach, and public sector scale.

Open Source is the most credible path to digital sovereignty. It's the only software you can run without permission. You can audit, host, modify, and migrate it yourself. No vendor, no government, and no sanctions regime can ever take it away.

But there is a catch. When governments buy Open Source services, the money rarely reaches the people who actually build and maintain it. Procurement rules favor large system integrators, not the maintainers of the software itself. As a result, public money flows to companies that package and resell Open Source, not to the ones who do the hard work of writing and sustaining it.

I've watched this pattern repeat for over two decades in Drupal, the Open Source project I started and that is now widely used across European governments. A small web agency spends months building a new feature. They design it, implement it, and shepherd it through review until it's merged.

Then the government puts out a tender for a new website, and that feature is a critical requirement. A much larger company, with no involvement in Drupal, submits a polished proposal. They have the references, the sales team, and the compliance certifications. They win the contract. The feature exists because the small agency built it. But apart from new maintenance obligations, the original authors get nothing in return.

Public money flows around Open Source instead of into it.

Multiply that by every Open Source project in Europe's software stack, and you start to see both the scale of the problem and the scale of the opportunity.

This is the pattern we need to break. Governments should be contracting with maintainers, not middlemen.

Public money flows around Open Source instead of into it. Governments should contract with maintainers and builders, not middlemen.

Skipping the maintainers is not just unfair, it is bad governance. Vendors who do not contribute upstream can still deliver projects, but they are much less effective at fixing problems at the source or shaping the software's future. You end up spending public money on short-term integration, while underinvesting in the long-term quality, security, and resilience of the software you depend on.

If Europe wants digital sovereignty and real innovation, procurement must invest in upstream maintainers where security, resilience, and new capabilities are actually built. Open Source is public infrastructure. It's time we funded it that way.

The fix is straightforward: make contribution count in procurement scoring. When evaluating vendors, ask what they put back into the Open Source projects they are selling. Code, documentation, security fixes, funding.

Of course, all vendors will claim they contribute. I've seen companies claim credit for work they barely touched, or count contributions from employees who left years ago.

So how does a procurement officer tell who is real? By letting Open Source projects vouch for contributors directly. Projects know who does the work. We built Drupal's credit system to solve for exactly this. It's not perfect, but it's transparent. And transparency is hard to fake.

We use the credit system to maintain a public directory of companies that provide Drupal services, ranked by their contributions to the project. It shows, at a glance, which companies actually help build and maintain Drupal. If a vendor isn't on that list, they're most likely not contributing in a meaningful way. For a procurement officer, this turns a hard governance problem into a simple check: you can literally see which service providers help build Drupal. This is what contribution-based procurement looks like when it's made practical.

Fortunately, the momentum is building. APELL, an association of European Open Source companies, has proposed making contribution a procurement criterion. EuroStack, a coalition of 260+ companies, is lobbying for a "Buy Open Source Act". The European Commission has embraced an Open Source roadmap with procurement recommendations.

Europe does not need to build the next hyperscaler. It needs to shift procurement toward Open Source builders and maintainers. If Europe gets this right, it will mean better software, stronger local vendors, and public money that actually builds public code. Not to mention the autonomy that comes with it.

I submitted this post as feedback to the European Commission's call for evidence on Towards European Open Digital Ecosystems. If you work in Open Source, consider adding your voice. The feedback period ends February 3, 2026.

Special thanks to Taco Potze, Sachiko Muto, and Gábor Hojtsy for their review and contributions to this blog post.

21 Jan 2026 3:59pm GMT

CKEditor: What’s new in CKEditor Drupal modules: Footnotes, Restricted Editing and more

New versions of the CKEditor contributed modules bring new formatting and content control features to Drupal. Premium Features module 1.7.0 introduces Footnotes and Line Height support, expanding the toolkit for structured writing and print-optimized formatting. The release also includes support for watermarks in Word exports and several fixes to improve authoring experiences and compatibility. Plugin Pack module 1.5.0 delivers the long-awaited Restricted Editing feature for locked-down templates and controlled collaboration, along with configuration improvements to Layout Tables and general stability fixes.

21 Jan 2026 1:53pm GMT

UI Suite Initiative website: Announcement - Display Builder beta 1 has been released

Monday January the 12th 2026, the UI Suite team has proudly released the first beta of the Display Builder module, a display building tool for ambitious site builders:A single display building tool deeply integrated with DrupalIts powerful unified UI can be used instead of Layout Builder for entity view displays, Block Layout for page displays, and as a replacement of the Views' display building feature. No more struggle with overcomplicated inconsistent UIs!

21 Jan 2026 12:49pm GMT

Tag1 Insights: Speed up your testing with Deuteros

Take Away:

At Tag1, we believe in proving AI within our own work before recommending it to clients. This post is part of our AI Applied content series, where team members share real stories of how they're using Artificial Intelligence and the insights and lessons they learn along the way. Here, Francesco Placella, Senior Architect | Technical Lead, explores how AI supported his creation of Deuteros, an open‑source PHP library that makes Drupal's entity system far easier to unit test-speeding up test runs by orders of magnitude and now freely available for anyone to use and contribute to.

Speed up your testing with Deuteros

At Tag1 we like performance.

We also like not being yelled at because a recently released production build introduced a major regression and suddenly the dev team has to scramble to hot-fix the issue.

Our industry has settled for quite a while now on a solid strategy to avoid that scenario: automated testing. It's not a silver bullet but, when done right, it can hugely increase confidence in making changes to large and/or complex codebases, where even minimal tweaks can sometimes trigger very hard to predict side-effects.

Doing it Right

Automated testing is a vast and complex subject, with some grey areas, especially around terminology; however, there's a key principle it's a good idea to stick to: always run your automated tests after any change.

If you are embracing Test Driven Development, you'll have written new tests covering the changes you are working on before implementing them: this will require running tests multiple times before the task is over. However, even if you are following a more traditional approach, writing test coverage after implementing changes, you will still end up running tests very often.

Admittedly, in the cases above we are talking about a very limited set of tests; however, if tests run slowly, even those will add a significant overhead to development time.

Another widely accepted best practice is to run the entire test suite before merging a change into the project's repository. This both ensures that the new code behaves as expected and that no regression was introduced in the process. Once again, if tests run slowly, it may take a long time to get feedback, adding further overhead due to context switching.

Aside from the impact on developer experience, which at Tag1 is taken very seriously, the larger the overhead, the harder it is to justify to clients the amount of time spent implementing a change. I guess I don't need to describe the cascading effects.

The Test Pyramid

An approach that's very effective in addressing test speed issues is the test pyramid: without going too much into detail, the idea is that a test suite should be organized in a pyramidal shape with multiple layers getting smaller and slower to run as they approach the top.

This commonly translates into having:

What about Drupal?

At Tag1 we do many things, but Drupal has a special place in our hearts. However, running Drupal tests quickly has often been challenging.

Support for automated tests was initially added to Drupal 7 and, as awesome as it was at the time, it involved running a full site installation in the test setup phase before being able to perform a single assertion. The Simpletest framework focused mainly on UI testing and, even if it was sort of possible to use it to write unit tests, it was like driving to your favorite movie theater during rush hour just to check whether the car battery was wired properly. As you can imagine, it took time.

In Drupal 8 and above the situation vastly improved with the adoption of the PhpUnit testing framework. Multiple types of tests were introduced:

It should be noted that the introduction by Moshe Weitzman of Drupal Test Traits helped to speed up these tests, as it allows them to run with an existing database, instead of installing an empty site from scratch every time.

  • Kernel tests run much faster, as they only require installing the subsystems needed to run the test code. These are ideal for Integration testing, as they allow you to pick and choose the parts of Drupal available to the test.

  • Unit tests, lightning-fast and narrow-scoped, are what should be used to implement the large majority of the test coverage. By default, they do close to nothing in the setup phase, allowing to test individual codebase units in isolation, by providing test doubles of all the objects the test subject interacts with.

So we are good now, right? Well, not exactly. The entity subsystem - which powers users, nodes, media items, and taxonomy terms among others - has historically been problematic when it comes to unit testing, because instantiating an entity object involves instantiating a significant amount of dependencies. In most cases this meant that, to write an entity unit test, a Kernel test with all its baggage was needed. Since entities are very prevalent in any Drupal codebase, this meant that a wide range of test scenarios could not rely on fast unit tests.

Meet Deuteros

This issue has been bugging me for a long time now, so I decided to experiment with an AI-assisted solution since long and tedious tasks is something LLMs are supposed to be good at.

The result is Deuteros - Drupal Entity Unit Test Extensible Replacement Object Scaffolding - a PHP library allowing the use of fast unit tests (extending UnitTestCase, for those in the know) when interacting with entities. And, yes, Claude helped me come up with the name as well.

Deuteros is fully open source and available today. You can explore the project, file issues, or contribute improvements on GitHub. The repository includes examples, docs, and instructions to start using it right away.

This is achieved by:

  • Allowing the creation of test doubles implementing various entity and field interfaces. These can be passed around in test code as if they were real entity objects.
use Deuteros\Double\EntityDoubleFactory;
use Deuteros\Double\EntityDoubleDefinitionBuilder;

class MyServiceTest extends TestCase {

  public function testMyService(): void {
    // Get a factory (auto-detects PHPUnit or Prophecy)
    $factory = EntityDoubleFactory::fromTest($this);

    // Create an entity double
    $entity = $factory->create(
      EntityDoubleDefinitionBuilder::create('node')
        ->bundle('article')
        ->id(42)
        ->label('Test Article')
        ->field('field_body', 'Article content here')
        ->build()
    );

    // Use it in your test
    $myService = new MyService();
    $myService->doStuff($entity);
    
    // These assertions would all pass
    $this->assertInstanceOf(EntityInterface::class, $entity);
    $this->assertSame('node', $entity->getEntityTypeId());
    $this->assertSame('article', $entity->bundle());
    $this->assertSame(42, $entity->id());
    $this->assertSame('Article content here', $entity->get('field_body')->value);
  }

}
  • Allowing the instantiation of real entity objects while providing test doubles for dependencies such as field items and Entity Field API services. This allows unit testing of custom entity logic such as the one typically added to bundle classes.
class MyNodeTest extends TestCase {

  private SubjectEntityFactory $factory;

  protected function setUp(): void {
    $this->factory = SubjectEntityFactory::fromTest($this);
    $this->factory->installContainer();
  }

  protected function tearDown(): void {
    $this->factory->uninstallContainer();
  }

  public function testNodeCreation(): void {
    $node = $this->factory->create(Node::class, [
      'nid' => 42,
      'type' => 'article',
      'title' => 'Test Article',
      'body' => ['value' => 'Body text', 'format' => 'basic_html'],
      'status' => 1,
    ]);

    // Real Node instance with DEUTEROS field doubles.
    $this->assertInstanceOf(Node::class, $node);
    $this->assertSame('node', $node->getEntityTypeId());
    $this->assertSame('article', $node->bundle());
    $this->assertSame('Test Article', $node->get('title')->value);
    $this->assertSame('Body text', $node->get('body')->value);
  }

}

How AI Enabled a New, Faster Class of Drupal Tests

This sounds all good but does it actually work? Well, the library is already being adopted by some Tag1 client projects and it seems to be working nicely. To get an idea of the speed gains Deuteros allows, we implemented some benchmark tests: a simple test trait performing a few node manipulation operations was used to measure the performance of multiple test types: in my DDEV local environment, 1000 iterations of the test complete in less than 1 second with Unit tests while they take ~12 minutes with Kernel tests - details and replication steps available here. Of course, running tests in parallel could likely reduce the gap by a factor of ten, but I'll still take the former, thanks.

What's Next?

Complex projects very often require patches to implement new functionality not yet available in the Drupal codebase, or apply bug fixes before they are eventually released. Applying patches for all intents and purposes means running a fork of the original project, which means there is no guarantee that, even if individual patches don't break tests, their combined behavior won't introduce regressions. Ideally, to be fully confident about the codebase quality, one would need to run the entire Drupal core + contrib test suite and confirm that nothing breaks. Currently, even on the drupal.org infrastructure - which is fully optimized for the job - on average you need to wait for ~15 minutes to get test results: adding the test suites of a hundred modules increases that by a lot. Running this on every pull request or commit push does not seem viable at the moment.

Deuteros was written as a standalone PHP library, so that it could easily be adopted without having to make significant changes to the host project. If it were widely adopted both in Drupal core and in contrib-land, running all unit test suites could probably happen in less than one minute, increasing the confidence in our changes by a lot, and possibly reducing our energy footprint as a bonus. Should we try?

Image by Google DeepMind from pexels.

21 Jan 2026 12:00am GMT

20 Jan 2026

feedDrupal.org aggregator

Dries Buytaert: Software as clay on the wheel

Two people shape a clay pot on a spinning pottery wheel, their hands covered in wet clay.

A few weeks ago, Simon Willison started a coding agent, went to decorate a Christmas tree with his family, watched a movie, and came back to a working HTML5 parser.

That sounds like a party trick. It isn't.

It worked because the result was easy to check. The parser tests either pass or they don't. The type checker either accepts the code or it doesn't. In that kind of environment, the work can keep moving without much supervision.

Geoffrey Huntley's Ralph Wiggum loop is probably the cleanest expression of this idea I've seen, and it's becoming more popular quickly. In his demonstration video, he describes creating specifications through conversation with an AI agent, then letting the loop run. Each iteration starts fresh: the agent reads the specification, picks the most important remaining task, implements it, runs the tests. If they pass, it commits and exits. The next iteration begins with empty context, reads the current state from disk, and picks up where the previous run left off.

If you think about it, that's what human prompting already looks like: prompt, wait, review, prompt again. You're shaping the code or text the way a potter shapes clay: push a little, spin the wheel, look, push again. The Ralph loop just automates the spinning, which makes much more ambitious tasks practical.

The difference is how state is handled. When you work this way by hand, the whole conversation comes along for the ride. In the Ralph loop, it doesn't. Each iteration starts clean.

Why? Because carrying everything with you all the time is a great way to stop getting anywhere. If you're going to work on a problem for hundreds of iterations, things start to pile up. As tokens accumulate, the signal can get lost in noise. By flushing context between iterations and storing state in files, each run can start clean.

Simon Willison's port of an HTML5 parsing library from Python to JavaScript showed the principle at larger scale. Using GPT-5.2 through Codex CLI with the --yolo flag for uninterrupted execution, he gave a handful of directional prompts: API design, milestones, CI setup. Then he let it run while he decorated a Christmas tree with his family and watched a movie.

Four and a half hours later, the agent had produced a working HTML5 parser. It passed over 9,200 tests from the html5lib-tests suite. HTML5 parsing is notoriously complex. The specification precisely defines how even malformed markup should be handled, with thousands of edge cases accumulated over years. But the agent had constant grounding: each test run pulled it back to reality before errors could compound.

As Willison put it: "If you can reduce a problem to a robust test suite you can set a coding agent loop loose on it with a high degree of confidence that it will eventually succeed". Ralph loops and Willison's approach differ in structure, but both depend on tests as the source of truth.

Cursor's research on scaling agents confirms this is starting to work at enterprise scale. Their team explored what happens when hundreds of agents work concurrently on a single codebase for weeks. In one experiment, they built a web browser from scratch. Over a million lines of code across a thousand files, generated in a week. And the browser worked.

That doesn't mean it's secure, fast, or something you'd ship. It means it met the criteria they gave it. If you decide to check for security or performance, it will work toward that as well. But the pattern is the same: clear tests, constant verification, agents that know when they're done.

From solo loops to hundreds of agents running in parallel, the same pattern keeps emerging. It feels like something fundamental is crystallizing: autonomous AI is starting to work well when you can accurately define success upfront.

Willison's success criteria were "simple": all 9,200 tests pass. That is a lot of tests, but the agent got there. Clear criteria made autonomy possible.

As I argued in AI flattens interfaces and deepens foundations, this changes where humans add value:

Humans are moving to where they set direction at the start and refine results at the end. AI handles everything in between.

The title of this post comes from Geoffrey Huntley. He describes software as clay on the pottery wheel, and once you've worked this way, it's hard to think about it any other way. As Huntley wrote: "If something isn't right, you throw it back on the wheel and keep going". That's exactly how it feels. Throw it back, refine it, spin again until it's right.

Of course, the Ralph Wiggum loop has limits. It works well when verification is unambiguous. A unit test returns pass or fail. But not all problems come with clear tests. And writing tests can be a lot of work.

For example, I've been thinking about how such loops could work for Drupal, where non-technical users build pages. "Make this page more on-brand" isn't a test you can run.

Or maybe it is? An AI agent could evaluate a page against brand guidelines and return pass or fail. It could check reading level and even do some basic accessibility tests. The verifier doesn't have to be a traditional test suite. It just has to provide clear feedback.

All of this just exposes something we already intuitively understand: defining success is hard. Really hard. When people build pages manually, they often iterate until it "feels right". They know what they want when they see it, but can't always articulate it upfront. Or they hire experts who carry that judgment from years of experience. This is the part of the work that's hardest to automate. The craft is moving upstream, from implementation to specification and validation.

The question for any task is becoming: can you tell, reliably, whether the result is getting better or worse? Where you can, the loop takes over. Where you can't, your judgment still matters.

The boundary keeps moving fast. A year ago, I was wrestling with local LLMs to generate good alt-text for images. Today, AI agents build working HTML5 parsers while you watch a movie. It's hard not to find that a little absurd. And hard not to be excited.

20 Jan 2026 7:39pm GMT

Drupal AI Initiative: The Future of AI-Powered Web Creation Is People-First, Not-Prompt First

Aidan Foster - Strategy Lead, Foster Interactive

The Big Shift in Web Creation

For years, the hardest part of building a website was technical execution. Slow development cycles, code barriers, and long timelines created bottlenecks.

AI has changed this.

Execution is no longer the limiting factor. Understanding is. The new challenge is knowing your audience, clarifying your message, and structuring the story your website needs to tell.

The future is not prompt-first. It is people first. Strategy, insight, empathy, and structure.

This was the core message of my talk AI Page Building with Drupal Canvas. It is also why Foster Interactive joined the Drupal AI Makers initiative.

But none of this works unless the human layer comes first.

Why People-First AI Matters and Why AI Slop Happens

A graphic showing 'authentic human insight' surrounded by a sea of AI slop
In a market characterized by AI slop, human insight is the key to being both authentic and distinct.

AI is a powerful assistant, but it cannot replace human judgment.
Large language models can synthesize patterns, but they cannot invent your strategy.

When teams skip the foundational work such as audience research, messaging clarity, and brand systems, AI produces generic output that feels shallow and off-brand.

This is what we call AI slop. The issue is not the model. The issue is unclear inputs.

AI can only accelerate the parts you already understand. The human layer must come first. Audience insight. Value propositions. Tone and language rules. Page-level content strategy.

Without this structure, every output becomes guesswork.

The New Tools: Canvas and the Context Control Center

Drupal's new AI features are powerful because they finally support how marketers work.

Canvas: The visual editor built for marketers

Canvas allows anyone to build pages using drag and drop.
It offers instant previews, mobile and desktop views, simple undo and redo, and AI built directly into the editor.

You can ask Canvas to assemble a campaign landing page and it uses your brand components, design system, content rules, and tone to create useful starting points.

This is the most marketer-friendly Drupal experience ever made.

The Context Control Center: The AI knowledge base

This is where strategy becomes usable by AI. It allows teams to load audience personas, value propositions, tone guides, brand rules, page templates, messaging frameworks, and content strategy documents.

With this context available, the AI produces work that is aligned, accurate, and consistent.

Instead of guessing, it draws from your organization's strategic foundation.
For the first time, brand and audience knowledge can be reused across the entire website.

The Demo: How We Built FinDrop Landing Page Demo

The demo shows canvas AI building and populating a landing page.

To demonstrate what is possible, we built a fictional SaaS company called FinDrop.

We created product stories, value props, audience personas, PPC ads, content strategy, and a visual system that matched the Mercury design system.

We generated all of this using strategy first, then AI. We crafted brand rules, used Nano Banana for consistent imagery, built campaign assets, and generated full landing pages for three stages of a funnel.

AI gave us speed, but only because the human structure was already in place. Without strategy the output collapsed. With structure it accelerated.

The Real Lesson: AI Only Works When the Inputs Are Strong

Drupal Canvas is a drag-and-drop page builder
Demo of the Drupal Canvas UI with Mercury component library.

The FinDrop demo made something clear. AI did not save time because it is smart. It saved time because the rules were defined. Your success depends on the strength of your foundations.

Clear value propositions. Real audience insight. A defined tone. Predictable page patterns. Brand rules the AI can follow. Without this, AI slows teams down.

At Foster Interactive we are testing the best models for Drupal workflows, refining content strategy structures for the Context Control Center, creating systems to make AI-ready brands easier to build, and bringing the marketer's perspective into the AI Makers roadmap.

Our goal is simple. Make AI genuinely useful for small marketing teams without sacrificing accuracy or authenticity.

What Is Coming Next for Drupal AI and Why It Matters

Drupal CMS 2 is coming in early 2026. It will include deeper Canvas integration, more intuitive site templates, a lighter AI suite, reusable design systems, expanded knowledge base support, and better tools for auditing and maintaining content.

But the biggest change is this. It will become easy to install the tools and it will be obvious who has done the strategic work. Teams relying solely on AI will blend into the noise.

Teams grounded in human insight will stand out.

Now Is the Time to Unlearn What Is Possible

A few months ago, I did not believe a CMS could generate usable landing pages in minutes or create consistent AI imagery. Then we built FinDrop.

The tools have changed. The pace has changed.

Human insight cannot be outsourced to AI.

We want our AI tools to take care of boring, repetitive jobs to free up our time for creative and strategic work.

The role of marketers is shifting away from production bottlenecks and toward clarity, empathy, positioning, narrative, and audience understanding.

AI can accelerate execution and remove repetitive tasks. But it cannot replace the strategy behind them.

If we get the human foundations right, we create a future where imagination becomes the bottleneck, not time.

That's the future I want to live in.

Ready to put people-first AI into practice?

Start with your foundations. Sit down with your team and audit your brand guidelines. Talk to front-line support and sales - the people closest to your customers. Update your tone, messaging, and audience details. This is the work that makes AI useful.

Then try Canvas. Once your foundations are solid, test what's possible with the upcoming Drupal CMS 2.0 demo at drupalforge.org. (Or if you're a little more technical, test the Driesnote Demo which is available right now).

20 Jan 2026 6:05pm GMT

Droptica: AGENTS.md Tool: How AI Actually Speeds Up Drupal Work

- Friday, 2:00 PM. New developer, production bug. Something's broken with a custom queue worker. In the past, this meant tracking down the previous developer, consultations, wasting time - all on a Friday. Now? The developer asks artificial intelligence and AI responds with useful answers because it knows the project. How? Just one file: AGENTS.md.

20 Jan 2026 12:14pm GMT

Undpaul.de: Converting PHP class annotations to PHP attributes

For years, Drupal relied on the doctrine/annotations library to add metadata to plugin classes. This approach works through docblock comments. The /** ... */ blocks that appear above class definitions. While convenient, this method has several inherent limitations and will be gradually replaced by PHP attributes.

20 Jan 2026 8:03am GMT

Specbee: Understanding Entity Reference Revisions in Drupal

Losing track of revisions after repeated edits in Drupal? Learn how Entity Reference Revisions preserves content integrity by ensuring safe edits, accurate history, and reliable workflows.

20 Jan 2026 5:44am GMT