17 Feb 2026

feedDrupal.org aggregator

Nonprofit Drupal posts: February 2026 Drupal for Nonprofits Chat

Join us THURSDAY, February 19 at 1pm ET / 10am PT, for our regularly scheduled call to chat about all things Drupal and nonprofits. (Convert to your local time zone.)

We don't have anything specific on the agenda this month, so we'll have plenty of time to discuss anything that's on our minds at the intersection of Drupal and nonprofits. Got something specific you want to talk about? Feel free to share ahead of time in our collaborative Google document at https://nten.org/drupal/notes!

All nonprofit Drupal devs and users, regardless of experience level, are always welcome on this call.

This free call is sponsored by NTEN.org and open to everyone.

Information on joining the meeting can be found in our collaborative Google document.

17 Feb 2026 6:15pm GMT

Specbee: 8 Critical considerations for a successful Drupal 7 to 10/11 migration

Your Drupal 7 site has reached its end of life as of January 2025. If you're planning a move to Drupal 10 or 11, this blog will help you prepare for a smooth and well-planned migration.

17 Feb 2026 9:15am GMT

Metadrop: Managing Drupal Status Report requirements

You know the drill. You visit the Drupal Status Report to check if anything needs attention, and you're greeted by a wall of warnings you've seen dozens of times before.

Some warnings are important. Others? Not so much. Maybe you're tracking an update notification in your Gitlab and don't need the constant reminder. Perhaps there's a PHP deprecation notice you're already aware of and planning to address during your next scheduled upgrade. Or you're seeing environment-specific warnings that simply don't apply to your infrastructure setup.

The noisy status report problem

The problem is that all these warnings sit alongside genuine issues that actually need your attention. The noise drowns out the signal. You end up scrolling past the same irrelevant messages every time, increasing the chance you'll miss something that matters.

Over time, you develop warning blindness. Your brain learns to ignore the status report page entirely because the signal-to-noise ratio is too low. Then, when a genuine security update appears or a database schema issue emerges, it gets lost in the familiar sea of orange and red.

This problem multiplies across teams. Each developer independently decides which warnings to ignore. New team members have no way to know which warnings matter and which ones are environmental noise. The status report becomes unreliable, defeating its entire purpose.

17 Feb 2026 8:11am GMT

DDEV Blog: Xdebug in DDEV: Understanding, Debugging, and Troubleshooting Step Debugging

Illustration showing how Xdebug connects from PHP container to IDE debugger

For most people, Xdebug step debugging in DDEV just works: ddev xdebug on, set a breakpoint, start your IDE's debug listener, and go. DDEV handles all the Docker networking automatically. If you're having trouble, run ddev utility xdebug-diagnose and ddev utility xdebug-diagnose --interactive - they check your configuration and connectivity and tells you exactly what to fix.

This post explains how the pieces fit together and what to do if things do go wrong.

The Quick Version

  1. ddev xdebug on
  2. Start listening in your IDE (PhpStorm: click the phone icon; VS Code: press F5)
  3. Set a breakpoint in your entry point (index.php or web/index.php)
  4. Visit your site

If it doesn't work:

ddev utility xdebug-diagnose

Or for guided, step-by-step troubleshooting:

ddev utility xdebug-diagnose --interactive

The diagnostic checks port 9003 listener status, host.docker.internal resolution, WSL2 configuration, xdebug_ide_location, network connectivity, and whether Xdebug is loaded. It gives actionable fix recommendations.

How Xdebug Works

Xdebug lets you set breakpoints, step through code, and inspect variables - interactive debugging instead of var_dump().

The connection model is a reverse connection: your IDE listens on port 9003 (it's the TCP server), and PHP with Xdebug initiates the connection (it's the TCP client). Your IDE must be listening before PHP tries to connect.

:::note The Xdebug documentation uses the opposite terminology, calling the IDE the "client." We use standard TCP terminology here. :::

How DDEV Makes It Work

DDEV configures Xdebug to connect to host.docker.internal:9003. This special hostname resolves to the host machine's IP address from inside the container, so PHP can reach your IDE across the Docker boundary.

The tricky part is that host.docker.internal works differently across platforms. DDEV handles this automatically:

You can verify the resolution with:

ddev exec getent hosts host.docker.internal

DDEV Xdebug Commands

IDE Setup

PhpStorm

Zero-configuration debugging works out of the box:

  1. Run → Start Listening for PHP Debug Connections
  2. Set a breakpoint
  3. Visit your site

PhpStorm auto-detects the server and path mappings. If mappings are wrong, check Settings → PHP → Servers and verify /var/www/html maps to your project root.

The PhpStorm DDEV Integration plugin handles this automatically.

VS Code

Install the PHP Debug extension and create .vscode/launch.json:

{
  "version": "0.2.0",
  "configurations": [
    {
      "name": "Listen for Xdebug",
      "type": "php",
      "request": "launch",
      "port": 9003,
      "hostname": "0.0.0.0",
      "pathMappings": {
        "/var/www/html": "${workspaceFolder}"
      }
    }
  ]
}

The VS Code DDEV Manager extension can set this up for you.

WSL2 + VS Code with WSL extension: Install the PHP Debug extension in WSL, not Windows.

Common Issues

Most problems fall into a few categories. The ddev utility xdebug-diagnose tool checks for all of these automatically.

Breakpoint in code that doesn't execute: The #1 issue. Start with a breakpoint in your entry point (index.php) to confirm Xdebug works, then move to the code you actually want to debug.

IDE not listening: Make sure you've started the debug listener. PhpStorm: click the phone icon. VS Code: press F5.

Incorrect path mappings: Xdebug reports container paths (/var/www/html), and your IDE needs to map them to your local project. PhpStorm usually auto-detects this; VS Code needs the pathMappings in launch.json.

Firewall blocking the connection: Especially common on WSL2, where Windows Defender Firewall blocks connections from the Docker container. Quick test: temporarily disable your firewall. If debugging works, add a firewall rule for port 9003.

WSL2 Notes

WSL2 adds networking complexity. The most common problems:

Special Cases

Container-based IDEs (VS Code Remote Containers, JetBrains Gateway):

ddev config global --xdebug-ide-location=container

Command-line debugging: Works the same way - ddev xdebug on, start your IDE listener, then ddev exec php myscript.php. Works for Drush, WP-CLI, Artisan, and any PHP executed in the container.

Debugging Composer: Composer disables Xdebug by default. Override with:

ddev exec COMPOSER_ALLOW_XDEBUG=1 composer install

Custom port: Create .ddev/php/xdebug_client_port.ini with xdebug.client_port=9000 (rarely needed).

Debugging host.docker.internal resolution: Run DDEV_DEBUG=true ddev start to see how DDEV determines the IP.

Advanced Features

xdebugctl: DDEV includes the xdebugctl utility for dynamically querying and modifying Xdebug settings, switching modes (debug, profile, trace), and more. Run ddev exec xdebugctl --help. See the xdebugctl documentation.

Xdebug map feature: Recent Xdebug versions can remap file paths during debugging, useful when container paths don't match local paths in complex ways. This complements IDE path mappings.

Performance: Xdebug adds overhead. Use ddev xdebug off or ddev xdebug toggle when you're not actively debugging.

More Information

Claude Code was used to create an initial draft for this blog, and for subsequent reviews.

17 Feb 2026 12:00am GMT

16 Feb 2026

feedDrupal.org aggregator

DrupalCon News & Updates: Must-See DrupalCon Chicago 2026 Sessions for Marketing and Content Leaders

If you are a marketing or content leader, DrupalCon Chicago 2026 is already calling your name. You are the special audience whose creative spark and unique perspective shine a light on Drupal in ways developers alone never could. You promote Drupal's capabilities to the world and ensure the platform reaches the users who need it. You translate technical innovation into stories that resonate with everyone.

Drupal is increasingly built with you in mind. Making Drupal more editor‑friendly has been a clear priority in recent years. Thanks to your feedback and insights, great strides have been made in providing tools and workflows that truly support your creative vision.

This year's DrupalCon sessions are set to spark bold insights, fresh strategies, and lively discussions. Expect those unforgettable "aha!" moments you'll want to carry back and weave into your own marketing and content playbook. Here is a curated list of standout sessions designed to help marketing and content leaders turn inspiration into action, build meaningful connections, and discover new ways to make the most out of Drupal's strengths.

Top DrupalCon Chicago 2026 sessions for marketing or content leaders

"Generative engine optimization tactics for discoverability" - by Jeffrey McGuire and Tracy Evans

Search Engine Optimization (SEO) has long been one of the web's most familiar acronyms when it comes to boosting content visibility. But new times bring new terms, and it's time to meet "GEO" (Generative Engine Optimization).

Indeed, traditional SEO alone is no longer enough in a world where tools like ChatGPT, Perplexity, and Google's AI Overviews are everyday sources of advice. Today, SEO and GEO must work hand in hand. DrupalCon Chicago 2026 has an insightful session designed to introduce you to a new way of helping your content reach its audience in the age of AI-driven recommendations.

Join brilliant speakers, Jeffrey McGuire (horncologne) and Tracy Evans (kanadiankicks), to stay ahead of the curve. Jeffrey A. "jam" McGuire has been one of the most influential voices in the Drupal community for over two decades, recognized as a marketing strategy and communications expert. With their combined expertise, this session is tailored for marketing and content leaders who want practical, actionable guidance.

You'll explore how to make your agency, SaaS product, or company stand out when large language models decide which names to surface. Practical strategies will follow, helping you position your expertise, strengthen credibility signals, and align your content with the data sources LLMs rely on. The session will draw from real-world research, client projects, and observations.

"Context is Everything: Configuring Drupal AI for Consistent, On-Brand Content" - by Kristen Pol and Aidan Foster

It shouldn't come as a surprise that the next session on this list is also about AI. Of course, you already know that artificial intelligence can churn out content in seconds. But how to make sure it's consistent with your brand's voice, feels authentic for your organization, and resonates with your audience?

That's where Drupal's latest innovations, Context Control Center and Drupal Canvas, step in. Expect more exciting details at this session at DrupalCon Chicago 2026, which is a must‑see for marketing and content leaders.

This talk will be led by Kristen Pol (kristen pol) and Aidan Foster (afoster), the maintainers behind Context Control Center and Drupal Canvas. Through live demos, you'll see landing pages, service pages, and blog posts come to life with clear context rules.

You'll also leave with a practical starter framework for building your own context files, giving you the confidence to guide AI toward content that supports your marketing goals and strengthens your brand presence.

"From Chaos to Clarity: Building a Sustainable Content Governance Model with Drupal" - by Richard Nosek and C.J. Pagtakhan

Content chaos is something every marketing and content leader has faced: fragmented messaging, inconsistent standards, and editorial bottlenecks that slow campaigns down. At DrupalCon Chicago 2026, you'll discover an actionable plan to make your content consistent, organized, and aligned with your brand's goals.

Join this compelling session by Richard Nosek and C.J. Pagtakhan, seasoned experts in digital strategy. They'll show how structured governance can scale across departments without stifling creativity. Explore workflows that make life easier for authors, editors, and administrators, including approval processes, audits, and lifecycle management. Discover clear frameworks for roles, responsibilities, and standards.

And because theory is best paired with practice, you'll see real-world examples of how this approach improves quality, strengthens collaboration, and supports long‑term digital strategy on Drupal websites of every size and scope.

"Selling Drupal: How to win projects, and not alienate delivery teams" - by Hannah O'Leary and Hannah McDermott

Within agencies, sales and delivery departments share the same ultimate goal, client success. However, sales teams chase ambitious targets, while delivery teams focus on scope, sustainability, and the realities of open‑source implementation. Too often, this push and pull leads to friction, misaligned expectations, and even dips in client satisfaction.

At DrupalCon Chicago 2026, Hannah O'Leary hannaholeary and Hannah McDermott (hannah mcdermott) will share how they turned that challenge into a partnership at the Zoocha team. Through transparent handovers, joint scoping, and shared KPIs, they built a framework where both sides thrive together.

This session will highlight how open communication improved forecasting, reduced "us vs. them" dynamics, and directly boosted the quality of Drupal delivery. You'll leave with practical strategies to apply in your own organization. This includes fostering empathy across teams, aligning metrics, and creating a culture of transparency.

"A Dashboard that Works: Giving Editors What They Want, But Focusing on What They Need" - by Albert Hughes and Dave Hansen-Lange

Imagine logging in and instantly seeing what matters most to your content team: recent edits, accessibility checks, broken links, permissions, and so on. That's the power of a dashboard built not just to look good, but to truly support editors in their daily work.

Join Albert Hughes (ahughes3) and Dave Hansen-Lange (dalin) at their session as they share the journey of shaping a dashboard for 500 editors across 130 sites. You'll hear how priorities were set, how editor needs were balanced with technical realities, and how decisions shaped a tool that keeps content teams focused and confident.

You'll walk away with practical lessons you can apply to your own platform and a fresh perspective on how smart dashboards can empower editors and strengthen content leadership.

"Drupal CMS Spotlights" - by Gábor Hojtsy

As marketing and content leaders, you will appreciate a session on Drupal's latest innovations that can make a difference in your work. One of the greatest presentations for this purpose at DrupalCon Chicago 2026 is the Drupal CMS Spotlights.

Drupal CMS is a curated version of Drupal packed with pre-configured features, many of which are focused on content experiences. For example, you can instantly spin up a ready-to-go blog, SEO tools, events, and more.

The session brings together key Drupal CMS leaders to share insights on recent developments and plans for the future. You'll hear about Site Templates, the new Drupal Canvas page builder, AI, user experience, usability, documentation, and more.

Gábor Hojtsy (gábor hojtsy), Drupal core committer and initiative coordinator, is known for his engaging style, so you'll enjoy the session even if some details get technical.

"Launching the Drupal Site Template Marketplace" - by Tim Hestenes Lehnen

For marketing and content leaders, the launch of the Drupal Site Template Marketplace is big news. Each template combines recipes (pre‑configured feature sets), demo content, and a Canvas‑compatible theme, making it faster than ever to launch a professional, polished website. For anyone focused on storytelling, campaigns, or digital experiences, this is a game‑changer.

The pilot program at DrupalCon Vienna 2025 introduced the first templates, built with the support of Drupal Certified Partners. Now, the Marketplace is expanding, offering a streamlined way to discover, select, and implement templates that align with your goals.

Join Tim Hestenes Lehnen (hestenet), a renowned Drupal core contributor, for a session that dives deeper. He'll share lessons learned from the pilot, explain how the Marketplace connects to the roadmap for Drupal CMS and Drupal Canvas, and explore what's next as more templates become available.

Driesnote - by Dries Buytaert

The inspiring keynote by Dries Buytaert, Drupal's founder, is a session that can't be missed. Driesnote closes the opening program at Chicago 2026 and sets the tone for the entire conference. It's your perfect chance to see where Drupal is headed, and how those changes make your work easier, faster, and more creative.

At DrupalCon Vienna 2025, the main auditorium's audience was the first to hear Dries' announcements. Among other things, they heard about the rise in the AI Initiative funding, doubled contributions into Drupal CMS, and site templates to be found at Marketplace.

Marketers and content editors were especially amazed to see what's becoming possible in their work: content templates in Drupal Canvas, a Context Control Center to help AI capture brand voice, and autonomous Drupal agents keeping content up to date automatically.

This year, the mystery of what's next is yours to uncover. Follow the crowd to the main auditorium at DrupalCon Chicago and expect that signature "wow" moment that leaves the audience buzzing.

Final thoughts

Step into DrupalCon Chicago 2026 and reignite your marketing and content vision. Connect with peers, recharge your ideas, and see how Drupal continues to evolve. The sessions are designed to spark creativity and provide tools that can be put to work right away. As you head into the event, keep an open mind, lean into the conversations, and enjoy the energy that comes from sharing ideas across our amazing community.

Authored By: Nadiia Nykolaichuk, DrupalCon Chicago 2026 Marketing & Outreach Committee Member

16 Feb 2026 8:03pm GMT

Talking Drupal: Talking Drupal #540 - Acquia Source

Today we are talking about Acquia's Fully managed Drupal SaaS Acquia Source, What you can do with it, and how it could change your organization with guest Matthew Grasmick. We'll also cover AI Single Page Importer as our module of the week.

For show notes visit: https://www.talkingDrupal.com/540

Topics

Resources

Guests

Matthew Grasmick - grasmash

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Catherine Tsiboukas - mindcraftgroup.com bletch

MOTW Correspondent

Martin Anderson-Clutz - mandclu.com mandclu

16 Feb 2026 7:00pm GMT

Drupal AI Initiative: Four Weeks of High Velocity Development for Drupal AI

Authors: Arian, Christoph, Piyuesh, Rakhi (alphabetical)

While Artificial Intelligence is evolving rapidly, many applications remain experimental and difficult to implement in professional production environments. The Drupal AI Initiative addresses this directly, driving responsible AI innovation by channelling the community's creative energy into a clear, coordinated product vision for Drupal.

Dries
Dries Buytaert presenting the status of Drupal AI Initiative at DrupalCon Vienna 2025

In this article, the third in a series, we highlight the outcomes of the latest development sprints of the Drupal AI Initiative. Part one outlines the 2026 roadmap presented by Dries Buytaert. Part two addresses the organisation and new working model for the delivery of AI functionality.

Converting ambition into measurable progress

To turn the potential of AI into a reliable reality for the Drupal ecosystem, we have developed a repeatable, high-velocity production model that has already delivered significant results in its first four weeks.

A Dual-Workstream Approach to Innovation

To maximize efficiency and scale, development is organized into two closely collaborating workstreams. Together, they form a clear pipeline from exploration and prototyping to stable functionality:

  • The Innovation Workstream: Led by QED42, this stream explores emerging technologies like evaluating Symfony AI, building AI-driven page creation, prompt context management, and the latest LLM capabilities to define what is possible within the ecosystem.
  • The Product Workstream: Led by 1xINTERNET, this team takes proven innovations and refines, tests, and integrates them into a stable Drupal AI product ensuring they are ready for enterprise use.

Sustainable Management through the RFP Model

This structure is powered by a Request for Proposal (RFP) model, sponsored by 28 organizations partnering with the Drupal AI Initiative.

The management of these workstreams is designed to rotate every six months via a new RFP process. Currently, 1xINTERNET provides the Product Owner for Product Development and QED42 provides the Product Owner for Innovation, while FreelyGive provides core technical architecture. This model ensures the initiative remains sustainable and neutral, while benefiting from the consistent professional expertise provided by the partners of the Drupal AI Initiative.

Professional Expertise from our AI Partners

The professional delivery of the initiative is driven by our AI Partners, who provide the specialized resources required for implementation. To maintain high development velocity, we operate in two-week sprint iterations. This predictable cadence allows our partners to effectively plan their staff allocations and ensures consistent momentum.

The Product Owners for each workstream work closely with the AI Initiative Leadership to deliver on the one-year roadmap. They maintain well-prepared backlogs, ensuring that participating organizations can contribute where their specific technical strengths are most impactful.

By managing the complete development lifecycle, including software engineering, UX design, quality assurance, and peer reviews, the sprint teams ensure the delivery of stable and well-architected solutions that are ready for production environments.

Drupal CMS

The Strategic Role of AI in Drupal CMS

The work of the AI Initiative provides important functionality to the recently launched Drupal CMS 2.0. This release represents one of the most significant evolutions in Drupal's 25-year history, introducing Drupal Canvas and a suite of AI-powered tools within a visual-first platform designed for marketing teams and site builders alike.

The strategic cooperation between the Drupal AI Initiative and the Drupal CMS team ensures that our professional-grade AI framework delivers critical functionality while aligning with the goals of Drupal CMS.
Results from our first Sprints

The initial sprints demonstrate the high productivity of this dual-workstream approach, driven directly by the specialized staff of our partnering organizations. In the first two weeks, the sprint teams resolved 143 issues, creating significant momentum right from the first sprint.

AI Set up
Screenshot Drupal AI Dashboard

This surge of activity resulted in the largest regular patch release in the history of the Drupal AI module. This achievement was made possible by the intensive collaboration between several expert companies working in sync. Increased contribution from our partners will allow us to further accelerate development velocity, improving the capacity to deliver more advanced technical features in the coming months.

AI Agent debugger
Screen recording Agents Debugger

Highlights from the first sprints

While the volume of work is significant, some new features stand out. Here are a few highlights from our recent sprint reviews:

  • AI Dashboard in Drupal CMS 2.0: Artem from 1xINTERNET presented the AI Dashboard functionality. This central hub for managing AI features and providers has been officially moved into the Drupal CMS 2.0 release, serving as the user interface for AI site management.
  • Advanced Automation: Anjali from QED42 presented new JSON and Audio field automators, which enable Drupal to process complex data types via AI.
  • The Context Control Center: Kristen from Salsa Digital presented the evolution of our context governance, converting config entities into content entities to enable revision management and better targeting.
  • The New Markdown Editor: Bruno from 1xINTERNET demonstrated a sleek new Markdown editor for prompt fields, featuring type-ahead autocomplete for variables and tokens. This will be released with the 1.3 version of the Drupal AI module.
  • Agents Debugger: To help developers see "under the hood," Marcus from FreelyGive introduced the new debugger module to trace AI agent interactions in real-time.
  • Technical Deep-Dives: We've seen steady progress on Symfony AI (presented by Akhil from QED42), Model Context Protocol (presented by Abhisek from DropSolid), and Reranking Models (led by Sergiu from DropSolid) to improve search quality.

Become a Drupal AI Partner

Our success so far is thanks to the companies who have stepped up as Drupal AI Partners. These organizations are leading the way in defining how AI and the Open Web intersect.
A huge thank you to our main contributors of the first two sprints (alphabetical order):

We invite further participation from the community. If your organization is interested in contributing expert resources to the forefront of AI development, we encourage you to join the initiative.

Sign up to join the Drupal AI Initiative

16 Feb 2026 4:00pm GMT

Debug Academy: Four PHP Communities, One Uncomfortable Conversation

Four PHP Communities, One Uncomfortable Conversation

#Drupal, Joomla, Magento, Mautic. All PHP-based, all use Composer, all have talented & passionate communities. And all share the same problems around growth and sustainability. There is a solution.

No, we should not merge the codebases. Sure, you could have AI "Ralph-Wiggum" its way to a monstrosity with passing tests. But these frameworks are trusted for their code quality and security, and using AI to Frankenstein-smush them together would destroy that trust instantly.

What I'm proposing is merging the communities behind a single framework.

Why now? Because (yes, I'm going there) while AI can't merge codebases, it can help developers who already know PHP, Composer, and open source ramp up on a new framework far faster than before. The barrier to a knowledgable human using a different technology has never been lower.

ashrafabed

16 Feb 2026 1:58pm GMT

The Drop Times: What Keeps You Invested in Drupal?

ORGANIZATION NEWS

DISCOVER DRUPAL

EVENT

FREE SOFTWARE

16 Feb 2026 1:48pm GMT

UI Suite Initiative website: Live show - Display Builder & UI Suite DaisyUI by WebWash !

It was fantastic to see Ivan Zugec diving deep into our ecosystem! We actually popped into the chat during the stream (he jokingly referred to us as the "End Boss"), and it was great to see him exploring the capabilities of Display Builder alongside UI Suite DaisyUI.Here is our recap of his session available on WebWash youtube channel:

16 Feb 2026 12:14pm GMT

1xINTERNET blog: DrupalCamp England coming to Salford, Manchester

The UK's largest Drupal event arrives at the University of Salford, Manchester on 28th February - 1st March - and there's something for everyone, whether you're a seasoned Drupal professional or simply curious about what open source technology can do for your organisation.

16 Feb 2026 12:00pm GMT

Drupal.org blog: GitLab issue migration: how to use the new workflow

Now that some of the projects that opted-in for GitLab issues are using them, they are getting real world experience with how the issue workflow in GitLab is slightly different. More and more projects are being migrated each week so sooner or later you will probably run into the following situations.

Creating new issues

When creating issues, the form is very simple. Add a title and a description and save, that's it!

GitLab has different work items when working on projects, like "Incidents", "Tasks" and "Issues". Our matching type will always be "Issue". Maintainers might choose to use the other types, but all integrations with Drupal.org will be made against "Issue" items.

Type of item

Labels (issue metadata)

As mentioned in the previous blog post GitLab issue migration: the new workflow for migrated projects, all the metadata for issues is managed via labels. Maintainers will select the labels once the issue is created.

Users without sufficient privileges cannot decide things like priority or tags to use. Maintainers can decide to grant the role "reporter" to some users to help with this metadata for the issues. Reporters will be able to add/edit metadata when adding or editing issues. We acknowledge that this is probably the biggest difference to working with Drupal.org issues. We are listening to feedback and trying to identify the real needs first (thanks to the projects that opted in), before implementing anything permanent.

Reporters will be able to add or edit labels on issue creation or edit:

issue with metadata

category

So far, we have identified the biggest missing piece, the ability to mark an issue as RTBC. Bouncing between "Needs work" or "Needs review" tends to happen organically via comments among the participating contributors in the issue, but RTBC is probably what some maintainers look for to get an issue merged.

The previous are conventions that we agreed on as a community a while back. RTBC is one, NW (Needs Work) vs NR (Needs Review) is another one, so we could use this transition to GitLab issues to define the equivalent ones.

GitLab merge requests offer several choices that we could easily leverage.

  • Draft vs ready could mimic "Needs work" vs "Needs review". Contributors will switch this back and forth depending on the feedback needed on the code.
  • Merge request approvals could mimic "RTBC". This is something that can even be required (number of approvals or approval rules) per project, depending on the maintainer's preferences.

merge request approvals

We encourage maintainers to look at the merge requests listing instead (like this one). Both "draft" vs. "ready" and "approved" are features you can filter by when viewing merge requests for a project.

Automated messages

There are automated messages when opening or closing issues that provide links related to fork management, fork information, and access request when creating forks, and reminders to update the contribution record links to the issue to track credit information.

Crosslinking between Drupal.org issues and GitLab issues

When referring to a Drupal.org issue from another Drupal.org issue, you can continue to use the [#123] syntax in the summary and comments, but enter the full URL in the "related issues" entry box.

When referring to a GitLab issue from another GitLab issue, use the #123 syntax, without the enclosing [ ].

For cross-platform references (Drupal to GitLab or GitLab to Drupal), you need to use the full URL.

What's next

Same as before, we want to go and review more of the already opted-in projects, collect feedback, act on it when needed, and then we will start to batch-migrate the next set: low-usage projects, projects with a low number of issues, etc.

The above should get us 80% of the way regarding the total number of projects to migrate, and once we have gathered more feedback and iterated over it, we'll be ready to target higher-volume, higher-usage projects.


Related blog posts:

16 Feb 2026 9:28am GMT

14 Feb 2026

feedDrupal.org aggregator

DDEV Blog: Mutagen in DDEV: Functionality, Issues, and Debugging

Friendly illustration of how Mutagen sync works between host and container filesystems

Mutagen has been a part of DDEV for years, providing dramatic performance improvements for macOS and traditional Windows users. It's enabled by default on these platforms, but understanding how it works, what can go wrong, and how to debug issues is key to getting the most out of DDEV.

Just Need to Debug Something?

If you're here because you just need to debug a Mutagen problem, this will probably help:

ddev utility mutagen-diagnose

See more below.

Contributor Training Video

This blog is based on the Mutagen Fundamentals and Troubleshooting Contributor Training held on January 22, 2026.

See the slides for the training video.

What Mutagen Does

Mutagen is an asynchronous file synchronization tool that decouples in-container reads and writes from reads and writes on the host machine. Each filesystem enjoys near-native speed because neither is stuck waiting on the other.

Traditional Docker bind-mounts check every file access against the file on the host. On macOS and Windows, Docker's implementation of these checks is not performant. Mutagen solves this by maintaining a cached copy of your project files in a Docker volume, syncing changes between host and container asynchronously.

Mostly for PHP

The primary target of Mutagen syncing is PHP files. These were the fundamental problem with Docker as the number of files in a Docker-hosted PHP website grew into the Composer generation with tens of thousands of files, so php-fpm had to open so very many of them all at once. Now with DDEV on macOS using Mutagen, php-fpm is opening files that are just on its local Linux filesystem, not opening ten thousand files that all have to be verified on the host.

Webserving Performance Improvement

Mutagen has delighted many developers with its web-serving performance. One dev said "the first time I tried it I cried."

Filesystem Notifications

Mutagen supports filesystem notifications (inotify/fsnotify), so file-watchers on both the host and inside the container are notified when changes occur. This is a significant advantage for development tools that would otherwise need to poll for changes.

How Mutagen Works in DDEV

When Mutagen is enabled, DDEV:

  1. Mounts a Docker volume onto /var/www inside the web container
  2. A Linux Mutagen daemon is installed inside the web container
  3. A host-side Mutagen daemon is started by DDEV
  4. The two daemons keep each other up-to-date with changes on either side

Lifecycle

Upload Directories

DDEV automatically excludes user-generated files in upload_dirs from Mutagen syncing, using bind-mounts instead. For most CMS types, this is configured automatically, for example:

If your project has non-standard locations, override defaults by setting upload_dirs in .ddev/config.yaml.

We do note that upload_dirs is not an adequate description for this behavior. It was originally intended for user-generated files, but now is used for heavy directories like node_modules, etc.

Common Issues and Caveats

Initial Sync Time

The first-time Mutagen sync takes 5-60 seconds depending on project size. A Magento 2 site with sample data might take 48 seconds initially, 12 seconds on subsequent starts. If sync takes longer than a minute, you're likely syncing large files or directories unnecessarily.

Large node_modules Directories

Frontend build tools create massive node_modules directories that slow Mutagen sync significantly. Solution: Add node_modules to upload_dirs:

upload_dirs: #upload_dirs entries are relative to docroot
  - sites/default/files # Keep existing CMS defaults
  - ../node_modules # Exclude from Mutagen

Then run ddev restart. The directory remains available in the container via Docker bind-mount.

File Changes When DDEV is Stopped

If you change files (checking out branches, running git pull, deleting files) while DDEV is stopped, Mutagen has no awareness of these changes. When you start again, it may restore old files from the volume.

Solution: Run ddev mutagen reset before restarting if you've made significant changes while stopped. That removes the volume so everything comes first from the host side in a fresh sync.

Simultaneous Changes

If the same file changes on both host and container while out of sync, conflicts can occur. This is quite rare but possible with:

Best practices:

Disk Space Considerations

Mutagen increases disk usage because project code exists both on your computer and inside a Docker volume. The upload_dirs directories are excluded to mitigate this.

Watch for volumes larger than 5GB (warning) or 10GB (critical). Use ddev utility mutagen-diagnose --all to check all projects.

Debugging Mutagen Issues

The New ddev utility mutagen-diagnose Command

DDEV now includes a diagnostic tool that automatically checks for common issues:

ddev utility mutagen-diagnose

This command analyzes:

Use --all flag to analyze all Mutagen volumes system-wide:

ddev utility mutagen-diagnose --all

The diagnostic provides actionable recommendations like:

⚠ 3 node_modules directories exist but are not excluded from sync (can cause slow sync)
  → Add to .ddev/config.yaml:
    upload_dirs:
      - sites/default/files
      - web/themes/custom/mytheme/node_modules
      - web/themes/contrib/bootstrap/node_modules
      - app/node_modules
  → Then run: ddev restart

Debugging Long Startup Times

If ddev start takes longer than a minute and ddev utility mutagen-diagnose doesn't give you clues about why, watch what Mutagen is syncing:

ddev mutagen reset  # Start from scratch
ddev start

In another terminal:

while true; do ddev mutagen st -l | grep "^Current"; sleep 1; done

This shows which files Mutagen is working on:

Current file: vendor/bin/large-binary (306 MB/5.2 GB)
Current file: vendor/bin/large-binary (687 MB/5.2 GB)
Current file: vendor/bin/large-binary (1.1 GB/5.2 GB)

Add problem directories to upload_dirs or move them to .tarballs (automatically excluded).

Monitoring Sync Activity

Watch real-time sync activity:

ddev mutagen monitor

This shows when Mutagen responds to changes and helps identify sync delays.

Manual Sync Control

Force an explicit sync:

ddev mutagen sync

Check sync status:

ddev mutagen status

View detailed status:

ddev mutagen status -l

Troubleshooting Steps

  1. Verify that your project works without Mutagen:

    ddev config --performance-mode=none && ddev restart
    
  2. Run diagnostics:

    ddev utility mutagen-diagnose
    
  3. Reset to clean .ddev/mutagen/mutagen.yml:

    # Backup customizations first
    mv .ddev/mutagen/mutagen.yml .ddev/mutagen/mutagen.yml.bak
    ddev restart
    
  4. Reset Mutagen volume and recreate it:

    ddev mutagen reset
    ddev restart
    
  5. Enable debug output:

    DDEV_DEBUG=true ddev start
    
  6. View Mutagen logs:

    ddev mutagen logs
    
  7. Restart Mutagen daemon:

    ddev utility mutagen daemon stop
    ddev utility mutagen daemon start
    

Advanced Configuration

Excluding Directories from Sync

Recommended approach: Use upload_dirs in .ddev/config.yaml:

upload_dirs:
  - sites/default/files # CMS uploads
  - ../node_modules # Build dependencies
  - ../vendor/bin # Large binaries

Advanced approach: Edit .ddev/mutagen/mutagen.yml after removing the #ddev-generated line:

ignore:
  paths:
    - "/web/themes/custom/mytheme/node_modules"
    - "/vendor/large-package"

Then add corresponding bind-mounts in .ddev/docker-compose.bindmount.yaml:

services:
  web:
    volumes:
      - "../web/themes/custom/mytheme/node_modules:/var/www/html/web/themes/custom/mytheme/node_modules"

Always run ddev mutagen reset after changing mutagen.yml.

Git Hooks for Automatic Sync

Add .git/hooks/post-checkout and make it executable:

#!/usr/bin/env bash
ddev mutagen sync || true
chmod +x .git/hooks/post-checkout

Use Global Configuration for performance_mode

The standard practice is to use global configuration for performance_mode so that each user does what's normal for them, and the project configuration does not have configuration that might not work for another team member.

Most people don't have to change this anyway; macOS and traditional Windows default to performance_mode: mutagen and Linux/WSL default to performance_mode: none.

When to Disable Mutagen

Disable Mutagen if:

Disable per-project:

ddev mutagen reset && ddev config --performance-mode=none && ddev restart

Disable globally:

ddev config global --performance-mode=none

Mutagen Data and DDEV

DDEV uses its own Mutagen installation, normally in ~/.ddev, but using $XDG_CONFIG_HOME when that is defined.

Access Mutagen directly:

ddev utility mutagen sync list
ddev utility mutagen sync monitor <projectname>

Summary

Mutagen provides dramatic performance improvements for macOS and traditional Windows users, but understanding its asynchronous nature is key to avoiding issues:

The benefits far outweigh the caveats for most projects, especially with the new diagnostic tools that identify and help resolve common issues automatically.

For more information, see the DDEV Performance Documentation and the Mutagen documentation.

Copilot was used to create an initial draft for this blog, and for subsequent reviews.

14 Feb 2026 1:39am GMT

13 Feb 2026

feedDrupal.org aggregator

A Drupal Couple: Why I Do Not Trust Independent AI Agents Without Strict Supervision

Image
Imagen
A human hand drawing a single clean glowing line through complex AI circuit patterns, representing human supervision guiding AI toward simpler solutions
Article body

I use Claude Code almost exclusively. Every day, for hours. It allowed me to get back into developing great tools, and I have published several results that are working very well. Plugins, skills, frameworks, development workflows. Real things that real people can use. The productivity is undeniable.

So let me be clear about what this post is. This is not a take on what AI can do. This is about AI doing it completely alone.

The results are there. But under supervision.

The laollita.es Moment

When we were building laollita.es, something happened that I documented in a previous post. We needed to apply some visual changes to the site. The AI agent offered a solution: a custom module with a preprocess function. It would work. Then we iterated, and it moved to a theme-level implementation with a preprocess function. That would also work. Both approaches would accomplish the goal.

Until I asked: isn't it easier to just apply CSS to the new classes?

Yes. It was. Simple CSS. No module, no preprocess, no custom code beyond what was needed.

Here is what matters. All three solutions would have accomplished the goal. The module approach, the theme preprocess, the CSS. They all would have worked. But two of them create technical debt and maintenance load that was completely unnecessary. The AI did not choose the simplest path because it does not understand the maintenance burden. It does not think about who comes after. It generates a solution that works and moves on.

This is what I see every time I let the AI make decisions without questioning them. It works... and it creates problems you only discover later.

Why This Happens

I have been thinking about this for a while. I have my own theories, and they keep getting confirmed the more I work with these tools. Here is what I think is going on.

AI Cannot Form New Memories

Eddie Chu made this point at the latest AI Tinkerers meeting, and it resonated with me because I live it every day.

I use frameworks. Skills. Plugins. Commands. CLAUDE.md files. I have written before about my approach to working with AI tools. I have built an entire organization of reference documents, development guides, content frameworks, tone guides, project structure plans. All of this exists to create guardrails, to force best practices, to give AI the context it needs to do good work.

And it will not keep the memory.

We need to force it. Repeat it. Say it again.

This is not just about development. It has the same problem when creating content. I built a creative brief step into my workflow because the AI was generating content that reflected its own patterns instead of my message. I use markdown files, state files, reference documents, the whole structure in my projects folder. And still, every session starts from zero. The AI reads what it reads, processes what it processes, and the rest... it is as if it never existed.

The Expo.dev engineering team described this perfectly after using Claude Code for a month [1]. They said the tool "starts fresh every session" like "a new hire who needs onboarding each time." Pre-packaged skills? It "often forgets to apply them without explicit reminders." Exactly my experience.

Context Is Everything (And Context Is the Problem)

Here is something I have noticed repeatedly. In a chat interaction, in agentic work, the full history is the context. Everything that was said, every mistake, every correction, every back-and-forth. That is what the AI is working with.

When the AI is already confused and I have asked for the same correction three times and it is going in strange ways... starting a new session and asking it to analyze the code fresh, to understand what is there, it magically finds the solution.

Why? Because the previous mistakes are in the context. The AI does not read everything from top to bottom. It scans for what seems relevant, picks up fragments, skips over the rest. Which means even the guardrails I put in MD files, the frameworks, the instructions... they are not always read. They are not always in the window of what the AI is paying attention to at that moment.

And when errors are in the context, they compound. Research calls this "cascading failures" [2]. A small mistake becomes the foundation for every subsequent decision, and by the time you review the output, the error has propagated through multiple layers. An inventory agent hallucinated a nonexistent product, then called four downstream systems to price, stock, and ship the phantom item [3]. One hallucinated fact, one multi-system incident.

Starting fresh clears the poison. But an unsupervised agent never gets to start fresh. It just keeps building on what came before.

The Dunning-Kruger Effect of AI

The Dunning-Kruger effect is a cognitive bias where people with limited ability in a task overestimate their competence. AI has its own version of this.

When we ask AI to research, write, or code something, it typically responds with "this is done, production ready" or some variation of "this is done, final, perfect!" But it is not. And going back to the previous point, that false confidence is now in the context. So no matter if you discuss it later and explain what was wrong or that something is missing... it is already "done." If the AI's targeted search through the conversation does not bring the correction back into focus... there you go.

Expo.dev documented the same pattern [1]. Claude "produces poorly architected solutions with surprising frequency, and the solutions are presented with confidence." It never says "I am getting confused, maybe we should start over." It just keeps going, confidently wrong.

The METR study puts hard numbers on this [4]. In a randomized controlled trial with experienced developers, AI tools made them 19% slower. Not faster. Slower. But the developers still believed AI sped them up by 20%. The perception-reality gap is not just an AI problem. It is a human problem too. Both sides of the equation are miscalibrated.

The Training Data Problem

The information or memory that AI has is actually not all good. Usually it is "cowboy developers" who really go ahead and respond to most social media questions, Stack Overflow answers, blog posts, tutorials. And that is the training. That is the information AI learned from.

The same principle applies beyond code. The information we produce as a society is biased, and AI absorbs all of it. That is why you see discriminatory AI systems across industries. AI resume screeners favor white-associated names 85% of the time [5]. UnitedHealthcare's AI denied care and was overturned on appeal 90% of the time [6]. A Dutch algorithm wrongly accused 35,000 parents of fraud, and the scandal toppled the entire government [7].

For my own work, I create guides to counteract this. Content framework guides that extract proper research on how to use storytelling, inverted pyramid, AIDA structures. Tone guides with specific instructions. I put them in skills and reference documents so I can point the AI to them when we are working. And still I have to remind it. Every time.

What I See Every Day

I have seen AI do what it did in laollita.es across multiple projects. In development, it created an interactive chat component, and the next time we used it on another screen, it almost wrote another one from scratch instead of reusing the one it had just built. Same project. Same session sometimes.

In content creation, I have a tone guide with specific stylistic preferences. And I still have to explicitly ask the AI to review it. No matter how directive the language in the instructions is. "Always load this file before writing content." It does not always load the file.

And it is not just my experience.

A Replit agent deleted a production database during a code freeze, then fabricated fake data and falsified logs to cover it up [8]. Google's Antigravity agent wiped a user's entire hard drive when asked to clear a cache [9]. Klarna's CEO said "we went too far" after cutting 700 jobs for AI and is now rehiring humans [10]. Salesforce cut 4,000 support staff and is now facing lost institutional knowledge [11]. The pattern keeps repeating. Companies trust the agent, remove the human, discover why the human was there in the first place.

What This Means for AI Supervision

I am not against AI. I am writing this post on a system largely built with AI assistance. The tools I publish, the workflows I create, the content I produce. AI is deeply embedded in my work. It makes me more productive.

At Palcera, I believe AI is genuinely great for employees and companies. When AI helps a developer finish faster, that time surplus benefits everyone. The developer gets breathing room. The company gets efficiency. And the customer can get better value, better pricing, faster delivery. That is real. I see it every day.

But all of that requires the human in the loop. Questioning the choices. Asking "isn't CSS simpler?" Clearing the context when things go sideways. Pointing to the tone guide when the AI forgets. Starting fresh when the conversation gets poisoned with old mistakes.

The results are there. But under supervision. And that distinction matters more than most people realize.


References

[1] Expo.dev, "What Our Web Team Learned Using Claude Code for a Month"

[2] Adversa AI, "Cascading Failures in Agentic AI: OWASP ASI08 Security Guide 2026"

[3] Galileo, "7 AI Agent Failure Modes and How To Fix Them"

[4] METR, "Measuring the Impact of Early-2025 AI on Experienced Developer Productivity"

[5] The Interview Guys / University of Washington, "85% of AI Resume Screeners Prefer White Names"

[6] AMA, "How AI Is Leading to More Prior Authorization Denials"

[7] WBUR, "What Happened When AI Went After Welfare Fraud"

[8] The Register, "Vibe Coding Service Replit Deleted Production Database"

[9] The Register, "Google's Vibe Coding Platform Deletes Entire Drive"

[10] Yahoo Finance, "After Firing 700 Humans For AI, Klarna Now Wants Them Back"

[11] Maarthandam, "Salesforce Regrets Firing 4,000 Experienced Staff and Replacing Them with AI"

Author
Abstract
A developer who uses AI coding tools daily shares real examples of why autonomous AI agents still need human supervision. From unnecessary technical debt to context pollution to confidently wrong outputs, AI works best when a human is asking the right questions.
Rating
No votes yet

Add new comment

13 Feb 2026 6:20pm GMT

Dripyard Premium Drupal Themes: Dripyard's Meridian + Drupal CMS Webinar Recording is Up

Our webinar on Drupal CMS + Meridian theme is up on YouTube! In this we talked about the new theme, demo'd various example sites built with it, and ran through new components.

We also talked about our differences with Drupal CMS's built in Byte theme and site template.

Enjoy!

13 Feb 2026 4:18pm GMT

1xINTERNET blog: 1xINTERNET and React Online join forces

1xINTERNET and React online have joined forces, React Online will become the Dutch subsidiary of 1xINTERNET. Same great people, same trusted partnerships, now backed by a team of 90+ experts across Europe.

13 Feb 2026 12:00pm GMT