
14 Feb 2026
Drupal.org aggregator
DDEV Blog: Mutagen in DDEV: Functionality, Issues, and Debugging

Mutagen has been a part of DDEV for years, providing dramatic performance improvements for macOS and traditional Windows users. It's enabled by default on these platforms, but understanding how it works, what can go wrong, and how to debug issues is key to getting the most out of DDEV.
Just Need to Debug Something?
If you're here because you just need to debug a Mutagen problem, this will probably help:
ddev utility mutagen-diagnose
See more below.
Contributor Training Video
This blog is based on the Mutagen Fundamentals and Troubleshooting Contributor Training held on January 22, 2026.
See the slides for the training video.
What Mutagen Does
Mutagen is an asynchronous file synchronization tool that decouples in-container reads and writes from reads and writes on the host machine. Each filesystem enjoys near-native speed because neither is stuck waiting on the other.
Traditional Docker bind-mounts check every file access against the file on the host. On macOS and Windows, Docker's implementation of these checks is not performant. Mutagen solves this by maintaining a cached copy of your project files in a Docker volume, syncing changes between host and container asynchronously.
Mostly for PHP
The primary target of Mutagen syncing is PHP files. These were the fundamental problem with Docker as the number of files in a Docker-hosted PHP website grew into the Composer generation with tens of thousands of files, so php-fpm had to open so very many of them all at once. Now with DDEV on macOS using Mutagen, php-fpm is opening files that are just on its local Linux filesystem, not opening ten thousand files that all have to be verified on the host.
Webserving Performance Improvement
Mutagen has delighted many developers with its web-serving performance. One dev said "the first time I tried it I cried."
Filesystem Notifications
Mutagen supports filesystem notifications (inotify/fsnotify), so file-watchers on both the host and inside the container are notified when changes occur. This is a significant advantage for development tools that would otherwise need to poll for changes.
How Mutagen Works in DDEV
When Mutagen is enabled, DDEV:
- Mounts a Docker volume onto
/var/wwwinside the web container - A Linux Mutagen daemon is installed inside the web container
- A host-side Mutagen daemon is started by DDEV
- The two daemons keep each other up-to-date with changes on either side
Lifecycle
ddev start: Starts the Mutagen daemon on the host if not running, creates or resumes sync sessionddev stop: Flushes sync session to ensure consistency, then pauses itddev composer: Triggers synchronous flush after completion to sync massive filesystem changesddev mutagen reset: Removes the Docker volume and the sync session will then be recreated from scratch (from the host-side contents) onddev start.
Upload Directories
DDEV automatically excludes user-generated files in upload_dirs from Mutagen syncing, using bind-mounts instead. For most CMS types, this is configured automatically, for example:
- Drupal:
sites/default/files - WordPress:
wp-content/uploads - TYPO3:
fileadmin,uploads
If your project has non-standard locations, override defaults by setting upload_dirs in .ddev/config.yaml.
We do note that upload_dirs is not an adequate description for this behavior. It was originally intended for user-generated files, but now is used for heavy directories like node_modules, etc.
Common Issues and Caveats
Initial Sync Time
The first-time Mutagen sync takes 5-60 seconds depending on project size. A Magento 2 site with sample data might take 48 seconds initially, 12 seconds on subsequent starts. If sync takes longer than a minute, you're likely syncing large files or directories unnecessarily.
Large node_modules Directories
Frontend build tools create massive node_modules directories that slow Mutagen sync significantly. Solution: Add node_modules to upload_dirs:
upload_dirs: #upload_dirs entries are relative to docroot
- sites/default/files # Keep existing CMS defaults
- ../node_modules # Exclude from Mutagen
Then run ddev restart. The directory remains available in the container via Docker bind-mount.
File Changes When DDEV is Stopped
If you change files (checking out branches, running git pull, deleting files) while DDEV is stopped, Mutagen has no awareness of these changes. When you start again, it may restore old files from the volume.
Solution: Run ddev mutagen reset before restarting if you've made significant changes while stopped. That removes the volume so everything comes first from the host side in a fresh sync.
Simultaneous Changes
If the same file changes on both host and container while out of sync, conflicts can occur. This is quite rare but possible with:
- Scripts running simultaneously on host and in container
- Massive branch changes
- Large
npm installoryarn installoperations
Best practices:
- Do Git operations on the host, not in the container
- Use
ddev composerfor most composer operations - Run
ddev mutagen syncafter major Git branch changes - Run
ddev mutagen syncafter manual Composer operations done inside the container
Disk Space Considerations
Mutagen increases disk usage because project code exists both on your computer and inside a Docker volume. The upload_dirs directories are excluded to mitigate this.
Watch for volumes larger than 5GB (warning) or 10GB (critical). Use ddev utility mutagen-diagnose --all to check all projects.
Debugging Mutagen Issues
The New ddev utility mutagen-diagnose Command
DDEV now includes a diagnostic tool that automatically checks for common issues:
ddev utility mutagen-diagnose
This command analyzes:
- Volume sizes: Warns if >5GB, critical if >10GB
- Upload directories configuration: Checks if properly configured for your CMS
- Sync session status: Reports problems with the sync session
- Large directories: Identifies
node_modulesand other large directories being synced - Ignore patterns: Validates Mutagen exclusion configuration
Use --all flag to analyze all Mutagen volumes system-wide:
ddev utility mutagen-diagnose --all
The diagnostic provides actionable recommendations like:
⚠ 3 node_modules directories exist but are not excluded from sync (can cause slow sync)
→ Add to .ddev/config.yaml:
upload_dirs:
- sites/default/files
- web/themes/custom/mytheme/node_modules
- web/themes/contrib/bootstrap/node_modules
- app/node_modules
→ Then run: ddev restart
Debugging Long Startup Times
If ddev start takes longer than a minute and ddev utility mutagen-diagnose doesn't give you clues about why, watch what Mutagen is syncing:
ddev mutagen reset # Start from scratch
ddev start
In another terminal:
while true; do ddev mutagen st -l | grep "^Current"; sleep 1; done
This shows which files Mutagen is working on:
Current file: vendor/bin/large-binary (306 MB/5.2 GB)
Current file: vendor/bin/large-binary (687 MB/5.2 GB)
Current file: vendor/bin/large-binary (1.1 GB/5.2 GB)
Add problem directories to upload_dirs or move them to .tarballs (automatically excluded).
Monitoring Sync Activity
Watch real-time sync activity:
ddev mutagen monitor
This shows when Mutagen responds to changes and helps identify sync delays.
Manual Sync Control
Force an explicit sync:
ddev mutagen sync
Check sync status:
ddev mutagen status
View detailed status:
ddev mutagen status -l
Troubleshooting Steps
-
Verify that your project works without Mutagen:
ddev config --performance-mode=none && ddev restart -
Run diagnostics:
ddev utility mutagen-diagnose -
Reset to clean
.ddev/mutagen/mutagen.yml:# Backup customizations first mv .ddev/mutagen/mutagen.yml .ddev/mutagen/mutagen.yml.bak ddev restart -
Reset Mutagen volume and recreate it:
ddev mutagen reset ddev restart -
Enable debug output:
DDEV_DEBUG=true ddev start -
View Mutagen logs:
ddev mutagen logs -
Restart Mutagen daemon:
ddev utility mutagen daemon stop ddev utility mutagen daemon start
Advanced Configuration
Excluding Directories from Sync
Recommended approach: Use upload_dirs in .ddev/config.yaml:
upload_dirs:
- sites/default/files # CMS uploads
- ../node_modules # Build dependencies
- ../vendor/bin # Large binaries
Advanced approach: Edit .ddev/mutagen/mutagen.yml after removing the #ddev-generated line:
ignore:
paths:
- "/web/themes/custom/mytheme/node_modules"
- "/vendor/large-package"
Then add corresponding bind-mounts in .ddev/docker-compose.bindmount.yaml:
services:
web:
volumes:
- "../web/themes/custom/mytheme/node_modules:/var/www/html/web/themes/custom/mytheme/node_modules"
Always run ddev mutagen reset after changing mutagen.yml.
Git Hooks for Automatic Sync
Add .git/hooks/post-checkout and make it executable:
#!/usr/bin/env bash
ddev mutagen sync || true
chmod +x .git/hooks/post-checkout
Use Global Configuration for performance_mode
The standard practice is to use global configuration for performance_mode so that each user does what's normal for them, and the project configuration does not have configuration that might not work for another team member.
Most people don't have to change this anyway; macOS and traditional Windows default to performance_mode: mutagen and Linux/WSL default to performance_mode: none.
When to Disable Mutagen
Disable Mutagen if:
- You're on Linux or WSL2 (already has native performance)
- Filesystem consistency is more critical than webserving performance
- You're troubleshooting other DDEV issues
Disable per-project:
ddev mutagen reset && ddev config --performance-mode=none && ddev restart
Disable globally:
ddev config global --performance-mode=none
Mutagen Data and DDEV
DDEV uses its own Mutagen installation, normally in ~/.ddev, but using $XDG_CONFIG_HOME when that is defined.
- Binary location:
$HOME/.ddev/bin/mutagenor${XDG_CONFIG_HOME}/ddev/bin/mutagen - Data directory:
$HOME/.ddev_mutagen_data_directory
Access Mutagen directly:
ddev utility mutagen sync list
ddev utility mutagen sync monitor <projectname>
Summary
Mutagen provides dramatic performance improvements for macOS and traditional Windows users, but understanding its asynchronous nature is key to avoiding issues:
- Use
ddev utility mutagen-diagnoseas your first debugging step - Configure
upload_dirsto exclude large directories likenode_modulesor heavy user-generated files directories - Run
ddev mutagen resetafter file changes when DDEV is stopped - Do Git operations on the host, not in the container
- Monitor sync activity with
ddev mutagen monitorwhen troubleshooting
The benefits far outweigh the caveats for most projects, especially with the new diagnostic tools that identify and help resolve common issues automatically.
For more information, see the DDEV Performance Documentation and the Mutagen documentation.
Copilot was used to create an initial draft for this blog, and for subsequent reviews.
14 Feb 2026 1:39am GMT
13 Feb 2026
Drupal.org aggregator
A Drupal Couple: Why I Do Not Trust Independent AI Agents Without Strict Supervision

I use Claude Code almost exclusively. Every day, for hours. It allowed me to get back into developing great tools, and I have published several results that are working very well. Plugins, skills, frameworks, development workflows. Real things that real people can use. The productivity is undeniable.
So let me be clear about what this post is. This is not a take on what AI can do. This is about AI doing it completely alone.
The results are there. But under supervision.
The laollita.es Moment
When we were building laollita.es, something happened that I documented in a previous post. We needed to apply some visual changes to the site. The AI agent offered a solution: a custom module with a preprocess function. It would work. Then we iterated, and it moved to a theme-level implementation with a preprocess function. That would also work. Both approaches would accomplish the goal.
Until I asked: isn't it easier to just apply CSS to the new classes?
Yes. It was. Simple CSS. No module, no preprocess, no custom code beyond what was needed.
Here is what matters. All three solutions would have accomplished the goal. The module approach, the theme preprocess, the CSS. They all would have worked. But two of them create technical debt and maintenance load that was completely unnecessary. The AI did not choose the simplest path because it does not understand the maintenance burden. It does not think about who comes after. It generates a solution that works and moves on.
This is what I see every time I let the AI make decisions without questioning them. It works... and it creates problems you only discover later.
Why This Happens
I have been thinking about this for a while. I have my own theories, and they keep getting confirmed the more I work with these tools. Here is what I think is going on.
AI Cannot Form New Memories
Eddie Chu made this point at the latest AI Tinkerers meeting, and it resonated with me because I live it every day.
I use frameworks. Skills. Plugins. Commands. CLAUDE.md files. I have written before about my approach to working with AI tools. I have built an entire organization of reference documents, development guides, content frameworks, tone guides, project structure plans. All of this exists to create guardrails, to force best practices, to give AI the context it needs to do good work.
And it will not keep the memory.
We need to force it. Repeat it. Say it again.
This is not just about development. It has the same problem when creating content. I built a creative brief step into my workflow because the AI was generating content that reflected its own patterns instead of my message. I use markdown files, state files, reference documents, the whole structure in my projects folder. And still, every session starts from zero. The AI reads what it reads, processes what it processes, and the rest... it is as if it never existed.
The Expo.dev engineering team described this perfectly after using Claude Code for a month [1]. They said the tool "starts fresh every session" like "a new hire who needs onboarding each time." Pre-packaged skills? It "often forgets to apply them without explicit reminders." Exactly my experience.
Context Is Everything (And Context Is the Problem)
Here is something I have noticed repeatedly. In a chat interaction, in agentic work, the full history is the context. Everything that was said, every mistake, every correction, every back-and-forth. That is what the AI is working with.
When the AI is already confused and I have asked for the same correction three times and it is going in strange ways... starting a new session and asking it to analyze the code fresh, to understand what is there, it magically finds the solution.
Why? Because the previous mistakes are in the context. The AI does not read everything from top to bottom. It scans for what seems relevant, picks up fragments, skips over the rest. Which means even the guardrails I put in MD files, the frameworks, the instructions... they are not always read. They are not always in the window of what the AI is paying attention to at that moment.
And when errors are in the context, they compound. Research calls this "cascading failures" [2]. A small mistake becomes the foundation for every subsequent decision, and by the time you review the output, the error has propagated through multiple layers. An inventory agent hallucinated a nonexistent product, then called four downstream systems to price, stock, and ship the phantom item [3]. One hallucinated fact, one multi-system incident.
Starting fresh clears the poison. But an unsupervised agent never gets to start fresh. It just keeps building on what came before.
The Dunning-Kruger Effect of AI
The Dunning-Kruger effect is a cognitive bias where people with limited ability in a task overestimate their competence. AI has its own version of this.
When we ask AI to research, write, or code something, it typically responds with "this is done, production ready" or some variation of "this is done, final, perfect!" But it is not. And going back to the previous point, that false confidence is now in the context. So no matter if you discuss it later and explain what was wrong or that something is missing... it is already "done." If the AI's targeted search through the conversation does not bring the correction back into focus... there you go.
Expo.dev documented the same pattern [1]. Claude "produces poorly architected solutions with surprising frequency, and the solutions are presented with confidence." It never says "I am getting confused, maybe we should start over." It just keeps going, confidently wrong.
The METR study puts hard numbers on this [4]. In a randomized controlled trial with experienced developers, AI tools made them 19% slower. Not faster. Slower. But the developers still believed AI sped them up by 20%. The perception-reality gap is not just an AI problem. It is a human problem too. Both sides of the equation are miscalibrated.
The Training Data Problem
The information or memory that AI has is actually not all good. Usually it is "cowboy developers" who really go ahead and respond to most social media questions, Stack Overflow answers, blog posts, tutorials. And that is the training. That is the information AI learned from.
The same principle applies beyond code. The information we produce as a society is biased, and AI absorbs all of it. That is why you see discriminatory AI systems across industries. AI resume screeners favor white-associated names 85% of the time [5]. UnitedHealthcare's AI denied care and was overturned on appeal 90% of the time [6]. A Dutch algorithm wrongly accused 35,000 parents of fraud, and the scandal toppled the entire government [7].
For my own work, I create guides to counteract this. Content framework guides that extract proper research on how to use storytelling, inverted pyramid, AIDA structures. Tone guides with specific instructions. I put them in skills and reference documents so I can point the AI to them when we are working. And still I have to remind it. Every time.
What I See Every Day
I have seen AI do what it did in laollita.es across multiple projects. In development, it created an interactive chat component, and the next time we used it on another screen, it almost wrote another one from scratch instead of reusing the one it had just built. Same project. Same session sometimes.
In content creation, I have a tone guide with specific stylistic preferences. And I still have to explicitly ask the AI to review it. No matter how directive the language in the instructions is. "Always load this file before writing content." It does not always load the file.
And it is not just my experience.
A Replit agent deleted a production database during a code freeze, then fabricated fake data and falsified logs to cover it up [8]. Google's Antigravity agent wiped a user's entire hard drive when asked to clear a cache [9]. Klarna's CEO said "we went too far" after cutting 700 jobs for AI and is now rehiring humans [10]. Salesforce cut 4,000 support staff and is now facing lost institutional knowledge [11]. The pattern keeps repeating. Companies trust the agent, remove the human, discover why the human was there in the first place.
What This Means for AI Supervision
I am not against AI. I am writing this post on a system largely built with AI assistance. The tools I publish, the workflows I create, the content I produce. AI is deeply embedded in my work. It makes me more productive.
At Palcera, I believe AI is genuinely great for employees and companies. When AI helps a developer finish faster, that time surplus benefits everyone. The developer gets breathing room. The company gets efficiency. And the customer can get better value, better pricing, faster delivery. That is real. I see it every day.
But all of that requires the human in the loop. Questioning the choices. Asking "isn't CSS simpler?" Clearing the context when things go sideways. Pointing to the tone guide when the AI forgets. Starting fresh when the conversation gets poisoned with old mistakes.
The results are there. But under supervision. And that distinction matters more than most people realize.
References
[1] Expo.dev, "What Our Web Team Learned Using Claude Code for a Month"
[2] Adversa AI, "Cascading Failures in Agentic AI: OWASP ASI08 Security Guide 2026"
[3] Galileo, "7 AI Agent Failure Modes and How To Fix Them"
[4] METR, "Measuring the Impact of Early-2025 AI on Experienced Developer Productivity"
[5] The Interview Guys / University of Washington, "85% of AI Resume Screeners Prefer White Names"
[6] AMA, "How AI Is Leading to More Prior Authorization Denials"
[7] WBUR, "What Happened When AI Went After Welfare Fraud"
[8] The Register, "Vibe Coding Service Replit Deleted Production Database"
[9] The Register, "Google's Vibe Coding Platform Deletes Entire Drive"
[10] Yahoo Finance, "After Firing 700 Humans For AI, Klarna Now Wants Them Back"
[11] Maarthandam, "Salesforce Regrets Firing 4,000 Experienced Staff and Replacing Them with AI"
13 Feb 2026 6:20pm GMT
Dripyard Premium Drupal Themes: Dripyard's Meridian + Drupal CMS Webinar Recording is Up
Our webinar on Drupal CMS + Meridian theme is up on YouTube! In this we talked about the new theme, demo'd various example sites built with it, and ran through new components.
We also talked about our differences with Drupal CMS's built in Byte theme and site template.
Enjoy!
13 Feb 2026 4:18pm GMT
1xINTERNET blog: 1xINTERNET and React Online join forces
1xINTERNET and React online have joined forces, React Online will become the Dutch subsidiary of 1xINTERNET. Same great people, same trusted partnerships, now backed by a team of 90+ experts across Europe.
13 Feb 2026 12:00pm GMT
Droptica: Intelligent Taxonomy Mapping for AI-Powered Drupal Systems: A Practical Guide

Integrating AI with Drupal content creation works well for text fields, but taxonomy mapping remains a significant challenge. AI extracts concepts using natural language, while Drupal taxonomies require exact predefined terms and the two rarely match. This article explores why common approaches like string matching and keyword mapping fail, and presents context injection as a production-proven solution that leverages AI's semantic understanding to select correct taxonomy terms directly from the prompt.
13 Feb 2026 11:00am GMT
ImageX: How to Manage Your Related Content in Drupal Instantly with Inline Entity Form
On a busy Drupal website, content rarely lives on its own in a silo. Presentations and webinars are linked to speakers, academic programs will reference courses, and events are tied to locations, the list goes on. Update one of these pieces on its own page, and the change shows up everywhere it's used, reflecting Drupal's strength as a coherent, interconnected system.
13 Feb 2026 12:19am GMT
12 Feb 2026
Drupal.org aggregator
Undpaul.de: Drupal’s Vision 2026: Why the Future of AI Is Structured, Secure, and Surprisingly Human
12 Feb 2026 6:18pm GMT
Talking Drupal: TD Cafe #014 - AmyJune and Avi - Navigating Community, Safety, and Accessibility
Join AmyJune and Avi as they discuss the complexities of organizing large events in changing times. The discussion covers topics from past DrupalCons, the crucial coordination behind community health and safety, accessibility, and the evolving challenges involving inclusivity. They also touch on the intersection of community dynamics, the importance of creating shared realities, and the engaging experience of the Drupal community. Additionally, expect an overview of upcoming events, including keynotes and fun activities like the Drupal Coffee Exchange.
For show notes visit: https://www.talkingDrupal.com/cafe014
Topics
- Catching Up with Abby and June
- Memories of DrupalCon and Camps
- The $2 Bill Tradition
- Open Y and Community Contributions
- Community Working Group and Governance
- Initial Reactions and Reflections
- Challenges of Organizing DrupalCon
- Accessibility and Safety Concerns
- Event Planning and Community Involvement
- Learning from Other Events
- Upcoming Keynote and Event Highlights
- Community and Collaboration
AmyJune Hineline
AmyJune works with the Linux Foundation as the Certification Community Architect, supporting the Education team in developing and maintaining exams and related documentation across the foundation's certification portfolio.
She's also a DrupalCamp organizer (Florida DrupalCamp, DrupalCamp Asheville, and DrupalCamp Colorado), a member of the Community Working Group's Conflict Resolution Team, and serves on the board of the Colorado Drupal Association.
Avi Schwab
Avi came to Drupal for the community and has been active in it since 2008. He is a founding organizer of MidCamp, Midwest Open Source Alliance, and the Event Organizer Working Group. In his role as a Technical Product Consultant at ImageX Media, he builds and supports Drupal sites for over 40 YMCA associations in the USA and Canada. For fun, he bikes, bakes, and enjoys time with his family.
Guests
AmyJune Hineline - volkswagenchick Avi Schwab - froboy
12 Feb 2026 12:00pm GMT
DDEV Blog: New `ddev share` Provider System: Cloudflare tunnel with no login or token

Sharing your local development environment with clients, colleagues, or testing services has always been a valuable DDEV feature. DDEV v1.25.0 makes this easier and more flexible than ever with a complete redesign of ddev share. The biggest news is that you can now share your projects for free using Cloudflare Tunnel-no account signup or token setup required.
What Changed in ddev share
Previous versions of DDEV relied exclusively on ngrok for sharing. While ngrok remains a solid choice with advanced features, v1.25.0 introduces a modular provider system allowing more options and flexibility. DDEV now ships with two built-in providers:
- ngrok: The traditional option (requires free account and authtoken)
- cloudflared: A new, cost-free option using Cloudflare Tunnel (requires no account or token)
You can select providers via command flags, project configuration, or global defaults. Existing projects using ngrok continue working unchanged, and ngrok remains the default provider.
Free Sharing with Cloudflare Tunnel
Cloudflare Tunnel provides production-grade infrastructure for sharing your local environments at zero cost. After installing the cloudflared CLI, getting started is:
ddev share --provider=cloudflared
No account creation, no authentication setup, no subscription tiers-just immediate access to share your work. This removes barriers for individual developers and teams who need occasional sharing without the overhead of managing service accounts.
When should you use cloudflared vs ngrok? Use cloudflared for quick, free sharing during development and testing. Choose ngrok if you need stable subdomains, custom domains, or advanced features like IP allowlisting and OAuth protection. (However, if you control a domain registered at Cloudflare you can use that for stable domains. This will be covered in a future blog.)
Configuration Flexibility
You can set your preferred provider at multiple levels:
# Use a specific provider for this session
ddev share --provider=cloudflared
# Set default provider for the current project
ddev config --share-default-provider=cloudflared
# Set global default for all projects
ddev config global --share-default-provider=cloudflared
This flexibility lets you use different providers for different projects or standardize across your entire development setup.
Tip: Your CMS or framework may have "trusted host patterns" configuration that denies access to the site when hosted at an unknown URL. You'll need to configure to allow all or specific URLs. For example, in Drupal, $settings['trusted_host_patterns'] = ['.*']; or in TYPO3 'trustedHostsPattern' => '.*.*'.
Automation for difficult CMSs using pre-share hooks and $DDEV_SHARE_URL
When you run ddev share, DDEV now exports the tunnel URL as the DDEV_SHARE_URL environment variable. This enables automation through hooks, particularly useful for integration testing, webhooks, or CI workflows that need the public URL.
WordPress Example
WordPress is always difficult because it embeds the URL right in the database. For sites to use a different URL the wp search-replace tool is the classic way to deal with this, so the hook demonstration below can be used to make ddev share work even when the URL is dynamic.
# .ddev/config.yaml
hooks:
pre-share:
# provide DDEV_SHARE_URL in container
- exec-host: echo "${DDEV_SHARE_URL}" >.ddev/share_url.txt
# Save database for restore later
- exec-host: ddev export-db --file=/tmp/tmpdump.sql.gz
# Change the URL in the database
- exec: wp search-replace ${DDEV_PRIMARY_URL} $(cat /mnt/ddev_config/share_url.txt) | grep Success
# Fix the wp-config-ddev.php to use the DDEV_SHARE_URL
- exec: cp wp-config-ddev.php wp-config-ddev.php.bak
- exec: sed -i.bak "s|${DDEV_PRIMARY_URL}|$(cat /mnt/ddev_config/share_url.txt)|g" wp-config-ddev.php
- exec: wp cache flush
post-share:
# Put back the things we changed
- exec: cp wp-config-ddev.php.bak wp-config-ddev.php
- exec-host: ddev import-db --file=/tmp/tmpdump.sql.gz
This approach works for any CMS that stores base URLs in its configuration or database. The pre-share hook updates URLs automatically, and you can use post-share hooks to restore them when sharing ends. This eliminates the manual configuration work that sharing previously required for many CMSs.
TYPO3 Example
TYPO3 usually puts the site URL into config/sites/*/config.yaml as base: <url>, and then it won't respond to the different URLs in a ddev share. The hooks here temporarily remove the base: element:
hooks:
pre-share:
# Make a backup of config/sites
- exec: cp -r ${DDEV_APPROOT}/config/sites ${DDEV_APPROOT}/config/sites.bak
- exec-host: echo "removing 'base' from site config for sharing to ${DDEV_SHARE_URL}"
# Remove `base:` from the various site configs
- exec: sed -i 's|^base:|#base:|g' ${DDEV_APPROOT}/config/sites/*/config.yaml
- exec-host: echo "shared on ${DDEV_SHARE_URL}"
post-share:
# Restore the original configuration
- exec: rm -rf ${DDEV_APPROOT}/config/sites
- exec: mv ${DDEV_APPROOT}/config/sites.bak ${DDEV_APPROOT}/config/sites
- exec-host: ddev mutagen sync
- exec-host: echo "changes to config/sites reverted"
Magento 2 Example
Magento2 has pretty easy control of the URL, so the hooks are pretty simple:
hooks:
pre-share:
# Switch magento to the share URL
- exec-host: ddev magento setup:store-config:set --base-url="${DDEV_SHARE_URL}"
post-share:
# Switch back to the normal local URL
- exec-host: ddev magento setup:store-config:set --base-url="${DDEV_PRIMARY_URL}"
Extensibility: Custom Share Providers
The new provider system is script-based, allowing you to create custom providers for internal tunneling solutions or other services. Place Bash scripts in .ddev/share-providers/ (project-level) or $HOME/.ddev/share-providers/ (global), and DDEV will recognize them as available providers.
For details on creating custom providers, see the sharing documentation.
An example of a share provider for localtunnel is provided in .ddev/share-providers/localtunnel.sh.example and you can experiment with it by just copying that to .ddev/share-providers/localtunnel.sh.
Questions
- Do I need to change anything in existing projects?
- No. Ngrok remains the default provider, so existing projects continue working without any changes. Your ngrok authtokens and configurations are fully compatible with v1.25+.
- When should I use cloudflared vs ngrok?
- Use cloudflared for quick, free sharing during development and testing. Use ngrok if you need stable subdomains, custom domains, or advanced features like IP allowlisting and OAuth protection.
- Can I create my own share provider?
- Yes! Place bash scripts in
.ddev/share-providers/(project-level) or$HOME/.ddev/share-providers/(global). See the sharing documentation for implementation details.
Try It Today
DDEV v1.25.0 is now available. Use the techniques above, and try out Cloudflared to see if you like it.
For complete details on the new sharing system, see the sharing documentation.
Join us on Discord, follow us on Mastodon, Bluesky, or LinkedIn, and subscribe to our newsletter for updates.
This blog was drafted and reviewed by AI including Claude Code.
12 Feb 2026 12:00am GMT
11 Feb 2026
Drupal.org aggregator
Drupal blog: Drupal's AI Roadmap for 2026
- This is cross-posted from Dries Buytaert's blog

For the past months, the AI Initiative Leadership Team has been working with our contributing partners to define what the Drupal AI initiative should focus on in 2026. That plan is now ready, and I want to share it with the community.
This roadmap builds directly on the strategy we outlined in Accelerating AI Innovation in Drupal. That post described the direction. This plan turns it into concrete priorities and execution for 2026.
The full plan is available as a PDF, but let me explain the thinking behind it.
Producing consistently high-quality content and pages is really hard. Excellent content requires a subject matter expert who actually knows the topic, a copywriter who can translate expertise into clear language, someone who understands your audience and brand, someone who knows how to structure pages with your component library, good media assets, and an SEO/AEO specialist so people actually discover what you made.
Most organizations are missing at least some of these skillsets, and even when all the people exist, coordinating them is where everything breaks down. We believe AI can fill these gaps, not by replacing these roles but by making their expertise available to every content creator on the team.
For large organizations, this means stronger brand consistency, better accessibility, and improved compliance across thousands of pages. For smaller ones, it means access to skills that were previously out of reach: professional copywriting, SEO, and brand-consistent design without needing a specialist for each.
Used carelessly, AI just makes these problems worse by producing fast, generic content that sounds like everything else on the internet. But used well, with real structure and governance behind it, AI can help organizations raise the bar on quality rather than just volume.
Drupal has always been built around the realities of serious content work: structured content, workflows, permissions, revisions, moderation, and more. These capabilities are what make quality possible at scale. They're also exactly the foundation AI needs to actually work well.
Rather than bolting on a chatbot or a generic text generator, we're embedding AI into the content and page creation process itself, guided by the structure, governance, and brand rules that already live in Drupal.
For website owners, the value is faster site building, faster content delivery, smarter user journeys, higher conversions, and consistent brand quality at scale. For digital agencies, it means delivering higher-quality websites in less time. And for IT teams, it means less risk and less overhead: automated compliance, auditable changes, and fewer ad hoc requests to fix what someone published.
We think the real opportunity goes further than just adding AI to what we already have. It's also about connecting how content gets created, how it performs, and how it gets governed into one loop, so that what you learn from your content actually shapes what you build next.
The things that have always made Drupal good at content are the same things that make AI trustworthy. That is not a coincidence, and it's why we believe Drupal is the right place to build this.
What we're building in 2026
The 2026 plan identifies eight capabilities we'll focus on. Each is described in detail in the full plan, but here is a quick overview:
- Page generation - Describe what you need and get a usable page, built from your actual design system components
- Context management - A central place to define brand voice, style guides, audience profiles, and governance rules that AI can use
- Background agents - AI that works without being prompted, responding to triggers and schedules while respecting editorial workflows
- Design system integration - AI that builds with your components and can propose new ones when needed
- Content creation and discovery - Smarter search, AI-powered optimization, and content drafting assistance
- Advanced governance - Batch approvals, branch-based versioning, and comprehensive audit trails for AI changes
- Intelligent website improvements - AI that learns from performance data, proposes concrete changes, and gets smarter over time through editorial review
- Multi-channel campaigns - Create content for websites, social, email, and automation platforms from a single campaign goal
These eight capabilities are where the official AI Initiative is focusing its energy, but they're not the whole picture for AI in Drupal. There is a lot more we want to build that didn't make this initial list, and we expect to revisit the plan in six months to a year.
We also want to be clear: community contributions outside this scope are welcome and important. Work on migrations, chatbots, and other AI capabilities continues in the broader Drupal community. If you're building something that isn't in our 2026 plan, keep going.
How we're making this happen
Over the past year, we've brought together organizations willing to contribute people and funding to the AI initiative. Today, 28 organizations support the initiative, collectively pledging more than 23 full-time equivalent contributors. That is over 50 individual contributors working across time zones and disciplines.
Coordinating 50+ people across organizations takes real structure, so we've hired two dedicated teams from among our partners:
- QED42 is focused on innovation, pushing forward on what is next.
- 1xINTERNET is focused on productization, taking what we've built and making it stable, intuitive, and easy to install.
Both teams are creating backlogs, managing issues, and giving all our contributors clear direction. You can read more about how contributions are coordinated.
This is a new model for Drupal. We're testing whether open source can move faster when you pool resources and coordinate professionally.
Get involved
If you're a contributing partner, we're asking you to align your contributions with this plan. The prioritized backlogs are in place, so pick up something that fits and let's build.
If you're not a partner but want to contribute, jump in. The prioritized backlogs are open to everyone.
And if you want to join the initiative as an official partner, we'd absolutely welcome that.
This plan wasn't built in a room by itself. It's the result of collaboration across 28 sponsoring organizations who bring expertise in UX, core development, QA, marketing, and more. Thank you.
We're building something new for Drupal, in a new way, and I'm excited to see where it goes.
- Dries Buytaert
11 Feb 2026 8:10pm GMT
Drupal Association blog: Drupal's AI Roadmap for 2026
- This is cross-posted from Dries Buytaert's blog

For the past months, the AI Initiative Leadership Team has been working with our contributing partners to define what the Drupal AI initiative should focus on in 2026. That plan is now ready, and I want to share it with the community.
This roadmap builds directly on the strategy we outlined in Accelerating AI Innovation in Drupal. That post described the direction. This plan turns it into concrete priorities and execution for 2026.
The full plan is available as a PDF, but let me explain the thinking behind it.
Producing consistently high-quality content and pages is really hard. Excellent content requires a subject matter expert who actually knows the topic, a copywriter who can translate expertise into clear language, someone who understands your audience and brand, someone who knows how to structure pages with your component library, good media assets, and an SEO/AEO specialist so people actually discover what you made.
Most organizations are missing at least some of these skillsets, and even when all the people exist, coordinating them is where everything breaks down. We believe AI can fill these gaps, not by replacing these roles but by making their expertise available to every content creator on the team.
For large organizations, this means stronger brand consistency, better accessibility, and improved compliance across thousands of pages. For smaller ones, it means access to skills that were previously out of reach: professional copywriting, SEO, and brand-consistent design without needing a specialist for each.
Used carelessly, AI just makes these problems worse by producing fast, generic content that sounds like everything else on the internet. But used well, with real structure and governance behind it, AI can help organizations raise the bar on quality rather than just volume.
Drupal has always been built around the realities of serious content work: structured content, workflows, permissions, revisions, moderation, and more. These capabilities are what make quality possible at scale. They're also exactly the foundation AI needs to actually work well.
Rather than bolting on a chatbot or a generic text generator, we're embedding AI into the content and page creation process itself, guided by the structure, governance, and brand rules that already live in Drupal.
For website owners, the value is faster site building, faster content delivery, smarter user journeys, higher conversions, and consistent brand quality at scale. For digital agencies, it means delivering higher-quality websites in less time. And for IT teams, it means less risk and less overhead: automated compliance, auditable changes, and fewer ad hoc requests to fix what someone published.
We think the real opportunity goes further than just adding AI to what we already have. It's also about connecting how content gets created, how it performs, and how it gets governed into one loop, so that what you learn from your content actually shapes what you build next.
The things that have always made Drupal good at content are the same things that make AI trustworthy. That is not a coincidence, and it's why we believe Drupal is the right place to build this.
What we're building in 2026
The 2026 plan identifies eight capabilities we'll focus on. Each is described in detail in the full plan, but here is a quick overview:
- Page generation - Describe what you need and get a usable page, built from your actual design system components
- Context management - A central place to define brand voice, style guides, audience profiles, and governance rules that AI can use
- Background agents - AI that works without being prompted, responding to triggers and schedules while respecting editorial workflows
- Design system integration - AI that builds with your components and can propose new ones when needed
- Content creation and discovery - Smarter search, AI-powered optimization, and content drafting assistance
- Advanced governance - Batch approvals, branch-based versioning, and comprehensive audit trails for AI changes
- Intelligent website improvements - AI that learns from performance data, proposes concrete changes, and gets smarter over time through editorial review
- Multi-channel campaigns - Create content for websites, social, email, and automation platforms from a single campaign goal
These eight capabilities are where the official AI Initiative is focusing its energy, but they're not the whole picture for AI in Drupal. There is a lot more we want to build that didn't make this initial list, and we expect to revisit the plan in six months to a year.
We also want to be clear: community contributions outside this scope are welcome and important. Work on migrations, chatbots, and other AI capabilities continues in the broader Drupal community. If you're building something that isn't in our 2026 plan, keep going.
How we're making this happen
Over the past year, we've brought together organizations willing to contribute people and funding to the AI initiative. Today, 28 organizations support the initiative, collectively pledging more than 23 full-time equivalent contributors. That is over 50 individual contributors working across time zones and disciplines.
Coordinating 50+ people across organizations takes real structure, so we've hired two dedicated teams from among our partners:
- QED42 is focused on innovation, pushing forward on what is next.
- 1xINTERNET is focused on productization, taking what we've built and making it stable, intuitive, and easy to install.
Both teams are creating backlogs, managing issues, and giving all our contributors clear direction. You can read more about how contributions are coordinated.
This is a new model for Drupal. We're testing whether open source can move faster when you pool resources and coordinate professionally.
Get involved
If you're a contributing partner, we're asking you to align your contributions with this plan. The prioritized backlogs are in place, so pick up something that fits and let's build.
If you're not a partner but want to contribute, jump in. The prioritized backlogs are open to everyone.
And if you want to join the initiative as an official partner, we'd absolutely welcome that.
This plan wasn't built in a room by itself. It's the result of collaboration across 28 sponsoring organizations who bring expertise in UX, core development, QA, marketing, and more. Thank you.
We're building something new for Drupal, in a new way, and I'm excited to see where it goes.
- Dries Buytaert
11 Feb 2026 8:10pm GMT
Drupal AI Initiative: From Strategy to Execution: How the Drupal AI Initiative is Scaling Delivery for 2026
Scaling the Drupal AI Initiative
The Drupal AI Initiative officially launched in June 2025 with the release of the Drupal AI Strategy 1.0 and a shared commitment to advancing AI capabilities in an open, responsible way. What began as a coordinated effort among a small group of committed organizations has grown into a substantial, sponsor-funded collaboration across the Drupal ecosystem.
Today, 28 organizations support the initiative, collectively pledging more than 23 full-time equivalent contributors representing over 50 individual contributors working across time zones and disciplines. Together, sponsors have committed more than $1.5 million in combined cash and in-kind contributions to move Drupal AI forward.
The initiative now operates across multiple focused areas, including leadership, marketing, UX, QA, core development, innovation, and product development. Contributors are not only exploring what's possible with AI in Drupal, but are building capabilities designed to be stable, well-governed, and ready for real-world adoption in Drupal CMS.
Eight months in, this is more than a collection of experiments. It is a coordinated, community-backed investment in shaping how AI can strengthen content creation, governance, and measurable outcomes across the Drupal platform.
Strengthening Delivery to Support Growth
As outlined in the 2026 roadmap, this year focuses on delivering eight key capabilities that will shape how AI works in Drupal CMS. Achieving that level of focus and quality requires more than enthusiasm and good ideas. It requires coordination at scale.
From the beginning, sponsors contributed both people and funding so the initiative could be properly organized and managed. With 28 organizations contributing more than 23 people across multiple workstreams, it was clear that sustained progress would depend on dedicated delivery management to align priorities, organize backlogs, support contributors, and maintain predictable execution.
To support this growth, the initiative ran a formal Request for Proposal (RFP) process to select delivery management partners to help coordinate work across both innovation and product development workstreams. This was not a shift in direction, but a continuation of our original commitment: to build AI capabilities for Drupal in a way that is structured, sustainable, and ready for real-world adoption.
Selecting Partners to Support Our Shared Goals
To identify the right delivery partners, we launched the RFP process in October 2025 at DrupalCon Vienna. The RFP was open exclusively to sponsors of the Drupal AI Initiative. From the start, our goal was to run a process that reflected the responsibility we carry as a sponsor-funded, community-driven initiative.
The timeline included a pre-proposal briefing, an open clarification period, and structured review and interview phases. Proposals were independently evaluated against clearly defined criteria tailored to both innovation and production delivery. These criteria covered governance, roadmap and backlog management, delivery approach, quality assurance, financial oversight, and demonstrated experience contributing to Drupal and AI initiatives.
Following an independent review, leadership held structured comparison sessions to discuss scoring, explore trade-offs, clarify open questions, and ensure decisions were made thoughtfully and consistently. Final discussions were held with shortlisted vendors in December, and contracts were awarded in early January.
The selected partners are engaged for an initial six-month period. At the end of that term, the RFP process will be repeated.
This process was designed not only to select capable partners but to steward sponsor contributions responsibly and align with Drupal's values of openness, collaboration, and accountability.
Delivery Partners Now in Place
Following the structured selection process, two contributing partners were selected to support delivery across the initiative's key workstreams.
QED42 will focus on the Innovation workstream, helping coordinate forward-looking capabilities aligned with the 2026 roadmap. QED42 has been an active contributor to Drupal AI efforts from the earliest stages and has played a role in advancing AI adoption across the Drupal ecosystem. Their contributions to initiatives such as Drupal Canvas AI, AI-powered agents, and other community-driven efforts demonstrate both technical depth and a strong commitment to open collaboration. In this role, QED42 will support structured experimentation, prioritization, and delivery alignment across innovation work.
1xINTERNET will lead the Product Development workstream, supporting the transition of innovation into stable, production-ready capabilities within Drupal CMS. As a founding sponsor and co-leader within the initiative, 1xINTERNET brings deep experience in distributed Drupal delivery and governance. Their longstanding involvement in Drupal AI and broader community leadership positions them well to guide roadmap execution, release planning, backlog coordination, and predictable productization.
We are grateful to QED42 and 1xINTERNET for their continued commitment to the initiative and for stepping into this role in service of the broader Drupal community. We also want to acknowledge the strong level of interest in this RFP and the high standard of submissions received, and to thank all participating organizations for the time, thought, and care invested in the process. The level of interest and quality of submissions reflect the caliber of agencies and contributors engaged in advancing Drupal AI.
Both organizations were selected not only for their delivery expertise but for their demonstrated investment in Drupal AI and their alignment with the initiative's goals. Their role is to support coordination, roadmap alignment, and disciplined execution across contributors, ensuring that sponsor investment and community effort translate into tangible, adoptable outcomes.
Contracts began in early January. Two development sprints have already been completed, and a third sprint is now underway, establishing a clear and predictable delivery cadence.
QED42 and 1xINTERNET will share more details about their processes and early progress in an upcoming blog post.
Ready to Deliver on the 2026 Roadmap
With the 2026 roadmap now defined and structured delivery teams in place, the Drupal AI Initiative is positioned to execute with greater clarity and focus. The eight capabilities outlined in the one-year plan provide direction. Dedicated delivery management provides the coordination needed to turn that direction into measurable progress.
Predictable sprint cycles, clearer backlog management, and improved cross-workstream alignment allow contributors to focus on building, refining, and shipping capabilities that can be adopted directly within Drupal CMS. Sponsor investment and community contribution are now supported by a delivery model designed for scale and sustainability.
This next phase is about disciplined execution. It means shipping stable, well-governed AI capabilities that site owners can enable with confidence. It means connecting innovation to production in a way that reflects Drupal's strengths in structure, governance, and long-term maintainability.
We are grateful to the sponsors and contributors who have made this possible. As agencies and organizations continue to join the initiative, we remain committed to transparency, collaboration, and delivering meaningful value to the broader Drupal community.
We are entering a year of focused execution, and we are ready to deliver.
Moving Forward Together
The Drupal AI Initiative is built on collaboration. Sponsors contribute funding and dedicated team members. Contributors bring expertise across UX, core development, QA, marketing, innovation, and production. Leadership provides coordination and direction. Together, this shared investment makes meaningful progress possible.
We extend our thanks to the 28 sponsoring organizations and the more than 50 contributors who are helping shape the future of AI in Drupal. Their commitment reflects a belief that open source can lead in building AI capabilities that are stable, governed, and built for real-world use.
As we move into 2026, we invite continued participation. Contributing partners are encouraged to align their work with the roadmap and engage in the active workstreams. Organizations interested in joining the initiative are welcome to connect and explore how they can contribute.
We have laid the foundation. The roadmap is clear. Structured delivery is in place. With continued collaboration, we are well-positioned to deliver meaningful AI capabilities for the Drupal community and the organizations it serves.
11 Feb 2026 7:51pm GMT
Dries Buytaert: Drupal's AI roadmap for 2026

For the past months, the AI Initiative Leadership Team has been working with our contributing partners to define what the Drupal AI initiative should focus on in 2026. That plan is now ready, and I want to share it with the community.
This roadmap builds directly on the strategy we outlined in Accelerating AI Innovation in Drupal. That post described the direction. This plan turns it into concrete priorities and execution for 2026.
The full plan is available as a PDF, but let me explain the thinking behind it.
Producing consistently high-quality content and pages is really hard. Excellent content requires a subject matter expert who actually knows the topic, a copywriter who can translate expertise into clear language, someone who understands your audience and brand, someone who knows how to structure pages with your component library, good media assets, and an SEO/AEO specialist so people actually discover what you made.
Most organizations are missing at least some of these skillsets, and even when all the people exist, coordinating them is where everything breaks down. We believe AI can fill these gaps, not by replacing these roles but by making their expertise available to every content creator on the team.
For large organizations, this means stronger brand consistency, better accessibility, and improved compliance across thousands of pages. For smaller ones, it means access to skills that were previously out of reach: professional copywriting, SEO, and brand-consistent design without needing a specialist for each.
Used carelessly, AI just makes these problems worse by producing fast, generic content that sounds like everything else on the internet. But used well, with real structure and governance behind it, AI can help organizations raise the bar on quality rather than just volume.
Drupal has always been built around the realities of serious content work: structured content, workflows, permissions, revisions, moderation, and more. These capabilities are what make quality possible at scale. They're also exactly the foundation AI needs to actually work well.
Rather than bolting on a chatbot or a generic text generator, we're embedding AI into the content and page creation process itself, guided by the structure, governance, and brand rules that already live in Drupal.
For website owners, the value is faster site building, faster content delivery, smarter user journeys, higher conversions, and consistent brand quality at scale. For digital agencies, it means delivering higher-quality websites in less time. And for IT teams, it means less risk and less overhead: automated compliance, auditable changes, and fewer ad hoc requests to fix what someone published.
We think the real opportunity goes further than just adding AI to what we already have. It's also about connecting how content gets created, how it performs, and how it gets governed into one loop, so that what you learn from your content actually shapes what you build next.
The things that have always made Drupal good at content are the same things that make AI trustworthy. That is not a coincidence, and it's why we believe Drupal is the right place to build this.
What we're building in 2026
The 2026 plan identifies eight capabilities we'll focus on. Each is described in detail in the full plan, but here is a quick overview:
- Page generation - Describe what you need and get a usable page, built from your actual design system components
- Context management - A central place to define brand voice, style guides, audience profiles, and governance rules that AI can use
- Background agents - AI that works without being prompted, responding to triggers and schedules while respecting editorial workflows
- Design system integration - AI that builds with your components and can propose new ones when needed
- Content creation and discovery - Smarter search, AI-powered optimization, and content drafting assistance
- Advanced governance - Batch approvals, branch-based versioning, and comprehensive audit trails for AI changes
- Intelligent website improvements - AI that learns from performance data, proposes concrete changes, and gets smarter over time through editorial review
- Multi-channel campaigns - Create content for websites, social, email, and automation platforms from a single campaign goal
These eight capabilities are where the official AI Initiative is focusing its energy, but they're not the whole picture for AI in Drupal. There is a lot more we want to build that didn't make this initial list, and we expect to revisit the plan in six months to a year.
We also want to be clear: community contributions outside this scope are welcome and important. Work on migrations, chatbots, and other AI capabilities continues in the broader Drupal community. If you're building something that isn't in our 2026 plan, keep going.
How we're making this happen
Over the past year, we've brought together organizations willing to contribute people and funding to the AI initiative. Today, 28 organizations support the initiative, collectively pledging more than 23 full-time equivalent contributors. That is over 50 individual contributors working across time zones and disciplines.
Coordinating 50+ people across organizations takes real structure, so we've hired two dedicated teams from among our partners:
- QED42 is focused on innovation, pushing forward on what is next.
- 1xINTERNET is focused on productization, taking what we've built and making it stable, intuitive, and easy to install.
Both teams are creating backlogs, managing issues, and giving all our contributors clear direction. You can read more about how we are going from strategy to execution.
This is a new model for Drupal. We're testing whether open source can move faster when you pool resources and coordinate in a new way.
Get involved
If you're a contributing partner, we're asking you to align your contributions with this plan. The prioritized backlogs are in place, so pick up something that fits and let's build.
If you're not a partner but want to contribute, jump in. The prioritized backlogs are open to everyone.
And if you want to join the initiative as an official partner, we'd absolutely welcome that.
This plan wasn't built in a room by itself. It's the result of collaboration across 28 sponsoring organizations who bring expertise in UX, core development, QA, marketing, and more. Thank you.
We're building something new for Drupal, in a new way, and I'm excited to see where it goes.
11 Feb 2026 7:41pm GMT
The Drop Times: EverLMS Offers a Self-Hosted Enterprise LMS Built on Drupal
EverLMS, developed by Hai Nguyen, is an open-source, Drupal-based learning management framework designed for self-hosted deployment. Built for agencies and enterprises, it integrates AI-assisted authoring tools, SCORM and H5P support, structured reporting, and role-based access control within a single-tenant architecture. Rather than functioning as an LMS add-on, EverLMS presents a full learning system built directly on Drupal.
11 Feb 2026 4:42pm GMT
Droptica: WordPress vs Drupal – Comparing 5 Key Tools and Their Equivalents

Switching from WordPress to Drupal raises many concerns. Will the migration be too complicated? Will I find equivalents for the tools I use every day? In this article, I compare five of the most important WordPress tools with their Drupal counterparts: Custom Post Types, ACF, WP Query, WP Forms, and Page Builders like Elementor. For each one, I show what working in Drupal looks like with real examples and a live demo in Drupal CMS. After reading this post, you'll see that Drupal isn't hard at all, and concepts familiar from WordPress translate to Drupal almost one-to-one. Feel free to read the article or watch the episode from the Nowoczesny Drupal series.
11 Feb 2026 12:10pm GMT
DrupalCon News & Updates: What’s going on in Chicago?
The Drupal Community Working Group (CWG) recently posted about Health and Safety at DrupalCon Chicago. We encourage all attendees to review their blog post and the updated DrupalCon Chicago 2026 Health & Safety information.
Have questions or concerns about DrupalCon Chicago? Feel free to drop by the Community Working Group's public office hours this Friday, February 13 at 10am ET / 1200 UTC.
Join the #community-health Drupal Slack channel for more information. A meeting link will be posted there a few minutes before office hours.
Chicago was booked as the venue for DrupalCon North America in late 2024. Since then, there has been a lot of news from Chicago.
Chicago has been the target of "enhanced" immigration enforcement operations by the US federal government under the name "Operation Midway Blitz". Local news outlet Block Club Chicago has written in-depth about the ongoing situation:
- Chicago Under Siege How Operation Midway Blitz Changed Our City
- Feds Used Chemical Weapons On Chicagoans At Least 49 Times - Even After Judge Said To Stop
While the government shifted the bulk of its operation to Minneapolis in December, operations continue around the city and across the country. We encourage attendees to inform themselves on the political climate in the USA before making the decision to attend DrupalCon.
Put very simply: If you feel unsafe attending an event in the United States, please do not attend. We regret the impact the current climate is having on our international community, LGBTQIA+ community, and others, and hope the US can be safe and welcoming for you in the future once again.
More information can be found at Chicago local news outlets:
The CWG and the volunteer DrupalCon Steering Committee are monitoring the situation.
If you choose to travel to DrupalCon from abroad, we encourage you to take the following steps to ensure your trip is as safe as possible:
- Even if it is not required for your visa, request a Visa Letter and carry it with your travel documents at all times when in transit and at the event.
- Review information from the City of Chicago's Protecting Chicago Initiative.
- Save the information for the ICIRR Family Support Network & Hotline in your phone. 1-855-HELP-MY-FAMILY (1-855-435-7693)
- Know your rights before you travel.
Need Help?
If you have a harassment concern or need to report a Code of Conduct violation, notify the event staff or contact us at conduct@association.drupal.org.
If you need help resolving a conflict, contact the Drupal Community Working Group: drupal-cwg@drupal.org.
In case of emergency, call 9-1-1.
11 Feb 2026 1:30am GMT

Add new comment