16 Mar 2026
Django community aggregator: Community blog posts
Built with Django β Weekly Roundup (Mar 09βMar 16, 2026)
Hey, Happy Monday!
Why are you getting this: *You signed up to receive this newsletter on Built with Django. I promised to send you the latest projects and jobs on the site as well as any other interesting Django content I encountered during the month. If you don't want to receive this newsletter, feel free to unsubscribe anytime.
Sponsor
This issue is sponsored by TuxSEO - your AI content team on auto-pilot.
- Plan and ship SEO content faster
- Generate practical, publish-ready drafts
- Keep your content pipeline moving every week
Projects
- Table of Contents Generator - Generate a clickable Table of Contents for any PDF in seconds. Free, private, no account required. Works with reports, eBooks, manuals, and papers.
Jobs
From the Community
- Django Admin: Building a Production-Ready Back Office Without Starting From Scratch | Medium by Anas Issath
- Beginner's Guide to Open Source Contribution | Djangonaut Space 2026 - DEV Community
- Django Tutorial - GeeksforGeeks
Support
You can support this project by using one of the affiliate links below. These are always going to be projects I use and love! No "Bluehost" crap here!
- Buttondown - Email newsletter tool I use to send you this newsletter.
- Readwise - Best reading software company out there. I you want to up your e-reading game, this is definitely for you! It also so happens that I work for Readwise. Best company out there!
- Hetzner - IMHO the best place to buy a VPS or a server for your projects. I'll be doing a tutorial on how to use this in the future.
- SaaS Pegasus is one of the best (if not the best) ways to quickstart your Django Project. If you have a business idea but don't want to set up all the boring stuff (Auth, Payments, Workers, etc.) this is for you!
16 Mar 2026 6:00pm GMT
15 Mar 2026
Django community aggregator: Community blog posts
How I deploy my projects to a single VPS with Gitea, NGINX and Docker
Hello everyone π
A few weeks ago, the team behind Jmail (a Gmail-styled interface for browsing the publicly released Epstein files) shared that they had racked up a $46,485 bill on Vercel The site had gone viral with ~450 million pageviews, and Vercel's pricing structure turned that into a five-figure invoice. Vercel's CEO ended up covering the bill personally, which is nice, but not exactly a scalable solution π
When I saw that story, my first thought was: this is an efficiency problem. Jmail is essentially a search interface on top of mostly static content. An SRE on Hacker News mentioned they handle 200x Jmail's request load on just two Hetzner servers. The whole thing could have been served from a moderately sized VPS for a fraction of the cost.
That got me thinking about my own setup. I run everything on a single VPS: my blog, my side projects, my git server, analytics, a wiki, a forum, a secret sharing tool, and more. The whole thing is held together by NGINX, Gitea, some bash scripts, and Docker. No Kubernetes, no Terraform, no CI/CD platform with a $500/month bill. Just a cheap VPS, some config files, and a deployment flow that's simple enough that I can fix it from my phone at the beach (I've written about that before).
I get asked about my deployment setup more often than I expected, so I figured I'd write it all down. Let me walk you through the whole thing.
The VPS
I'm running a Hetzner Cloud CPX21 in Nuremberg, Germany. Here are the specs:
| Spec | Value |
|---|---|
| vCPUs | 3 |
| RAM | 4 GB |
| Disk | 80 GB SSD |
| OS | Ubuntu |
| Price | ~β¬7-8/month |
The CPX21 is one of Hetzner's shared vCPU instances. It's cheap, reliable, and more than enough for what I need. I'm usually sitting at around ~10% CPU and ~2GB RAM, so there's plenty of headroom.
I set up the VPS manually. No Ansible, no configuration management, just plain old SSH and installing things by hand. I know, I know, "infrastructure as code" and all that. But for a single server that I manage myself, the overhead of automating the setup isn't worth it. If the server dies, I can set it up again in a couple of hours and restore from backups.
What's running on it
Here's everything running on this single VPS:
Bare metal (directly on the server)
| Service | Purpose |
|---|---|
| Gitea | Self-hosted git server |
| NGINX | Web server / reverse proxy |
| Certbot | SSL/TLS certificates |
| PHP-FPM | For WordPress sites |
| DokuWiki | Personal wiki |
| fail2ban | Brute force protection |
| UFW | Firewall |
| A couple WordPress sites | Various projects |
Docker
| Service | Purpose |
|---|---|
| ntfy | Push notifications |
| shhh | Secret sharing |
| SearXNG | Privacy-respecting search engine |
| WireGuard | VPN |
| phpBB | YAMS community forum |
| Umami | Privacy-respecting analytics |
| Gitea Actions runner | CI/CD runner |
| Watchtower | Automatic Docker image updates |
Static sites (Hugo, served by NGINX)
| Site | Purpose |
|---|---|
| rogs.me | This blog! |
| montevideo.restaurant | Restaurant directory |
| yams.media | YAMS documentation site |
That's a lot of stuff for a 4GB VPS. But static sites are basically free in terms of resources, and the Docker services are all lightweight. The heaviest things are probably Gitea and the WordPress sites, and even those barely register.
The web server: NGINX
Every site and service gets its own NGINX config file in /etc/nginx/conf.d/. One file per site, nice and clean. No sites-available / sites-enabled symlink dance.
Here's what a typical config looks like for one of my Hugo sites:
server {
root /var/www/rogs.me;
index index.html;
server_name rogs.me;
location / {
try_files $uri $uri/ =404;
}
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/rogs.me/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/rogs.me/privkey.pem;
include /etc/letsencrypt/options-ssl-nginx.conf;
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
}
server {
if ($host = rogs.me) {
return 301 https://$host$request_uri;
}
server_name rogs.me;
listen 80;
return 404;
}
Nothing fancy. Serve files from /var/www/rogs.me, redirect HTTP to HTTPS, done. The SSL bits are all managed by Certbot (more on that later).
For Docker services, the config looks slightly different because NGINX acts as a reverse proxy:
server {
server_name analytics.rogs.me;
location / {
proxy_pass http://localhost:3000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
listen 443 ssl; # managed by Certbot
# ... SSL config same as above
}
Same pattern: one file per service, NGINX handles SSL termination, and proxies to whatever port the Docker container exposes on localhost.
SSL/TLS with Let's Encrypt
All certificates come from Let's Encrypt via Certbot. I installed it with apt and used the NGINX plugin:
sudo apt install certbot python3-certbot-nginx
sudo certbot --nginx -d rogs.me
Certbot modifies the NGINX config automatically to add the SSL directives (that's why you see those # managed by Certbot comments).
Certificates auto-renew daily at 3 AM via a cron job:
0 3 * * * certbot renew -q
The -q flag keeps it quiet: no output unless something goes wrong. Certbot is smart enough to only renew certificates that are close to expiring, so running it daily is fine.
Self-hosted git with Gitea
I use Gitea as my primary git server. It runs bare metal on the VPS (not in Docker) and lives at git.rogs.me.
Why Gitea instead of just using GitHub? I want to own my git infrastructure. GitHub is great for collaboration, but I like having control over where my code lives. If GitHub goes down or decides to change their terms, my repos are safe on my own server.
That said, I mirror everything to both GitHub and GitLab so other people can collaborate, open issues, and submit PRs. Best of both worlds: I own the primary, and the mirrors handle the social coding side.
Gitea Actions
Gitea has a built-in CI/CD system called Gitea Actions that's compatible with GitHub Actions workflows. The runner is the official gitea/act_runner Docker image, running on the same VPS. Pretty vanilla setup, no custom configuration.
This is the core of my deployment pipeline. Every time I push to master, Gitea Actions picks up the workflow and deploys the site.
Deploying Hugo sites
This is where it all comes together. All three of my Hugo sites follow the exact same deployment pattern. Here's the flow:
ββββββββββββ push ββββββββββββ Gitea Actions ββββββββββββ
β Local ββββββββββββββββββΆ β Gitea β βββββββββββββββββββββΆβ Runner β
β machine β β(git.rogs)β β (Docker) β
ββββββββββββ ββββββββββββ ββββββ¬ββββββ
β
SSH into same VPS
β
βΌ
ββββββββββββ
β VPS β
β git pull β
β build.sh β
ββββββ¬ββββββ
β
Hugo builds to
/var/www/domain/
β
βΌ
ββββββββββββ
β NGINX β
β serves β
ββββββββββββ
Yes, the Gitea Actions runner SSHes into the same server it's running on. I know that's a bit redundant, but I designed it this way on purpose: if I ever move my hosting somewhere else (or switch back to GitHub Actions), the workflow doesn't need to change. The SSH target is just a secret, so I swap an IP address and everything keeps working.
The Gitea Actions workflow
Here's the workflow file that lives in .gitea/workflows/deploy.yml in each repo:
name: deploy
on:
push:
branches:
- master
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Deploy via SSH
uses: appleboy/ssh-action@v1
with:
host: ${{ secrets.SSH_HOST }}
username: ${{ secrets.SSH_USER }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
port: ${{ secrets.SSH_PORT }}
script: |
cd repo && git stash && git pull --force origin master && ./build.sh
It's beautifully simple:
- Push to
mastertriggers the workflow - The runner uses appleboy/ssh-action to SSH into the server
- On the server: stash any local changes, pull the latest code, and run the build script
The git stash is there as a safety net. The WebP conversion in the build script modifies tracked files (more on that in a second), so without the stash, git pull would complain about dirty working tree.
All four secrets (SSH_HOST, SSH_USER, SSH_PRIVATE_KEY, SSH_PORT) are configured in Gitea's repository settings. The SSH key has access to the server but is locked down to only what the deployment needs.
The build script
Every Hugo site has a build.sh in the repo root. Here's the one for this blog:
#!/bin/bash
# Convert all images to WebP for better performance
for file in $(git ls-files --others --cached --exclude-standard \
| grep -v '.git' \
| grep -E '\.(png|jpg|jpeg)$'); do
cwebp -lossless "$file" -o "${file%.*}.webp"
done
# Update all references from png/jpg/jpeg to webp
for tracked_file in $(git ls-files --others --cached --exclude-standard \
| grep -v '.git'); do
sed -i 's/\.webp/.webp/g' "$tracked_file"
sed -i 's/\.webp/.webp/g' "$tracked_file"
sed -i 's/\.webp/.webp/g' "$tracked_file"
done
# Build the site
hugo -s . -d /var/www/rogs.me/ --minify --cacheDir $PWD/hugo-cache
Three things happen here:
- Image optimization: Every PNG, JPG, and JPEG gets converted to WebP using
cwebp(lossless mode, so no quality loss). WebP files are significantly smaller than their originals. - Reference rewriting: All file references get updated from
.webp/.webp/.webpto.webp. This is why we needgit stashin the workflow; this step modifies tracked files. - Hugo build: Generates the static site with minification enabled and outputs it directly to
/var/www/rogs.me/. NGINX is already configured to serve from that directory, so the site is live immediately.
The --cacheDir flag keeps Hugo's build cache in the repo directory, which speeds up subsequent builds.
Each site's build.sh is essentially identical, just with a different output path (montevideo.restaurant, yams.media, etc.).
Variations across sites
While the pattern is the same, there are small differences:
- yams.media has a two-job workflow: a
test_buildjob runs Hugo in a Docker container first to make sure the build succeeds, and only then does the deploy job run. This is because the YAMS docs site has more contributors, so I want to catch build errors before they hit production. - yams.media also uses
--cleanDestinationDirand--gcflags for a cleaner build output.
Docker services and Watchtower
Most of my non-static services run in Docker with docker-compose. Each service has its own directory in /opt/:
/opt/
βββ analytics.rogs.me/ # Umami
β βββ docker-compose.yml
βββ ntfy/
β βββ docker-compose.yml
βββ shhh/
β βββ docker-compose.yml
βββ searx/
β βββ docker-compose.yml
βββ ...
For updates, I use Watchtower. It runs as a Docker container itself and periodically checks if there are newer images available for my running containers. If there are, it pulls the new image, stops the old container, and starts a new one with the same configuration.
version: "3"
services:
watchtower:
image: containrrr/watchtower
volumes:
- /var/run/docker.sock:/var/run/docker.sock
restart: unless-stopped
Is this a bit risky? Sure. An automatic update could break something. But in practice, it hasn't failed me once, and the services I'm running are stable enough that breaking changes in Docker images are rare. For a personal setup, the convenience of never having to manually update containers is worth the small risk.
Security
I'm not running a bank here, but I do take basic security seriously:
- UFW (Uncomplicated Firewall): Only NGINX ports (80, 443) and SSH are open. Everything else is blocked.
- fail2ban: Watches SSH logs and bans IPs after too many failed login attempts. Essential if your SSH port is exposed to the internet.
- SSH keys only: Password authentication is disabled. If you don't have the key, you're not getting in.
- Let's Encrypt everywhere: Every site and service gets HTTPS. No exceptions.
- Docker services on localhost: All Docker containers bind to
localhost. They're only accessible through the NGINX reverse proxy, which handles SSL termination.
# Quick UFW setup
sudo ufw default deny incoming
sudo ufw default allow outgoing
sudo ufw allow 'Nginx Full'
sudo ufw allow ssh
sudo ufw enable
DNS
All my domains use Cloudflare for DNS. But only DNS for most of them. I'm not using Cloudflare's CDN or proxy features on my main sites. The DNS records point directly to my VPS IP with the proxy toggle set to "DNS only" (the grey cloud, not the orange one).
Why Cloudflare for DNS? Two reasons. First, it's free, fast, and the dashboard is easy to use. Second, and more importantly: if something goes wrong, I can switch to using Cloudflare's full proxy and DDoS protection with the flick of a button. Just toggle the grey cloud to orange and you're behind Cloudflare's network instantly.
I've already had to do this once. forum.yams.media (the YAMS community forum) was getting DDoSed and swarmed by bots constantly. Flipping that toggle to orange solved the problem immediately. The rest of my sites run without Cloudflare's proxy because they don't need it, but knowing I can turn it on in seconds gives me peace of mind.
Backups
This is the part that most people skip. Don't be most people.
My backup strategy has two stages:
βββββββββββββββ 11 PM cron βββββββββββββββββββββ
β VPS β ββββββββββββββββΆβ /home/backups/ β
β (services) β tar + GPG β (encrypted .gpg) β
βββββββββββββββ βββββββββββ¬ββββββββββ
β
midnight cron
(SSH pull)
β
βΌ
ββββββββββββββββββββ
β Home Server β
β (NAS + S3) β
ββββββββββββββββββββ
Stage 1: Backup on the VPS (11 PM)
Every night at 11 PM, a series of cron jobs run backup scripts for each service. Each script follows the same pattern:
#!/bin/bash
BACKUP_DIR="/home/backups/servicename"
TARGET_DIR="/path/to/service"
DATE=$(date +%Y-%m-%d-%s)
BACKUP_FILE="$BACKUP_DIR/backup-servicename-$DATE.tar.zst"
ENCRYPTED_FILE="$BACKUP_FILE.gpg"
LOG_FILE="/var/log/backup_servicename.log"
GPG_RECIPIENT="your-email@example.com"
log_message() {
echo "$(date +'%Y-%m-%d %H:%M:%S') - $1" | tee -a "$LOG_FILE"
}
log_message "=== Starting backup ==="
mkdir -p "$BACKUP_DIR"
# For Docker services: stop containers first
docker compose stop
# Create compressed archive
tar -caf "$BACKUP_FILE" -C "$TARGET_DIR" .
# Encrypt with GPG
gpg --encrypt --armor -r "$GPG_RECIPIENT" -o "$ENCRYPTED_FILE" "$BACKUP_FILE"
rm -f "$BACKUP_FILE" # Remove unencrypted version
# For Docker services: restart containers
docker compose up -d
log_message "=== Backup completed ==="
Key points:
- Compression: I use
tar.zst(Zstandard) for compression. It's faster than gzip and produces smaller files. - Encryption: Every backup gets GPG-encrypted before it touches the network. Even if someone gets access to the backup files, they're useless without my private key.
- Docker services: For services running in Docker, the script stops the containers before backing up to ensure data consistency, then starts them again. This causes a brief downtime (usually a few seconds), which is fine for personal services at 11 PM.
- Database dumps: For services with databases (like Gitea, which uses MySQL), the script dumps the database separately with
mysqldumpbefore creating the archive. - Logging: Every step is logged to
/var/log/, so I can check if something went wrong.
Stage 2: Pull to home server (midnight)
At midnight, my home server SSHes into the VPS and pulls all the encrypted backup files to my local NAS. From there, they also get pushed to an S3 bucket.
This gives me the classic 3-2-1 backup strategy: 3 copies of the data (VPS, NAS, S3), on 2 different media types, with 1 offsite copy. If Hetzner's datacenter burns down, I have everything locally. If my house burns down, I have everything in S3.
Monitoring
I run Uptime Kuma on my home server to monitor all my services. It checks every site and service periodically and sends me a notification (via ntfy, naturally) if something goes down.
It's not fancy, but it works. I've caught a few issues before anyone else noticed them, which is the whole point.
The big picture
Here's what the whole setup looks like:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Hetzner CPX21 β
β β
β βββββββββββ ββββββββββββββββββββββββββββββββββββ β
β β Gitea β β NGINX β β
β β Actions β β ββββββββββββ ββββββββββββββββ β β
β β Runner β β β Static β β Reverse β β β
β β (Docker) β β β sites β β proxy to β β β
β ββββββ¬ββββββ β β/var/www/ β β Docker svcs β β β
β β β ββββββββββββ ββββββββββββββββ β β
β β SSH β β² β β β
β β ββββββββββΌβββββββββββββββΌβββββββββββ β
β β β β β
β βΌ β βΌ β
β βββββββββββ βββββββββ βββββββββββββ β
β β Git βββbuildβββ Hugo β β Docker β β
β β repos β β sites β β services β β
β βββββββββββ βββββββββ βββββββββββββ β
β β
β βββββββββββββββ ββββββββββββ ββββββββββββββ β
β β Gitea β β Certbot β β fail2ban β β
β β (bare metal)β β (SSL) β β + UFW β β
β βββββββββββββββ ββββββββββββ ββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Conclusion
The whole philosophy here is simplicity. There's no orchestration tool, no container registry, no deployment platform. It's just:
- Push code to Gitea
- A workflow SSHes into the server
- Git pull + bash script builds the site
- NGINX serves it
Could I make this more sophisticated? Sure. Could I use Ansible to manage the server config, or Kubernetes to orchestrate the containers, or a proper CI/CD platform with build artifacts and rollbacks? Absolutely. But for a personal setup that hosts a blog, some side projects, and a handful of services, this is more than enough.
The setup has been running for years with minimal maintenance. The most time I spend on it is writing backup scripts for new services and adding NGINX configs when I deploy something new. Everything else is automated: deployments, SSL renewals, Docker updates, backups.
If you're thinking about self-hosting your projects, my advice is: start simple. A VPS, NGINX, and a bash script can take you surprisingly far. You can always add complexity later if you need it, but in my experience, you probably won't.
If you have questions about any part of this setup, feel free to reach out on the Contact page. I'm always happy to help people get started with self-hosting.
See you in the next one!
15 Mar 2026 5:00am GMT
14 Mar 2026
Django community aggregator: Community blog posts
10 Years of Jazzband
Jazzband is sunsetting. Before moving on, here's a look at what 10 years of cooperative coding actually looked like.
By the numbers
Five years in, we had about 1,350 members and 55 projects. Here's where things stand now:
Members
- 3,135 total members over the years
- 2,133 members currently - a 68% retention rate over 10 years
- New members every year, peaking at 424 in 2022
- Members who left stayed an average of 510 days
- Based on GitHub profiles (only ~28% of members list a location), members from at least 56 countries across every continent but Antarctica - 36% Europe, 30% Asia, 22% North America, 7% South America, 3% Africa, 1% Oceania. Real numbers are likely higher. And given how widely Python is used in research, someone on Antarctica has probably pip-installed a Jazzband project at some point
Projects
- 84 projects total, 71 still active
- 13 projects left again over the years
- ~93,000 GitHub stars across all projects
- ~16,000 forks
Activity
- ~43,800 commits across all repositories
- ~15,600 pull requests
- ~12,200 issues
Releases
- 1,429 package uploads via Jazzband's release pipeline
- 1,312 releases to PyPI across 56 projects and 390 versions
- 281 MB of release artifacts total
- First upload in November 2017, most recent in March 2026
Project teams
- 470 project team memberships
- 105 lead roles across 81 project leads
- Most prolific leads: aleksihakli, hramezani, claudep, and camilonova each maintained 4 projects
How Jazzband was actually used
The numbers above only tell part of the story. Here's what's more interesting.
Not everyone used the release pipeline
20 active projects never shipped a single release through it. Projects like Watson (2,515 stars), django-rest-knox (1,255), and django-admin2 (1,187) used Jazzband as a collaborative home - for shared access, triage, and maintenance - not for releases. The pipeline was useful for the projects that used it, but it wasn't what made Jazzband work for most people.
Old projects stayed alive
django-avatar's repo was created in 2008 and shipped its most recent Jazzband release in January 2026 - a 17-year-old repo still getting releases. django-axes (2009), sorl-thumbnail (2010), django-constance (2010), and 18 other projects created before 2015 were all still getting releases in 2025 or 2026. Jazzband kept old projects alive long after their original authors moved on. That was the whole point.
Release cadence varied wildly
django-axes had the most active release cadence: 253 release files across 127 versions, peaking at 28 versions in 2019 - roughly one every 13 days. pip-tools was second at 138 releases / 69 versions.
Meanwhile, 7 active projects have no team members at all - django-permission, django-mongonaut, and five others. Nobody was actively working on them, but they had a home and stayed installable.
pip-tools was its own community
With 69 team members it dwarfed every other project (the next largest, djangorestframework-simplejwt, had 24). It was basically a sub-organization within Jazzband. And two projects joined as recently as 2024 (django-tagging, django-summernote) with single-digit stars and zero releases - people were still finding value in the model right up to the end.
The open access model was genuinely controversial
When django-newsletter transferred in, its author @dokterbob worried that giving 800 members write access would "dissolve the responsibility so much that it might actually reduce participation." I wrote a long reply defending the open model.
An earlier project, Collectfast, actually left Jazzband after a member pushed directly to master without review - merging commits the author had been holding off on. That incident led to real discussions about code review processes, branch protection, and what "open access" should actually mean. The tension between openness and control was never fully resolved.
Moderation was another solo job
Over the years I had to block 10 accounts from the GitHub organization - first crypto spammers who joined just to be in the org, then community conflicts that needed real moderation decisions, and finally the AI-driven spam that made the open model untenable. None of that is unusual for an organization this size, but it all went through one person.
The onboarding bottleneck
Every transferred project got an onboarding checklist - a webhook automatically opened an "Implement Jazzband guidelines" issue with TODOs like fixing links, adding badges, setting up CI, adding jazzband to PyPI, deciding on a project lead. 41 projects got one of these. 28 completed it. 13 are still open.
The pattern in those 13 is telling: contributors would do every item they could, then get stuck on things that required admin access - configuring webhooks, fixing CI checks, setting up the release pipeline - and wait for me. Sometimes for months.
django-user-sessions' original author pinged me five times over two months about broken CI checks only an admin could fix. Watson's lead asked twice to remove legacy CI tools blocking PR merges. The checklist was good. The bottleneck was me.
Projects that moved on
One of the earliest and most visible Jazzband projects was django-debug-toolbar, transferred in back in 2016. It grew to over 8,000 stars under Jazzband before it moved to Django Commons in 2024.
django-simple-history, django-oauth-toolkit, PrettyTable, and tablib all moved on too, for similar reasons - they needed more autonomy than Jazzband's structure could provide.
Downloads
For context on how widely these projects are used, here are some numbers from PyPI. All projects that were ever part of Jazzband account for over 150 million downloads a month. Current projects alone are around 95 million.
Top 15 by monthly downloads:
| Project | Downloads/month | Note |
|---|---|---|
| prettytable | 42.4M | left Jazzband |
| pip-tools | 23.3M | |
| contextlib2 | 10.7M | |
| django-redis | 9.6M | |
| django-debug-toolbar | 7.3M | left, now Django Commons |
| djangorestframework-simplejwt | 6.1M | |
| dj-database-url | 5.5M | |
| pathlib2 | 4.9M | |
| django-model-utils | 4.8M | |
| geojson | 4.6M | |
| tablib | 4.1M | |
| django-oauth-toolkit | 3.7M | left |
| django-simple-history | 3.1M | left, now Django Commons |
| django-silk | 2.7M | |
| django-formtools | 2.1M |
One thing that surprised me: prettytable alone accounts for 42 million downloads a month, and it isn't even a Django package. contextlib2, pathlib2, and geojson aren't either. Jazzband ended up being broader than the Django ecosystem it started in.
django-debug-toolbar ranked in the top three most used third-party packages in the Django Developers Survey and is featured in the official Django tutorial. It spent 8 years under Jazzband before moving to Django Commons.
If you've come across Jazzband projects before, it was probably through the Django News newsletter, Python Weekly, or Opensource.com's 2020 piece on how Jazzband worked.
Top 10 projects by stars
| Project | Stars |
|---|---|
| pip-tools | 7,997 |
| django-silk | 4,939 |
| tablib | 4,752 |
| djangorestframework-simplejwt | 4,310 |
| django-taggit | 3,429 |
| django-redis | 3,059 |
| django-model-utils | 2,759 |
| Watson | 2,515 |
| django-push-notifications | 2,384 |
| django-widget-tweaks | 2,165 |
14 Mar 2026 4:02pm GMT
Wind-Down Plan
This post outlines the plan for winding down Jazzband. If you haven't read them yet, see the sunsetting announcement for context on why this is happening, and the 10-year retrospective for the full story.
Timeline
The wind-down will happen in phases over the course of 2026.
Phase 1: Announcement (March 2026)
- New member signups are disabled immediately
- This announcement and wind-down plan are published
- Existing members retain access to the GitHub organization and all repositories
Phase 2: Outreach (March - May 2026)
- All 80 project leads will be contacted via email to discuss transferring their projects
- The goal is to have initial conversations with every lead before PyCon US 2026 (May 13-19 in Long Beach, CA)
- Leads who don't respond will be followed up with at PyCon US and through other channels
Phase 3: Project Transfers (June - December 2026)
- Projects will be transferred out of the Jazzband GitHub organization to their new homes - whether that's a lead's personal account, a new organization, or another collaborative group
- For each project, the transfer includes:
- GitHub repository: transferred to the new owner
- PyPI package ownership: existing maintainers added, Jazzband credentials removed
- CI/CD configuration: updated to work outside Jazzband
- Projects without an active lead or willing recipient will be archived in the Jazzband GitHub organization
Phase 4: Wind Down (Early 2027)
- Remaining repositories archived
- The Jazzband GitHub organization set to read-only
- The jazzband.co website archived (with a redirect or static notice)
- PSF Fiscal Sponsorship status concluded, remaining funds donated to the PSF general fund
What happens toβ¦
β¦existing members?
You remain a member of the GitHub organization until it is archived. No action is needed on your part. If you'd like to leave earlier, you can do so from your account dashboard.
β¦projects I contribute to?
The projects aren't going away - they're moving. Your contributions, issues, and pull requests will transfer with the repository to its new home. Git history is preserved.
β¦PyPI packages?
Package ownership on PyPI will be transferred to the project leads before the Jazzband release credentials are deactivated. If you're a project lead, we'll coordinate this with you directly.
β¦the Jazzband release pipeline?
The Jazzband-specific release pipeline (uploading via Twine to jazzband.co, then releasing to PyPI) will remain functional during the transition period. After transfer, projects will publish to PyPI directly using standard tooling.
β¦the website?
The jazzband.co website will remain online through the transition. After wind-down, it will be replaced with a static page linking to this announcement and an archive of the project list.
β¦the PSF Fiscal Sponsorship?
Jazzband's PSF Fiscal Sponsorship status will be formally concluded. Any remaining funds will be donated to the Python Software Foundation's general fund.
For project leads
If you're a project lead, here's what to expect:
- You'll receive an email with details specific to your project(s)
- Decide on a new home for your project - your personal GitHub account, a new organization, or another collaborative group like Django Commons
- Coordinate the transfer - we'll handle the GitHub repo transfer and help with PyPI ownership changes
- Update your project - CI/CD, documentation links, and any Jazzband-specific references
Several projects have already successfully transferred to Django Commons, including django-debug-toolbar and django-simple-history. If you're looking for a place with shared maintenance and multiple admins, it's a good option.
If you have questions or want to start the process early, please contact the roadies.
14 Mar 2026 4:01pm GMT
Sunsetting Jazzband
Over 10 years ago, Jazzband started as a cooperative experiment to reduce the stress of maintaining Open Source software projects. The idea was simple - everyone who joins gets access to push code, triage issues, merge pull requests. "We are all part of this."
It had a good run. More than 10 years, actually.
But it's time to wind things down.
What happened
There's a short answer and a long answer.
The slopocalypse
GitHub's slopocalypse - the flood of AI-generated spam PRs and issues - has made Jazzband's model of open membership and shared push access untenable.
Jazzband was designed for a world where the worst case was someone accidentally merging the wrong PR. In a world where only 1 in 10 AI-generated PRs meets project standards, where curl had to shut down its bug bounty because confirmation rates dropped below 5%, and where GitHub's own response was a kill switch to disable pull requests entirely - an organization that gives push access to everyone who joins simply can't operate safely anymore.
The one-roadie problem
But honestly, the cracks have been showing for much longer than that.
Jazzband was always a one-roadie operation. People asked for more roadies and offered to help over the years, and I tried a number of times to make it work - but it never stuck. I dropped the ball on organizing it properly, and when volunteers did step up they'd quietly step back after a while. That's not a criticism of them, it's just how volunteer work goes when there's no structure to support it.
The result was the same though: every release request, every project transfer, every lead assignment, every PyPI permission change - it all went through me.
The warnings
The sustainability question was raised as early as 2017. I gave a keynote at DjangoCon Europe 2021 about it - five years in. In that talk I said out loud that the "social coding" experiment had failed to create an equitable community, and that a sustainable solution didn't exist without serious financial support.
The roadmap I presented - revamp infrastructure, grow the management team, formalize guidelines, reach out for funding - none of that happened. The PSF fiscal sponsorship was the one thing that did.
In the years since, I've been on the PSF board - which faced its own crises - and now serve as PSF chair. That work matters and I don't regret prioritizing it, but it meant Jazzband got even less of my time.
GitHub went the other way
Meanwhile, GitHub moved in the opposite direction. Copilot launched in 2022, trained on open source code that maintainers were burning out maintaining for free. GitHub Sponsors participation sits at 0.0014%. 60% of maintainers are still unpaid.
The XZ Utils backdoor in 2024 showed what happens when a lone maintainer burns out and someone malicious fills the gap. And Jazzband's own infrastructure started getting in the way of the projects it was supposed to help - the release pipeline couldn't support trusted publishing, projects that needed admin access were stuck.
So projects started leaving. And that's OK - that was always supposed to be part of the deal.
Django Commons
I want to specifically thank Django Commons and Tim Schilling for picking up where Jazzband fell short. They have 5 admins, 15 active projects (including django-debug-toolbar, django-simple-history, and django-cookie-consent from Jazzband), and django-polymorphic is transferring over right now. They solved the governance problem from day one. If you're a Jazzband project lead looking for a new home for your Django project, start there.
For non-Django projects like pip-tools, contextlib2, geojson, or tablib - I'm not aware of an equivalent. If someone wants to build one for the broader Python tooling ecosystem, I'd love to see it.
By the numbers
Over 10 years, Jazzband grew to 3,135 members from every continent but Antarctica, maintained 84 projects with ~93,000 GitHub stars, and shipped 1,312 releases to PyPI.
Projects that passed through Jazzband are downloaded over 150 million times a month - pip-tools at 23 million, prettytable at 42 million. django-debug-toolbar spent 8 years under Jazzband and ended up in the official Django tutorial. django-avatar, a repo from 2008, was still getting releases in 2026. And django-axes shipped 127 versions - a release every 13 days in its peak year.
The full 10-year retrospective has all the numbers, the stories, and what actually happened.
What happens next
I'm not pulling the plug overnight. There is a detailed wind-down plan that covers the timeline, but the short version:
- New signups are disabled as of today
- Project leads will be contacted before PyCon US 2026 to coordinate transferring projects to new homes
- The GitHub organization and website will remain available during the transition period through end of 2026
If you're a project lead, expect an email soon.
Thank you
None of this would have been possible without the people who showed up - strangers on the internet who decided to maintain something together. Thanks to the 81 project leads who kept things going despite the bottlenecks I created, and to everyone who joined, contributed, filed issues, and shipped releases over the years.
I started Jazzband because maintaining Open Source alone was exhausting. The irony of then becoming a single point of failure for 71 projects is not lost on me. But the experiment worked in the ways that mattered - projects got maintained, releases got shipped, people collaborated.
Anyways, the projects will move on to new homes, and that's fine. That was always the point.
We are all part of this.
14 Mar 2026 4:00pm GMT
13 Mar 2026
Django community aggregator: Community blog posts
Django News - 21 PRs in One Week to Django Core! - Mar 13th 2026
News
The Call for Proposals for DjangoCon US 2026 has been extended one week!
DjangoCon US 2026 has extended its Call for Proposals deadline by one week to March 23 at 11 AM CDT, giving prospective speakers a little more time to submit their talk ideas.
CPython: 36 Years of Source Code
An analysis of the growth of CPython's codebase from its first commits to the present day
Releases
Python 3.15.0 alpha 7
Python 3.15.0 alpha 7 introduces explicit lazy imports, a new frozendict type, improved profiling tools, and JIT upgrades that deliver modest performance gains while development continues toward the upcoming beta.
Django Software Foundation
DSF member of the month - Theresa Seyram Agbenyegah
Theresa Seyram Agbenyegah features as DSF member of the month for March 2026, highlighting her Django community leadership and PyCon organization work.
Updates to Django
Today, "Updates to Django" is presented by Johanan from Djangonaut Space! π
Last week we had 21 pull requests merged into Django by 11 different contributors - including 2 first-time contributors! Congratulations to KhadyotTakale and Lakshya Prasad for having their first commits merged into Django - welcome on board!
This week's Django highlights:
-
Fixed TypeError in deprecation warnings if Django is imported by namespace. (#36961)
-
Improved admin changelist layout for object-tools button. (#36887)
-
Fixed migrate --run-syncdb crash for existing model with truncated db_table names. (#12529)
Django Newsletter
Django Fellow Reports
Fellow Report - Jacob
Two cool features landed this week: @Antoliny0919's more standard vertical layout for inputs and labels in admin forms, and Artyom Kotovskiy's work to make RenameModel migration operations update permission names as well.
Lots of tickets triaged, reviewed, and authored!
Fellow Report - Natalia
This week had as the main attraction the security releases I issued on Tuesday (6.0.3, 5.2.12, and 4.2.29), which required the usual coordination, strong focus, and intense follow-up.
Beyond that, a significant part of the week was spent navigating the continuing wave of LLM-generated pull requests, which adds a fair amount of noise to the review queue. After prioritizing the security work, I tried to reclaim some joy in the day-to-day Fellow work by digging through long-snoozed notification emails and picking off a number of lingering tickets and PRs that had been waiting for attention.
Sponsored Link 1
The deployment service for developers and teams.
Articles
New Feature Proposal for Django - AddConstraintConcurrently
More context on a recent proposal suggesting a pair of opt-in contrib.postgres operations - AddConstraintConcurrently and RemoveConstraintConcurrently - to allow unique indexes created via UniqueConstraint to be created and dropped concurrently.
Avoiding empty strings in non-nullable Django string-based model fields
Django silently converts None values in non-nullable string fields into empty strings, but a simple CheckConstraint can enforce truly required values and prevent empty data from slipping into your database.
Buttondown - How we check every link in your email
The machinery behind Buttondown's link checker is more involved than you might expect.
The State of OpenSSL for pyca/cryptography with Alex Gaynor and Paul Kehrer
The written transcript of an interview all about Python security/cryptography, current features in cryptography, as well as some of what's coming in the future.
Year of the Snake Recap
Mariatta's review of the year showcases how prolific she was, with conferences, documentaries, ice cream selfies, and much more.
What is `self`?
Eric Matthes tackles the age-old questions that is asked many times by newcomers, but is always worth revisiting.
I Ditched Elasticsearch for Meilisearch. Here's What Nobody Tells You.
A practical deep dive into replacing Elasticsearch with Meilisearch, showing how a simpler Rust-based search engine cut costs from $120 to $14 a month while delivering faster, typo-tolerant search for typical application workloads.
Videos
From Kenya to London - Velda Kiara
The video version of Django Chat and this week's guest, Velda. We won't always do a double-feature of episodes, but Velda is always sunny and uplifting even amidst these last legs of winter.
Python Unplugged on PyTV - Free Online Python Conference
If you missed it live last week, there was a digital conference hosted by PyCharm featuring several Django speakers including Sarah Boyce (Fellow), Carlton Gibson (podcast host), and Sheena O'Connell (PSF Member). Timestamps in the description!
Podcasts
Django Chat #197: From Kenya to London with Django - Velda Kiara
Velda is a software engineer at RevSys based in London and an extremely active member of the Python and Django communities. She is a PSF Fellow, former Djangonaut, co-maintainer of django-debug-toolbar, regular conference speaker, and Microsoft MVP.
Django Job Board
Explore new opportunities this week including a Solutions Architect role at JetBrains, an Infrastructure Engineer position at the Python Software Foundation, and a Lead Backend Engineer opening at TurnTable.
Solutions Architect - Python (Client-facing) at JetBrains π
Infrastructure Engineer at Python Software Foundation
Lead Backend Engineer at TurnTable
Django Newsletter
Projects
Lupus/django-lumen
Visualize your Django models as an interactive ERD diagram in the browser. No external diagram library - the diagram is pure vanilla JS + SVG rendered at request time from the live Django model registry.
paradedb/django-paradedb
Official extension to Django for use with ParadeDB.
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
13 Mar 2026 3:00pm GMT
11 Mar 2026
Django community aggregator: Community blog posts
Weeknotes (2026 week 11)
Weeknotes (2026 week 11)
Last time I wrote that I seem to be publishing weeknotes monthly. Now, a quarter of a year has passed since the last entry. I do enjoy the fact that I have published more posts focused on a single topic. That said, what has been going on in open source land is certainly interesting too.
LLMs in Open Source
I have started a longer piece to think about my stance regarding using LLMs in Open Source. The argument I'm thinking about is that there's a balance between LLMs having ingested all of my published open source code and myself using them now to help myself and others again.
The happenings in the last two weeks (think Pentagon, Iran, and the bombings of schools) have again brought to the foreground the perils of using those tools. I therefore haven't been motivated to pursue this train of thought for the moment. When the upsides are somewhat questionable and tentative and the downsides are so clear and impossible to miss, it's hard to use my voice to speak in favor of these tools.
That said, all the shaming when someone uses an LLM that I see in my Mastodon feed also annoys me. I'll quote part of a post here which I liked and leave it at that for the moment:
The AI hype-cyclone is bad, but so is the anti-AI witch hunt. Commits co-authored by Claude do not mean that a project has "abandoned engineering as a serious endeavor"
[β¦]
Other goings-on
- Health: My back continues to improve. Some days are still bad, but the idea that the herniation may go away entirely doesn't sound totally unreasonable anymore.
- Gardening: We started weeding the garden last week. Lots to do! Being outside is fun. Weeding isn't the greatest part ever, but it's meditative.
Releases since December
- django-json-schema-editor 0.12.1: CSS fixes. I have again looked at other, more modern JSON schema editor implementations but all of them are more limited than is acceptable to act as a replacement.
- django-debug-toolbar 6.2: I haven't done much work here! Just some reviewing.
- django-content-editor 8.1: Started emitting warnings when using non-abstract base classes for plugins. Using multi table inheritance is mostly an accident and not intended in my experience when using django-content-editor, therefore we have started detecting this case and emitting system checks (warnings, not errors).
- django-imagefield 0.23.0a3: We have done some work on supporting libvips as an alternative backend to Pillow because I hoped that memory usage in Kubernetes pods might go down a bit. Results are not conclusive yet, and I'm not yet convinced the additional code complexity is worth it. Debugging and monitoring continues.
- FeinCMS 26.2.1: Released a few bugfixes. FeinCMS is still being maintained ~17 years later!
- django-auto-admin-fieldsets 0.3: Added a helper to remove fields from the fieldsets structure.
- django-tree-queries 0.23.1: Shipped a small bugfix for
{% recursetree %}which unintentionally cached children across invocations. - feincms3-downloads: Used
PATHfrom the environment instead of using a very restricted allowlist so thatconvertandpdftocairoare detected in more locations. This should help with local development for example on macOS. - django-prose-editor 0.24.1: Read the CHANGELOG; there's too much in there for a short notice.
- form-designer 0.27.3: Mosparo captcha support, bugfixes and additional translations.
- feincms3 5.5: Started using the
OrderableTreeNodefrom django-tree-queries.
11 Mar 2026 5:00pm GMT
From Kenya to London with Django - Velda Kiara
π Links
- Velda at RevSys
- Velda's Substack: The Storyteller's Byte Tales
- Velda on GitHub
- Optimal Performance Over Basic as a Perfectionist with Deadlines
- More about me
- Neapolitan
π¦ Projects
π Books
- A History of the Bible by John Barton
- Python Mastery, a course from David Beazley
- Kite Runner by Khaled Hosseini
π₯ YouTube
Sponsor
This episode was brought to you by Buttondown, the easiest way to start, send, and grow your email newsletter. New customers can save 50% off their first year with Buttondown using the coupon code DJANGO.
11 Mar 2026 4:00pm GMT
10 Mar 2026
Django community aggregator: Community blog posts
The highs and lows of running a business
The past few weeks have been both amazing and frustrating. If you listen/read to my weekly inβprogress reports have already I mentioned some of this. In the last week of February, I appeared on the Django Chat podcast, which was released to positive feedback. I considered that a personal win for my career, it was fun to chat with Will and Carlton and hope to do more like this in future. However, two days after the episode aired, I recieved news that a client had passed away, causing that income source to vanish overnight. My condolences go to their family and I was also in shock, personally and then from the perspecive of my business.
This was especially frustrating because I thought my cashflow for 2026 was solid, and I expected to be salaried before the end of year once Hamilton Rock secures funding, which we plan on doing this year. Generally from a business perspective, income can vanish overnight, and I accept this. This risk is inherent in running a business, unlike an employed job in the UK, which offers protections in this regard. The risk and potential downside has a counter of greater flexibility. I can work where and when I like, take long lunch breaks, and take holidays without needing permission-aside from checking in with the wife!
However if you lose a job, all your income can disappear at once, it's the classic "all your eggs in one basket" scenario. With a business, there is the opportunity to diversify, so thankfully this lost income wasn't everything for me, but it doesn significantly shortern my cashflow runway. My next steps are to explore several options: revisiting Comfort Monitor Live and improving its marketing with the help of Claude, and experimenting with agents to generate additional revenue. Finally there is of course new freelance contracts or projects.
For now, that's where things stand with me, I'm available for freelance contracts or projects. If you need a reliable developer with Django expertise, feel free to get in touch. I am open to part-time or projectβbased work. Whether you need something built or maintained that your team lacks the capacity for, or you want advice on AI, development, or anything in between, drop me an email and or book a call.
10 Mar 2026 5:00am GMT
06 Mar 2026
Django community aggregator: Community blog posts
Django News - Django Security Fixes, Python Releases, and New Tools - Mar 6th 2026
News
Django security releases issued: 6.0.3, 5.2.12, and 4.2.29
Django 6.0.3, 5.2.12, and 4.2.29 were released to fix two security issues: URLField DoS on Windows and file permission race conditions.
Releases
Python 3.12.13, 3.11.15 and 3.10.20 are now available!
Python 3.12.13, 3.11.15, and 3.10.20 fix security and denial-of-service vulnerabilities in email, HTTP cookies, WSGI headers, XML parsing, and SSL.
Python Software Foundation
PEP 827 - Type Manipulation
PEP 827 proposes extensive type-level introspection and construction APIs in typing to enable computed types for ORMs, dataclass-style transforms, and decorator typing.
The Python Insider Blog Has Moved!
Python Insider moved to a Git backed Markdown workflow with a static Astro site, GitHub Actions, and RSS, simplifying contributions and versioned posts.
Djangonaut Space News
2026 Session 6 Team Introductions!
Djangonaut Space introduces the six teams for its sixth session, pairing volunteers and new contributors to collaborate on projects ranging from Django core and accessibility improvements to django CMS, BeeWare, and deployment tools.
Wagtail CMS News
Our projects for Google Summer of Code 2026
Wagtail will mentor GSoC 2026 projects, including bakerydemo redesign, starter kit overhaul, and multilingual improvements to core and wagtail-localize for CMS contributors.
Our roadmap for the next 6 months
Wagtail roadmap targets UX and editor improvements, Django modelsearch enhancements, customizable page models, SEO and AI content checks, autosave polish, and LTS stability.
Updates to Django
Today, "Updates to Django" is presented by Johanan from Djangonaut Space! π
Last week we had 23 pull requests merged into Django by 17 different contributors - including 6 first-time contributors! Congratulations to Pierre Sassoulas, Abhimanyu Singh Negi, Sam.An, Anurag Verma, Zac Iloka and Elias Hernandis for having their first commits merged into Django - welcome on board!
This week's Django highlights:
-
Removed empty exc_info from log_task_finished signal handler.(#36951)
-
Renamed permissions upon model renaming in migrations. (#27489) This ticket was created 9 years ago . Thanks to everyone who worked on this π
-
Improved the accessibility of admin form label(#34643).
Django Newsletter
Sponsored Link 1
Sponsor Django News
Reach 4,300+ highly-engaged and experienced Django developers.
Articles
Making Django unique constraints case-insensitive (with no downtime)
Fix Django's case-sensitive unique constraint pitfalls by cleaning duplicates, adding Lower() constraints, and safely migrating with PostgreSQL CONCURRENTLY to avoid downtime.
Row Locks With Joins Can Produce Surprising Results in PostgreSQL
A subtle PostgreSQL concurrency edge case shows how SELECT ... FOR UPDATE with joins can unexpectedly return missing or partial results under Read Committed isolation, and explores safer query patterns to avoid it.
Pytest parameter functions
Use helper functions that return pytest.param to preprocess multiline strings or file contents, and assign concise IDs to make parametrized pytest test cases clearer.
I Checked 5 Security Skills for Claude Code. Only One Is Worth Installing
A deep dive into five Claude Code security review skills reveals that most are shallow checklists prone to false positives, while Sentry's standout skill delivers a context-aware methodology that actually finds real vulnerabilities.
State of WASI support for CPython: March 2026
PEP 816 locks WASI and WASI SDK versions for CPython 3.15, enabling stable build targets while work continues on packaging, deps, and socket support.
Videos
Python Unplugged on PyTV - Free Online Python Conference livestream available
The first PyTV, a global online Python conference, occurred as a livestream on Wednesday. Django speakers included Sarah Boyce, Sheena O'Connell, Carlton Gibson, Mark Smith, Paul Everitt, and others. Time stamps in the description!
Django Job Board
The Python Software Foundation is hiring an Infrastructure Engineer to help maintain the systems that power Python's infrastructure.
TurnTable is seeking a Lead Backend Engineer to build and scale backend systems for its music collaboration platform.
Projects
Django (anti)patterns
Django Antipatterns is a community-maintained reference that highlights common mistakes in Django projects and explains better patterns developers can use instead.
yassi/dj-control-room
The control room for your Django app.
trottomv/django-never-cache
A lightweight Django package to simplify Cache-Control configuration for sensitive views.
Sponsorship
π Reach 4,300+ Django Developers Every Week
Want to reach developers who actually read what they subscribe to?
Django News lands in the inboxes of 4,300+ Django and Python developers every week. With a 52% open rate and 15% click rate, sponsors get their message in front of builders who actively use Django.
Promote your product, service, event, job, or open source project to a highly engaged developer audience while supporting the newsletter.
π Explore sponsorship options: https://django-news.com/sponsorship
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
06 Mar 2026 5:00pm GMT
05 Mar 2026
Django community aggregator: Community blog posts
Smoother translations in Django
I've been working for roughly 5 years now in an app that is localized to Swedish, so I have built up some opinions on how to manage translation of a Django project. Here's my list of things I do currently:
Always use gettext_lazy
I've been bitten many times by accidentally using gettext when I should have used gettext_lazy, resulting in strings that were stuck in English or Swedish randomly because a user with a specific language caused that piece of code to be imported.
I realize that there are some performance implications here, but compared to stuff like database access this is tiny and has never shown up in profiler outputs, so I will gladly take this hit and avoid these bugs that tend to be hard to track down (if they even get reported by users at all!).
A simple naive hand-rolled static analysis test that forbids usages of plain gettext in the code base is easy to implement and stops a whole class of bugs.
Django models
The Okrand setting django_model_upgrade which dynamically sets verbose_name for all fields correctly with the normal default, and on the model sets up verbose_name and verbose_name_plural. Then when you run the Okrand collect command you will get strings to translate without polluting your source with silly stuff like
class Foo(Model):
user = ForeignKey(User, verbose_name=gettext_lazy('user'))
class Meta:
verbose_name = gettext_lazy('foo')
verbose_name_plural = gettext_lazy('foos')
and you can instead have models like:
class Foo(Model):
user = ForeignKey(User)
You can still write them out explicitly if you need them to differ from the defaults.
Elm
There's a built-in regex pattern for ML-style languages in Okrand that makes it quite easy to collect strings from Elm code.
Menu translations
I use the iommi MainMenu system which looks something like this:
menu = MainMenu(
items=dict(
albums=M(view=albums_view),
artists=M(view=artists_view),
),
)
Since Okrand has a plugin system, I can build a little function that loops over this menu and collects these identifiers into translation strings. In the example above this would be "albums" and "artists". I enjoy not having to write the English base string that is 99% the exact same as the identifier (after replacing _ with space), which keeps the business logic clean.
Stick to lowercase as far as possible
I was frustrated by the translation files ending up with translations for "album" and "Album", "artist" and "Artist" over and over. The solution I came up with was to define two simple functions:
def Trans(s):
return capfirst(gettext_lazy(s))
def trans(s):
return gettext_lazy(s)
I like the semantic weight of having Trans("album") mean that the word should start with uppercase in that place while trans("album") meaning that it should stay as lowercase. One could also add TRANS("album") if one wants all uppercase of a string for example.
05 Mar 2026 6:00am GMT
Write the docs meetup: digital sovereignty for writers - Olufunke Moronfolu
(One of my summaries of the Amsterdam *write the docs* meetup).
Full title: digital sovereignty for writers: your data, your decisions. Olufunke Moronfolu has her website at https://writerwhocodes.com/ .
"Digital sovereignty is the ability to have control over your own digital destiny: the data, hardware and software that you rely on and create" (quote from the World Economic Forum).
What do writers want? Mostly: to be read. For this you could for instance start looking for (commercial) blogging platforms, searching for the best one. And after a while you start looking for a different one. On and on. You can run into problems. Substack might ban your newsletter. A google workspace domain being blocked. A Medium story getting deleted without feedback.
Tim Berners-Lee intended for the web to be universal and open. But now it is mostly a collection of isolated silos.
There are some questions you can ask yourself to test your sovereignty. If your current platform deletes your account, is your content completely lost? Second question: can you export your work in some portable format (like markdown).
If you are a technical writer, you have to do the test twice. Once for your own content and once for your company's documentation.
Own your content. Most sovereign for your own website/blog would be hugo/jekyll or other static generators. In the middle are (self-hosted?) wordpress sites. Least sovereign is something like linkedin/medium/substack. For company content, confluence/notion would be least sovereign. Wiki.js/bookstack middle. The best is docs as code like some markdown in git.
So: review the platform's policy. What is the ease of export? Do you have control? What's the stability? Do you have an identity there? Perhaps even a domain?
Own your identity. Having your own domain is best. If you're some-platform.com/name, your identity goes away if the site disappears.
Decide how to share. Sovereign would be an email list, an RSS feed or something like the POSSE approach (Publish (on your) Own Site, Syndicate Elsewhere).
Build for the future: build something. Start. It doesn't have to be perfect. Your own domain name and a single static page is already much more sovereign than a million followers on a site that could vanish tomorrow.
If you want to do more, join the "independent web" (indieweb, https://indieweb.org) movement.
Personal note: I've got my own domain. This is a blog entry that ends up in an RSS/atom feed. The site is .rst files in a git repo. Statically generated with Sphinx. So: yeah, pretty sovereign :-)
05 Mar 2026 5:00am GMT
Write the docs meetup: developers documentation, your hidden strength - FrΓ©dΓ©ric Harper
(One of my summaries of the Amsterdam *write the docs* meetup).
If you have a product, you need good developer documentation. "It is an integral part of your product: one cannot exist without the other". You might have the best product, but if people don't know how to use it, it doesn't matter.
What he tells developers: good documentation reduces support tickets and angry customers. You should be able to "sell" good documentation to your company: it saves money and results in more sales.
Some notes on documentation contents:
- You need a search function. The first thing you need to add.
- Think about John Snow (game of thrones): "you know nothing, John Snow". Be detailed in your instructions, they'll need it. Start with the assumption that the user knows nothing about your program. Advanced users can easily skip those parts.
- Have a proper architecture/structure. Simply having a "home" link to get back to the start already helps. Add a "getting started" section with step-by-step instructions to get something simple running. And detailed how-to guides where you go into depth.
- Show a table of contents of the current page.
- Keep the docs of previous versions available.
- Take great screenshots. Docs should have great quality and it especially shows in the screenshots.
- Don't show off your language skills too much. Keep the language simple. Not everyone will have your documentation's language as their native language.
- Test the code in your documentation! There's nothing more irritating than errors in example code. And keep it up to date. Especially watch out when the software gets updated. Do you give your documentation time to get updated?
Some extra notes:
- Make your docs accessible for people with disabilities.
- Are your docs fast? Load times help you get ranked higher in search engines.
- Some people read your documentation on their phones: does it work there?
- Try to make your docs open source. You might get an occasional fix. And perhaps more feedback.
05 Mar 2026 5:00am GMT
02 Mar 2026
Django community aggregator: Community blog posts
DjangoCon 2025 The Attendee's Experience
This post is the second in a three-part series reflecting on DjangoCon US 2025. In this post, I'm reflecting on experiencing DjangoCon 2025 from the audience while serving as conference chair.
02 Mar 2026 9:00pm GMT
27 Feb 2026
Django community aggregator: Community blog posts
Using tox to Test a Django App Across Multiple Django Versions

Recently, I developed a reusable Django app django-clearplaintext for normalizing plain text in Django templates. And to package and test it properly, I had a fresh look to Tox.
Tox is the standard testing tool that creates isolated virtual environments, installs the exact dependencies you specify, and runs your test suite in each one - all from a single command.
This post walks through a complete, working setup using a minimal example app called django-shorturl.
The Example App: django-shorturl
django-shorturl is a self-contained Django app with one model and one view.
shorturl/models.py
from django.db import models
from django.utils.translation import gettext_lazy as _
class ShortLink(models.Model):
slug = models.SlugField(_("slug"), unique=True)
target_url = models.URLField(_("target URL"))
created_at = models.DateTimeField(_("created at"), auto_now_add=True)
class Meta:
verbose_name = _("short link")
verbose_name_plural = _("short links")
def __str__(self):
return self.slug
shorturl/views.py
from django.shortcuts import get_object_or_404, redirect
from .models import ShortLink
def redirect_link(request, slug):
link = get_object_or_404(ShortLink, slug=slug)
return redirect(link.target_url)
shorturl/urls.py
from django.urls import path
from . import views
urlpatterns = [
path("<slug:slug>/", views.redirect_link, name="redirect_link"),
]
shorturl/admin.py
from django.contrib import admin
from .models import ShortLink
admin.site.register(ShortLink)
Project Layout
django-shorturl/
βββ src/
β βββ shorturl/
β βββ __init__.py
β βββ admin.py
β βββ models.py
β βββ views.py
β βββ urls.py
βββ tests/
β βββ __init__.py
β βββ test_views.py
βββ pyproject.toml
βββ test_settings.py
βββ tox.ini
The source lives under src/ and the tests are at the top level, separate from the package. This separation prevents the tests from accidentally being shipped inside the installed package.
Packaging: pyproject.toml
Tox needs a properly packaged app to install into each environment. With isolated_build = true (more on that below), Tox builds a wheel from your pyproject.toml before running any tests.
pyproject.toml
[project]
name = "django-shorturl"
version = "1.0.0"
requires-python = ">=3.8"
dependencies = [
"Django>=4.2",
]
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[tool.setuptools.packages.find]
where = ["src"]
The dependencies list here declares the runtime minimum - your app needs Django, but you don't pin a specific version because that is Tox's job during testing.
For the [build-system] section, we can also use uv_build to gain some performance improvements:
[build-system]
requires = ["uv_build >= 0.10.0, <0.11.0"]
build-backend = "uv_build"
[tool.uv.build-backend]
module-name = "shorturl"
Here module-name lets uv_build not to get confused between django-shorturl and shorturl.
Test Settings: test_settings.py
Django requires a settings module to run. As we don't have an associated project, we have to create a minimal one by defining project settings in the project's settings, create a minimal one dedicated to testing. It lives at the repo root so it's easy to point to from anywhere.
test_settings.py
SECRET_KEY = "test"
INSTALLED_APPS = [
"shorturl",
]
DATABASES = {
"default": {
"ENGINE": "django.db.backends.sqlite3",
"NAME": ":memory:",
}
}
ROOT_URLCONF = "shorturl.urls"
DEFAULT_AUTO_FIELD = "django.db.models.AutoField"
A few deliberate choices here:
SECRET_KEY = "test"- A fixed value, fine for tests, but never use this in production.INSTALLED_APPS- Only include apps that your tests actually need. Nodjango.contrib.admin, no auth, nothing extra.- SQLite in-memory database -
":memory:"means the database is created fresh for every test run and disappears when the process exits. No files left behind, no teardown needed, and it is fast. ROOT_URLCONF- The test client resolves URLs through this setting. Without it,reverse()raisesNoReverseMatchand the test client has no URL configuration to dispatch against. Point it at your app'surls.py.DEFAULT_AUTO_FIELD- Suppresses Django's system check warning about the implicit primary key type. Setting it explicitly keeps the test output clean and makes the expectation clear.
The Core: tox.ini
This is where Tox is configured.
tox.ini
[tox]
envlist =
py{38,39,310,311,312}-django42,
py{310,311,312}-django50,
py{310,311,312,313}-django51,
py{310,311,312,313,314}-django52,
py{312,313,314}-django60
isolated_build = true
[testenv]
deps =
django42: Django>=4.2,<4.3
django50: Django>=5.0,<5.1
django51: Django>=5.1,<5.2
django52: Django>=5.2,<6.0
django60: Django>=6.0,<6.1
commands =
python -m django test
setenv =
DJANGO_SETTINGS_MODULE = test_settings
envlist - the matrix
py{38,39,310,311,312}-django42 is a shortcut used in Tox.
The numbers inside {} are expanded automatically. Tox combines each Python version with django42, creating 5 environments:
py38-django42py39-django42py310-django42py311-django42py312-django42
The full envlist simply lists all Python and Django combinations you want to test, so you can check that your project works in each setup.
Each part separated by a dash in an environment name is called a "factor". You can have as many factors as you like, and they can be named anything. py* factors are a convention for Python versions. Others need to be defined in the [testenv] deps section.
isolated_build = true
This tells tox to build a proper wheel from your pyproject.toml before installing into each environment. Without it, tox would try to install your package with pip install -e ., which bypasses the build system and can hide packaging bugs. With it, each environment tests the package exactly as a user would receive it after pip install django-shorturl.
deps - conditional dependencies
The django42: prefix is a Tox factor condition: the dependency on that line is only installed when the environment name contains the django42 factor. This is how a single [testenv] block handles all Django versions without needing a separate section for each one.
Tox also installs your package itself into each environment (because of isolated_build), so you don't need to list it here.
commands
commands =
python -m django test
python -m django test is Django's built-in test runner. It discovers tests by looking for files matching test*.py under the current directory, which picks up everything in your tests/ folder automatically.
setenv
setenv =
DJANGO_SETTINGS_MODULE = test_settings
Django refuses to run without a settings module. This environment variable tells it where to find yours. Because test_settings.py is at the repo root and tox runs from the repo root, the module name test_settings resolves correctly without any path manipulation.
Writing the Tests
Create test cases for each (critical) component of your app. For example, if you have models, views, and template tags, create tests/test_models.py, tests/test_views.py, and tests/test_templatetags.py.
tests/test_views.py
from django.test import TestCase
from django.urls import reverse
from shorturl.models import ShortLink
class RedirectLinkViewTest(TestCase):
def setUp(self):
ShortLink.objects.create(
slug="dt",
target_url="https://www.djangotricks.com",
)
def test_redirects_to_target_url(self):
response = self.client.get(
reverse(
"redirect_link", kwargs={"slug": "dt"}
)
)
self.assertRedirects(
response,
"https://www.djangotricks.com",
fetch_redirect_response=False,
)
def test_returns_404_for_unknown_slug(self):
response = self.client.get(
reverse(
"redirect_link", kwargs={"slug": "nope"}
)
)
self.assertEqual(response.status_code, 404)
Installing Python Versions with pyenv
Tox needs the actual Python binaries for every version in your envlist. If you try to run tox without them installed, it will fail immediately with an InterpreterNotFound error. pyenv is the standard way to install and manage multiple Python versions side by side.
Install pyenv
Use Homebrew on macOS (or follow the official instructions for Linux):
brew install pyenv
Add the following to your shell config (~/.zshrc, ~/.bashrc, etc.) and restart your shell:
export PYENV_ROOT="$HOME/.pyenv"
export PATH="$PYENV_ROOT/bin:$PATH"
eval "$(pyenv init -)"
Install each Python version
Install every version that appears in your envlist:
pyenv install 3.8
pyenv install 3.9
pyenv install 3.10
pyenv install 3.11
pyenv install 3.12
pyenv install 3.13
pyenv install 3.14
Make them all reachable at once
Tox resolves py312 by looking for a binary named python3.12 on PATH. The trick is pyenv global, which accepts multiple versions and places all of their binaries on your PATH simultaneously:
pyenv global 3.14 3.13 3.12 3.11 3.10 3.9 3.8
List the first (the one python3 and python resolve to) and work downward. After running this, confirm every interpreter is visible:
python3.8 --version # Python 3.8.x
python3.9 --version # Python 3.9.x
python3.10 --version # Python 3.10.x
python3.11 --version # Python 3.11.x
python3.12 --version # Python 3.12.x
python3.13 --version # Python 3.13.x
python3.14 --version # Python 3.14.x
Now tox can find all of them and the full matrix will run without InterpreterNotFound errors.
Running tox
Run the full matrix:
tox
Or run a single environment:
tox -e py312-django52
tox will print a summary at the end showing which environments passed and which failed.
py38-django42: OK (3.25=setup[2.32]+cmd[0.93] seconds)
py39-django42: OK (2.88=setup[2.16]+cmd[0.72] seconds)
py310-django42: OK (2.61=setup[2.02]+cmd[0.59] seconds)
py311-django42: OK (2.70=setup[2.09]+cmd[0.61] seconds)
py312-django42: OK (3.28=setup[2.46]+cmd[0.82] seconds)
py310-django50: OK (2.67=setup[2.09]+cmd[0.58] seconds)
py311-django50: OK (2.61=setup[2.02]+cmd[0.59] seconds)
py312-django50: OK (2.85=setup[2.25]+cmd[0.60] seconds)
py310-django51: OK (2.81=setup[2.27]+cmd[0.54] seconds)
py311-django51: OK (2.85=setup[2.30]+cmd[0.55] seconds)
py312-django51: OK (2.70=setup[2.09]+cmd[0.61] seconds)
py313-django51: OK (2.97=setup[2.29]+cmd[0.68] seconds)
py310-django52: OK (3.03=setup[2.31]+cmd[0.72] seconds)
py311-django52: OK (2.88=setup[2.22]+cmd[0.66] seconds)
py312-django52: OK (2.80=setup[2.13]+cmd[0.67] seconds)
py313-django52: OK (4.70=setup[3.66]+cmd[1.04] seconds)
py314-django52: OK (6.41=setup[5.18]+cmd[1.23] seconds)
py312-django60: OK (5.13=setup[4.06]+cmd[1.07] seconds)
py313-django60: OK (5.35=setup[4.15]+cmd[1.21] seconds)
py314-django60: OK (6.01=setup[4.65]+cmd[1.37] seconds)
congratulations :) (70.59 seconds)
Final Words
What makes this setup robust?
- No shared state between environments. Each Tox environment is its own virtualenv with its own Django installation.
- The package is built, not symlinked.
isolated_build = truecatches packaging mistakes before they reach users. - The database never persists between runs. SQLite in-memory means no stale data, no cleanup scripts, no CI-specific teardown.
- The test settings are minimal by design. Fewer installed apps means faster startup, fewer implicit dependencies, and tests that fail for clear, local reasons rather than configuration noise from elsewhere in the project.
This setup is not the only way to test a Django app with Tox, but it is a solid starting point that balances comprehensiveness with maintainability. With a little effort upfront, you can ensure your app works across a wide range of Python and Django versions - and catch packaging bugs before they hit real users.
27 Feb 2026 6:00pm GMT
Django News - Google Summer of Code 2026 with Django - Feb 27th 2026
News
Google Summer of Code 2026 with Django
All the information you need to apply for Django's 21st consecutive year in the program.
Django Software Foundation
DSF member of the month - Baptiste Mispelon
Baptiste is a long-time Django and Python contributor who co-created the Django Under the Hood conference series and serves on the Ops team maintaining its infrastructure. He has been a DSF member since November 2014. You can learn more about Baptiste by visiting Baptiste's website and his GitHub Profile.
Wagtail CMS News
The *1000 most popular* Django packages
Based on GitHub stars and PyPI download numbers.
Updates to Django
Today, "Updates to Django" is presented by Johanan from Djangonaut Space! π
Last week we had 11 pull requests merged into Django by 10 different contributors - including 4 first-time contributors! Congratulations to Saish Mungase, Marco AurΓ©lio da Rosa Haubrich, μ‘°νμ€ and Muhammad Usman for having their first commits merged into Django - welcome on board!
This week's Django highlights:
-
BuiltinLookup.as_sql()now correctly handles parameters returned as tuples, ensuring consistency with release note guidance for custom lookups. This avoids the need for developers to audit both process_lhs() and as_sql() for tuple/list resilience when subclassing BuiltinLookup. (#36934) (#35972) -
SessionBase.__bool__() has been implemented, allowing session objects to be evaluated directly in boolean contexts instead of relying on truthiness checks. (#36899)
Django Newsletter
Django Fellow Reports
Django Fellow Report - Jacob
A short week with a US holiday and some travel to visit family, but still 4 tickets triaged, 12 reviewed, 3 authored, security report, and more.
Django Fellow Report - Natalia
Roughly 70% of my time this week went into security work, which continues being quite demanding. The remaining time was primarily dedicated to Mike's excellent write-up on the dictionary-based EMAIL_PROVIDERS implementation and migration, along with a smaller amount of ticket triage and PR review.
Also 2 tickets triaged, 9 reviewed, and other misc.
Sponsored Link 1
PyTV - Free Online Python Conference (March 4th)
1 Day, 15 Speakers, 6 hours of live talks including from Sarah Boyce, Sheena O'Connell, Carlton Gibson, and Will Vincent. Sign up and save the date!
Articles
β Django ORM Standaloneβ½ΒΉβΎ: Querying an existing database
A practical step-by-step guide to using Django ORM in standalone mode to connect to and query an existing database using inspectdb.
Using tox to Test a Django App Across Multiple Django Versions
A practical, production-ready guide to using tox to test your reusable Django app across multiple Python and Django versions, complete with packaging, minimal test settings, and a full version matrix.
How I Use django-simple-nav for Dashboards, Command Palettes, and More
Jeff shares how he uses django-simple-nav to define navigation once in Python and reuse it across dashboards and even a lightweight HTMX-powered command palette.
Serving Private Files with Django and S3
Django's FileField and ImageField are good at storing files, but on their own they don't let us control access. When β¦
CLI subcommands with lazy imports
In case you didn't hear, PEP 810 got accepted which means Python 3.15 is going to support lazy imports! One of the selling points of lazy imports is with code that has a CLI so that you only import code as necessary, making the app a bit more snappy
Events
DjangoCon US Updated Dates
The conference is now August 24-28, 2026 in Chicago, Illinois. The Call for Proposals (CFP) is open until March 16. And Early Bird Tickets are now available!
Sponsored Link 2
Sponsor Django News
Reach 4,300+ highly-engaged and experienced Django developers.
Podcasts
Django Chat #196: Freelancing & Community - Andrew Miller
Andrew is a prolific software developer based out of Cambridge, UK. He runs the solo agency Software Crafts, writes regularly, is a former Djangonaut, and co-founder of the AI banking startup Hamilton Rock.
PyPodcats Episode 11 with Sheena O'Connell
Sheena O'Connell tells us about her journey, the importance of community and good practices for teachers and educators in Python, and organizational psychology. We talk about how to enable a 10x team and how to enable the community through guild of educators.
Django Job Board
This week there is a very rare Infrastructure Engineer position for the PSF.
Infrastructure Engineer at Python Software Foundation π
Lead Backend Engineer at TurnTable
Backend Software Developer at Chartwell Resource Group Ltd.
Django Newsletter
Projects
yassi/dj-control-room
The control room for your Django app.
adamchainz/icu4py
Python bindings to the ICU (International Components for Unicode) library (ICU4C).
matagus/awesome-django-articles
π Articles explaining topics about Django like admin, ORM, views, forms, scaling, performance, testing, deployments, APIs, and more!
Sponsorship
π Reach 4,300+ Django Developers Every Week
Want to reach developers who actually read what they subscribe to?
Django News is opened by thousands of engaged Django and Python developers every week. A 52% open rate and 15% click rate means your message lands in front of people who pay attention.
Support the newsletter and promote your product, service, event, or job to builders who use Django daily.
π Explore sponsorship options: https://django-news.com/sponsorship
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
27 Feb 2026 5:00pm GMT



