23 Mar 2026
Django community aggregator: Community blog posts
Built with Django — Weekly Roundup (Mar 16–Mar 23, 2026)
Hey, Happy Monday!
Why are you getting this: *You signed up to receive this newsletter on Built with Django. I promised to send you the latest projects and jobs on the site as well as any other interesting Django content I encountered during the month. If you don't want to receive this newsletter, feel free to unsubscribe anytime.
Sponsor
This issue is sponsored by TuxSEO - your AI content team on auto-pilot.
- Plan and ship SEO content faster
- Generate practical, publish-ready drafts
- Keep your content pipeline moving every week
Projects
- Reckot - Speak and We make it, event management reinvent.
- Your Cloud Hub - YourCloudHub.ai is a technology and digital solutions company offering IT staffing and outsourcing services along with software development, website development, digital marketing.
From the Community
- Django Apps vs Projects Explained: A Complete Production Guide - DEV Community
- Building a Seamless JWT Onboarding Flow with React Router v7 and Django - DEV Community
- How to Show a Waitlist Until Your Wagtail Site Is Ready
Support
You can support this project by using one of the affiliate links below. These are always going to be projects I use and love! No "Bluehost" crap here!
- Buttondown - Email newsletter tool I use to send you this newsletter.
- Readwise - Best reading software company out there. I you want to up your e-reading game, this is definitely for you! It also so happens that I work for Readwise. Best company out there!
- Hetzner - IMHO the best place to buy a VPS or a server for your projects. I'll be doing a tutorial on how to use this in the future.
- SaaS Pegasus is one of the best (if not the best) ways to quickstart your Django Project. If you have a business idea but don't want to set up all the boring stuff (Auth, Payments, Workers, etc.) this is for you!
23 Mar 2026 6:00pm GMT
21 Mar 2026
Django community aggregator: Community blog posts
Human.json
I have seen more and more people talk about human.json lately and I think it is a pretty neat idea. From what I can tell it checks all the boxes I would expect from a protocol like this.
The fact that it relies on browser extensions right now makes sense, but might become a limiting factor in future. Or the number of extensions needs to go up beyond the two easy ones and come to mobile as well. I am not sure this will be going anywhere beyond a few enthusiastic people, but you never know.
Implementing the protocol was not much work, which is expected considering it only consists of two required values and an optional list of two more values. If you want to add it to your Django based site, I packaged everything up and you can find it on PyPI.
Should you use the package? Eh, that is not an easy question. From a supply chain perspective I would say "no". It is only a few lines of code. But you never know how the protocol will evolve, so things might look more complicated in a month. I will do my best to keep up with the protocol and not ship crypto miners.
I am still not a fan of Python packaging, but I have to admit uv makes it kind of bearable despite still not being without little gotchas.
21 Mar 2026 5:05pm GMT
Wagtail Routable Pages and Layout Configuration

If you are familiar with Wagtail CMS for Django, you know that you can create Wagtail pages and control their content and layout with blocks inside of stream fields. But what if you have entries coming from normal Django models through a routable page? In this article, I will explore how you can control the dynamic layout of a detail view in a routable page.
Routable pages in Wagtail are dynamic pages of your CMS page tree that can have their own URL subpaths and views. You can use them for filtered list and detail views, multi-step forms, multiple formats for the same data, etc. Here I will show you a routable ArticleIndexPage with a list and detail views for Article instances rendering the detail views based on the block layout in a detail_layout stream field.

1. Project Setup
Create a Wagtail project myproject and articles app:
pip install wagtail
wagtail start myproject
cd myproject
python manage.py startapp articles
Add to INSTALLED_APPS in your Django project settings:
INSTALLED_APPS = [
...
"wagtail.contrib.routable_page", # required for RoutablePage
"myproject.apps.articles",
]
2. File Structure
The articles app:
myproject/apps/articles/
├── __init__.py
├── apps.py
├── models.py # Article, Category, ArticleIndexPage
├── blocks.py # All StreamField block definitions
└── admin.py # Register Article and Category in Django admin
The articles templates:
myproject/templates/articles/
├── article_list.html # List view
├── article_detail.html # Detail view
└── blocks/
├── cover_image_block.html
├── description_block.html
└── related_articles_block.html
3. Models
myproject/apps/articles/models.py
Create the Category and Article Django models, and the ArticleIndexPage routable Wagtail page with article list and detail views:
from django.core.paginator import EmptyPage, PageNotAnInteger, Paginator
from django.db import models
from django.shortcuts import get_object_or_404
from django.utils.translation import gettext_lazy as _
from wagtail.admin.panels import FieldPanel, ObjectList, TabbedInterface
from wagtail.contrib.routable_page.models import RoutablePageMixin, path
from wagtail.fields import StreamField
from wagtail.models import Page
from .blocks import article_detail_layout_blocks
class Category(models.Model):
name = models.CharField(max_length=100, verbose_name=_("name"))
slug = models.SlugField(unique=True, verbose_name=_("slug"))
class Meta:
verbose_name = _("category")
verbose_name_plural = _("categories")
def __str__(self):
return self.name
class Article(models.Model):
title = models.CharField(max_length=255, verbose_name=_("title"))
slug = models.SlugField(unique=True, verbose_name=_("slug"))
category = models.ForeignKey(
Category,
null=True,
blank=True,
on_delete=models.SET_NULL,
related_name="articles",
verbose_name=_("category"),
)
cover_image = models.ForeignKey(
"wagtailimages.Image",
null=True,
blank=True,
on_delete=models.SET_NULL,
related_name="+",
verbose_name=_("cover image"),
)
description = models.TextField(blank=True, verbose_name=_("description"))
created_at = models.DateTimeField(auto_now_add=True, verbose_name=_("created at"))
class Meta:
verbose_name = _("article")
verbose_name_plural = _("articles")
def __str__(self):
return self.title
class ArticleIndexPage(RoutablePageMixin, Page):
"""
A single Wagtail page that owns:
- /articles/ → paginated list of all Articles
- /articles/<slug>/ → detail view for one Article
The StreamField is edited once in the Wagtail admin and
defines the layout for every detail view.
"""
articles_per_page = models.IntegerField(default=10, verbose_name=_("articles per page"))
detail_layout = StreamField(
article_detail_layout_blocks(),
blank=True,
use_json_field=True,
verbose_name=_("detail layout"),
help_text=_(
"Configure the layout for all article detail pages. "
"Add, remove, and reorder blocks to change what appears "
"on every article detail view."
),
)
# TabbedInterface gives List View and Detail View their own tabs.
# promote_panels and settings_panels must be added explicitly here
# because edit_handler takes full ownership of the admin UI structure.
edit_handler = TabbedInterface([
ObjectList(Page.content_panels + [FieldPanel("articles_per_page")], heading=_("List View")),
ObjectList([FieldPanel("detail_layout")], heading=_("Detail View")),
ObjectList(Page.promote_panels, heading=_("SEO / Promote")),
ObjectList(Page.settings_panels, heading=_("Settings")),
])
class Meta:
verbose_name = _("article index page")
verbose_name_plural = _("article index pages")
@path("")
def article_list(self, request):
all_articles = Article.objects.select_related("category", "cover_image").order_by("-created_at")
paginator = Paginator(all_articles, self.articles_per_page)
page_number = request.GET.get("page")
try:
articles = paginator.page(page_number)
except PageNotAnInteger:
articles = paginator.page(1)
except EmptyPage:
articles = paginator.page(paginator.num_pages)
return self.render(
request,
context_overrides={"articles": articles, "paginator": paginator},
template="articles/article_list.html",
)
@path("<slug:article_slug>/")
def article_detail(self, request, article_slug):
article = get_object_or_404(
Article.objects.select_related("category", "cover_image"),
slug=article_slug,
)
return self.render(
request,
context_overrides={"article": article},
template="articles/article_detail.html",
)
4. StreamField Blocks
myproject/apps/articles/blocks.py
Create Wagtail stream-field blocks for the cover image, description, and the related articles of an actual article. Each block can have some settings on how to represent the content of the block.
from django.utils.translation import gettext_lazy as _
from wagtail import blocks
class CoverImageBlock(blocks.StructBlock):
aspect_ratio = blocks.ChoiceBlock(
choices=[
("16-9", _("16:9 Widescreen")),
("4-3", _("4:3 Standard")),
("1-1", _("1:1 Square")),
("3-1", _("3:1 Banner")),
],
default="16-9",
label=_("Aspect ratio"),
help_text=_("Controls the cropping of the cover image."),
)
class Meta:
template = "articles/blocks/cover_image_block.html"
icon = "image"
label = _("Cover Image")
class DescriptionBlock(blocks.StructBlock):
max_lines = blocks.IntegerBlock(
min_value=0,
default=0,
label=_("Maximum lines"),
help_text=_("Clamp the description to this many lines. Set to 0 to show all."),
required=False,
)
class Meta:
template = "articles/blocks/description_block.html"
icon = "pilcrow"
label = _("Description")
class RelatedArticlesBlock(blocks.StructBlock):
sort_order = blocks.ChoiceBlock(
choices=[
("newest", _("Newest first")),
("oldest", _("Oldest first")),
("title_asc", _("Title A → Z")),
("title_desc", _("Title Z → A")),
],
default="newest",
label=_("Sort order"),
help_text=_("Order in which related articles are listed."),
)
def get_context(self, value, parent_context=None):
context = super().get_context(value, parent_context=parent_context)
article = (parent_context or {}).get("article")
if not article or not article.category_id:
context["related_articles"] = []
return context
from .models import Article
sort_map = {
"newest": "-created_at",
"oldest": "created_at",
"title_asc": "title",
"title_desc": "-title",
}
context["related_articles"] = (
Article.objects.select_related("category", "cover_image")
.filter(category=article.category)
.exclude(pk=article.pk)
.order_by(sort_map.get(value["sort_order"], "-created_at"))[:3]
)
return context
class Meta:
template = "articles/blocks/related_articles_block.html"
icon = "list-ul"
label = _("Related Articles")
def article_detail_layout_blocks():
"""
Returns the list of (name, block) tuples used in ArticleIndexPage.detail_layout.
Defined as a function so models.py can import it without circular issues.
"""
return [
("cover_image", CoverImageBlock()),
("description", DescriptionBlock()),
("related_articles", RelatedArticlesBlock()),
]
The RelatedArticlesBlock here also has a customized context where we pass related_articles variable with 3 other articles of the same category sorted by the sorting order defined in the block.
5. Templates
articles/article_list.html
This will be the template for the paginated article list. Later you could augment it with a search form and filters.
{% extends "base.html" %}
{% load wagtailcore_tags wagtailimages_tags i18n wagtailroutablepage_tags %}
{% block content %}
<main class="article-index">
<h1>{{ page.title }}</h1>
<ul class="article-list">
{% for article in articles %}
<li class="article-card">
{% if article.cover_image %}{% image article.cover_image width-400 as img %}
<img src="{{ img.url }}" alt="{{ article.title }}">
{% endif %}
<h2>
<a href="{% routablepageurl page "article_detail" article.slug %}">{{ article.title }}</a>
</h2>
{% if article.category %}<span class="badge">{{ article.category.name }}</span>{% endif %}
<p>{{ article.description|truncatewords:30 }}</p>
</li>
{% empty %}
<li>{% trans "No articles yet." %}</li>
{% endfor %}
</ul>
{% if articles.has_other_pages %}
<nav class="pagination" aria-label="{% trans 'Article pagination' %}">
{% if articles.has_previous %}
<a href="?page={{ articles.previous_page_number }}">{% trans "← Previous" %}</a>
{% endif %}
<span>{% blocktrans with num=articles.number total=articles.paginator.num_pages %}Page {{ num }} of {{ total }}{% endblocktrans %}</span>
{% if articles.has_next %}
<a href="?page={{ articles.next_page_number }}">{% trans "Next →" %}</a>
{% endif %}
</nav>
{% endif %}
</main>
{% endblock %}
articles/article_detail.html
The detail page would use the {% include_block page.detail_layout with article=article page=page %} to pass the article to the context of each block:
{% extends "base.html" %}
{% load i18n wagtailcore_tags wagtailroutablepage_tags %}
{% block content %}
<article class="article-detail">
<header>
<h1>{{ article.title }}</h1>
{% if article.category %}<span class="badge">{{ article.category.name }}</span>{% endif %}
</header>
{% include_block page.detail_layout with article=article page=page %}
<p>
<a href="{% routablepageurl page "article_list" %}">{% trans "← Back to all articles" %}</a>
</p>
</article>
{% endblock %}
articles/blocks/cover_image_block.html
Cover image block would show the article cover image with the aspect ratio set in the block:
{% load wagtailimages_tags %}
{% if article.cover_image %}
<div class="cover-image cover-image--{{ value.aspect_ratio }}">
{% image article.cover_image width-1200 as img %}
<img src="{{ img.url }}" alt="{{ article.title }}">
</div>
{% endif %}
articles/blocks/description_block.html
Description block would hide the article description text overflow based on the max lines set in the block:
<section class="article-description">
<p{% if value.max_lines > 0 %} class="line-clamp" style="-webkit-line-clamp: {{ value.max_lines }};"{% endif %}>
{{ article.description }}
</p>
</section>
articles/blocks/related_articles_block.html
The related articles block would list the related articles as defined in the extra context of the block:
{% load i18n wagtailimages_tags wagtailroutablepage_tags %}
{% if related_articles %}
<section class="related-articles">
<h2>{% trans "Related Articles" %}</h2>
<ul class="related-articles__list">
{% for rel in related_articles %}
<li class="related-card">
{% if rel.cover_image %}{% image rel.cover_image width-400 as img %}
<img src="{{ img.url }}" alt="{{ rel.title }}">
{% endif %}
<div class="related-card__body">
{% if rel.category %}<span class="badge">{{ rel.category.name }}</span>{% endif %}
<h3>
<a href="{% routablepageurl page "article_detail" rel.slug %}">{{ rel.title }}</a>
</h3>
<p>{{ rel.description|truncatewords:20 }}</p>
</div>
</li>
{% endfor %}
</ul>
</section>
{% endif %}
6. Django Admin Registration
articles/admin.py
Let's not forget to register admin views for the categories and articles so that we can add some data there:
from django.contrib import admin
from .models import Article, Category
@admin.register(Category)
class CategoryAdmin(admin.ModelAdmin):
list_display = ("name", "slug")
prepopulated_fields = {"slug": ("name",)}
@admin.register(Article)
class ArticleAdmin(admin.ModelAdmin):
list_display = ("title", "category", "created_at")
list_filter = ("category",)
search_fields = ("title", "description")
prepopulated_fields = {"slug": ("title",)}
7. Migrations and Initial Data
python manage.py makemigrations articles
python manage.py migrate
python manage.py createsuperuser
python manage.py runserver
8. Wagtail Admin Setup
- Open
http://localhost:8000/cms/and log in. - In the Pages explorer, create an Article Index Page as a child of the root page.
- Set the Slug to
articles.
- Set the Slug to
- On the List View tab, set Articles per page (e.g.
24). - On the Detail View tab, open the Detail Layout StreamField and add blocks in your preferred order:
- Cover Image - choose an aspect ratio.
- Description - optionally set a maximum line count to clamp long descriptions.
- Related Articles - choose the sort order for the three related articles shown.
- Publish the page.
- In the Django admin (
/django-admin/), create some Categories and Articles with cover images and descriptions. - Visit
http://localhost:8000/articles/for the paginated list. - Click any article to see the detail view rendered using the StreamField layout you configured in step 4.
Final words
Using stream fields we can render not only editorial content, for example, images or rich-text descriptions, but also dynamic content based on values from other models and/or the context of the given template.
The approach illustrated in this article allows us to create Wagtail pages where content editors have freedom to adjust the layouts of the pages or insert blocks, such as ads or info texts, into specific places based on real-time events.
21 Mar 2026 5:00pm GMT
20 Mar 2026
Django community aggregator: Community blog posts
How to Show a Waitlist Until Your Wagtail Site Is Ready

This year, I want to bring my centralized gamified donation platform www.make-impact.org to life (at least technically). Earlier I had the version I was developing separate from the waiting list, but I decided to merge them and have a switch between the waitlist and an early preview.
This allows me to have no data duplication, the possibility to create user accounts immediately, and saves hosting and maintenance costs.
This guide walks through a pattern that lets you ship a temporary waitlist page while your Wagtail site is still being built, with the ability to show your progress to chosen people. If you are building a Software as a Service (SaaS) or a web platform with Django, this article is for you.
The Concept
A custom start page view will check for a specific cookie value. If it is unset, the visitor will be redirected to a waitlist form at /waitlist/. If it is set, the visitor will be served the Wagtail home page.
All views under development will have a decorator that checks the cookie value and redirects to the start page if it is unset.
There will be a special view at /preview-access/ with a passphrase form that allows the visitor to gain preview access by setting the mentioned cookie. This view will also allow preview access to be deactivated.
These are the steps to implement this:
1. Generate and store two secrets
You will need two secret values, either set manually or generated with a cryptographically secure random generator (e.g. Python's secrets module):
PREVIEW_ACCESS_PASSPHRASE- the human-readable passphrase typed into the form. Share this with the people who need site access.PREVIEW_ACCESS_TOKEN- the opaque random value stored in the cookie. Never exposed to users; only the server compares against it.
>>> import secrets
>>> print(secrets.token_urlsafe(16)) # passphrase
dI5nGNftZOBx8m-r0m6glg
>>> print(secrets.token_hex(32)) # cookie token
c1b7a76e3ad5cbfb1657fa4e9885a3c8baa6a5a869f49a136abd0e873a9be9ee
Add both to environment variables or a secrets file untracked by Git, and load them in the Django project settings:
# myproject/settings/_base.py
PREVIEW_ACCESS_PASSPHRASE = get_secret("PREVIEW_ACCESS_PASSPHRASE")
PREVIEW_ACCESS_TOKEN = get_secret("PREVIEW_ACCESS_TOKEN")
The get_secret() here is my custom function to retrieve a secret from the secrets source.
2. Create the access-control decorator
Create myproject/apps/misc/decorators.py. Every protected view will import from here.
# myproject/apps/misc/decorators.py
from functools import wraps
from django.conf import settings
from django.shortcuts import redirect
def preview_access_required(view_func):
@wraps(view_func)
def wrapper(request, *args, **kwargs):
if request.COOKIES.get("preview_access") == settings.PREVIEW_ACCESS_TOKEN:
return view_func(request, *args, **kwargs)
return redirect("misc:home_page")
return wrapper
The decorator compares the cookie against the opaque unguessable token from settings, so unless the token value is known, a random attacker cannot gain access by setting the cookie manually in DevTools.
3. Create the passphrase form
Create myproject/apps/misc/forms.py. The form will have a single required password field. Validation will reject any value that does not match the setting.
# myproject/apps/misc/forms.py
from django import forms
from django.conf import settings
from django.utils.translation import gettext_lazy as _
class PreviewAccessForm(forms.Form):
passphrase = forms.CharField(
label=_("Passphrase"),
widget=forms.PasswordInput(
attrs={"autocomplete": "current-password"}
),
required=True,
)
def clean_passphrase(self):
value = self.cleaned_data["passphrase"]
if value != settings.PREVIEW_ACCESS_PASSPHRASE:
raise forms.ValidationError(
_("Incorrect passphrase.")
)
return value
4. Build the cookie toggle view
Point your browser to /preview-access/. When access is off it shows a passphrase form; when access is on it shows a disable button.
# myproject/apps/misc/views.py
from django.conf import settings
from django.shortcuts import redirect, render
from .forms import PreviewAccessForm
def preview_access(request):
has_access = request.COOKIES.get("preview_access") == settings.PREVIEW_ACCESS_TOKEN
if request.method == "POST":
if has_access:
response = redirect("misc:home_page")
response.delete_cookie("preview_access")
return response
form = PreviewAccessForm(request.POST)
if form.is_valid():
response = redirect("misc:home_page")
response.set_cookie(
"preview_access",
settings.PREVIEW_ACCESS_TOKEN,
httponly=True,
samesite="Strict",
)
return response
else:
form = PreviewAccessForm()
return render(
request,
"preview_access/preview_access.html",
{"has_access": has_access, "form": form}
)
Key points: - Disabling never requires the passphrase - the cookie is already proof of prior access. - The cookie is set with httponly=True (not readable by JavaScript) and samesite="Strict" (not sent on cross-site requests). - The cookie value is the opaque token, not "1", so it cannot be guessed.
The template renders the passphrase input only when not has_access, and shows field-level errors from the form if the passphrase is wrong.
5. Wrap the Wagtail catch-all with the decorator
Replace the default Wagtail catch-all route handler with a thin wrapper that enforces the same cookie check.
# myproject/apps/misc/views.py
from myproject.apps.misc.decorators import preview_access_required
from wagtail.views import serve as wagtail_serve
@preview_access_required
def serve_wagtail_page(request, path=""):
return wagtail_serve(request, path)
Without this, a visitor who knows any Wagtail page URL could bypass the gate by typing it directly into the browser.
6. Build the proxy home page view
This view is the only entry point to the site. It decides what every visitor sees first.
# myproject/apps/misc/views.py
from django.conf import settings
from wagtail.models import Site
from wagtail.views import serve as wagtail_serve
def home_page(request):
if request.COOKIES.get("preview_access") == settings.PREVIEW_ACCESS_TOKEN:
# serve Wagtail directly
site = Site.find_for_request(request)
return wagtail_serve(request, "")
# redirect to waiting list
return redirect("waiting_list")
Key point: the waiting_list view and a Wagtail Site and page must exist and be matched to the request domain before wagtail_serve is called.
7. Wire up the URLs
Django project URL rules:
# myproject/urls.py
from django.conf.urls.i18n import i18n_patterns
from django.urls import re_path
from wagtail.coreutils import WAGTAIL_APPEND_SLASH
from myproject.apps.misc import views as misc_views
if WAGTAIL_APPEND_SLASH:
wagtail_serve_pattern = r"^((?:[\w\-]+/)*)$"
else:
wagtail_serve_pattern = r"^([\w\-/]*)$"
urlpatterns += i18n_patterns(
# ... all your other app URLs above ...
# Catch-all - must be last
re_path(
wagtail_serve_pattern,
misc_views.serve_wagtail_page,
name="wagtail_serve"
),
)
The misc app URLs:
# myproject/apps/misc/urls.py
from django.urls import path
from . import views
app_name = "misc"
urlpatterns = [
path("", views.home_page, name="home_page"),
path("preview-access/", views.preview_access, name="preview_access"),
]
The waiting_list app URLs:
# myproject/apps/waiting_list/urls.py
from django.urls import path
from . import views
urlpatterns = [
path("waitlist/", views.show_waiting_list_form, name="waiting_list"),
]
8. Protect every other app view
Import and apply @preview_access_required to every view that belongs to the real site. Class-based views can be wrapped at assignment time:
from myproject.apps.misc.decorators import preview_access_required
# Function-based view
@preview_access_required
def event_list(request):
...
# Class-based view
event_list = preview_access_required(
EventListView.as_view()
)
Waiting-list views, API views, social authentication views, and static/legal pages (/imprint/, /privacy/, etc.) must not receive this decorator - they need to remain publicly accessible.
Final words
You get a lot of benefits from this setup. The waitlist measures demand for your website while you are still building. Invited test users can evaluate your progress at any time. While you are developing the website, you do not necessarily need multiple servers. Launching later is also easier - no hassle or delays with domain IP updates and SSL certificates.
20 Mar 2026 5:00pm GMT
Django News - Sunsetting Jazzband - Mar 20th 2026
News
Sunsetting Jazzband
After more than a decade maintaining 80+ Python projects, Jazzband is winding down as AI-generated spam and long-standing sustainability challenges make its open, shared-maintenance model no longer viable.
Astral to join OpenAI
Astral, creators of Ruff and uv, are joining OpenAI's Codex team to push the future of AI-powered Python development while continuing to support their open source tools.
Wagtail CMS News
Wagtail Security team no longer accepts GPG-encrypted emails
Wagtail's security team has dropped GPG-encrypted email support, citing zero real-world use and modern encryption making it unnecessary while simplifying their workflow.
Updates to Django
Today, "Updates to Django" is presented by Raffaella from Djangonaut Space! 🚀
Last week we had 18 pull requests merged into Django by 15 different contributors - including a first-time contributor! Congratulations to dcsid for having their first commits merged into Django - welcome on board!
The undocumented get_placeholder method of Field is deprecated in favor of the newly introduced get_placeholder_sql method, which has the same input signature but is expected to return a two-elements tuple composed of an SQL format string and a tuple of associated parameters. This method should now expect to be provided expressions meant to be compiled via the provided compiler argument.(#36727)
Django Newsletter
Sponsored Link 1
The deployment service for developers and teams.
Articles
DjangoCon US Talks I'd Like to See 2026 Edition
A curated wishlist of timely, thought-provoking DjangoCon US 2026 talk ideas, from Python's future and deployment wins to Rust, LLMs, and real-world team productivity.
Defense in Depth: A Practical Guide to Python Supply Chain Security
A practical, defense-in-depth guide to securing Python's supply chain, covering everything from linting and dependency pinning to SBOMs, vulnerability scanning, and trusted publishing.
Python Unplugged on PyTV Recap
A behind-the-scenes post on this first-ever digital Python conference that featured a host of Django speakers.
django-security-label: A third-party package to anonymize data in your models
Define data masking rules directly on your Django models and let PostgreSQL enforce anonymization automatically, keeping sensitive data out of your app layer by design.
Djangonaut Diaries: Week 1, part 2 - Creating and debugging a Django project - DEV Community
A hands-on guide to spinning up a local Django project, generating realistic test data, and using VS Code's debugger to step into Django internals and understand how admin delete views work.
Typing Your Django Project in 2026
Typing Django in 2026 is still a tradeoff between slower, accurate mypy + django-stubs and faster tools that struggle with Django's dynamic magic, though native typing support may finally be on the horizon.
Python 3.15's JIT is now back on track
Python 3.15's once-struggling JIT is finally delivering real speedups, thanks to a scrappy, community-driven effort and a few surprisingly lucky design bets.
Thoughts on OpenAI acquiring Astral and uv/ruff/ty
Simon Willison provides some timely insights on the recent acquisition making waves in our community.
OpenAI Acquiring Astral: A 4th Option for Funding Open Source
Thoughts on the three traditional ways to fund open source and the new fourth option (VC funding) currently makes waves.
Events
How DjangoCon US Selects Talk Proposals
A behind-the-scenes look at how DjangoCon US turns anonymous proposals and community reviews into a balanced, inclusive conference lineup.
PyCon US 2026 Conference Schedule is live!
PyCon US 2026's Conference Schedule is live!
Podcasts
Django Chat #198: PyCon US 2026 - Elaine Wong & Jon Banafato
Elaine and Jon are the chair/co-chair respectively of PyCon US, the largest Python conference in North America, happening this May in Long Beach, CA. We discuss what to expect at the conference, new additions from last year, tips on where to stay, and generally how to maximize your PyCon experience.
Django Job Board
Two standout Python roles this week include a client-facing Solutions Architect position at JetBrains and an Infrastructure Engineer opening at the Python Software Foundation.
Solutions Architect - Python (Client-facing) at JetBrains
Infrastructure Engineer at Python Software Foundation
Django Newsletter
Django Forum
Improve free-threading performance - Django Internals
A CPython core developer is proposing small but impactful changes to help Django scale better under free-threaded Python, sparking early collaboration on tackling shared state, caching, and lock contention issues.
Projects
codingjoe/django-mail-auth
Django authentication via login URLs, no passwords required.
duriantaco/skylos
Yet another static analysis tool for Python codebases written in Python that detects dead code.
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
20 Mar 2026 3:00pm GMT
19 Mar 2026
Django community aggregator: Community blog posts
OpenAI Acquiring Astral: A 4th Option for Funding Open Source
Thoughts on the recent acquisition and what it portends for open source software.
19 Mar 2026 10:56am GMT
18 Mar 2026
Django community aggregator: Community blog posts
PyCon US 2026 - Elaine Wong & Jon Banafato
🔗 Links
- PyCon US website
- Volunteer mailing list
- Elaine on LinkedIn and Jon's personal site
🎥 YouTube
Sponsor
This episode is brought to you by Six Feet Up, the Python, Django, and AI experts who solve hard software problems. Whether it's scaling an application, deriving insights from data, or getting results from AI, Six Feet Up helps you move forward faster.
See what's possible at sixfeetup.com.
18 Mar 2026 10:00pm GMT
Tombi, pre-commit, prek and uv.lock
In almost all my Python projects, I'm using pre-commit to handle/check formatting and linting. The advantage: pre-commit is the only tool you need to install. Pre-commit itself reads its config file and installs the formatters and linters you defined in there.
Here's a typical .pre-commit-config.yaml:
default_language_version: python: python3 repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v6.0.0 hooks: - id: trailing-whitespace - id: end-of-file-fixer - id: check-yaml args: [--allow-multiple-documents] - id: check-toml - id: check-added-large-files - repo: https://github.com/astral-sh/ruff-pre-commit # Ruff version. rev: v0.15.6 hooks: # Run the linter. - id: ruff args: ["--fix"] # Run the formatter. - id: ruff-format - repo: https://github.com/tombi-toml/tombi-pre-commit rev: v0.9.6 hooks: - id: tombi-format args: ["--offline"] - id: tombi-lint args: ["--offline"]
The "tombi" at the end might be a bit curious. There's already the build-in "check-toml" toml syntax checker, right? Well, tombi also does formatting and schema validation. And in a recent project, I handled configuration through toml files.
It was for a Django website where several geographical maps were shown, each with its own title, description, legend yes/no, etcetera. I made up a .toml configuration format so that a colleague could configure all those maps without needing to deal with the python code. I created a json schema as format specification (yes, json is funnily used for that purpose). With tombi, I could make sure the config files were valid.
Oh, and tombi has an LSP plugin, so I my colleague got autocomplete and syntax help out of the box. nice.
I'm also using uv a lot. That generates an uv.lock file, in .toml format, with all the version pins. It is a toml file, but without the .toml extension. So pre-commit ignored it. Until suddenly it started complaining about the indentation. But only in a github action, not locally.
Note: the complaint about the indentation is probably correct, as there's an issue in the uv bugtracker about changing the indentation from 4 to 2 in the lockfile.
The weird thing for me was that I pin the the versions of the plugins. So the behaviour locally and on github should be the same. Some observations:
- Running tombi from the commandline on uv.lock resulted in re-formatting to two spaces, whatever the tombi version.
- Pre-commit locally did not re-format the file, but pre-commit on the server did.
- I tried it with the new rust-based alternative for pre-commit, prek (see https://github.com/j178/prek), which did re-format uv.lock.
Some further debugging showed that pre-commit was actually skipping the uv.lock file. But apparently not on github. I did some searching in pre-commit's source code and tombi's pre-commit hook definition. The only relevant part there was types: [toml]. So somehow pre-commit has a definition of what a toml file is. But I couldn't find anything.
Until I spotted that pre-commit uses identify as the means to detect file types. (Looks like a handy library, btw!). And that project had a change a couple of weeks ago that identifies uv.lock as a toml file!
- My colleague updated his pre-commit installation and yes: uv.lock was getting re-formatted.
- So: github actions had a newer version than we had.
- Weird, as I just updated my python tool install this morning. Ah: I installed it with homebrew instead of uv tool, that's why it is still older.
Anyway: small mystery solved.
18 Mar 2026 4:00am GMT
16 Mar 2026
Django community aggregator: Community blog posts
Built with Django — Weekly Roundup (Mar 09–Mar 16, 2026)
Hey, Happy Monday!
Why are you getting this: *You signed up to receive this newsletter on Built with Django. I promised to send you the latest projects and jobs on the site as well as any other interesting Django content I encountered during the month. If you don't want to receive this newsletter, feel free to unsubscribe anytime.
Sponsor
This issue is sponsored by TuxSEO - your AI content team on auto-pilot.
- Plan and ship SEO content faster
- Generate practical, publish-ready drafts
- Keep your content pipeline moving every week
Projects
- Table of Contents Generator - Generate a clickable Table of Contents for any PDF in seconds. Free, private, no account required. Works with reports, eBooks, manuals, and papers.
Jobs
From the Community
- Django Admin: Building a Production-Ready Back Office Without Starting From Scratch | Medium by Anas Issath
- Beginner's Guide to Open Source Contribution | Djangonaut Space 2026 - DEV Community
- Django Tutorial - GeeksforGeeks
Support
You can support this project by using one of the affiliate links below. These are always going to be projects I use and love! No "Bluehost" crap here!
- Buttondown - Email newsletter tool I use to send you this newsletter.
- Readwise - Best reading software company out there. I you want to up your e-reading game, this is definitely for you! It also so happens that I work for Readwise. Best company out there!
- Hetzner - IMHO the best place to buy a VPS or a server for your projects. I'll be doing a tutorial on how to use this in the future.
- SaaS Pegasus is one of the best (if not the best) ways to quickstart your Django Project. If you have a business idea but don't want to set up all the boring stuff (Auth, Payments, Workers, etc.) this is for you!
16 Mar 2026 6:00pm GMT
15 Mar 2026
Django community aggregator: Community blog posts
How I deploy my projects to a single VPS with Gitea, NGINX and Docker
Hello everyone 👋
A few weeks ago, the team behind Jmail (a Gmail-styled interface for browsing the publicly released Epstein files) shared that they had racked up a $46,485 bill on Vercel The site had gone viral with ~450 million pageviews, and Vercel's pricing structure turned that into a five-figure invoice. Vercel's CEO ended up covering the bill personally, which is nice, but not exactly a scalable solution 😅
When I saw that story, my first thought was: this is an efficiency problem. Jmail is essentially a search interface on top of mostly static content. An SRE on Hacker News mentioned they handle 200x Jmail's request load on just two Hetzner servers. The whole thing could have been served from a moderately sized VPS for a fraction of the cost.
That got me thinking about my own setup. I run everything on a single VPS: my blog, my side projects, my git server, analytics, a wiki, a forum, a secret sharing tool, and more. The whole thing is held together by NGINX, Gitea, some bash scripts, and Docker. No Kubernetes, no Terraform, no CI/CD platform with a $500/month bill. Just a cheap VPS, some config files, and a deployment flow that's simple enough that I can fix it from my phone at the beach (I've written about that before).
I get asked about my deployment setup more often than I expected, so I figured I'd write it all down. Let me walk you through the whole thing.
The VPS
I'm running a Hetzner Cloud CPX21 in Nuremberg, Germany. Here are the specs:
| Spec | Value |
|---|---|
| vCPUs | 3 |
| RAM | 4 GB |
| Disk | 80 GB SSD |
| OS | Ubuntu |
| Price | ~€7-8/month |
The CPX21 is one of Hetzner's shared vCPU instances. It's cheap, reliable, and more than enough for what I need. I'm usually sitting at around ~10% CPU and ~2GB RAM, so there's plenty of headroom.
I set up the VPS manually. No Ansible, no configuration management, just plain old SSH and installing things by hand. I know, I know, "infrastructure as code" and all that. But for a single server that I manage myself, the overhead of automating the setup isn't worth it. If the server dies, I can set it up again in a couple of hours and restore from backups.
What's running on it
Here's everything running on this single VPS:
Bare metal (directly on the server)
| Service | Purpose |
|---|---|
| Gitea | Self-hosted git server |
| NGINX | Web server / reverse proxy |
| Certbot | SSL/TLS certificates |
| PHP-FPM | For WordPress sites |
| DokuWiki | Personal wiki |
| fail2ban | Brute force protection |
| UFW | Firewall |
| A couple WordPress sites | Various projects |
Docker
| Service | Purpose |
|---|---|
| ntfy | Push notifications |
| shhh | Secret sharing |
| SearXNG | Privacy-respecting search engine |
| WireGuard | VPN |
| phpBB | YAMS community forum |
| Umami | Privacy-respecting analytics |
| Gitea Actions runner | CI/CD runner |
| Watchtower | Automatic Docker image updates |
Static sites (Hugo, served by NGINX)
| Site | Purpose |
|---|---|
| rogs.me | This blog! |
| montevideo.restaurant | Restaurant directory |
| yams.media | YAMS documentation site |
That's a lot of stuff for a 4GB VPS. But static sites are basically free in terms of resources, and the Docker services are all lightweight. The heaviest things are probably Gitea and the WordPress sites, and even those barely register.
The web server: NGINX
Every site and service gets its own NGINX config file in /etc/nginx/conf.d/. One file per site, nice and clean. No sites-available / sites-enabled symlink dance.
Here's what a typical config looks like for one of my Hugo sites:
server {
root /var/www/rogs.me;
index index.html;
server_name rogs.me;
location / {
try_files $uri $uri/ =404;
}
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/rogs.me/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/rogs.me/privkey.pem;
include /etc/letsencrypt/options-ssl-nginx.conf;
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
}
server {
if ($host = rogs.me) {
return 301 https://$host$request_uri;
}
server_name rogs.me;
listen 80;
return 404;
}
Nothing fancy. Serve files from /var/www/rogs.me, redirect HTTP to HTTPS, done. The SSL bits are all managed by Certbot (more on that later).
For Docker services, the config looks slightly different because NGINX acts as a reverse proxy:
server {
server_name analytics.rogs.me;
location / {
proxy_pass http://localhost:3000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
listen 443 ssl; # managed by Certbot
# ... SSL config same as above
}
Same pattern: one file per service, NGINX handles SSL termination, and proxies to whatever port the Docker container exposes on localhost.
SSL/TLS with Let's Encrypt
All certificates come from Let's Encrypt via Certbot. I installed it with apt and used the NGINX plugin:
sudo apt install certbot python3-certbot-nginx
sudo certbot --nginx -d rogs.me
Certbot modifies the NGINX config automatically to add the SSL directives (that's why you see those # managed by Certbot comments).
Certificates auto-renew daily at 3 AM via a cron job:
0 3 * * * certbot renew -q
The -q flag keeps it quiet: no output unless something goes wrong. Certbot is smart enough to only renew certificates that are close to expiring, so running it daily is fine.
Self-hosted git with Gitea
I use Gitea as my primary git server. It runs bare metal on the VPS (not in Docker) and lives at git.rogs.me.
Why Gitea instead of just using GitHub? I want to own my git infrastructure. GitHub is great for collaboration, but I like having control over where my code lives. If GitHub goes down or decides to change their terms, my repos are safe on my own server.
That said, I mirror everything to both GitHub and GitLab so other people can collaborate, open issues, and submit PRs. Best of both worlds: I own the primary, and the mirrors handle the social coding side.
Gitea Actions
Gitea has a built-in CI/CD system called Gitea Actions that's compatible with GitHub Actions workflows. The runner is the official gitea/act_runner Docker image, running on the same VPS. Pretty vanilla setup, no custom configuration.
This is the core of my deployment pipeline. Every time I push to master, Gitea Actions picks up the workflow and deploys the site.
Deploying Hugo sites
This is where it all comes together. All three of my Hugo sites follow the exact same deployment pattern. Here's the flow:
┌──────────┐ push ┌──────────┐ Gitea Actions ┌──────────┐
│ Local │────────────────▶ │ Gitea │ ────────────────────▶│ Runner │
│ machine │ │(git.rogs)│ │ (Docker) │
└──────────┘ └──────────┘ └────┬─────┘
│
SSH into same VPS
│
▼
┌──────────┐
│ VPS │
│ git pull │
│ build.sh │
└────┬─────┘
│
Hugo builds to
/var/www/domain/
│
▼
┌──────────┐
│ NGINX │
│ serves │
└──────────┘
Yes, the Gitea Actions runner SSHes into the same server it's running on. I know that's a bit redundant, but I designed it this way on purpose: if I ever move my hosting somewhere else (or switch back to GitHub Actions), the workflow doesn't need to change. The SSH target is just a secret, so I swap an IP address and everything keeps working.
The Gitea Actions workflow
Here's the workflow file that lives in .gitea/workflows/deploy.yml in each repo:
name: deploy
on:
push:
branches:
- master
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Deploy via SSH
uses: appleboy/ssh-action@v1
with:
host: ${{ secrets.SSH_HOST }}
username: ${{ secrets.SSH_USER }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
port: ${{ secrets.SSH_PORT }}
script: |
cd repo && git stash && git pull --force origin master && ./build.sh
It's beautifully simple:
- Push to
mastertriggers the workflow - The runner uses appleboy/ssh-action to SSH into the server
- On the server: stash any local changes, pull the latest code, and run the build script
The git stash is there as a safety net. The WebP conversion in the build script modifies tracked files (more on that in a second), so without the stash, git pull would complain about dirty working tree.
All four secrets (SSH_HOST, SSH_USER, SSH_PRIVATE_KEY, SSH_PORT) are configured in Gitea's repository settings. The SSH key has access to the server but is locked down to only what the deployment needs.
The build script
Every Hugo site has a build.sh in the repo root. Here's the one for this blog:
#!/bin/bash
# Convert all images to WebP for better performance
for file in $(git ls-files --others --cached --exclude-standard \
| grep -v '.git' \
| grep -E '\.(png|jpg|jpeg)$'); do
cwebp -lossless "$file" -o "${file%.*}.webp"
done
# Update all references from png/jpg/jpeg to webp
for tracked_file in $(git ls-files --others --cached --exclude-standard \
| grep -v '.git'); do
sed -i 's/\.webp/.webp/g' "$tracked_file"
sed -i 's/\.webp/.webp/g' "$tracked_file"
sed -i 's/\.webp/.webp/g' "$tracked_file"
done
# Build the site
hugo -s . -d /var/www/rogs.me/ --minify --cacheDir $PWD/hugo-cache
Three things happen here:
- Image optimization: Every PNG, JPG, and JPEG gets converted to WebP using
cwebp(lossless mode, so no quality loss). WebP files are significantly smaller than their originals. - Reference rewriting: All file references get updated from
.webp/.webp/.webpto.webp. This is why we needgit stashin the workflow; this step modifies tracked files. - Hugo build: Generates the static site with minification enabled and outputs it directly to
/var/www/rogs.me/. NGINX is already configured to serve from that directory, so the site is live immediately.
The --cacheDir flag keeps Hugo's build cache in the repo directory, which speeds up subsequent builds.
Each site's build.sh is essentially identical, just with a different output path (montevideo.restaurant, yams.media, etc.).
Variations across sites
While the pattern is the same, there are small differences:
- yams.media has a two-job workflow: a
test_buildjob runs Hugo in a Docker container first to make sure the build succeeds, and only then does the deploy job run. This is because the YAMS docs site has more contributors, so I want to catch build errors before they hit production. - yams.media also uses
--cleanDestinationDirand--gcflags for a cleaner build output.
Docker services and Watchtower
Most of my non-static services run in Docker with docker-compose. Each service has its own directory in /opt/:
/opt/
├── analytics.rogs.me/ # Umami
│ └── docker-compose.yml
├── ntfy/
│ └── docker-compose.yml
├── shhh/
│ └── docker-compose.yml
├── searx/
│ └── docker-compose.yml
└── ...
For updates, I use Watchtower. It runs as a Docker container itself and periodically checks if there are newer images available for my running containers. If there are, it pulls the new image, stops the old container, and starts a new one with the same configuration.
version: "3"
services:
watchtower:
image: containrrr/watchtower
volumes:
- /var/run/docker.sock:/var/run/docker.sock
restart: unless-stopped
Is this a bit risky? Sure. An automatic update could break something. But in practice, it hasn't failed me once, and the services I'm running are stable enough that breaking changes in Docker images are rare. For a personal setup, the convenience of never having to manually update containers is worth the small risk.
Security
I'm not running a bank here, but I do take basic security seriously:
- UFW (Uncomplicated Firewall): Only NGINX ports (80, 443) and SSH are open. Everything else is blocked.
- fail2ban: Watches SSH logs and bans IPs after too many failed login attempts. Essential if your SSH port is exposed to the internet.
- SSH keys only: Password authentication is disabled. If you don't have the key, you're not getting in.
- Let's Encrypt everywhere: Every site and service gets HTTPS. No exceptions.
- Docker services on localhost: All Docker containers bind to
localhost. They're only accessible through the NGINX reverse proxy, which handles SSL termination.
# Quick UFW setup
sudo ufw default deny incoming
sudo ufw default allow outgoing
sudo ufw allow 'Nginx Full'
sudo ufw allow ssh
sudo ufw enable
DNS
All my domains use Cloudflare for DNS. But only DNS for most of them. I'm not using Cloudflare's CDN or proxy features on my main sites. The DNS records point directly to my VPS IP with the proxy toggle set to "DNS only" (the grey cloud, not the orange one).
Why Cloudflare for DNS? Two reasons. First, it's free, fast, and the dashboard is easy to use. Second, and more importantly: if something goes wrong, I can switch to using Cloudflare's full proxy and DDoS protection with the flick of a button. Just toggle the grey cloud to orange and you're behind Cloudflare's network instantly.
I've already had to do this once. forum.yams.media (the YAMS community forum) was getting DDoSed and swarmed by bots constantly. Flipping that toggle to orange solved the problem immediately. The rest of my sites run without Cloudflare's proxy because they don't need it, but knowing I can turn it on in seconds gives me peace of mind.
Backups
This is the part that most people skip. Don't be most people.
My backup strategy has two stages:
┌─────────────┐ 11 PM cron ┌───────────────────┐
│ VPS │ ───────────────▶│ /home/backups/ │
│ (services) │ tar + GPG │ (encrypted .gpg) │
└─────────────┘ └─────────┬─────────┘
│
midnight cron
(SSH pull)
│
▼
┌──────────────────┐
│ Home Server │
│ (NAS + S3) │
└──────────────────┘
Stage 1: Backup on the VPS (11 PM)
Every night at 11 PM, a series of cron jobs run backup scripts for each service. Each script follows the same pattern:
#!/bin/bash
BACKUP_DIR="/home/backups/servicename"
TARGET_DIR="/path/to/service"
DATE=$(date +%Y-%m-%d-%s)
BACKUP_FILE="$BACKUP_DIR/backup-servicename-$DATE.tar.zst"
ENCRYPTED_FILE="$BACKUP_FILE.gpg"
LOG_FILE="/var/log/backup_servicename.log"
GPG_RECIPIENT="your-email@example.com"
log_message() {
echo "$(date +'%Y-%m-%d %H:%M:%S') - $1" | tee -a "$LOG_FILE"
}
log_message "=== Starting backup ==="
mkdir -p "$BACKUP_DIR"
# For Docker services: stop containers first
docker compose stop
# Create compressed archive
tar -caf "$BACKUP_FILE" -C "$TARGET_DIR" .
# Encrypt with GPG
gpg --encrypt --armor -r "$GPG_RECIPIENT" -o "$ENCRYPTED_FILE" "$BACKUP_FILE"
rm -f "$BACKUP_FILE" # Remove unencrypted version
# For Docker services: restart containers
docker compose up -d
log_message "=== Backup completed ==="
Key points:
- Compression: I use
tar.zst(Zstandard) for compression. It's faster than gzip and produces smaller files. - Encryption: Every backup gets GPG-encrypted before it touches the network. Even if someone gets access to the backup files, they're useless without my private key.
- Docker services: For services running in Docker, the script stops the containers before backing up to ensure data consistency, then starts them again. This causes a brief downtime (usually a few seconds), which is fine for personal services at 11 PM.
- Database dumps: For services with databases (like Gitea, which uses MySQL), the script dumps the database separately with
mysqldumpbefore creating the archive. - Logging: Every step is logged to
/var/log/, so I can check if something went wrong.
Stage 2: Pull to home server (midnight)
At midnight, my home server SSHes into the VPS and pulls all the encrypted backup files to my local NAS. From there, they also get pushed to an S3 bucket.
This gives me the classic 3-2-1 backup strategy: 3 copies of the data (VPS, NAS, S3), on 2 different media types, with 1 offsite copy. If Hetzner's datacenter burns down, I have everything locally. If my house burns down, I have everything in S3.
Monitoring
I run Uptime Kuma on my home server to monitor all my services. It checks every site and service periodically and sends me a notification (via ntfy, naturally) if something goes down.
It's not fancy, but it works. I've caught a few issues before anyone else noticed them, which is the whole point.
The big picture
Here's what the whole setup looks like:
┌─────────────────────────────────────────────────────────┐
│ Hetzner CPX21 │
│ │
│ ┌─────────┐ ┌──────────────────────────────────┐ │
│ │ Gitea │ │ NGINX │ │
│ │ Actions │ │ ┌──────────┐ ┌──────────────┐ │ │
│ │ Runner │ │ │ Static │ │ Reverse │ │ │
│ │ (Docker) │ │ │ sites │ │ proxy to │ │ │
│ └────┬─────┘ │ │/var/www/ │ │ Docker svcs │ │ │
│ │ │ └──────────┘ └──────────────┘ │ │
│ │ SSH │ ▲ │ │ │
│ │ └────────┼──────────────┼──────────┘ │
│ │ │ │ │
│ ▼ │ ▼ │
│ ┌─────────┐ ┌───────┐ ┌───────────┐ │
│ │ Git │──build──│ Hugo │ │ Docker │ │
│ │ repos │ │ sites │ │ services │ │
│ └─────────┘ └───────┘ └───────────┘ │
│ │
│ ┌─────────────┐ ┌──────────┐ ┌────────────┐ │
│ │ Gitea │ │ Certbot │ │ fail2ban │ │
│ │ (bare metal)│ │ (SSL) │ │ + UFW │ │
│ └─────────────┘ └──────────┘ └────────────┘ │
└─────────────────────────────────────────────────────────┘
Conclusion
The whole philosophy here is simplicity. There's no orchestration tool, no container registry, no deployment platform. It's just:
- Push code to Gitea
- A workflow SSHes into the server
- Git pull + bash script builds the site
- NGINX serves it
Could I make this more sophisticated? Sure. Could I use Ansible to manage the server config, or Kubernetes to orchestrate the containers, or a proper CI/CD platform with build artifacts and rollbacks? Absolutely. But for a personal setup that hosts a blog, some side projects, and a handful of services, this is more than enough.
The setup has been running for years with minimal maintenance. The most time I spend on it is writing backup scripts for new services and adding NGINX configs when I deploy something new. Everything else is automated: deployments, SSL renewals, Docker updates, backups.
If you're thinking about self-hosting your projects, my advice is: start simple. A VPS, NGINX, and a bash script can take you surprisingly far. You can always add complexity later if you need it, but in my experience, you probably won't.
If you have questions about any part of this setup, feel free to reach out on the Contact page. I'm always happy to help people get started with self-hosting.
See you in the next one!
15 Mar 2026 5:00am GMT
14 Mar 2026
Django community aggregator: Community blog posts
10 Years of Jazzband
Jazzband is sunsetting. Before moving on, here's a look at what 10 years of cooperative coding actually looked like.
By the numbers
Five years in, we had about 1,350 members and 55 projects. Here's where things stand now:
Members
- 3,135 total members over the years
- 2,133 members currently - a 68% retention rate over 10 years
- New members every year, peaking at 424 in 2022
- Members who left stayed an average of 510 days
- Based on GitHub profiles (only ~28% of members list a location), members from at least 56 countries across every continent but Antarctica - 36% Europe, 30% Asia, 22% North America, 7% South America, 3% Africa, 1% Oceania. Real numbers are likely higher. And given how widely Python is used in research, someone on Antarctica has probably pip-installed a Jazzband project at some point
Projects
- 84 projects total, 71 still active
- 13 projects left again over the years
- ~93,000 GitHub stars across all projects
- ~16,000 forks
Activity
- ~43,800 commits across all repositories
- ~15,600 pull requests
- ~12,200 issues
Releases
- 1,429 package uploads via Jazzband's release pipeline
- 1,312 releases to PyPI across 56 projects and 390 versions
- 281 MB of release artifacts total
- First upload in November 2017, most recent in March 2026
Project teams
- 470 project team memberships
- 105 lead roles across 81 project leads
- Most prolific leads: aleksihakli, hramezani, claudep, and camilonova each maintained 4 projects
How Jazzband was actually used
The numbers above only tell part of the story. Here's what's more interesting.
Not everyone used the release pipeline
20 active projects never shipped a single release through it. Projects like Watson (2,515 stars), django-rest-knox (1,255), and django-admin2 (1,187) used Jazzband as a collaborative home - for shared access, triage, and maintenance - not for releases. The pipeline was useful for the projects that used it, but it wasn't what made Jazzband work for most people.
Old projects stayed alive
django-avatar's repo was created in 2008 and shipped its most recent Jazzband release in January 2026 - a 17-year-old repo still getting releases. django-axes (2009), sorl-thumbnail (2010), django-constance (2010), and 18 other projects created before 2015 were all still getting releases in 2025 or 2026. Jazzband kept old projects alive long after their original authors moved on. That was the whole point.
Release cadence varied wildly
django-axes had the most active release cadence: 253 release files across 127 versions, peaking at 28 versions in 2019 - roughly one every 13 days. pip-tools was second at 138 releases / 69 versions.
Meanwhile, 7 active projects have no team members at all - django-permission, django-mongonaut, and five others. Nobody was actively working on them, but they had a home and stayed installable.
pip-tools was its own community
With 69 team members it dwarfed every other project (the next largest, djangorestframework-simplejwt, had 24). It was basically a sub-organization within Jazzband. And two projects joined as recently as 2024 (django-tagging, django-summernote) with single-digit stars and zero releases - people were still finding value in the model right up to the end.
The open access model was genuinely controversial
When django-newsletter transferred in, its author @dokterbob worried that giving 800 members write access would "dissolve the responsibility so much that it might actually reduce participation." I wrote a long reply defending the open model.
An earlier project, Collectfast, actually left Jazzband after a member pushed directly to master without review - merging commits the author had been holding off on. That incident led to real discussions about code review processes, branch protection, and what "open access" should actually mean. The tension between openness and control was never fully resolved.
Moderation was another solo job
Over the years I had to block 10 accounts from the GitHub organization - first crypto spammers who joined just to be in the org, then community conflicts that needed real moderation decisions, and finally the AI-driven spam that made the open model untenable. None of that is unusual for an organization this size, but it all went through one person.
The onboarding bottleneck
Every transferred project got an onboarding checklist - a webhook automatically opened an "Implement Jazzband guidelines" issue with TODOs like fixing links, adding badges, setting up CI, adding jazzband to PyPI, deciding on a project lead. 41 projects got one of these. 28 completed it. 13 are still open.
The pattern in those 13 is telling: contributors would do every item they could, then get stuck on things that required admin access - configuring webhooks, fixing CI checks, setting up the release pipeline - and wait for me. Sometimes for months.
django-user-sessions' original author pinged me five times over two months about broken CI checks only an admin could fix. Watson's lead asked twice to remove legacy CI tools blocking PR merges. The checklist was good. The bottleneck was me.
Projects that moved on
One of the earliest and most visible Jazzband projects was django-debug-toolbar, transferred in back in 2016. It grew to over 8,000 stars under Jazzband before it moved to Django Commons in 2024.
django-simple-history, django-oauth-toolkit, PrettyTable, and tablib all moved on too, for similar reasons - they needed more autonomy than Jazzband's structure could provide.
Downloads
For context on how widely these projects are used, here are some numbers from PyPI. All projects that were ever part of Jazzband account for over 150 million downloads a month. Current projects alone are around 95 million.
Top 15 by monthly downloads:
| Project | Downloads/month | Note |
|---|---|---|
| prettytable | 42.4M | left Jazzband |
| pip-tools | 23.3M | |
| contextlib2 | 10.7M | |
| django-redis | 9.6M | |
| django-debug-toolbar | 7.3M | left, now Django Commons |
| djangorestframework-simplejwt | 6.1M | |
| dj-database-url | 5.5M | |
| pathlib2 | 4.9M | |
| django-model-utils | 4.8M | |
| geojson | 4.6M | |
| tablib | 4.1M | |
| django-oauth-toolkit | 3.7M | left |
| django-simple-history | 3.1M | left, now Django Commons |
| django-silk | 2.7M | |
| django-formtools | 2.1M |
One thing that surprised me: prettytable alone accounts for 42 million downloads a month, and it isn't even a Django package. contextlib2, pathlib2, and geojson aren't either. Jazzband ended up being broader than the Django ecosystem it started in.
django-debug-toolbar ranked in the top three most used third-party packages in the Django Developers Survey and is featured in the official Django tutorial. It spent 8 years under Jazzband before moving to Django Commons.
If you've come across Jazzband projects before, it was probably through the Django News newsletter, Python Weekly, or Opensource.com's 2020 piece on how Jazzband worked.
Top 10 projects by stars
| Project | Stars |
|---|---|
| pip-tools | 7,997 |
| django-silk | 4,939 |
| tablib | 4,752 |
| djangorestframework-simplejwt | 4,310 |
| django-taggit | 3,429 |
| django-redis | 3,059 |
| django-model-utils | 2,759 |
| Watson | 2,515 |
| django-push-notifications | 2,384 |
| django-widget-tweaks | 2,165 |
14 Mar 2026 4:02pm GMT
Wind-Down Plan
This post outlines the plan for winding down Jazzband. If you haven't read them yet, see the sunsetting announcement for context on why this is happening, and the 10-year retrospective for the full story.
Timeline
The wind-down will happen in phases over the course of 2026.
Phase 1: Announcement (March 2026)
- New member signups are disabled immediately
- This announcement and wind-down plan are published
- Existing members retain access to the GitHub organization and all repositories
Phase 2: Outreach (March - May 2026)
- All 80 project leads will be contacted via email to discuss transferring their projects
- The goal is to have initial conversations with every lead before PyCon US 2026 (May 13-19 in Long Beach, CA)
- Leads who don't respond will be followed up with at PyCon US and through other channels
Phase 3: Project Transfers (June - December 2026)
- Projects will be transferred out of the Jazzband GitHub organization to their new homes - whether that's a lead's personal account, a new organization, or another collaborative group
- For each project, the transfer includes:
- GitHub repository: transferred to the new owner
- PyPI package ownership: existing maintainers added, Jazzband credentials removed
- CI/CD configuration: updated to work outside Jazzband
- Projects without an active lead or willing recipient will be archived in the Jazzband GitHub organization
Phase 4: Wind Down (Early 2027)
- Remaining repositories archived
- The Jazzband GitHub organization set to read-only
- The jazzband.co website archived (with a redirect or static notice)
- PSF Fiscal Sponsorship status concluded, remaining funds donated to the PSF general fund
What happens to…
…existing members?
You remain a member of the GitHub organization until it is archived. No action is needed on your part. If you'd like to leave earlier, you can do so from your account dashboard.
…projects I contribute to?
The projects aren't going away - they're moving. Your contributions, issues, and pull requests will transfer with the repository to its new home. Git history is preserved.
…PyPI packages?
Package ownership on PyPI will be transferred to the project leads before the Jazzband release credentials are deactivated. If you're a project lead, we'll coordinate this with you directly.
…the Jazzband release pipeline?
The Jazzband-specific release pipeline (uploading via Twine to jazzband.co, then releasing to PyPI) will remain functional during the transition period. After transfer, projects will publish to PyPI directly using standard tooling.
…the website?
The jazzband.co website will remain online through the transition. After wind-down, it will be replaced with a static page linking to this announcement and an archive of the project list.
…the PSF Fiscal Sponsorship?
Jazzband's PSF Fiscal Sponsorship status will be formally concluded. Any remaining funds will be donated to the Python Software Foundation's general fund.
For project leads
If you're a project lead, here's what to expect:
- You'll receive an email with details specific to your project(s)
- Decide on a new home for your project - your personal GitHub account, a new organization, or another collaborative group like Django Commons
- Coordinate the transfer - we'll handle the GitHub repo transfer and help with PyPI ownership changes
- Update your project - CI/CD, documentation links, and any Jazzband-specific references
Several projects have already successfully transferred to Django Commons, including django-debug-toolbar and django-simple-history. If you're looking for a place with shared maintenance and multiple admins, it's a good option.
If you have questions or want to start the process early, please contact the roadies.
14 Mar 2026 4:01pm GMT
Sunsetting Jazzband
Over 10 years ago, Jazzband started as a cooperative experiment to reduce the stress of maintaining Open Source software projects. The idea was simple - everyone who joins gets access to push code, triage issues, merge pull requests. "We are all part of this."
It had a good run. More than 10 years, actually.
But it's time to wind things down.
What happened
There's a short answer and a long answer.
The slopocalypse
GitHub's slopocalypse - the flood of AI-generated spam PRs and issues - has made Jazzband's model of open membership and shared push access untenable.
Jazzband was designed for a world where the worst case was someone accidentally merging the wrong PR. In a world where only 1 in 10 AI-generated PRs meets project standards, where curl had to shut down its bug bounty because confirmation rates dropped below 5%, and where GitHub's own response was a kill switch to disable pull requests entirely - an organization that gives push access to everyone who joins simply can't operate safely anymore.
The one-roadie problem
But honestly, the cracks have been showing for much longer than that.
Jazzband was always a one-roadie operation. People asked for more roadies and offered to help over the years, and I tried a number of times to make it work - but it never stuck. I dropped the ball on organizing it properly, and when volunteers did step up they'd quietly step back after a while. That's not a criticism of them, it's just how volunteer work goes when there's no structure to support it.
The result was the same though: every release request, every project transfer, every lead assignment, every PyPI permission change - it all went through me.
The warnings
The sustainability question was raised as early as 2017. I gave a keynote at DjangoCon Europe 2021 about it - five years in. In that talk I said out loud that the "social coding" experiment had failed to create an equitable community, and that a sustainable solution didn't exist without serious financial support.
The roadmap I presented - revamp infrastructure, grow the management team, formalize guidelines, reach out for funding - none of that happened. The PSF fiscal sponsorship was the one thing that did.
In the years since, I've been on the PSF board - which faced its own crises - and now serve as PSF chair. That work matters and I don't regret prioritizing it, but it meant Jazzband got even less of my time.
GitHub went the other way
Meanwhile, GitHub moved in the opposite direction. Copilot launched in 2022, trained on open source code that maintainers were burning out maintaining for free. GitHub Sponsors participation sits at 0.0014%. 60% of maintainers are still unpaid.
The XZ Utils backdoor in 2024 showed what happens when a lone maintainer burns out and someone malicious fills the gap. And Jazzband's own infrastructure started getting in the way of the projects it was supposed to help - the release pipeline couldn't support trusted publishing, projects that needed admin access were stuck.
So projects started leaving. And that's OK - that was always supposed to be part of the deal.
Django Commons
I want to specifically thank Django Commons and Tim Schilling for picking up where Jazzband fell short. They have 5 admins, 15 active projects (including django-debug-toolbar, django-simple-history, and django-cookie-consent from Jazzband), and django-polymorphic is transferring over right now. They solved the governance problem from day one. If you're a Jazzband project lead looking for a new home for your Django project, start there.
For non-Django projects like pip-tools, contextlib2, geojson, or tablib - I'm not aware of an equivalent. If someone wants to build one for the broader Python tooling ecosystem, I'd love to see it.
By the numbers
Over 10 years, Jazzband grew to 3,135 members from every continent but Antarctica, maintained 84 projects with ~93,000 GitHub stars, and shipped 1,312 releases to PyPI.
Projects that passed through Jazzband are downloaded over 150 million times a month - pip-tools at 23 million, prettytable at 42 million. django-debug-toolbar spent 8 years under Jazzband and ended up in the official Django tutorial. django-avatar, a repo from 2008, was still getting releases in 2026. And django-axes shipped 127 versions - a release every 13 days in its peak year.
The full 10-year retrospective has all the numbers, the stories, and what actually happened.
What happens next
I'm not pulling the plug overnight. There is a detailed wind-down plan that covers the timeline, but the short version:
- New signups are disabled as of today
- Project leads will be contacted before PyCon US 2026 to coordinate transferring projects to new homes
- The GitHub organization and website will remain available during the transition period through end of 2026
If you're a project lead, expect an email soon.
Thank you
None of this would have been possible without the people who showed up - strangers on the internet who decided to maintain something together. Thanks to the 81 project leads who kept things going despite the bottlenecks I created, and to everyone who joined, contributed, filed issues, and shipped releases over the years.
I started Jazzband because maintaining Open Source alone was exhausting. The irony of then becoming a single point of failure for 71 projects is not lost on me. But the experiment worked in the ways that mattered - projects got maintained, releases got shipped, people collaborated.
Anyways, the projects will move on to new homes, and that's fine. That was always the point.
We are all part of this.
14 Mar 2026 4:00pm GMT
13 Mar 2026
Django community aggregator: Community blog posts
Django News - 21 PRs in One Week to Django Core! - Mar 13th 2026
News
The Call for Proposals for DjangoCon US 2026 has been extended one week!
DjangoCon US 2026 has extended its Call for Proposals deadline by one week to March 23 at 11 AM CDT, giving prospective speakers a little more time to submit their talk ideas.
CPython: 36 Years of Source Code
An analysis of the growth of CPython's codebase from its first commits to the present day
Releases
Python 3.15.0 alpha 7
Python 3.15.0 alpha 7 introduces explicit lazy imports, a new frozendict type, improved profiling tools, and JIT upgrades that deliver modest performance gains while development continues toward the upcoming beta.
Django Software Foundation
DSF member of the month - Theresa Seyram Agbenyegah
Theresa Seyram Agbenyegah features as DSF member of the month for March 2026, highlighting her Django community leadership and PyCon organization work.
Updates to Django
Today, "Updates to Django" is presented by Johanan from Djangonaut Space! 🚀
Last week we had 21 pull requests merged into Django by 11 different contributors - including 2 first-time contributors! Congratulations to KhadyotTakale and Lakshya Prasad for having their first commits merged into Django - welcome on board!
This week's Django highlights:
-
Fixed TypeError in deprecation warnings if Django is imported by namespace. (#36961)
-
Improved admin changelist layout for object-tools button. (#36887)
-
Fixed migrate --run-syncdb crash for existing model with truncated db_table names. (#12529)
Django Newsletter
Django Fellow Reports
Fellow Report - Jacob
Two cool features landed this week: @Antoliny0919's more standard vertical layout for inputs and labels in admin forms, and Artyom Kotovskiy's work to make RenameModel migration operations update permission names as well.
Lots of tickets triaged, reviewed, and authored!
Fellow Report - Natalia
This week had as the main attraction the security releases I issued on Tuesday (6.0.3, 5.2.12, and 4.2.29), which required the usual coordination, strong focus, and intense follow-up.
Beyond that, a significant part of the week was spent navigating the continuing wave of LLM-generated pull requests, which adds a fair amount of noise to the review queue. After prioritizing the security work, I tried to reclaim some joy in the day-to-day Fellow work by digging through long-snoozed notification emails and picking off a number of lingering tickets and PRs that had been waiting for attention.
Sponsored Link 1
The deployment service for developers and teams.
Articles
New Feature Proposal for Django - AddConstraintConcurrently
More context on a recent proposal suggesting a pair of opt-in contrib.postgres operations - AddConstraintConcurrently and RemoveConstraintConcurrently - to allow unique indexes created via UniqueConstraint to be created and dropped concurrently.
Avoiding empty strings in non-nullable Django string-based model fields
Django silently converts None values in non-nullable string fields into empty strings, but a simple CheckConstraint can enforce truly required values and prevent empty data from slipping into your database.
Buttondown - How we check every link in your email
The machinery behind Buttondown's link checker is more involved than you might expect.
The State of OpenSSL for pyca/cryptography with Alex Gaynor and Paul Kehrer
The written transcript of an interview all about Python security/cryptography, current features in cryptography, as well as some of what's coming in the future.
Year of the Snake Recap
Mariatta's review of the year showcases how prolific she was, with conferences, documentaries, ice cream selfies, and much more.
What is `self`?
Eric Matthes tackles the age-old questions that is asked many times by newcomers, but is always worth revisiting.
I Ditched Elasticsearch for Meilisearch. Here's What Nobody Tells You.
A practical deep dive into replacing Elasticsearch with Meilisearch, showing how a simpler Rust-based search engine cut costs from $120 to $14 a month while delivering faster, typo-tolerant search for typical application workloads.
Videos
From Kenya to London - Velda Kiara
The video version of Django Chat and this week's guest, Velda. We won't always do a double-feature of episodes, but Velda is always sunny and uplifting even amidst these last legs of winter.
Python Unplugged on PyTV - Free Online Python Conference
If you missed it live last week, there was a digital conference hosted by PyCharm featuring several Django speakers including Sarah Boyce (Fellow), Carlton Gibson (podcast host), and Sheena O'Connell (PSF Member). Timestamps in the description!
Podcasts
Django Chat #197: From Kenya to London with Django - Velda Kiara
Velda is a software engineer at RevSys based in London and an extremely active member of the Python and Django communities. She is a PSF Fellow, former Djangonaut, co-maintainer of django-debug-toolbar, regular conference speaker, and Microsoft MVP.
Django Job Board
Explore new opportunities this week including a Solutions Architect role at JetBrains, an Infrastructure Engineer position at the Python Software Foundation, and a Lead Backend Engineer opening at TurnTable.
Solutions Architect - Python (Client-facing) at JetBrains 🆕
Infrastructure Engineer at Python Software Foundation
Lead Backend Engineer at TurnTable
Django Newsletter
Projects
Lupus/django-lumen
Visualize your Django models as an interactive ERD diagram in the browser. No external diagram library - the diagram is pure vanilla JS + SVG rendered at request time from the live Django model registry.
paradedb/django-paradedb
Official extension to Django for use with ParadeDB.
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
13 Mar 2026 3:00pm GMT
11 Mar 2026
Django community aggregator: Community blog posts
Weeknotes (2026 week 11)
Weeknotes (2026 week 11)
Last time I wrote that I seem to be publishing weeknotes monthly. Now, a quarter of a year has passed since the last entry. I do enjoy the fact that I have published more posts focused on a single topic. That said, what has been going on in open source land is certainly interesting too.
LLMs in Open Source
I have started a longer piece to think about my stance regarding using LLMs in Open Source. The argument I'm thinking about is that there's a balance between LLMs having ingested all of my published open source code and myself using them now to help myself and others again.
The happenings in the last two weeks (think Pentagon, Iran, and the bombings of schools) have again brought to the foreground the perils of using those tools. I therefore haven't been motivated to pursue this train of thought for the moment. When the upsides are somewhat questionable and tentative and the downsides are so clear and impossible to miss, it's hard to use my voice to speak in favor of these tools.
That said, all the shaming when someone uses an LLM that I see in my Mastodon feed also annoys me. I'll quote part of a post here which I liked and leave it at that for the moment:
The AI hype-cyclone is bad, but so is the anti-AI witch hunt. Commits co-authored by Claude do not mean that a project has "abandoned engineering as a serious endeavor"
[…]
Other goings-on
- Health: My back continues to improve. Some days are still bad, but the idea that the herniation may go away entirely doesn't sound totally unreasonable anymore.
- Gardening: We started weeding the garden last week. Lots to do! Being outside is fun. Weeding isn't the greatest part ever, but it's meditative.
Releases since December
- django-json-schema-editor 0.12.1: CSS fixes. I have again looked at other, more modern JSON schema editor implementations but all of them are more limited than is acceptable to act as a replacement.
- django-debug-toolbar 6.2: I haven't done much work here! Just some reviewing.
- django-content-editor 8.1: Started emitting warnings when using non-abstract base classes for plugins. Using multi table inheritance is mostly an accident and not intended in my experience when using django-content-editor, therefore we have started detecting this case and emitting system checks (warnings, not errors).
- django-imagefield 0.23.0a3: We have done some work on supporting libvips as an alternative backend to Pillow because I hoped that memory usage in Kubernetes pods might go down a bit. Results are not conclusive yet, and I'm not yet convinced the additional code complexity is worth it. Debugging and monitoring continues.
- FeinCMS 26.2.1: Released a few bugfixes. FeinCMS is still being maintained ~17 years later!
- django-auto-admin-fieldsets 0.3: Added a helper to remove fields from the fieldsets structure.
- django-tree-queries 0.23.1: Shipped a small bugfix for
{% recursetree %}which unintentionally cached children across invocations. - feincms3-downloads: Used
PATHfrom the environment instead of using a very restricted allowlist so thatconvertandpdftocairoare detected in more locations. This should help with local development for example on macOS. - django-prose-editor 0.24.1: Read the CHANGELOG; there's too much in there for a short notice.
- form-designer 0.27.3: Mosparo captcha support, bugfixes and additional translations.
- feincms3 5.5: Started using the
OrderableTreeNodefrom django-tree-queries.
11 Mar 2026 5:00pm GMT
From Kenya to London with Django - Velda Kiara
🔗 Links
- Velda at RevSys
- Velda's Substack: The Storyteller's Byte Tales
- Velda on GitHub
- Optimal Performance Over Basic as a Perfectionist with Deadlines
- More about me
- Neapolitan
📦 Projects
📚 Books
- A History of the Bible by John Barton
- Python Mastery, a course from David Beazley
- Kite Runner by Khaled Hosseini
🎥 YouTube
Sponsor
This episode was brought to you by Buttondown, the easiest way to start, send, and grow your email newsletter. New customers can save 50% off their first year with Buttondown using the coupon code DJANGO.
11 Mar 2026 4:00pm GMT


