05 Jul 2025
Django community aggregator: Community blog posts
Weeknotes (2025 week 27)
Weeknotes (2025 week 27)
I have again missed a few weeks, so the releases section will be longer than usual since it covers six weeks.
django-prose-editor
I have totally restructured the documentation to make it clearer. The configuration chapter is shorter and more focussed, and the custom extensions chapter actually shows all required parts now.
The most visible change is probably the refactored menu system. Extensions now have an addMenuItems
method where they can add their own buttons to the menu bar. I wanted to do this for a long time but have only just this week found a way to achieve this which I actually like.
I've reported a bug to Tiptap where a .can()
chain always succeeded even though the actual operation could fail (#6306).
Finally, I have also switched from esbuild to rslib; I'm a heavy user of rspack anyway and am more at home with its configuration.
django-content-editor
The 7.4 release mostly contains minor changes, one new feature is the content_editor.admin.RefinedModelAdmin
class. It includes tweaks to Django's standard behavior such as supporting a Ctrl-S
shortcut for the "Save and continue editing" functionality and an additional warning when people want to delete inlines and instead delete the whole object. This seems to happen often even though people are shown the full list of objects which will be deleted.
Releases
- django-prose-editor 0.15: See above
- django-content-editor 7.4.1: See above.
- django-json-schema-editor 0.5.1: Now supports customizing the prose editor configuration (when using
format: "prose"
) and also includes validation support for foreign key references in the JSON data. - html-sanitizer 2.6: The sanitizer started crashing when used with
lxml>=6
when being fed strings with control characters inside. - django-recent-objects 0.1.1: Changed the code to use
UNION ALL
instead ofUNION
when determining which objects to fetch from all tables. - feincms3 5.4.1: Added experimental support for rendering sections. Sections can be nested, so they are more powerful than subregions. Also, added warnings when registering plugin proxies for rendering and fetching, since that will mostly likely lead to duplicated objects in the rendered output.
- django-tree-queries 0.20: Added
tree_info
andrecursetree
template tags. Optimized the performance by avoiding the rank table if easily possible. Added stronger recommendations to pre-filter the table using.tree_filter()
or.tree_exclude()
when working with small subsets of large datasets. - django-ckeditor 6.7.3: Added a trove identifeir for recent Django versions. It still works fine, but it's deprecated and shouldn't be used since it still uses the unmaintained CKEditor 4 line (since we do not ship the commercial LTS version).
- feincms3-cookiecontrol 1.6.1: Golfed the generated CSS and JavaScript bundle down to below 4000 bytes again, including the YouTube/Vimeo/etc. wrapper which only loads external content when users consent.
05 Jul 2025 5:00pm GMT
04 Jul 2025
Django community aggregator: Community blog posts
Django News - Django 2024 Annual Impact Report and Django 5.2.4 - Jul 4th 2025
News
Django 5.2.4 bugfix release
Django 5.2.4 fixes regressions in media type preference, JSON null serialization, and composite primary key lookups to improve framework robustness.
Django Joins curl in Pushing Back on AI Slop Security Report...
Django updates its security guidelines to mandate verified AI-assisted vulnerability reports, reducing fabricated submissions and ensuring human oversight in vulnerability triage.
W2D Special Event Station announcement
Amateur radio operators or those interested who also use Django - special event callsign W2D has been reserved to celebrate Django's 20th birthday.
Django Software Foundation
Django's 2024 Annual Impact Report
Django Software Foundation's annual impact report details community milestones, funding initiatives, and strategic support to drive continued growth and innovation in Django development.
Django's Ecosystem
The Django project now has an ecosystem page featuring third-party apps and add-ons.
Updates to Django
Today 'Updates to Django' is presented by Pradhvan from the Djangonaut Space!๐
Last week we had 8 pull requests merged into Django by 7 different contributors.
This week's Django highlights ๐ฆ
Content Security Policy lands in Django core: built-in CSP middleware and nonce support finally arrives, closing the ticket #15727. Shoutout to Rob Hudson for finally bringing CSP to Django core.
Enhanced MariaDB GIS support: Added __coveredby
lookups plus Collect
, GeoHash
, and IsValid
functions for MariaDB 12.0.1+.
Admin messaging gets visual polish: INFO and DEBUG messages now have proper styles and icons in the admin interface, closing ticket #36386.
Django Newsletter
Sponsored Link 1
Scout Monitoring: Logs, Traces, Error (coming soon). Made for devs who own products, not just tickets.
Articles
Drag and Drop and Django
Integrates custom HTML drag and drop components with Django template data and CSRF tokens to create interactive scheduling interfaces using fetch and custom elements.
Hosting your Django sites with Coolify
This post details how a complex, self-managed Django deployment stack was replaced with Coolify, an open-source self-hosted PaaS that offers zero downtime deployments, built-in backups, and a Git-based workflow, all while running on personal hardware.
From Rock Bottom to Production Code
Matthew Raynor's transformation from personal adversity to constructing production-level Django applications using full-stack development, custom authentication, and integrated AI features.
Pyclichรฉ & Djereo
Starting a Python or Django project? Steal Alberto Morรณn Hernรกndez's templates!: pyclichรฉ & djereo, opinionated project templates for Python & Django, respectively.
Django Fellow Report
Django Fellow Report - Sarah Boyce
5 tickets triaged, 22 reviewed, 1 authored, and set up mssql-django
to test a ticket.
Django Fellow Report - Natalia Bidart
3 tickets triaged, 7 reviewed, 1 authored, and other misc.
Events
Visa to DjangoCon Africa
From the DSF President, tips on how to get a visa application done for DjangoCon Africa in Tanzania ๐น๐ฟ.
DjangoCon Videos
Passkeys in Django: the best of all possible worlds - Tom Carrick
Secure, accessible, usable - pick any three.
Why compromise when you can have it all? This talks shows how easy it is to integrate support for passkeys (Face ID, fingerprint scans, etc.) into your Django app in almost no time at all.
100 Million Parking Transactions Per Year with Django - Wouter Steenstra
For several Dutch municipalities, Django applications power the monitoring of both on-street and off-street parking transactions. What started as a straightforward tool for extracting data from parking facilities has evolved into a robust ETL platform with a feature-rich dashboard. This talk delves into how Django remains the backbone of our operations and why it continues to be the foundation of our business success.
How we make decisions in Django - Carlton Gibson
Django is an inclusive community. We seek to invite contributions, and aim for consensus in our decision making. As the project has grown - and as with all large open source projects - that's led to difficulties, as even simple proposals get drawn out into long discussions that suck time, energy, and enthusiasm from all. It's time we refreshed our approach. We're going to look at how we got here, what we need to maintain, and how we can move forwards towards a better process.
Podcasts
Episode 9: with Tamara Atanasoska
Learn about Tamara's journey. Tamara has been contributing to open source projects since 2012. She participated in Google Summer of Code to contribute to projects like Gnome and e-cidadania.
Django News Jobs
Senior Backend Python Developer at Gravitas Recruitment ๐
Senior/Staff Software Engineer at Clerq
Full Stack Software Engineer at Switchboard
Django Fellow at Django Software Foundation
Senior Software Engineer at Simons Foundation
Django Newsletter
Projects
wsvincent/official-django-polls-tutorial
Source code for the official Django Polls tutorial.
justinmayer/typogrify
A set of Django template filters to make caring about typography on the web a bit easier.
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
04 Jul 2025 3:00pm GMT
03 Jul 2025
Django community aggregator: Community blog posts
Rate Limiting for Django Websites
Sometimes, certain pages of a Django website might receive unwanted traffic from crawlers or malicious bots. These traffic spikes consume server resources and can make the website unusable for legitimate users. In this article, I will explore Nginx's rate-limiting capabilities to prevent such performance issues.
What is rate limiting, and why use it?
A typical Django website is deployed using a Gunicorn (or Uvicorn for ASGI) application server, with an Nginx web server in front of it. When a request comes to Nginx, it goes through various checks, gets filtered by domain, protocol, and path, and is finally passed to Gunicorn. Gunicorn then runs Django, which parses the URL and returns a response from the appropriate view.
Nginx rate limiting allows you to limit how often certain pages (based on URL paths) or all Django endpoints can be accessed. It prevents quick reloading or scripted attacks that flood your site with requests. This is especially useful for pages with forms, faceted list views, REST APIs, or GraphQL endpoints.
A typical rate-limiting configuration in Nginx defines a zone with a specific memory size (in megabytes), a rate (requests per second or per minute), and one or more locations that apply that zone using options like burst
and nodelay
. When the rate limit is exceeded, Nginx returns a 429 Too Many Requests response.
You can imagine the burst
as a funnel. The requests are like small balls coming in quickly. The first request passes through immediately, and the others get stacked in the funnel. Any requests that don't fit into the funnel are rejected with a 429 response. The rest are either delayed or processed immediately, depending on the configuration. If more requests come in and there's space in the funnel, they're stacked and processed according to the defined rate.
Practical example
Here is an example configuration that limits list views to 1 request per second, allowing up to 2 additional requests in a short burst before rejecting further ones. It also limits POST requests to authentication paths to 10 per minute, allowing bursts of up to 5 requests at a time. And a generous fallback for other URL paths. All excess requests beyond these limits will be rejected. Each zone has 10 MB of storage for metadata such as IP addresses, timestamps, and the current state of the funnel.
map $request_method $limit_key {
POST $binary_remote_addr;
default "";
}
limit_req_zone $binary_remote_addr zone=list_pages:10m rate=1r/s;
limit_req_zone $limit_key zone=auth_conditional:10m rate=10r/m;
limit_req_zone $binary_remote_addr zone=global_fallback:10m rate=20r/s;
server {
listen 443 ssl;
server_name www.pybazaar.com;
charset utf-8;
# ...
location ~* "^/(|resources|profiles|opportunities|buzz)/?$" {
limit_req zone=list_pages burst=2 nodelay;
limit_req_status 429;
proxy_pass http://127.0.0.1:8000;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_buffering off;
proxy_redirect off;
proxy_read_timeout 300s;
}
location ~* "^/(signup|login)/?$" {
limit_req zone=auth_conditional burst=5 nodelay;
limit_req_status 429;
proxy_pass http://127.0.0.1:8000;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_buffering off;
proxy_redirect off;
proxy_read_timeout 300s;
}
location / {
limit_req zone=global_fallback burst=40 nodelay;
limit_req_status 429;
proxy_pass http://127.0.0.1:8000;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_buffering off;
proxy_redirect off;
proxy_read_timeout 300s;
}
# ...
}
Checking if it worked
Once you have this set up, you can test the rate limiting using the following bash script:
#!/usr/bin/env bash
for i in {1..20}; do
{
status=$(curl -w "%{http_code}" -o /dev/null -s \
https://www.pybazaar.com/opportunities/)
echo "Request $i: $status"
} &
done
wait
This script will output the HTTP status codes for 20 requests sent simultaneously, for example:
Request 7: 429
Request 4: 429
Request 8: 429
Request 3: 429
Request 19: 429
Request 15: 429
Request 9: 429
Request 13: 429
Request 6: 429
Request 20: 429
Request 18: 429
Request 10: 429
Request 17: 429
Request 11: 429
Request 16: 429
Request 14: 429
Request 12: 429
Request 5: 200
Request 1: 200
Request 2: 200
Keep in mind that the order of the requests and which ones succeed or fail won't be in a simple sequence because Internet traffic takes different paths on its way to and from the server.
Next steps
The next step could be setting up the fail2ban service to ban IP addresses that receive the 429 response code too frequently. Should I write an article about that?
Conclusion
Rate limiting with Nginx is an effective way to protect your Django website from traffic spikes caused by crawlers or malicious bots. By setting limits on requests per second or minute-using zones and bursts-you can control how many requests each IP can make to different parts of your site. This helps prevent server overload and keeps the website responsive for real users.
03 Jul 2025 5:00pm GMT
Hosting your Django sites with Coolify
I currently run four Django apps in production, with staging and production environments for each. For years, I've managed this on a single server with a stack that has served me well, but has grown increasingly complex:
- A Hetzner VPS running Debian 12
- Nginx as a reverse proxy
- Gunicorn processes managed by systemd
- More systemd services for background task runners
- A custom deploy service that listens for GitHub webhooks to pull and restart the right app
- A custom backup script that archives configs and databases and ships them offsite to rsync.net.
I've written about this setup in detail, both in "Setting up a Debian 11 server for SvelteKit and Django" and more recently in "Automatically deploy your site when you push the main branch".
While not rocket science, it's a non-trivial setup. Each new app requires a checklist of configuration steps, and it's easy to miss one. Worse, my deploy script involves a brief moment of downtime as Gunicorn restarts. It's only a second or two, but it's not ideal. I'm more of a developer than an operations person, and building a zero-downtime, rolling deploy script myself feels like a step too far.
On the other end of the spectrum, I have a bunch of static sites hosted with Netlify. Its workflow is a dream: connect a GitHub repo, push changes, and Netlify automatically builds and deploys the site. It handles complex build steps for static site generators, even ones built with Swift like Saga. But its magic is limited to static sites; you can't run a backend service like a Django app.
I've been looking for a solution that combines the best of both worlds: the simplicity of Netlify with the power to run my own backend services, all self-hosted, open-source, and on my own hardware.
Enter Coolify. It promises a Heroku-like experience you can run yourself. It can:
- Deploy static sites from a Git repository, with a build step.
- Run backend services like Node.js and Python via Dockerfiles.
- Manage databases like PostgreSQL and Redis.
- Handle backups, HTTPS certificates, and more.
This looked like exactly what I needed. Here's how I moved my Django apps to it.
Step 1: prepare a fresh server
Before installing Coolify, it's wise to perform some basic server hardening. I spun up a new VPS on Hetzner and logged in as root to get it ready.
First, I disabled password-based SSH login in favor of public key authentication. In /etc/ssh/sshd_config
, I made these changes:
PasswordAuthentication no
PubkeyAuthentication yes
I supplied my SSH public key to Hetzner during the server creation process, so it was already stored on the server for me. If you didn't do that, you'll need to copy your public key to the server yourself. The easiest way is with the ssh-copy-id
command from your local machine:
$ ssh-copy-id root@YOUR_SERVER_IP
Next, I set up UFW (Uncomplicated Firewall) to control network traffic:
$ apt install ufw
$ ufw default deny incoming
$ ufw default allow outgoing
$ ufw allow ssh
$ ufw allow http
$ ufw allow https
$ ufw enable
To protect against brute-force attacks, I installed Fail2ban.
$ apt install fail2ban python3-systemd
$ cd /etc/fail2ban
$ cp jail.conf jail.local
$ nano jail.local
I then enabled the SSH jail in jail.local
and configured it to be quite strict, banning an IP after a single failed attempt. After all, we're using SSH keys, not passwords.
# /etc/fail2ban/jail.local
[sshd]
enabled = true
port = ssh
logpath = %(sshd_log)s
backend = systemd
maxretry = 1
After saving, I enabled and started the service:
$ systemctl enable fail2ban
$ service fail2ban start
Finally, I enabled automatic security updates to keep the system patched without manual intervention.
$ apt install unattended-upgrades
$ dpkg-reconfigure --priority=low unattended-upgrades
With the server secured, it was time for the main event.
Step 2: install Coolify
This is the easiest part. Coolify provides a simple installation script that handles everything.
$ curl -fsSL https://cdn.coollabs.io/coolify/install.sh | sudo bash
After a few minutes, Coolify is up and running, accessible via the server's IP address. I created a CNAME DNS entry for my server so that I can easily access it with a memorable domain.
Step 3: containerize the Django app
Coolify works by building and running your applications in Docker containers. This is a departure from my old setup of running Gunicorn directly on the host. The central piece of this is the Dockerfile
, a recipe for creating your application's image.
Here is a Dockerfile
I've put together for a typical Django project. (It uses uv
, because it's awesome. I've written a bunch of articles about it.)
# Use a slim, modern Python base image
FROM python:3.13-slim
# Set the working directory inside the container
WORKDIR /app
# Arguments needed at build-time, to be provided by Coolify
ARG DEBUG
ARG SECRET_KEY
ARG DATABASE_URL
# Install system dependencies needed by our app (e.g., for psycopg2)
RUN apt-get update && apt-get install -y \
postgresql-client \
&& rm -rf /var/lib/apt/lists/*
# Install uv, the fast Python package manager
RUN pip install uv
# Copy only the dependency definitions first to leverage Docker's layer caching
COPY pyproject.toml uv.lock ./
# Install Python dependencies for production
RUN uv sync --no-group dev --group prod
# Copy the rest of the application code into the container
COPY . .
# Run build steps. These are baked into the final image.
RUN uv run --no-sync ./manage.py tailwind build
RUN uv run --no-sync ./manage.py collectstatic --noinput
# Migrate the database
RUN uv run --no-sync ./manage.py migrate
# Expose the port Gunicorn will run on
EXPOSE 8000
# Run with gunicorn
CMD ["uv", "run", "--no-sync", "gunicorn", "--bind", "0.0.0.0:8000", "--workers", "3", "config.wsgi:application"]
This file defines every step needed to run the application. It installs dependencies, builds static assets, and runs database migrations. The final image is a self-contained, runnable artifact. When Coolify deploys this, it's simply a matter of stopping the old container and starting the new one, which is how it achieves zero-downtime deploys.
Note: I run database migrations as part of the build process. Some some prefer to run migrations at container startup [citation needed], but since we're rebuilding on every Git push anyway, it fits perfectly with this workflow. Feel free to tell me in the comments below if I am wrong.
Within the Coolify UI, I can now create a new application, point it to my GitHub repository, and tell it to use the "Dockerfile" build pack. Coolify automatically detects pushes to my main branch, pulls the code, builds the new image, and deploys it.
I no longer have an .env
file on the server with the environment variables (like DATABASE_URL
), instead I use Coolify's Environment Variables within the project settings. The way I configure my Django projects hasn't changed, only the .env
file part has been replaced with Coolify's UI. However, there is one small gotcha: by default these Coolify environment variables are only available at runtime, but because I use code like os.getenv("DATABASE_URL")
in my settings.py, these variables also need to be available at build-time when Django commands like collectstatic
run. This is why we explicitly expose these three variables as build arguments in the Dockerfile with the ARG
declarations, making them available during the Docker build process.
(For a non-Python example: the Dockerfile
for this very website, which is built with Swift, can be found on GitHub.)
Step 4: configure backups
My old custom backup script is no longer needed because Coolify has backups built-in. First, you need to configure a destination, which Coolify calls an "S3 Storage" target.
I'm using Cloudflare R2 for this, as it offers a generous 10 GB of S3-compatible object storage for free. Here's how to set it up:
- In Cloudflare: Navigate to R2 from your dashboard. Create a new bucket, giving it a unique name (e.g.,
coolify-backups-your-name
). - Once the bucket is created, go to the R2 overview page and click Manage R2 API Tokens.
- Click Create API Token. Give it a descriptive name, grant it "Object Read & Write" permissions, and specify the bucket you just created.
- After you create the token, Cloudflare will display the Access Key ID and Secret Access Key. Copy these immediately, as the Secret Key won't be shown again. You will also need your Account ID and the S3 endpoint URL, which is shown on the R2 bucket page.
With these credentials in hand, head back to Coolify:
- In Coolify: Go to the Storages tab in the main navigation.
- Click Add a new S3 Storage.
- Fill in the form with the credentials from Cloudflare. The
region
can typically be ignored, just leave it as-is. - Save the new storage configuration.
With the S3 storage now configured, we can set up our backups.
- Go to Settings -> Backup, and make sure backups are turned on. Then enable the "S3 Enabled" checkmark. You can choose the local and remote retention; I keep 30 days of backups both locally and remotely.
- Go to your Django project, then to its database, then to the Backups tab. Here you can create a new scheduled backup, which will be stored locally. Enable the "Save to S3" checkmark to also store it remotely.
Step 5: remaining Coolify config
To make sure you get important alerts, you'll want to configure the email settings in Settings -> Transactional Email, using an SMTP server. Then go to the Notification menu and enable the "use system wide (transactional) email settings" checkbox. You can choose when to receive notifications, for example when a build fails, a backup fails, or when disk usage gets too high.
The way forward
Moving to Coolify is a significant simplification of my infrastructure. It replaces my collection of custom scripts with a unified, robust system that provides the modern, git-based workflow I love from Netlify. The shift to containerization was long overdue, and Coolify makes it approachable.
Another major benefit is that all the configuration of how to run an app now lives directly in the project's repository, in the form of a Dockerfile. It no longer only lives on the server in the form of a bunch of config files and systemd services and crontabs.
Best of all, I get zero-downtime deployments out of the box, all while still running everything on my own server. It's the self-hosted PaaS I've been looking for.
03 Jul 2025 2:22am GMT
01 Jul 2025
Django community aggregator: Community blog posts
I am back
Well it's been more than a few weeks since I last managed to get some words out the door or even have the headspace for writing. This was mostly down to the new startup picking up steam and client work taking up time, but also a wedding, holiday's and other priorities coming in for me to deal with! All that to say, I'm back for a little while at least, summer is fast approaching which means holiday's again.
I'll keep today short as I don't have a particular topic fully developed in mind, but few things I have in my head recently are:
- AI has finally clicked in my head especially with Agentic stuff coming through. This is what I expected a few years back when ChatGPT was released. The true skill these days seems to be ticket writing (although AI developing PRD documents is something I need to try out). Also using git worktrees to allow multiple agents working on my projects at the same time and creating my own custom AI agents..
- I have been enjoying, with the help of AI, the python attrs and cattrs library to validate data going to and from Stripe.
- I'm excited by some active Django topics that are exploring what it means for some items to partially included into core, perhaps via pip-extras or some other mechanism
- How payments work
- How card transactions work
- Open-sourcing a codebase that would typically be closed-source.
- The rough edges of AI usage and how my work is changing (could it all be mobile based?)
- Building on my last blog post, could we document the Django API's enough so that it could be implemented in another language? The point being the complete documentation, not necessarily the rewrite itself
That's all for now! Hopefully another post soon on another Django related topic!
01 Jul 2025 5:00am GMT
27 Jun 2025
Django community aggregator: Community blog posts
Django News - Fellow Deadline, Native Pooling, and Debugging in Production - Jun 27th 2025
News
โญ DSF calls for applicants for a Django Fellow [last call]
Applications for the new Django Fellow position are open until midnight AOE on July 1st.
Updates to Django
Today 'Updates to Django' is presented by Pradhvan from the Djangonaut Space!๐
Last week we had 14 pull requests merged into Django by 8 different contributors - including 1 first-time contributor! Congratulations to lukas-komischke-ameos for having their first commit merged into Django - welcome on board! ๐
This week's Django highlights ๐
PostgreSQL safety check: Added a helpful system check to catch when you forget django.contrib.postgres
in INSTALLED_APPS
, saving developers from confusing database errors.
Cleaner column aliases: Use of %
in column aliases is now deprecated.
Better content negotiation: Media type selection now properly considers quality values when choosing the preferred type for more reliable API responses.
Special shoutout to Clifford Gama who brought closure to ticket #32770 after 4 years. ๐ฆ๐
Django Newsletter
Wagtail CMS
htmx accessibility gaps: data and recommendations
Analysis of htmx accessibility in Django sites reveals mixed Lighthouse scores, with specific ARIA gaps, and recommends using UI components, testing tools, and implementing carefully.
Sponsored Link 1
Scout Monitoring: Logs, Traces, Error (coming soon). Made for devs who own products, not just tickets.
Articles
Disable runserver warning in Django 5.2
Use the DJANGO_RUNSERVER_HIDE_WARNING=true
environment variable in Django 5.2 (including via django-environ .env files) to suppress the default development server warning.
Production-ready cache-busting for Django and Tailwind CSS
Use a custom ManifestStaticFilesStorage
to skip hashing Tailwind's source.css so django-tailwind-cli builds produce cache-busted tailwind.css via a two-step tailwind build and collectstatic
for Django projects.
Native connection pooling in Django 5 with PostgreSQL
Django 5 adds native PostgreSQL connection pooling via OPTIONS pool=True
in DATABASES
, eliminating external tools like PgBouncer and delivering over fivefold performance gains.
Markdown Virtual Table: Implementing SELECT
Demonstrates using custom Django schema editor hooks to register a Rust-based SQLite virtual table for Markdown content and frontmatter as unmanaged models.
Django: Introducing inline-snapshot-django
Inline snapshot Django provides snapshot testing of SQL queries in Django tests, automating fingerprint capture and inline updates with a Rust-based SQL fingerprinting engine.
Switching pip to uv in a Dockerized Flask / Django App
Switch from pip to uv in Dockerized Django apps to replace requirements.txt with pyproject.toml
, leverage uv lock/sync commands for faster, deterministic builds.
How to Migrate your Python & Django Projects to uv
Migrate Django projects from requirements files to uv by defining dependencies in pyproject.toml, syncing environments locally, in Docker, and CI with pre-commit integration.
DjangoCon Videos
KEYNOTE The Most Bizarre Software Bugs in History - Mia Bajiฤ
We've all heard that we should test our software, but what happens when we don't? Sometimes, it leads to strange and unexplainable events.
Is 'testing more' always the right solution? What do these bugs reveal about software and its failures? And how can we use these lessons to build more resilient systems? Let's take a look at the most bizarre software bugs in history.
KEYNOTE Django for Data Science: Deploying Machine Learning Models with Django - Will Vincent
A keynote talk on learning how to train your own ML model and publicly share it using Django. Comes with a demo website and GitHub repos with source code for the ML model and Django website.
How to Enjoy Debugging in Production - Karen Tracey
While launch day is often anticipated as a satisfying completion to a project, the reality is often different. Real users in the production environment may test our code in unanticipated ways, leading to surprising bugs that need to be addressed, often under time pressure and with fewer debugging resources than we're used to having in our development environment.
Podcasts
Abstractions: Perverse Cargo Cult
Django made the Abstractions podcast this week with their guidance on AI-assisted security reports commit.
Django News Jobs
Django Fellow at Django Software Foundation
Senior/Staff Software Engineer at Clerq ๐
Full Stack Software Engineer at Switchboard
Senior Software Engineer at Simons Foundation
Django Newsletter
Projects
django-wiki/django-nyt
Notification system for Django with batteries included: Email digests, user settings, JSON API.
charettes/django-fk-constraint
Django app providing a foreign key constraint support multiple fields.
Sponsorship
๐ Sponsor Django News
Looking to reach over 4,200 active Django developers? This summer, promote your product or service while supporting the Django community. Sponsorship spots are available now!
View Sponsorship Opportunities โ
Django Newsletter
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
27 Jun 2025 3:00pm GMT
Django: hide the development server warning
From Django 5.2 (April 2025), the runserver
management command outputs a warning:
$ ./manage.py runserver
...
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.
WARNING: This is a development server. Do not use it in a production setting. Use a production WSGI or ASGI server instead.
For more information on production servers see: https://docs.djangoproject.com/en/5.2/howto/deployment/
The warning uses bold, yellow text for emphasis.
Here's the relevant release note:
A new warning is displayed when running
runserver
, indicating that it is unsuitable for production. This warning can be suppressed by setting theDJANGO_RUNSERVER_HIDE_WARNING
environment variable to"true"
.
I think the warning is somewhat useful for new users, who do sometimes skip the deployment checklist that states that runserver
is not designed for production use. That said, the warning is rather annoying for the majority of users who have correctly set up a production server. I also fear it will lead to warning fatigue, normalizing that runserver
outputs lots of warning text, making it easier to skip over other, more pressing messages that may be logged.
If you've set up your production environment to use a secure server, like gunicorn, then you probably want to hide this warning. As the release note states, you can do this by setting the environment variable DJANGO_RUNSERVER_HIDE_WARNING
to "true"
.
While you can manage environment variables in a bunch of different ways, I think the simplest method here is to set it at the top of your settings file:
import os
# Hide development server warning
# https://docs.djangoproject.com/en/stable/ref/django-admin/#envvar-DJANGO_RUNSERVER_HIDE_WARNING
os.environ["DJANGO_RUNSERVER_HIDE_WARNING"] = "true"
(I would rather this warning was configured with a setting, or it came through the system check framework, but that's a change for a future Django release, at most.)
With those lines in place, runserver
will be a little bit more peaceful:
$ ./manage.py runserver
...
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.
27 Jun 2025 4:00am GMT
26 Jun 2025
Django community aggregator: Community blog posts
Production-ready cache-busting for Django and Tailwind CSS
I'm a big fan of the django-tailwind-cli package. It makes integrating Tailwind CSS into a Django project incredibly simple. By managing the Tailwind watcher process for you, it streamlines development, especially when paired with django-browser-reload for live updates. It's a fantastic developer experience.
However, when I first deployed a project using this setup, I ran into a classic problem: caching. You see, django-tailwind-cli
creates a single tailwind.css
file that you load in your base template. In production, browsers and CDNs will aggressively cache this file to improve performance. This is normally a good thing! But when you deploy an update, like adding a new Tailwind class to a template, your users might not see the changes. Their browser will continue to serve the old, cached tailwind.css file
, leading to broken or outdated styling.
Luckily, Django has a built-in cache-busting mechanism in the form of ManifestStaticFilesStorage
. But, there's one important caveat: you can't use this class directly. The Tailwind build process relies on a source file (typically css/source.css
) that contains this line:
@import "tailwindcss";
When collectstatic
runs, ManifestStaticFilesStorage
tries to be helpful and process this file, too. It attempts to find and hash source.css
, and it also attempts to hash the imported tailwindcss
, which won't work.
The solution is to create a custom storage class that tells Django to leave source.css
alone.
storage.py
from django.contrib.staticfiles.storage import ManifestStaticFilesStorage
class CustomManifestStaticFilesStorage(ManifestStaticFilesStorage):
def hashed_name(self, name, content=None, filename=None):
# Skip hashing for source.css - it's only used during Tailwind compilation
if name == 'css/source.css':
return name
return super().hashed_name(name, content, filename)
def post_process(self, paths, **options):
# Exclude source.css from post-processing
paths = {k: v for k, v in paths.items() if k != 'css/source.css'}
return super().post_process(paths, **options)
Then configure it in settings.py
:
settings.py
STATIC_ROOT = BASE_DIR / "static_root"
STATIC_URL = "/static/"
STORAGES = {
"default": {
"BACKEND": "django.core.files.storage.FileSystemStorage",
},
"staticfiles": {
"BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage"
if DEBUG else "storage.CustomManifestStaticFilesStorage",
},
}
The last thing to do is update your base template. Replace the {% tailwind_css %}
tag with:
base.html
<link rel="preload" href="{% static 'css/tailwind.css' %}" as="style">
<link href="{% static 'css/tailwind.css' %}" rel="stylesheet" />
With everything configured, your deployment process for static files will now be a two-step command:
./manage.py tailwind build
./manage.py collectstatic --noinput
First, tailwind build
creates the final tailwind.css
file. Then, collectstatic
picks it up, hashes it with a unique name like tailwind.4e3e58f1a4a4.css
, and places it in your STATIC_ROOT
directory, ready to be served.
That's it! Your Tailwind styles are now production-ready and properly cache-busted.
26 Jun 2025 5:32pm GMT
Building a Multi-tenant App with Django
This tutorial looks at how to implement multi-tenancy in Django.
26 Jun 2025 3:28am GMT
Native connection pooling in Django 5 with PostgreSQL
Enabling native connection pooling in Django 5 gives me a 5.4x speedup.
26 Jun 2025 2:36am GMT
24 Jun 2025
Django community aggregator: Community blog posts
Django: Introducing inline-snapshot-django
I recently released a new package called inline-snapshot-django. It's a tool for snapshot testing SQL queries in Django projects, described shortly.
Snapshot testing and inline-snapshot
inline-snapshot-django builds on top of the excellent inline-snapshot, a nifty snapshot testing library.
Snapshot testing is a type of testing that compares a result against a previously captured "snapshot" value. It can enable you to write tests quickly and assert on many details that you'd otherwise miss.
Snapshot testing is quite popular in other languages, such as JavaScript, but it has been underused in Python. This may be due to a lack of good tools to help automate snapshot management. inline-snapshot provides nice workflows tied into pytest, so I hope it will popularize the technique among Python and Django developers.
The inline-snapshot developer, Frank Hoffmann, is working intensely on it, and he provides some extra tools and features for his GitHub sponsors.
inline-snapshot-django
I created inline-snapshot-django to provide an advanced yet easier-to-use alternative to Django's assertNumQueries()
. While assertNumQueries()
is handy, it is a rather blunt tool that leaves you in the dark when debugging failures. For example, say you encountered this test:
from django.test import TestCase
class IndexTests(TestCase):
def test_success(self):
with self.assertNumQueries(1):
response = self.client.get("/")
assert response.status_code == 200
If it starts failing with an incorrect query count, you'll see an error like:
$ pytest
========================= test session starts =========================
...
example/tests.py F [100%]
============================== FAILURES ===============================
_______________________ IndexTests.test_success _______________________
self = <example.tests.IndexTests testMethod=test_success>
def test_success(self):
> with self.assertNumQueries(1):
^^^^^^^^^^^^^^^^^^^^^^^^
E AssertionError: 2 != 1 : 2 queries executed, 1 expected
E Captured queries were:
E 1. SELECT COUNT(*) AS "__count" FROM "example_character"
E 2. SELECT "example_character"."id", "example_character"."name", "example_character"."strength", "example_character"."charisma", "example_character"."klass_id" FROM "example_character" LIMIT 10
The message makes it clear that the query count is incorrect, but it doesn't help you narrow down what changed. You might glean the change from the captured queries, but without a record of what the queries were before, it's hard to know what to look for. This problem becomes particularly challenging in larger projects with many queries.
One client project that I have been working on tried to alleviate this problem by adding a docstring under the assertNumQueries()
call, recording the expected query fingerprints:
from django.test import TestCase
class IndexTests(TestCase):
def test_success(self):
with self.assertNumQueries(1):
"""
1. SELECT ... FROM example_character
"""
response = self.client.get("/")
assert response.status_code == 200
This was a great idea, but failures still required manual comparison and updating of the query fingerprints. Managing and checking these fingerprints required a considerable effort, particularly when I worked on a Django 5.2 upgrade that added and removed some SQL queries in the admin interface. That work inspired inline-snapshot-django to automate such fingerprint comments, using inline-snapshot.
Here's the above test rewritten using inline-snapshot-django:
from django.test import TestCase
from inline_snapshot import snapshot
from inline_snapshot_django import snapshot_queries
class IndexTests(TestCase):
def test_success(self):
with snapshot_queries() as snap:
response = self.client.get("/")
assert response.status_code == 200
assert snap == snapshot(
[
"SELECT ... FROM example_character LIMIT ...",
]
)
The test now uses the snapshot_queries()
context manager to capture the fingerprints of the executed SQL queries. The fingerprints are then compared with a snapshot()
call in the final assertion.
inline-snapshot manages values within snapshot()
calls. It wraps comparisons, and if they fail, it can automatically update the call inside the test file with the new value.
For example, if we repeat the previous failure, inline-snapshot will prompt us to update the snapshot:
$ pytest
...
example/tests.py .E [100%]
โโโโโโโโโโโโโโโโโโโโโโโโโโโ inline-snapshot โโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโ Fix snapshots โโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โญโโโโโโโโโโโโโโโโโโโโโโโโโ example/tests.py โโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ @@ -11,6 +11,7 @@ โ
โ โ
โ assert response.status_code == 200 โ
โ assert snap == snapshot( โ
โ [ โ
โ + "SELECT ... FROM example_character", โ
โ "SELECT ... FROM example_character LIMIT ...", โ
โ ] โ
โ ) โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
Do you want to fix these snapshots? [y/n] (n):
As the message implies, answering "y" updates the test file to insert the new query fingerprint:
@@ -11,6 +11,7 @@
assert response.status_code == 200
assert snap == snapshot(
[
+ "SELECT ... FROM example_character",
"SELECT ... FROM example_character LIMIT ...",
]
)
inline-snapshot only does this prompting when running in an interactive terminal. On non-interactive runs, such as in CI, it will simply fail the test.
inline-snapshot also provides other modes, such as --inline-snapshot=update
, to automatically update snapshots without prompting. I covered the workflows for such modes with inline-snapshot-django over in its usage documentation.
History and fun with Rust
inline-snapshot-django is a new package, but it's inspired by my previous package django-perf-rec (Django performance recorder). I created django-perf-rec nearly 9 years ago, in 2016, to perform similar SQL query snapshot testing. However, it works differently by storing the snapshots in a separate YAML file.
For example, you might write the previous test with django-perf-rec like:
import django_perf_rec
from django.test import TestCase
class IndexTests(TestCase):
def test_success(self):
with django_perf_rec.record():
response = self.client.get("/")
assert response.status_code == 200
The record()
context manager captures SQL query fingerprints and stores them in an entry in a YAML file, copying the name from the test file, such as tests.perf.yml
. Its contents would look like:
IndexTests.test_success:
- db: SELECT COUNT(*) AS "__count" FROM "example_character"
- db: 'SELECT ... FROM "example_character" LIMIT #'
While it can be nice to get long query fingerprint lists out of the test file, the old package has many disadvantages:
- Separate snapshots are harder to review, as they don't make it super clear what changed, where.
- Renaming test files, classes, or methods requires manually updating the YAML file.
- YAML syntax is not always obvious and can be hard to format nicely.
- The SQL fingerprinting process was relatively slow and inflexible, being built in Python on top of sqlparse.
Starting afresh in inline-snapshot-django allowed me to address these issues. Building on top of inline-snapshot solves the first three issues, as it stores the snapshots inline in the test file. I tackled the final issue of poor SQL fingerprinting by building a new SQL fingerprinting library in Rust, called sql-fingerprint.
sql-fingerprint is my second open source Rust project, after Djade, my Django template formatter. It provides a fast way to fingerprint SQL queries, based on the excellent sqlparser crate, a production-grade SQL parser used by many Rust-based databases and tools.
I made sql-fingeprint as a Rust crate, and intended to publish it to PyPI as a Python package with the same name. However, at the last second, PyPI rejected the name sql-fingerprint because it was too similar to an existing package, sqlfingerprint (no hyphen). Therefore, the Python wrapper of sql-fingerprint is called sql-impressao, based on the Portuguese word for "fingerprint" (impressรฃo digital).
Using Rust has worked well for sql-fingerprint/sql-impressao, as it is much faster to process strings in Rust than in Python. While the previous SQL fingerprinting in django-perf-rec would be rather visible in test profiles, taking perhaps 1-5% of the total test time. sql-impressao rounds to 0% in the test profiles I've done.
Response snapshots would be cool too
While inline-snapshot-django currently only supports SQL query fingerprints, I think it can be extended to support snapshot testing of other Django features. For example, it might be possible to snapshot response details from the test client, allowing instant tests like:
from django.test import TestCase
from inline_snapshot import snapshot
from inline_snapshot_django import details, ResponseDetails
class IndexTests(TestCase):
def test_success(self):
response = self.client.get("/")
assert details(response) == snapshot(
ResponseDetails(
status_code=200,
content_type="text/html",
)
)
This idea is tracked in Issue #11 if you'd like to weigh in on it.
Fin
Please give inline-snapshot-django a try today in your Django projects. The documentation awaits your perusal, and I'd love to hear your feedback.
Happy snapshot testing,
-Adam
24 Jun 2025 4:00am GMT
20 Jun 2025
Django community aggregator: Community blog posts
Django News - Python 3.14.0 beta 3 - Jun 20th 2025
News
DSF member of the month - Elena Williams
Elena Williams, Django community stalwart and DSF Code of Conduct WG member, reflects on her contributions, favorite Django features, and community leadership.
International Travel to DjangoCon US 2025
Are you attending DjangoCon US 2025 in Chicago, Illinois, but you are not from US and need some travel information? Here are some things to consider when planning your trip.
Python 3.14.0 beta 3 is released!
Python 3.14.0 beta 3 adds deferred annotation evaluation, template string literals, multiple interpreters, zstd compression module, free-threaded support and other core improvements, ready for testing.
2025 PSF Board Election Schedule
PSF defines 2025 board election schedule with nomination, voter affirmation, voting dates, membership eligibility, and candidate resources for the Python community.
Updates to Django
Today 'Updates to Django' is presented by Abigail Afi Gbadago from the DSF Board and Djangonaut Space!๐
Last week we had 18 pull requests merged into Django by 15 different contributors - including 6 first-time contributors! Congratulations to Viliam Mihรกlik, Sulove Bista, ruvilonix, Jericho Serrano, nakanoh and Jeff Cho for having their first commits merged into Django - welcome on board!๐
This week's Django highlights ๐ซ
-
A follow-up to CVE-2025-48432: security archive which addresses Internal HTTP response logging has been added.
-
The "q" used in internal MediaType.params property has been restored.
-
Inline JavaScript has been removed in Geometry widgets for refactoring purposes.
Special thanks to Claude Paroz for the long work on the PR ๐ฅณ
Django Newsletter
Articles
Cut Django Database Latency by 50-70ms With Native Connection Pooling
Native connection pooling in Django 5.1 cuts 50- 70ms PostgreSQL connection latency, simplifies deployment without external tools, and boosts response times by 10-30%.
Make Django show dates and times in the visitor's local timezone
Combine middleware, a custom template tag, and client-side JavaScript to detect visitors timezone and consistently render Django DateTimeField values in users local time.
Avoiding Timezone Traps: Correctly Extracting Date/Time Subfields in Django with PostgreSQL
When extracting date subfields in Django with PostgreSQL, avoid applying AT TIME ZONE to DATE types under UTC session to prevent offset-based year miscalculations.
Avoiding PostgreSQL Pitfalls: The Hidden Cost of Failing Inserts
Using Django create with exception handling for unique constraint violations causes expensive rollbacks and bloat; use ON CONFLICT DO NOTHING
via bulk_create(ignore_conflicts)
or raw SQL.
Django Fellow Report
Django Fellow Report - Natalia Bidart
3 tickets triaged, 8 reviewed, 6 authored, security reports triage, and more.
Django Fellow Report - Sarah Boyce
10 tickets triaged, 15 reviewed, 3 authored, released Django 5.2.3, 5.1.11, 4.2.23, and more.
Events
Django on the Med - October 7th - 9th in Palafrugell, Spain
A new website with FAQs is now live. Django Development Sprints. Three days to get together and work on Django. Twice a year, in Pescara, Italy, and Palafrugell, Spain. Spring and Autumn.
DjangoCon Videos
How to solve a Python mystery - Aivars Kalvฤns
This talk introduces useful Linux performance and observability tools and covers real-world mysteries that this approach has helped to solve.
Bulletproof Data Pipelines: Django, Celery, and the Power of Idempotency - Ricardo Morato Rocha
Learn how to build resilient data pipelines with Django, Celery, and idempotent consumers. We'll dive into robust error-handling techniques and the role of idempotency in ensuring reliable and consistent data processing.
Logs, shells, caches and other strange words we use daily - Slawa Gladkov
Have you ever stopped to think about where the words we use in software engineering come from? Terms like "bug" and "debugging" are familiar to most of us, but what about "daemon" or "cache"? This talk takes a trip down memory lane to explore the surprising and often quirky origins of some of the most common words in computing.
Django News Jobs
Full Stack Software Engineer at Switchboard ๐
Django Fellow at Django Software Foundation
Senior Software Engineer at Simons Foundation
Senior Backend Engineer at Wasmer
Django Newsletter
Projects
justinmayer/smartypants.py
Translate plain ASCII quotation marks and other characters into "smart" typographic HTML entities.
mikeckennedy/jinja_partials
Simple reuse of partial HTML page templates in the Jinja template language for Python web frameworks. #pypackage
niwinz/django-jinja
Simple and nonobstructive jinja2 integration with Django.
Sponsorship
๐ Sponsor Django News
Are you interested in connecting with a vibrant community of over 4,200 active Django developers? We have sponsorship opportunities for this summer. Reach an engaged audience and support the Django community!
Explore Sponsorship Options โ
Django Newsletter
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
20 Jun 2025 3:00pm GMT
17 Jun 2025
Django community aggregator: Community blog posts
Avoiding Timezone Traps: Correctly Extracting Date/Time Subfields in Django with PostgreSQL
Working with timezones can sometimes lead to confusing results, especially when combining Django's ORM, raw SQL for performance (like in PostgreSQL materialized views), and specific timezone requirements. I recently had an issue while aggregating traffic stop data by year, where all yearly calculations needed to reflect the 'America/New_York' (EST/EDT) timezone, even though our original data contained timestamp with time zone fields. We were using django-pgviews-redux to manage materialized views, and I mistakenly attempted to apply timezone logic to a date field that had no time or timezone information.
The core issue stemmed from a misunderstanding of how PostgreSQL handles EXTRACT operations on date types when combined with AT TIME ZONE, especially within a Django environment that defaults database connections to UTC.
PostgreSQL's Handling of Timestamps and Timezones
PostgreSQL's timestamp with time zone (often abbreviated as timestamptz) type is a common database type for storing date and time information. As per the PostgreSQL documentation:
For timestamp with time zone values, an input string that includes an explicit time zone will be converted to UTC (Universal Coordinated Time) using the appropriate offset for that time zone.
When you query a timestamptz column, PostgreSQL converts the stored UTC value back to the current session's TimeZone. You can see your session's timezone with SHOW TIME ZONE;. Django, by default, sets this session TimeZone to 'UTC' for all database connections. This is a sensible default for consistency but can be a source of confusion if you're also interacting with the database via psql or other clients that might use your system's local timezone (e.g., 'America/New_York' on my Mac via Postgres.app).
You can change the session timezone and observe its effect:
tztest=# SHOW TIME ZONE;
-- TimeZone
-- ------------------
-- America/New_York (If running from my Mac via Postgres.app and psql)
tztest=# SELECT '2025-01-01 00:00:00 EST'::timestamptz;
-- SET
-- timestamptz
-- ------------------------
-- 2025-01-01 00:00:00-05 (Stored as UTC, displayed in session TZ which is America/New_York)
tztest=# SET TIME ZONE 'UTC'; SELECT '2025-01-01 00:00:00 EST'::timestamptz;
-- SET
-- timestamptz
-- ------------------------
-- 2025-01-01 05:00:00+00 (Stored as UTC, displayed in session TZ which is now UTC)
The AT TIME ZONE clause is used to convert a timestamp with time zone to a timestamp without time zone in a specified timezone, or a timestamp without time zone to a timestamp with time zone by assuming the naive timestamp is in the specified zone.
-- Assuming session timezone is UTC
tztest=# SELECT '2025-01-01 05:00:00+00'::timestamptz AT TIME ZONE 'America/New_York';
-- timezone
-- ---------------------
-- 2025-01-01 00:00:00 (Result is timestamp WITHOUT time zone)
The Pitfall: Extracting Subfields from DATE Types with AT TIME ZONE
Here's where things got tricky for me. My goal was to extract the year of traffic stops in 'America/New_York' time. The original data was timestamptz, but at some point in my raw SQL query construction with several Common Table Expressions (CTEs) for the materialized view, I was working with a date type.
Consider this scenario, which mirrors the confusion: your application (session TimeZone is 'UTC') executes a query like this:
-- Session TimeZone is 'UTC'
SELECT
'2025-01-01'::date AS the_date,
EXTRACT('year' FROM '2025-01-01'::date)::integer AS extract_year_simple,
EXTRACT('year' FROM ('2025-01-01'::date AT TIME ZONE 'America/New_York'))::integer AS extract_year_at_new_york;
You might expect extract_year_at_new_york to be 2025. However, it is 2024:
-[ RECORD 1 ]------------+-----------
the_date | 2025-01-01
extract_year_simple | 2025
extract_year_at_new_york | 2024
2024? What happened?
- '2025-01-01'::date is simply the date January 1st, 2025.
- When AT TIME ZONE 'America/New_York' is applied to this date type, PostgreSQL implicitly converts the date to a timestamp at the beginning of that day in the current session's timezone. Since Django sets the session to 'UTC', '2025-01-01'::date becomes 2025-01-01 00:00:00 UTC.
- Then, 2025-01-01 00:00:00 UTC is converted to the 'America/New_York' timezone. 2025-01-01 00:00:00 UTC is actually 2024-12-31 19:00:00 EST (UTC-5).
- EXTRACT('year' ...) from 2024-12-31 19:00:00 EST correctly yields 2024.
This behavior occurs because applying AT TIME ZONE to a date type (or a timestamp with time zone) performs a conversion based on the session timezone.
In my case, I was aggregating dates by year in a materialized view and mistakenly extracted years using AT TIME ZONE 'America/New_York' when it wasn't necessary. This led to incorrect results when aggregating the data in Django, because traffic stops on January 1st were being grouped into the wrong year, causing the counts to be off from other queries that are grouped by year.
When I was debugging the issue, I was confused because I was using psql with the session timezone set to 'America/New_York', which made it appear that the EXTRACT was working as I expected. It wasn't until I switched to a UTC session that the issue became clear.
For example, when the same query above is re-run with the timezone set to 'America/New_York', the extracted year is consistent:
tztest=# \x
Expanded display is on.
tztest=# SET TIME ZONE 'America/New_York';
SET
tztest=# SELECT
'2025-01-01'::date
, EXTRACT('year' FROM '2025-01-01'::date)::integer AS extract_year_display_est
, EXTRACT('year' FROM '2025-01-01'::date AT TIME ZONE 'America/New_York')::integer AS extract_year_at_est_display_est;
-[ RECORD 1 ]-------------------+-----------
date | 2025-01-01
extract_year_display_est | 2025
extract_year_at_est_display_est | 2025
Conclusion
When working with time zones, dates, and timestamps in Django and PostgreSQL, it's important to be aware of how time zones are handled in each system. Be mindful of the time zone settings in your database and Django, and be careful when extracting subfields from dates and timestamps. Hopefully, this post will help you avoid the pitfalls I encountered when working with time zones and dates!
17 Jun 2025 12:00am GMT
13 Jun 2025
Django community aggregator: Community blog posts
Django News - New Django Fellow Position! - Jun 13th 2025
News
DSF calls for applicants for a Django Fellow
DSF invites experienced Django developers to apply for a new Django Fellow position focused on framework maintenance, mentoring, and security oversight.
Django bugfix releases issued: 5.2.3, 5.1.11, and 4.2.23
Django issues bugfix releases for 5.2.3, 5.1.11, and 4.2.23 to finalize mitigation for potential log injection using safer logging practices.
Python Release Python 3.13.5
Python 3.13.5 resolves critical bugs in extension building and generator expressions, complementing Python 3.13's experimental free-threaded mode and JIT for improved performance.
Updates to Django
Hello there ๐ Today 'Updates to Django' is presented by Raffaella from Djangonaut Space! ๐
Last week we had 11 pull requests merged into Django by 10 different contributors - including 2 first-time contributors! Congratulations to myoungjinGo and Blayze for having their first commits merged into Django - welcome on board!
Fixes from last week include:
- A log injection possibility: the remaining response logging is migrated to
django.utils.log.log_response()
, which safely escapes arguments such as the request path to prevent unsafe log output (CVE-2025-48432). This is released within 5.2.3, 5.1.11, and 4.2.23. - An issue where
bulk_create()
would raise anIntegrityError
due tonull
values in the_order
column when used with models havingorder_with_respect_to
Meta option is now fixed. The fix ensures proper order values are assigned to objects during bulk creation. Special thanks to myoungjinGo for the first contribution and the long work on the PR, and to everyone who helped with the review ๐ฅณ
Django Newsletter
Sponsored Link 1
Open a Django office in Bulgaria with HackSoft!
Looking to expand your operations? We offer end-to-end support in setting up your Django development office. Learn more!
Articles
Announcing django-rq-cron
A Django app for running cron jobs with RQ.
Beyond htmx: building modern Django apps with Alpine AJAX
Leveraging Alpine AJAX, Django developers can achieve progressive enhancement with concise, server-rendered partial updates that simplify frontend complexity and ensure graceful degradation.
Better Django management commands with django-click and django-typer
Streamline Django management commands using django-click and django-typer for cleaner syntax, built-in argument parsing, and richer output via type annotations and customizable CLI styling.
Django, JavaScript modules and importmaps
Integrating JavaScript modules in Django with importmaps simplifies cache busting and app integration while exposing challenges with static asset storage and bundling.
Python: a quick cProfile recipe with pstats
Learn how to efficiently profile Django migrations and other Python scripts using cProfile and pstats to analyze slow functions and optimize database calls.
The currency of open-source
Using recognition as a strategic tool aligns individual motivations to streamline community efforts and guide open-source project direction.
DjangoCon Videos
Turn back time:Converting integer fields to bigint using Django migrations at scale
Django migrations enable converting IntegerField to BigIntegerField with minimal downtime using RunSQL for large-scale PostgreSQL upgrades on money and primary key fields.
Data-Oriented Django Drei
The talk demonstrates efficient application of Data Oriented Design for leveraging Django tools to optimize database indexes for faster query performance.
The fine print in Django release notes
Uncover overlooked Django 5.0+ features and their code improvements such as URL query modifications, LoginRequiredMiddleware, efficient Django Admin display and bulk_create conflict handling.
Sponsored Link 2
Scout Monitoring: Logs, Traces, Error (coming soon). Made for devs who own products, not just tickets.
Django News Jobs
Full Stack Software Engineer at Switchboard ๐
Django Fellow at Django Software Foundation ๐
Senior Software Engineer at Simons Foundation ๐
Senior Backend Engineer at Wasmer
Django Newsletter
Projects
alexandercollins/turbodrf
The dead simple Django REST Framework API generator with role-based permissions.
buttondown/django-rq-cron
A cron runner built atop rq
.
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
13 Jun 2025 3:00pm GMT
iSAQB meetup: software architecture decision making in practice
I attended a meetup of the Dutch iSAQB community in Utrecht (NL). The location was in the old industrial buildings of the former werkspoor train manufacturer, something I personally like :-)
(At least three people asked me during dinner whether there were any Dutch python meetups, so I'll post the three active ones that I know of here for ease of finding them: Utrecht, Leiden and Amsterdam. And don't forget the two one-day conferences, PyGrunn (Groningen) and pycon NL (Utrecht).)
Making significant software architecture decisions - Bert Jan Schrijver
Software architecture. What is software? Algorithms, code-that-works, the-part-you-cannot-kick. And what is software architecture? The part that is expensive to change, the structure of the system, best practices. The decisions that are important and hard and expensive to change. Software architecture is about making decisions. Decisions that hurt when you get them wrong.
There are bad reasons for architecture decisions:
- We've always done it like this.
- We don't want to depend on XYZ.
- We need to be future-proof. (You often get elaborate complex systems with this reasoning. Isn't a simple solution more changeable and future-proof?)
- Because the product owner wants it.
- Because the architect wants it. (If the architect wants something without real feedback from the team that has to build it.)
Some input you can use for architecture decisions:
- 5xW. Why, why, why, why, why. After asking "why" five times, you really get to the core.
- Every architecture decision should have a business component. (Don't pick a fancy framework when there's no business need.)
- Requirements.
- Constraints.
You also have to look at quality. ISO 25010 is a great checklist for software quality: self-descriptiveness, availability, recoverability, capacity, integrity, modifiability, testability, etc.
The perfect architecture doesn't exist, there are always trade-offs. Trade-off analysis can help you. Gather requirements, figure out quality attributes and constraints, select potential solutions, discover/weigh trade-offs, pick the best fitting solution. You can look at the book fundamentals of software architecture.
An example? Security versus usability: 2FA is great for security, but a pain to use. Availability versus costs: more redundancy and more servers also mean it costs more. He recommends this video.
Something to keep in mind: organisational factors. What is the developer experience for your idea? The learning curve? IDE support? Does it integrate with the current landscape? How popular is it in the industry as a whole? What is the long-term viability? Will it still be actively developed and is there a community?
And there are business factors. Support. Labour pool. License costs. What are the costs of migration versus the benefits after migration? Productivity. Is there an exit strategy if you want to move away from a provider or technology?
Some trade-offs shouldn't even need to be considered. For instance when something risks irreversible damage to your business.
Creating effective and objective architectural decision records (ADRs) - Piet van Dongen
Nothing is static. People change jobs, business goals change, budgets change, etc. Time goes on and during this time you are making decisions. When a new colleague joins, is it clear which decisions have been made beforehand? Are decisions discoverable? And can the decisions be explained? Are they appropriate?
He asked "did you ever disagree with a decision that involved you?". Almost all hands went up. Bad decisions might have been made in the past because better solutions weren't known or available at the time. Or there was time pressure. Unclarity on the requirements. All reasons for decisions to be bad.
Decisions should be made when you really understand the context, which should be clear. And the decision should be objective and logical and clear and well-explained. And they should be made by the right stakeholders: was it a shared decision?
Note: architecture work isn't only done by official architects.
Some archetypes of wrong decision-making:
- Aristocrats. A decision made by a small group of self-appointed experts. Ivory tower. They only explain why their decision is perfect, but they don't concern themselves with trade-offs.
- Unobtainium. A theoretically perfect decision, but that totally isn't implementable.
- Ivory tower dump. Even more ivory tower than the aristocrats. Totally no input from the team.
- Pros and cons. Endless lists of pros and cons.
- Polder model. Consensus-based. A decision made by a huge group. Endless meetings.
Now... how to make decisions in the right way? ADRs, Architecture Decision Records. A structured/standardised document that documents the decision. Structure? For instance:
-
Title + state + summary. Handy for easy scanning. State is something like "decided" or "rejected".
-
Stakeholders. Names plus the roles they had when the decision was made. Find stakeholders by looking with a 2x2 matrix: high/low power, high/low interest. A boss might be high power, low interest: keep him appropriately informed. High power, high interest: really involve them.
-
Context of the decision. Clear and sharp. What is in scope, what not?
-
Requirements. It is easy to come up with 1000 requirements. Stick to what is significant. What is significant? Requirements with high risk. Huge interest to high-power stakeholders. Hard-to-change items. The fewer requirements, the sharper the decision.
-
Options. Nicely listed and scored. On requirements. And just don't give scores, but weigh them by the importance of the requirements. This also helps in understanding the decision afterwards.
Options should be distinct. Don't pick very similar solutions. You should have something to choose. And drop options that you know are never going to satisfy the requirements, this clears up clutter.
But... watch out for tweaking the weights to get to the desired decision...
-
Decision. The logical conclusion.
In case the decision turned out to be wrong, you now have a nice document and you can re-evaluate it. Perhaps you missed a stakeholder? Perhaps a requirement was missed? Or a weight should be changed? You can then make a 2.0 of the ADR. You learned from your mistakes.
13 Jun 2025 4:00am GMT
12 Jun 2025
Django community aggregator: Community blog posts
Make Django show dates and times in the visitorโs local timezone
When you're building a web app with Django, handling timezones is a common hurdle. You're likely storing timestamps in your database in UTC-which is best practice-but your users are scattered across the globe. Showing them a UTC timestamp for when they left a comment isn't very friendly. They want to see it in their own, local time.
Let's start with a typical scenario. You have a Comment
model that stores when a comment was added:
models.py
class Comment(models.Model):
post = models.ForeignKey(Post, on_delete=models.CASCADE)
user = models.ForeignKey(User, on_delete=models.CASCADE)
comment = models.TextField()
added = models.DateTimeField(auto_now_add=True)
In your Django settings, you've correctly set TIME_ZONE = "UTC"
.
When you render these comments in a template, you'll find the problem right away:
post.html
{% for comment in post.comment_set.all %}
<div>
<h3>From {{ comment.user.name }} on {{ comment.added }}</h3>
<p>{{ comment.comment }}</p>
</div>
{% endfor %}
The output for {{ comment.added }}
will be in UTC, not the visitor's local time. Let's fix that.
The Server-Side Fix: A Timezone Middleware
The most robust way to solve this is on the server. If Django knows the user's timezone, it can automatically convert all datetime objects during rendering. The plan is simple:
- Use JavaScript to get the visitor's timezone from their browser.
- Store it in a cookie.
- Create a Django middleware to read this cookie on every request and activate the timezone.
First, let's create the middleware. This small piece of code will check for a timezone
cookie and, if it exists, activate it for the current request.
myapp/middleware.py
from zoneinfo import ZoneInfo
from django.utils import timezone
class TimezoneMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
tzname = request.COOKIES.get("timezone")
if tzname:
try:
# Activate the timezone for this request
timezone.activate(ZoneInfo(tzname))
except Exception:
# Fallback to the default timezone (UTC) if the name is invalid
timezone.deactivate()
else:
timezone.deactivate()
return self.get_response(request)
Don't forget to add the middleware to your settings.py
:
settings.py
# settings.py
MIDDLEWARE = [
# ...
"myapp.middleware.TimezoneMiddleware",
]
Next, we need to set that cookie. A tiny snippet of JavaScript in your base template is all it takes. The Intl
object in modern browsers makes this incredibly easy.
base.html
<script>
document.cookie = "timezone=" + Intl.DateTimeFormat().resolvedOptions().timeZone + "; path=/";
</script>
With this in place, every rendered datetime
object will now be in the user's local timezone. Brilliant!
Except for one small catch: it only works after the first page load. On the very first visit, the browser hasn't sent the cookie yet. Django renders the page in UTC, then the JavaScript runs and sets the cookie for the next request. This means new visitors get UTC times on their first impression. We can do better.
Fixing the First-Visit Problem with a Template Tag and JavaScript
To create a seamless experience, we need to handle that first visit gracefully. The solution is to combine our server-side middleware with a little client-side enhancement. We'll render the time in a way that JavaScript can easily find and format it, ensuring the correct time is shown even on the first load.
First, we create a custom template tag that wraps our timestamp in a semantically-correct <time>
element. This element includes a machine-readable datetime
attribute, which is perfect for our JavaScript to hook into.
myapp/templatetags/local_time.py
from django import template
from django.template.defaultfilters import date
from django.utils.html import format_html
from django.utils.timezone import localtime
register = template.Library()
@register.filter
def local_time(value):
"""
Renders a <time> element with an ISO 8601 datetime and a fallback display value.
Example:
{{ comment.added|local_time }}
Outputs:
<time datetime="2024-05-19T10:34:00+02:00" class="local-time">May 19, 2024 at 10:34 AM</time>
"""
if not value:
return ""
# Localize the time based on the active timezone (from middleware)
localized = localtime(value)
# Format for the datetime attribute (ISO 8601)
iso_format = date(localized, "c")
# A user-friendly format for the initial display
display_format = f"{date(localized, 'DATE_FORMAT')} at {date(localized, 'h:i A')}"
return format_html('<time datetime="{}" class="local-time">{}</time>', iso_format, display_format)
Now, update your template to use this new filter. Remember to load your custom tags first.
post.html
{% load local_time %}
{% for comment in post.comment_set.all %}
<div>
<h3>From {{ comment.user.name }} on {{ comment.added|local_time }}</h3>
<p>{{ comment.comment }}</p>
</div>
{% endfor %}
Finally, add a bit of JavaScript to your base template. This script will find all our <time>
elements and re-format their content using the browser's knowledge of the local timezone.
base.html
<script>
document.addEventListener('DOMContentLoaded', () => {
document.querySelectorAll('.local-time').forEach((el) => {
const utcDate = new Date(el.getAttribute('datetime'));
el.textContent = utcDate.toLocaleString(undefined, {
dateStyle: 'medium',
timeStyle: 'short'
});
});
});
</script>
The Best of Both Worlds
So why use both the middleware and the JavaScript? Because together, they cover all bases and provide the best user experience.
-
On the first visit: The user has no
timezone
cookie and the middleware does nothing. Thelocal_time
template tag renders the time in your server's default timezone (UTC). Immediately after the page loads, the JavaScript runs, finds the.local-time
element, and instantly rewrites its content to the user's actual local time. There might be a barely-perceptible flicker, but only on this very first page view. -
On all subsequent visits: The user has the cookie. The
TimezoneMiddleware
activates their timezone. Thelocal_time
template tag now renders the time correctly, right from the server. The JavaScript still runs, but it essentially replaces the already-correct time with the same correct time, resulting in no visible change.
This two-part approach gives you the best of server-side rendering (no content-shifting for returning visitors) while using client-side JavaScript as a progressive enhancement to fix the one edge case where the server can't know better.
12 Jun 2025 8:38pm GMT