13 Jun 2025
Django community aggregator: Community blog posts
Django News - New Django Fellow Position! - Jun 13th 2025
News
DSF calls for applicants for a Django Fellow
DSF invites experienced Django developers to apply for a new Django Fellow position focused on framework maintenance, mentoring, and security oversight.
Django bugfix releases issued: 5.2.3, 5.1.11, and 4.2.23
Django issues bugfix releases for 5.2.3, 5.1.11, and 4.2.23 to finalize mitigation for potential log injection using safer logging practices.
Python Release Python 3.13.5
Python 3.13.5 resolves critical bugs in extension building and generator expressions, complementing Python 3.13's experimental free-threaded mode and JIT for improved performance.
Updates to Django
Hello there π Today 'Updates to Django' is presented by Raffaella from Djangonaut Space! π
Last week we had 11 pull requests merged into Django by 10 different contributors - including 2 first-time contributors! Congratulations to myoungjinGo and Blayze for having their first commits merged into Django - welcome on board!
Fixes from last week include:
- A log injection possibility: the remaining response logging is migrated to
django.utils.log.log_response()
, which safely escapes arguments such as the request path to prevent unsafe log output (CVE-2025-48432). This is released within 5.2.3, 5.1.11, and 4.2.23. - An issue where
bulk_create()
would raise anIntegrityError
due tonull
values in the_order
column when used with models havingorder_with_respect_to
Meta option is now fixed. The fix ensures proper order values are assigned to objects during bulk creation. Special thanks to myoungjinGo for the first contribution and the long work on the PR, and to everyone who helped with the review π₯³
Django Newsletter
Sponsored Link 1
Open a Django office in Bulgaria with HackSoft!
Looking to expand your operations? We offer end-to-end support in setting up your Django development office. Learn more!
Articles
Announcing django-rq-cron
A Django app for running cron jobs with RQ.
Beyond htmx: building modern Django apps with Alpine AJAX
Leveraging Alpine AJAX, Django developers can achieve progressive enhancement with concise, server-rendered partial updates that simplify frontend complexity and ensure graceful degradation.
Better Django management commands with django-click and django-typer
Streamline Django management commands using django-click and django-typer for cleaner syntax, built-in argument parsing, and richer output via type annotations and customizable CLI styling.
Django, JavaScript modules and importmaps
Integrating JavaScript modules in Django with importmaps simplifies cache busting and app integration while exposing challenges with static asset storage and bundling.
Python: a quick cProfile recipe with pstats
Learn how to efficiently profile Django migrations and other Python scripts using cProfile and pstats to analyze slow functions and optimize database calls.
The currency of open-source
Using recognition as a strategic tool aligns individual motivations to streamline community efforts and guide open-source project direction.
DjangoCon Videos
Turn back time:Converting integer fields to bigint using Django migrations at scale
Django migrations enable converting IntegerField to BigIntegerField with minimal downtime using RunSQL for large-scale PostgreSQL upgrades on money and primary key fields.
Data-Oriented Django Drei
The talk demonstrates efficient application of Data Oriented Design for leveraging Django tools to optimize database indexes for faster query performance.
The fine print in Django release notes
Uncover overlooked Django 5.0+ features and their code improvements such as URL query modifications, LoginRequiredMiddleware, efficient Django Admin display and bulk_create conflict handling.
Sponsored Link 2
Scout Monitoring: Logs, Traces, Error (coming soon). Made for devs who own products, not just tickets.
Django News Jobs
Full Stack Software Engineer at Switchboard π
Django Fellow at Django Software Foundation π
Senior Software Engineer at Simons Foundation π
Senior Backend Engineer at Wasmer
Django Newsletter
Projects
alexandercollins/turbodrf
The dead simple Django REST Framework API generator with role-based permissions.
buttondown/django-rq-cron
A cron runner built atop rq
.
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
13 Jun 2025 3:00pm GMT
iSAQB meetup: software architecture decision making in practice
I attended a meetup of the Dutch iSAQB community in Utrecht (NL). The location was in the old industrial buildings of the former werkspoor train manufacturer, something I personally like :-)
(At least three people asked me during dinner whether there were any Dutch python meetups, so I'll post the three active ones that I know of here for ease of finding them: Utrecht, Leiden and Amsterdam. And don't forget the two one-day conferences, PyGrunn (Groningen) and pycon NL (Utrecht).)
Making significant software architecture decisions - Bert Jan Schrijver
Software architecture. What is software? Algorithms, code-that-works, the-part-you-cannot-kick. And what is software architecture? The part that is expensive to change, the structure of the system, best practices. The decisions that are important and hard and expensive to change. Software architecture is about making decisions. Decisions that hurt when you get them wrong.
There are bad reasons for architecture decisions:
- We've always done it like this.
- We don't want to depend on XYZ.
- We need to be future-proof. (You often get elaborate complex systems with this reasoning. Isn't a simple solution more changeable and future-proof?)
- Because the product owner wants it.
- Because the architect wants it. (If the architect wants something without real feedback from the team that has to build it.)
Some input you can use for architecture decisions:
- 5xW. Why, why, why, why, why. After asking "why" five times, you really get to the core.
- Every architecture decision should have a business component. (Don't pick a fancy framework when there's no business need.)
- Requirements.
- Constraints.
You also have to look at quality. ISO 25010 is a great checklist for software quality: self-descriptiveness, availability, recoverability, capacity, integrity, modifiability, testability, etc.
The perfect architecture doesn't exist, there are always trade-offs. Trade-off analysis can help you. Gather requirements, figure out quality attributes and constraints, select potential solutions, discover/weigh trade-offs, pick the best fitting solution. You can look at the book fundamentals of software architecture.
An example? Security versus usability: 2FA is great for security, but a pain to use. Availability versus costs: more redundancy and more servers also mean it costs more. He recommends this video.
Something to keep in mind: organisational factors. What is the developer experience for your idea? The learning curve? IDE support? Does it integrate with the current landscape? How popular is it in the industry as a whole? What is the long-term viability? Will it still be actively developed and is there a community?
And there are business factors. Support. Labour pool. License costs. What are the costs of migration versus the benefits after migration? Productivity. Is there an exit strategy if you want to move away from a provider or technology?
Some trade-offs shouldn't even need to be considered. For instance when something risks irreversible damage to your business.
Creating effective and objective architectural decision records (ADRs) - Piet van Dongen
Nothing is static. People change jobs, business goals change, budgets change, etc. Time goes on and during this time you are making decisions. When a new colleague joins, is it clear which decisions have been made beforehand? Are decisions discoverable? And can the decisions be explained? Are they appropriate?
He asked "did you ever disagree with a decision that involved you?". Almost all hands went up. Bad decisions might have been made in the past because better solutions weren't known or available at the time. Or there was time pressure. Unclarity on the requirements. All reasons for decisions to be bad.
Decisions should be made when you really understand the context, which should be clear. And the decision should be objective and logical and clear and well-explained. And they should be made by the right stakeholders: was it a shared decision?
Note: architecture work isn't only done by official architects.
Some archetypes of wrong decision-making:
- Aristocrats. A decision made by a small group of self-appointed experts. Ivory tower. They only explain why their decision is perfect, but they don't concern themselves with trade-offs.
- Unobtainium. A theoretically perfect decision, but that totally isn't implementable.
- Ivory tower dump. Even more ivory tower than the aristocrats. Totally no input from the team.
- Pros and cons. Endless lists of pros and cons.
- Polder model. Consensus-based. A decision made by a huge group. Endless meetings.
Now... how to make decisions in the right way? ADRs, Architecture Decision Records. A structured/standardised document that documents the decision. Structure? For instance:
-
Title + state + summary. Handy for easy scanning. State is something like "decided" or "rejected".
-
Stakeholders. Names plus the roles they had when the decision was made. Find stakeholders by looking with a 2x2 matrix: high/low power, high/low interest. A boss might be high power, low interest: keep him appropriately informed. High power, high interest: really involve them.
-
Context of the decision. Clear and sharp. What is in scope, what not?
-
Requirements. It is easy to come up with 1000 requirements. Stick to what is significant. What is significant? Requirements with high risk. Huge interest to high-power stakeholders. Hard-to-change items. The fewer requirements, the sharper the decision.
-
Options. Nicely listed and scored. On requirements. And just don't give scores, but weigh them by the importance of the requirements. This also helps in understanding the decision afterwards.
Options should be distinct. Don't pick very similar solutions. You should have something to choose. And drop options that you know are never going to satisfy the requirements, this clears up clutter.
But... watch out for tweaking the weights to get to the desired decision...
-
Decision. The logical conclusion.
In case the decision turned out to be wrong, you now have a nice document and you can re-evaluate it. Perhaps you missed a stakeholder? Perhaps a requirement was missed? Or a weight should be changed? You can then make a 2.0 of the ADR. You learned from your mistakes.
13 Jun 2025 4:00am GMT
12 Jun 2025
Django community aggregator: Community blog posts
Make Django show dates and times in the visitorβs local timezone
When you're building a web app with Django, handling timezones is a common hurdle. You're likely storing timestamps in your database in UTC-which is best practice-but your users are scattered across the globe. Showing them a UTC timestamp for when they left a comment isn't very friendly. They want to see it in their own, local time.
Let's start with a typical scenario. You have a Comment
model that stores when a comment was added:
models.py
class Comment(models.Model):
post = models.ForeignKey(Post, on_delete=models.CASCADE)
user = models.ForeignKey(User, on_delete=models.CASCADE)
comment = models.TextField()
added = models.DateTimeField(auto_now_add=True)
In your Django settings, you've correctly set TIME_ZONE = "UTC"
.
When you render these comments in a template, you'll find the problem right away:
post.html
{% for comment in post.comment_set.all %}
<div>
<h3>From {{ comment.user.name }} on {{ comment.added }}</h3>
<p>{{ comment.comment }}</p>
</div>
{% endfor %}
The output for {{ comment.added }}
will be in UTC, not the visitor's local time. Let's fix that.
The Server-Side Fix: A Timezone Middleware
The most robust way to solve this is on the server. If Django knows the user's timezone, it can automatically convert all datetime objects during rendering. The plan is simple:
- Use JavaScript to get the visitor's timezone from their browser.
- Store it in a cookie.
- Create a Django middleware to read this cookie on every request and activate the timezone.
First, let's create the middleware. This small piece of code will check for a timezone
cookie and, if it exists, activate it for the current request.
myapp/middleware.py
from zoneinfo import ZoneInfo
from django.utils import timezone
class TimezoneMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
tzname = request.COOKIES.get("timezone")
if tzname:
try:
# Activate the timezone for this request
timezone.activate(ZoneInfo(tzname))
except Exception:
# Fallback to the default timezone (UTC) if the name is invalid
timezone.deactivate()
else:
timezone.deactivate()
return self.get_response(request)
Don't forget to add the middleware to your settings.py
:
settings.py
# settings.py
MIDDLEWARE = [
# ...
"myapp.middleware.TimezoneMiddleware",
]
Next, we need to set that cookie. A tiny snippet of JavaScript in your base template is all it takes. The Intl
object in modern browsers makes this incredibly easy.
base.html
<script>
document.cookie = "timezone=" + Intl.DateTimeFormat().resolvedOptions().timeZone + "; path=/";
</script>
With this in place, every rendered datetime
object will now be in the user's local timezone. Brilliant!
Except for one small catch: it only works after the first page load. On the very first visit, the browser hasn't sent the cookie yet. Django renders the page in UTC, then the JavaScript runs and sets the cookie for the next request. This means new visitors get UTC times on their first impression. We can do better.
Fixing the First-Visit Problem with a Template Tag and JavaScript
To create a seamless experience, we need to handle that first visit gracefully. The solution is to combine our server-side middleware with a little client-side enhancement. We'll render the time in a way that JavaScript can easily find and format it, ensuring the correct time is shown even on the first load.
First, we create a custom template tag that wraps our timestamp in a semantically-correct <time>
element. This element includes a machine-readable datetime
attribute, which is perfect for our JavaScript to hook into.
myapp/templatetags/local_time.py
from django import template
from django.template.defaultfilters import date
from django.utils.html import format_html
from django.utils.timezone import localtime
register = template.Library()
@register.filter
def local_time(value):
"""
Renders a <time> element with an ISO 8601 datetime and a fallback display value.
Example:
{{ comment.added|local_time }}
Outputs:
<time datetime="2024-05-19T10:34:00+02:00" class="local-time">May 19, 2024 at 10:34 AM</time>
"""
if not value:
return ""
# Localize the time based on the active timezone (from middleware)
localized = localtime(value)
# Format for the datetime attribute (ISO 8601)
iso_format = date(localized, "c")
# A user-friendly format for the initial display
display_format = f"{date(localized, 'DATE_FORMAT')} at {date(localized, 'h:i A')}"
return format_html('<time datetime="{}" class="local-time">{}</time>', iso_format, display_format)
Now, update your template to use this new filter. Remember to load your custom tags first.
post.html
{% load local_time %}
{% for comment in post.comment_set.all %}
<div>
<h3>From {{ comment.user.name }} on {{ comment.added|local_time }}</h3>
<p>{{ comment.comment }}</p>
</div>
{% endfor %}
Finally, add a bit of JavaScript to your base template. This script will find all our <time>
elements and re-format their content using the browser's knowledge of the local timezone.
base.html
<script>
document.addEventListener('DOMContentLoaded', () => {
document.querySelectorAll('.local-time').forEach((el) => {
const utcDate = new Date(el.getAttribute('datetime'));
el.textContent = utcDate.toLocaleString(undefined, {
dateStyle: 'medium',
timeStyle: 'short'
});
});
});
</script>
The Best of Both Worlds
So why use both the middleware and the JavaScript? Because together, they cover all bases and provide the best user experience.
-
On the first visit: The user has no
timezone
cookie and the middleware does nothing. Thelocal_time
template tag renders the time in your server's default timezone (UTC). Immediately after the page loads, the JavaScript runs, finds the.local-time
element, and instantly rewrites its content to the user's actual local time. There might be a barely-perceptible flicker, but only on this very first page view. -
On all subsequent visits: The user has the cookie. The
TimezoneMiddleware
activates their timezone. Thelocal_time
template tag now renders the time correctly, right from the server. The JavaScript still runs, but it essentially replaces the already-correct time with the same correct time, resulting in no visible change.
This two-part approach gives you the best of server-side rendering (no content-shifting for returning visitors) while using client-side JavaScript as a progressive enhancement to fix the one edge case where the server can't know better.
12 Jun 2025 8:38pm GMT
11 Jun 2025
Django community aggregator: Community blog posts
Beyond htmx: building modern Django apps with Alpine AJAX
I've recently been rethinking how I build web applications. For the past few years my default has been a Django backend serving a JSON API to a frontend built with SvelteKit. And I am not alone; many (if not most) sites now use a complex JavaScript frontend and a JSON API. This pattern, the Single-Page Application (SPA), brought us amazing user experiences, but it also brought a mountain of complexity: state management, API versioning, client-side routing, duplicate form validation, build tools, and the endless churn of the JavaScript ecosystem.
And then I came across htmx, which promises to enhance HTML to the point where your old-fashioned Multi-Page Application (MPA) feels modern, without having to write a single line of JavaScript. We can have the smooth, modern UX of a SPA but with the simplicity and robustness of traditional, server-rendered Django applications.
This article is about why I believe this "Hypermedia-Driven Application" approach is a better fit for many Django projects than a full-blown SPA, and why I ultimately chose Alpine AJAX over the more popular htmx.
Returning to true REST and hypermedia
To understand why this "new" approach feels so simple, we need to look back at the original principles of the web. The term everyone knows is REST, but most of us associate "REST API" with "JSON API."
When Roy Fielding defined REST in his 2000 dissertation, JSON didn't even exist. REST was a description of the web itself, where hypermedia (i.e., HTML with links and forms) is the Engine of Application State (HATEOAS).
In a true RESTful system, a client (like a browser) doesn't need to know any specific API endpoints besides a single entry point. It discovers what it can do next simply by parsing the HTML it receives. The links and forms are the API, and they fully describe the available actions. This is why Fielding gets frustrated with what we call REST APIs today:
"I am getting frustrated by the number of people calling any HTTP-based interface a REST API. Today's example is the SocialSite REST API. That is RPC. It screams RPC. There is so much coupling on display that it should be given an X rating."
- Roy Fielding
If you've ever built a standard server-rendered Django app, congratulations: you've built something more RESTful than 99.9% of JSON APIs. The only problem is that the full-page reloads of these Multi-Page Applications feel clunky. This is the exact problem that libraries like htmx and Alpine AJAX solve: they let us keep the robust, simple, and truly RESTful architecture of an MPA, while adding the smooth user experience of an SPA.
(For a much deeper dive into the philosophy of hypermedia as the engine of state, I highly recommend the essays on the htmx.org website, as well as the book Hypermedia Systems by the creator of htmx.)
The promise of htmx
htmx is a brilliant library that "completes" HTML as a hypertext. It lets you trigger AJAX requests from any element, not just links and forms, and swap the response HTML into any part of the page.
For example, here's a classic "click-to-edit" pattern. Initially, the page shows user details with an "Edit" button:
<!-- Initial state -->
<html>
<body>
<div hx-target="this" hx-swap="outerHTML">
<div><label>First Name</label>: Joe</div>
<div><label>Last Name</label>: Blow</div>
<div><label>Email</label>: joe@blow.com</div>
<button hx-get="/contact/1/edit" class="btn primary">
Click To Edit
</button>
</div>
</body>
</html>
When you click the button, htmx sends a GET
request to /contact/1/edit
. The server responds not with JSON, but with a snippet of HTML for an edit form:
<!-- HTML returned from the server -->
<form hx-put="/contact/1" hx-target="this" hx-swap="outerHTML">
<div>
<label>First Name</label>
<input type="text" name="firstName" value="Joe">
</div>
<div>
<label>Last Name</label>
<input type="text" name="lastName" value="Blow">
</div>
<div>
<label>Email Address</label>
<input type="email" name="email" value="joe@blow.com">
</div>
<button class="btn">Submit</button>
<button class="btn" hx-get="/contact/1">Cancel</button>
</form>
htmx then swaps this form into the DOM, replacing the original div
. No JSON, no client-side templating, no virtual DOM. It's simple and fast.
You can build incredible features like infinite scroll, active search, and more with just a few HTML attributes.
The downside: a crack in the foundation
htmx really is a fantastic library, but there is one big downside: it encourages you to add behavior to elements that have no native function. Look at that "Click To Edit" button again:
<button hx-get="/contact/1/edit" class="btn primary">
Click To Edit
</button>
If JavaScript is disabled or fails to load, this button does⦠nothing. It's not wrapped in a form, so it has no default action. The same is true for the "Cancel" button in the edit form. The application is broken. This violates the principle of Progressive Enhancement, where a site should be functional at a baseline level (plain HTML) and enhanced with JavaScript.
You can write progressively enhanced code with htmx, but it often requires attribute repetition and constant vigilance from you, the developer.
My preferred alternative: Alpine.js + Alpine AJAX
Alpine.js is a rugged, minimal JavaScript framework for composing behavior directly in your HTML. If you've used Vue, it will feel very familiar. It's very often used alongside htmx to handle things htmx doesn't, like toggling modals or managing simple client-side state.
<!-- Simple Alpine.js counter -->
<div x-data="{ count: 0 }">
<button x-on:click="count++">Increment</button>
<span x-text="count"></span>
</div>
<!-- Alpine.js dropdown -->
<div x-data="{ open: false }">
<button @click="open = ! open">Toggle</button>
<div x-show="open" @click.outside="open = false">Contents ..</div>
</div>
I was already including Alpine for this kind of light interactivity, and then I discovered its Alpine AJAX plugin. It does most of what htmx does, but with two key differences:
- It's smaller (3kB vs 14kB for htmx). A nice bonus, but not the deciding factor.
- It only enhances
<a>
and<form>
tags.
This second point is the game-changer. By design, Alpine AJAX prevents you from making the progressive enhancement mistake. Your application must work with plain HTML first. Any AJAX functionality is purely an enhancement. For me, that's a win-win: a more resilient site with less JavaScript, built with a tool I'm already using.
Let's rebuild it with Alpine AJAX
Here is the same "click-to-edit" feature, now built with Alpine AJAX.
First, the initial state. The <button>
is now an <a>
tag, which has a meaningful href
:
<!-- Initial state with Alpine AJAX -->
<html>
<body>
<div id="user_details">
<div><label>First Name</label>: Joe</div>
<div><label>Last Name</label>: Blow</div>
<div><label>Email</label>: joe@blow.com</div>
<a href="/contact/1/edit"
x-target="user_details"
class="btn primary">
Click To Edit
</a>
</div>
</body>
</html>
Without JavaScript, this is a standard link that takes you to the edit page (a full page refresh). With JavaScript, the x-target="user_details"
attribute tells Alpine AJAX to fetch the content from the link's href
and use the response to replace the element with the ID user_details
.
The server returns the edit form. This is a standard HTML <form>
that works perfectly without JavaScript:
<!-- HTML returned from server -->
<form method="post"
action="/contacts/1"
id="user_details"
x-target="user_details">
<div>
<label>First Name</label>
<input type="text" name="firstName" value="Joe">
</div>
<!-- ... other fields ... -->
<button type="submit">Submit</button>
<a class="btn" href="/contact/1" x-target="user_details">Cancel</a>
</form>
When JavaScript is enabled, the x-target
on the form intercepts the submission, sends it via AJAX, and replaces the target with the result. The "Cancel" link works the same way. It's progressively enhanced by default.
Making it sing with Django
This is all great, but how do we handle this on the Django side? An AJAX request for a partial update needs just a snippet of HTML, while a full-page refresh (JS disabled) needs the full base template.
The simple approach
Alpine AJAX (and htmx) sends a special header with its requests. We can check for this header in our view to decide what to render.
# views.py
from django.shortcuts import render
from django.template.response import TemplateResponse
def contact_view(request, pk: int):
contact = Contact.objects.get(pk=pk)
context = {"contact": contact}
if "X-Alpine-Request" in request.headers:
# It's an AJAX request, return just the partial
return TemplateResponse(request, "partial.html", context)
# It's a normal request, return the full page
return TemplateResponse(request, "full.html", context)
This works, but maintaining two separate templates (full.html
and partial.html
) is a pain. Yes we can use Django's include
tag to include the partial template into the full template, but we can do better.
A better way: django-template-partials
A fantastic third-party package called django-template-partials
lets us define reusable blocks within a single template. We can then render just a specific block.
First, we define our partial block in the main template:
{# full.html #}
<html>
<body>
{% partialdef details inline %}
<div id="user_details">
... contact details and edit link ...
</div>
{% endpartialdef %}
</body>
</html>
Now, our view can choose to render the whole template or just the details
partial from it:
# views.py
def contact_view(request, pk: int):
contact = Contact.objects.get(pk=pk)
context = {"contact": contact}
if "X-Alpine-Request" in request.headers:
return TemplateResponse(request, "full.html#details", context)
return TemplateResponse(request, "full.html", context)
Much cleaner! We only have one template to maintain.
The best way: an abstracted TemplateResponse
We can abstract this logic away into a custom TemplateResponse
class to make our views even cleaner. Alpine AJAX sends another header, X-Alpine-Target
, which tells us which partial it's expecting. We can use this to automatically determine the partial name.
# a custom lib.py or utils.py
from django.template.response import TemplateResponse as BaseTemplateResponse
from django.http import HttpRequest
def is_alpine(request: HttpRequest) -> bool:
return "X-Alpine-Request" in request.headers
class AlpineTemplateResponse(BaseTemplateResponse):
def get_ajax_template(self, request: HttpRequest, template: str) -> str:
if is_alpine(request):
# Use the target ID from the request as the partial name.
# This allows one view to serve multiple, distinct partials.
# We fall back to "alpine" as a sensible default.
partial = request.headers.get("X-Alpine-Target") or "alpine"
return f"{template}#{partial}"
return template
def __init__(self, request: HttpRequest, template: str, *args, **kwargs):
template_name = self.get_ajax_template(request, template)
super().__init__(request, template_name, *args, **kwargs)
Now our view is blissfully unaware of the implementation details:
# views.py
from .lib import AlpineTemplateResponse
def contact_view(request, pk: int):
contact = Contact.objects.get(pk=pk)
return AlpineTemplateResponse(request, "full.html", {"contact": contact})
Final example: search-as-you-type
Here's how a "search-as-you-type" feature looks with our Alpine stack. Alpine handles the user input events (like debouncing), and Alpine AJAX handles the form submission.
<h3>Search Contacts</h3>
<form x-target="search-results" action="/contacts" autocomplete="off">
<input class="form-control" type="search"
name="search" placeholder="Begin Typing To Search Users..."
@input.debounce="$el.form.requestSubmit()"
@search="$el.form.requestSubmit()">
<button x-show="false">Search</button>
</form>
<table class="table">
<thead>
<tr>
<th>First Name</th>
<th>Last Name</th>
<th>Email</th>
</tr>
</thead>
<tbody id="search-results">
{# Initial results rendered by Django #}
</tbody>
</table>
This degrades perfectly. Without JS, it's a standard search form with a submit button. With JS, the submit button is hidden, @input.debounce
triggers a form submission via AJAX after the user stops typing, and the results are injected into the <tbody>
.
Compare this with the htmx version:
<h3>Search Contacts</h3>
<input class="form-control" type="search"
name="search" placeholder="Begin Typing To Search Users..."
hx-post="/search"
hx-trigger="input changed delay:500ms, keyup[key=='Enter'], load"
hx-target="#search-results">
<table class="table">
<thead>
<tr>
<th>First Name</th>
<th>Last Name</th>
<th>Email</th>
</tr>
</thead>
<tbody id="search-results">
{# Initial results rendered by Django #}
</tbody>
</table>
Instead of leaning on Alpine for the trigger logic, htmx has its own DSL for triggers. And like I said before: most people who use htmx, also use Alpine, so it's a bit strange to use two different syntaxes side by side. But more importantly this version doesn't work without JavaScript, it's not a progressive enhancement.
Yes, you can make this htmx example work without JavaScript, but it's not enforced, none of the official examples do so, and it results in a lot of added HTML attributes. It's not as ergonomic as Alpine AJAX in my experience.
Make Django messages work with Alpine AJAX
It's incredibly easy to make Django's messages framework work with Alpine AJAX. Let's say we have a view that sets a success message:
messages.success(request, "Success!")
How do you make this message appear when you're only returning a partial HTML template as a response to an AJAX request?
The trick is to use Alpine AJAX's x-sync
attribute. Change your base.html
to include the following snippet:
{% partialdef messages inline %}
<div id="messages" x-sync x-merge="append" class="toast toast-top toast-end">
{% for message in messages %}
<div class="alert alert-{{ message.tags }} flex"
x-data="{ open: false }"
x-show="open"
x-init="$nextTick(() => open = true); setTimeout(() => open = false, 3000)"
x-transition.duration.500ms>
<div>{{ message.message }}</div>
<button class="btn btn-circle btn-ghost" @click="open = false">x</button>
</div>
{% endfor %}
</div>
{% endpartialdef %}
And then include the following middleware into your project (and add it to MIDDLEWARE
in settings.py
):
class AlpineMessageMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
response = self.get_response(request)
if (
"X-Alpine-Request" in request.headers
and not 300 <= response.status_code < 400
and (messages := get_messages(request))
and not response.text.endswith("</html>")
):
response.write(
render_to_string(
"base.html#messages",
{"messages": messages},
)
)
return response
This includes the messages
partial from base.html
into any partial template response, as a result of an Alpine AJAX request. Alpine AJAX sees the x-sync
attribute, finds the same element in the webpage, and merges the content.
The result is that you can use Django's messages framework and those messages are shown as expected, even when you return a partial template that doesn't include those messages. The middleware takes care of all of that.
Closing thoughts
I've been building with this stack for a few weeks, and it feels like a revelation. I get to stay in Django, writing Python and standard HTML templates. All my validation and business logic live on the server where they belong. There's no API layer to maintain, no over-fetching, no build steps.
This approach also champions Locality of Behavior. When you look at a template, the behavior is right there in the HTML attributes (x-target
, @input
), not hidden away in a separate JavaScript file. It's the same reason I love Tailwind CSS. It might seem to violate "Separation of Concerns," but I've found it dramatically reduces the mental overhead of switching contexts.
This isn't to say SPAs are dead. For highly interactive, application-like experiences (think Figma or a complex dashboard), a framework like SvelteKit or Vue is still the right tool.
But for the vast majority of websites-the content sites, the e-commerce stores, the blogs-that are mostly pages of content with forms and a sprinkle of interactivity, this hypermedia approach feels like a return to sanity. It combines the stability and simplicity of Web 1.0 with the slick user experience of Web 2.0.
If you're a Django developer feeling the fatigue of the modern frontend, I highly recommend you give Alpine.js and Alpine AJAX a try. You might be surprised how productive and fun it is to build for the web again.
11 Jun 2025 4:54pm GMT
Migrating Python & Django Projects to uv
I recently migrated a legacy project that used requirements files to uv. I undertook the project in hopes of streamlining the setup process and to help to ensure that the versions of packages installed locally, in CI/CD, and in production are all consistent.
uv manages everything about your Python environment, so I found it's best to start with uv's approach and integrate other tools as needed.
Local development
To migrate a legacy project to uv, I followed these steps.
-
First, I added a project definition to our project's pyproject.toml:
[project] name = "my-product" version = "1.2.3" description = "Our amazing product." readme = "README.md" requires-python = "~=3.12" dependencies = []
-
Then, I moved the requirements from our pre-existing requirements files to the project dependencies and removed the old files:
uv add -r requirements/base.txt uv add -r requirements/dev.txt --group dev uv add -r requirements/deploy.txt --group deploy git rm requirements/*.txt
This adds the base requirements to the dependencies list in pyproject.toml, and the dev and deploy requirements to the dev and deploy groups, respectively.
-
Next, I installed and pinned a Python version, and synced the dependencies:
uv python install 3.12 uv python pin 3.12 uv sync
This installs a Python 3.12 interpreter, ensures that uv uses Python 3.12 for the current directory and all subdirectories (through the .python-version file it creates), and downloads and installs the necessary dependencies in a .venv virtual environment that uv manages for you.
Note that uv installs your dev requirements by default, but not other groups (this can be customized with default-groups), but you can include a group name on the command line to also install the dependencies for that group. For example, if I install the deploy group, uv will install the gunicorn package along with the already-installed packages for my project:
β― uv sync --locked --group deploy Resolved 223 packages in 6ms Installed 1 package in 7ms + gunicorn==23.0.0
It's worth noting that (as in the example below) one needs to specify --no-dev when calling uv sync in your deployed environment (Docker-based or otherwise). If you don't, then your deployed environment will include all of your local development dependencies.
-
Finally, I use direnv for managing environment variables, so I updated my .envrc to remove the old layout python command and add the new venv To my PATH:
sed -e '/layout python/ s/^#*/#/' -i .envrc echo 'export PATH="$(pwd)/.venv/bin:${PATH}"' >> .envrc direnv allow
Note that if you installed uv in the same terminal session, you may need to restart your terminal, otherwise, direnv allow might wipe out the PATH changes that the uv installation script made and prevent you from running uv.
Updating the project's Dockerfile
uv comes with a nice uv-docker-example that shows how to integrate uv with Docker.
In our case:
-
I changed the base image to use the uv image:
FROM ghcr.io/astral-sh/uv:python3.12-bookworm
-
I enabled byte code compilation, copying from cache, and set a custom virtual environment path:
# Enable bytecode compilation ENV UV_COMPILE_BYTECODE=1 # Copy from the cache instead of linking since it's a mounted volume ENV UV_LINK_MODE=copy # Use a custom VIRTUAL_ENV with uv to avoid conflicts with local developer's # .venv/ while running tests in Docker ENV VIRTUAL_ENV=/venv
Setting a custom VIRTUAL_ENV is not recommended in the uv example, but I found it necessary in our case to avoid conflicts with the local developer's .venv/ when running tests locally inside the Docker container (our project supports running tests locally both with and without Docker).
-
In our build step that installs requirements, I mount the cache and necessary files as recommended in the example, and adjust the uv sync command to target my custom virtual environment path:
ARG UV_OPTS="--no-dev --group deploy" RUN --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ set -ex \ && BUILD_DEPS=" \ build-essential \ git \ libpq-dev \ " \ && apt-get update && apt-get install -y --no-install-recommends $BUILD_DEPS \ && uv venv $VIRTUAL_ENV \ && uv sync --active --locked --no-install-project $UV_OPTS \ && apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false $BUILD_DEPS \ && rm -rf /var/lib/apt/lists/* # Add uv venv to PATH ENV PATH="$VIRTUAL_ENV/bin:$PATH"
I hit an odd qemu/Docker bug with bytecode compilation during this process, but fortunately it happened only on a self-hosted runner that we control, so I was able to switch from the docker-container driver to the docker driver and avoid the issue altogether.
Integrating pre-commit
Our project also uses pre-commit for managing code quality checks and formatting.
I integrated the pre-commit check for the uv.lock file by adding the following to our .pre-commit-config.yaml:
repos:
- repo: https://github.com/astral-sh/uv-pre-commit
# uv version.
rev: 0.7.12
hooks:
- id: uv-lock
This makes sure that when the dependencies are updated, the uv.lock file is also updated.
There are a number of other pre-commit hooks available for uv which may be useful for your project.
Integrating Github Actions
I switched to using setup-uv to install and manage Python and our project's requirements:
name: CI
# <snip>
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: astral-sh/setup-uv@v1
with:
python-version: 3.12
enable-cache: true
cache-dependency-glob: "uv.lock"
- run: uv sync --locked
- run: uv run pre-commit run --show-diff-on-failure --color=always --all-files
- run: uv run ... # your test command here
Our CI workflow is more complex than is worth sharing here, but hopefully this snippet helps demonstrate how you can use setup-uv in place of setup-python to manage your Python environment in CI/CD.
Local setup instructions (documentation!)
Last but certainly not least, I updated our project's local setup instructions to work both for new developers creating a fresh environment, and for pre-existing local environments that need to be migrated to use uv.
Additionally, we found the workflows for updating dependent packages are still being worked on, so we added some documentation to our project repo to explain these steps.
For example, if you need to update a dependent package that's not listed in pyproject.toml, you can run the following commands to update the uv.lock file and sync your local venv:
uv lock --upgrade-package <package-name>
uv sync --locked
From time to time, we found it may also be helpful to regenerate the uv.lock file with all the indirect dependencies upgraded. You can do this by running:
uv lock --upgrade
uv sync --locked
My colleagues Simon and Mariatta reviewed the changes and tested the setup instructions on their own machines, and they provided valuable feedback that helped refine the documentation.
Conclusion
We're still early in the migration process (the related PR hasn't been merged yet as I write this!), but my initial impressions are that uv will help streamline the Python setup process for our project. As an added bonus, the base Docker image is hosted on Github Container Registry, which as far as I can tell does not (yet) enforce a rate limit on pulls.
I hope this has provided a helpful outline for testing uv in your projects. If you have any questions or suggestions, feel free to reach out to us directly!
11 Jun 2025 10:00am GMT
06 Jun 2025
Django community aggregator: Community blog posts
PHP Web Frameworks - Roman Pronskiy
- Roman's personal site, GitHub, and YouTube channel
- The PHP Foundation
- The New Life of PHP - The PHP Foundation
- PHP Annotated Newsletter
- PHP 3 to 8: The Evolution of a Codebase
- PHPStorm 2025.1
Sponsor
This episode was brought to you by Buttondown, the easiest way to start, send, and grow your email newsletter. New customers can save 50% off their first year with Buttondown using the coupon code DJANGO.
06 Jun 2025 3:16pm GMT
Django News - Django security releases issued: 5.2.2, 5.1.10, and 4.2.22 - Jun 6th 2025
News
Django security releases issued: 5.2.2, 5.1.10, and 4.2.22
Django issues security patches in 5.2.2, 5.1.10, and 4.2.22, resolving a moderate severity log injection vulnerability in internal logging via unescaped request.path
.
Python 3.13.4, 3.12.11, 3.11.13, 3.10.18 and 3.9.23 are now available!
The Python 3.13.4 release includes over 300 bug fixes, and every version of Python has received three security updates.
django-unicorn - Request for maintainer(s)
Django-unicorn seeks new maintainers to help evolve its interactive component library, address complexity, improve features, and support continued development within Django projects.
Python Packaging Ecosystem Survey
Participate in Anaconda's survey to share your Python packaging experiences and resource preferences, helping guide future improvements in the packaging ecosystem.
Updates to Django
Today 'Updates to Django' is presented by Pradhvan from the Djangonaut Space! π Last week we had 3 pull requests merged into Django by 3 different contributors - including 1 first-time contributor! Congratulations to Jason Judkins for having their first commit merged into Django - welcome on board! π₯³
This week's Django highlight: π¦
- Jason, updated the package name in Django's reusable apps tutorial to follow PEP 625 standards.
Django Newsletter
Wagtail CMS
Closing the gap: strict CSP in the Django world
Inching closer to strict CSP compatibility for the Django ecosystem.
Articles
Optimizing Django Docker Builds with Astral's `uv`
Astral's uv dramatically accelerates and secures Django Docker builds by leveraging multi-stage images, cache mounts, and strict lockfile verification for deterministic dependency management.
Give Your Django Admin XβRay Vision - Automatic DeadβLink Detection
django-linkcheck is a Django app that automates dead link detection in URLFields and HTML content, schedules checks, and provides an admin interface.
Loopwerk: An easy way to use different serializers for different actions and request methods in Django REST Framework
ActionSerializerModelViewSet
lets Django REST Framework developers assign specific read and write serializers per viewset action or method with fallback logic.
Django: Deferred constrain enforcement
PostgreSQL deferrable unique constraints in Django ORM allow postponing integrity checks until transaction commit to avoid transient conflicts when bulk updating related records.
Validating a new project
Leveraging PyCon open spaces and sprints delivers actionable early feedback and use cases for new Python tools like py-bugger and django-simple-deploy.
SQLite Virtual Tables from Django
Integrate custom Rust-based SQLite virtual tables into Django by loading extensions on connection creation and auto-defining virtual tables for models to query external data.
How to split up a Django monolith without using microservices
django-queuebie offers a synchronous command and event message queue to decouple, modularize and test complex Django business logic across internal apps without microservices.
Preserving referential integrity with JSON fields and Django
Use django-json-schema-editor's JSONField
with JSON schema definitions and register_data_reference
to enforce on_delete=PROTECT
referential integrity for model ids in JSON, illustrated with galleries
Django Fellow Report
Fellow Report - Natalia Bidart
Seven tickets triaged, six reviewed, three authored, and monthly Security Council call.
DjangoCon Videos
Keynote: Django needs you! (to do code review)
Django Fellow Sarah Boyce's keynote on how/why to contribute to Django.
Django + HTMX: Patterns to Success with Lucas Pires
A talk on tried and tested patterns for building applications using Django and HTMX.
End-to-end testing Django applications using Pytest with Playwright by Jacob Rief
How to evolve unit tests using Beautifulsoup into end-to-end tests using Playwright.
Sponsored Link 2
AI-Powered Django Development & Consulting
REVSYS specializes in seamlessly integrating powerful AI technologies, including GPT-4, into your existing Django applications. Your Django project deserves modern, intelligent features that enhance user engagement and streamline content workflows.
Podcasts
Django Chat #184: PHP Web Frameworks - Roman Pronskiy
Roman Pronskiy is the Executive Director of the PHP Foundation and a Developer Advocate at JetBrains. We discuss PHP's evolution over the years, Laravel vs Symfony, and what Python can learn from the PHP ecosystem.
Django News Jobs
Senior Backend Engineer at Wasmer
Python / Django Software Developer - fulltime employee- No visa sponsorship at Off Duty Management
Django Newsletter
Django Forum
Proposal: Lazy loading for `django.contrib.gis.gdal`
Proposal to implement lazy loading in django.contrib.gis.gdal
, matching GEOS behavior, so django.setup()
won't fail when GDAL isn't installed unless needed.
Projects
AndrewIngram/django-extra-views
Django's class-based generic views are awesome, let's have more of them.
simonw/django-plugin-datasette
Django plugin to run Datasette inside of Django.
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
06 Jun 2025 3:00pm GMT
04 Jun 2025
Django community aggregator: Community blog posts
Preserving referential integrity with JSON fields and Django
Preserving referential integrity with JSON fields and Django
Motivation
The great thing about using feincms3 and django-content-editor is that CMS plugins are Django models - if using them you immediately have access to the power of Django's ORM and Django's administration interface.
However, using one model per content type can be limiting on larger sites. Because of this we like using JSON plugins with schemas for more fringe use cases or for places where we have richer data but do not want to write a separate Django app for it. This works well as long as you only work with text, numbers etc. but gets a bit ugly once you start referencing Django models because you never know if those objects are still around when actually using the data stored in those JSON fields.
Django has a nice on_delete=models.PROTECT
feature, but that of course only works when using real models. So, let's bridge this gap and allow using foreign key protection with data stored in JSON fields!
Models
First, you have to start using the django-json-schema-editor and specifically its JSONField
instead of the standard Django JSONField
. The most important difference between those two is that the schema editor's field wants a JSON schema. So, for the sake of an example, let's assume that we have a model with images and a model with galleries. Note that we're omitting many of the fields actually making the interface nice such as titles etc.
from django.db import models
from django_json_schema_editor.fields import JSONField
class Image(models.Model):
image = models.ImageField(...)
gallery_schema = {
"type": "object",
"properties": {
"caption": {"type": "string"},
"images": {
"type": "array",
"format": "table",
"minItems": 3,
"items": {
"type": "string",
"format": "foreign_key",
"options": {
# raw_id_fields URL:
"url": "/admin/myapp/image/?_popup=1&_to_field=id",
},
},
},
},
}
class Gallery(models.Model):
data = JSONField(schema=gallery_schema)
Now, if we were to do it by hand, we'd define a through
model for a ManyToManyField
linking galleries to images, and adding a on_delete=models.PROTECT
foreign key to this through model's image
foreign key and we would be updating this many to many table when the Gallery
object changes. Since that's somewhat boring but also tricky code I have already written it (including unit tests of course) and all that's left to do is define the linking:
Gallery.register_data_reference(
# The model we're referencing:
Image,
# The name of the ManyToManyField:
name="images",
# The getter which returns a list of stringified primary key values or nothing:
getter=lambda obj: obj.data.get("images"),
)
Now, attempting to delete an image which is still used in a gallery somewhere will raise ProtectedError exceptions. That's what we wanted to achieve.
Using a gallery instance
When you have a gallery instance you can now use the images
field to fetch all images and use the order from the JSON data:
def gallery_context(gallery):
images = {str(image.pk): image for image in gallery.images.all()}
return {
"caption": gallery.data["caption"],
"images": [images[pk] for pk in gallery.data["images"]],
}
JSONPluginBase and JSONPluginInline
I would generally do the instantiation of models slightly differently and use django-json-schema-editor
's JSONPluginBase
and JSONPluginInline
which offer additional niceties such as streamlined JSON models with only one backing database table (using proxy models) and supporting not just showing the primary key of referenced model instances but also their __str__
value.
The example above would have to be changed to look more like this:
from django_json_schema_editor import JSONPluginBase
class JSONPlugin(JSONPluginBase, ...):
pass
JSONPlugin.register_data_reference(...)
Gallery = JSONPlugin.proxy("gallery", schema=gallery_schema)
However, that's not documented yet so for now you unfortunately have to read the code and the test suite, sorry for that. It's used heavily in production though so if you start using it it won't suddenly start breaking in the future.
04 Jun 2025 5:00pm GMT
03 Jun 2025
Django community aggregator: Community blog posts
An easy way to use different serializers for different actions and request methods in Django REST Framework
Imagine a simple Django REST Framework serializer and view like this:
from rest_framework import serializers
from rest_framework import viewsets
from .models import Post
class PostSerializer(serializers.ModelSerializer):
class Meta:
model = Post
fields = "__all__"
class PostViewSet(viewsets.ModelViewSet):
serializer_class = PostSerializer
def get_queryset(self):
return Post.objects.all()
The PostSerializer
class is used for everything: the list of posts, retrieving a single post, the payload when creating or updating a post, and the response when creating or updating a post.
I find that this is often not what I want; for example I often want a simple version of the model to be returned in the list endpoint (/posts/
), while the full model is returned in the retrieve endpoint (/posts/{post_id}/
). And I also often want that the input serializer is different from the output serializer, when creating or updating something (especially when using DRF's built-in Browsable API, because it includes all the read-only fields in the example input payload, causing confusion).
Using different serializers in the list and retrieve endpoints isn't too hard:
class PostViewSet(viewsets.ModelViewSet):
def get_serializer_class(self):
if self.action == "list":
return PostListSerializer
return PostDetailSerializer
But when you also want to use different input and output serializers when creating and updating models, then you need to override a lot more code:
class PostViewSet(viewsets.ModelViewSet):
def get_serializer_class(self):
if self.action == "list":
return PostListSerializer
return PostDetailSerializer
def create(self, request, *args, **kwargs):
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
self.perform_create(serializer)
response_serializer = PostWriteSerializer(
instance=serializer.instance,
context=self.get_serializer_context(),
)
headers = self.get_success_headers(response_serializer.data)
return Response(response_serializer.data, status=status.HTTP_201_CREATED, headers=headers)
def update(self, request, *args, **kwargs):
partial = kwargs.pop("partial", False)
instance = self.get_object()
serializer = self.get_serializer(instance, data=request.data, partial=partial)
serializer.is_valid(raise_exception=True)
self.perform_update(serializer)
if getattr(instance, "_prefetched_objects_cache", None):
# If 'prefetch_related' has been applied to a queryset, we need to
# forcibly invalidate the prefetch cache on the instance.
instance._prefetched_objects_cache = {}
response_serializer = PostWriteSerializer(
instance=serializer.instance,
context=self.get_serializer_context(),
)
return Response(response_serializer.data)
This is starting to get pretty unwieldy for something that comes up all time time. Or what about different serializers for different router actions within a viewset? You keep adding more and more code to handle all the different actions within the get_serializer_class
method.
Today I want to present a better way, inspired by rest-framework-actions and drf-rw-serializers.
The first project, rest-framework-actions, allows you to specify different serializers for different actions (so you can have a list_serializer_class
which is different from the serializer_class
), which is super useful, as well as different serializers for input versus output. It's almost perfect, but not quite. For example you can't specify different serializers for added actions, and since there's no serializer fallback logic you end up being forced to six properties to your ViewSets.
The second project, drf-rw-serializers, allows you to specify different serializers for the write and read actions: write_serializer_class
and read_serializer_class
, and it handles serializer fallbacks a lot better. But it doesn't allow you to specify different serializers for different actions, it's a bit too simple.
So I took these ideas, evolved it, and now your view can look like this:
class PostViewSet(ActionSerializerViewSet):
serializer_class = PostDetailSerializer
list_serializer_class = PostListSerializer
write_serializer_class = PostWriteSerializer
Or you can get super specific, like this:
class PostViewSet(ActionSerializerViewSet):
list_read_serializer_class = PostListSerializer
retrieve_read_serializer_class = PostDetailSerializer
create_write_serializer_class = PostWriteSerializer
create_read_serializer_class = PostListSerializer
update_write_serializer_class = PostWriteSerializer
update_read_serializer_class = PostDetailSerializer
And it also works for any extra actions you add onto the ViewSet. So you can have different serializers for each action, you can have different serializers for input and output, and a different serializer for every combination of action and method, with sensible fallback logic so you don't have to specify a serializer for every possible combination (like you're forced to do with rest-framework-actions).
Here's the full code of ActionSerializerViewSet
. Just drop it into your project (mine lives in a lib.py
file) and use this instead of ModelViewSet
.
from rest_framework import permissions, status, viewsets
from rest_framework.response import Response
class ActionSerializerViewSet(viewsets.ModelViewSet):
"""
A ModelViewSet that enables the use of different serializers for responses and
requests for update/create, as well as different serializers for different actions.
The create and update actions use a special write serializer, while the response of these
actions use the read serializer.
"""
def get_action_serializer(self, method):
result = (
getattr(self, f"{self.action}_{method}_serializer_class", None)
or getattr(self, f"{self.action}_read_serializer_class", None)
or getattr(self, f"{self.action}_serializer_class", None)
or getattr(self, f"{method}_serializer_class", None)
or getattr(self, "read_serializer_class", None)
or getattr(self, "serializer_class", None)
)
assert result is not None, (
f"{self.__class__.__name__} should either include one of `{self.action}_{method}_serializer_class`, `{self.action}_read_serializer_class`, `{self.action}_serializer_class`, `{method}_serializer_class`, `read_serializer_class`, and `serializer_class` attribute, or override the `get_serializer_class()` method"
)
return result
def get_serializer_class(self):
method = "read" if self.request.method in permissions.SAFE_METHODS else "write"
return self.get_action_serializer(method)
def create(self, request, *args, **kwargs):
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
self.perform_create(serializer)
response_serializer = self.get_action_serializer("read")(
instance=serializer.instance,
context=self.get_serializer_context(),
)
headers = self.get_success_headers(response_serializer.data)
return Response(response_serializer.data, status=status.HTTP_201_CREATED, headers=headers)
def update(self, request, *args, **kwargs):
partial = kwargs.pop("partial", False)
instance = self.get_object()
serializer = self.get_serializer(instance, data=request.data, partial=partial)
serializer.is_valid(raise_exception=True)
self.perform_update(serializer)
if getattr(instance, "_prefetched_objects_cache", None):
# If 'prefetch_related' has been applied to a queryset, we need to
# forcibly invalidate the prefetch cache on the instance.
instance._prefetched_objects_cache = {}
response_serializer = self.get_action_serializer("read")(
instance=serializer.instance,
context=self.get_serializer_context(),
)
return Response(response_serializer.data)
Disclaimer: this code could (and probably should) be split up into multiple mixins, so you don't always get the full set of actions that come with ModelViewSet
when you use ActionSerializerViewSet
. Once I have a need for that in my real-world project I'll make the changes and update this post. For now I don't want to post this code to GitHub, maybe later.
03 Jun 2025 10:13pm GMT
Changing Directions
Two announcements: (1) I'm leaving the tech industry. Hopefully "for good"; if not, at least "for now". (2) As such, the content on this blog is going to shift, perhaps dramatically. I'm going to be writing about a broader range of topics that interest me (projects around my hobby farm, wilderness trips, emergency medicine) - more writing for me, less writing for some imagined audience. (I'll probably still end up writing about some of the same topics as I've been covering since 2020, just less often.) I'm writing this post mostly to give myself permission to make that change, and to give readers the opportunity to unsubscribe/unfollow if they're not interested.
03 Jun 2025 5:00am GMT
30 May 2025
Django community aggregator: Community blog posts
Django News - DjangoCon US Early Bird Tickets - May 30th 2025
News
Python Release Python 3.14.0b2
Python 3.14.0b2 beta introduces deferred type annotations, t-string templating, improved error messages, and remote debugging support that may influence Django project testing.
Updates to Django
Fixed #35629 -- Added support for async database connections and cursors.
Enhances Django's ORM with asynchronous database connections and low-level cursor support for executing raw SQL queries, improving async performance and transaction management.
Wagtail CMS
What's new in Wagtail - May 2025 highlights
May 2025 Wagtail update for Django developers details LTS release enhancements with autosave progress, dynamic StreamField previews, improved accessibility and active community contributions.
Sponsored Link 1
Open a Django office in Bulgaria with HackSoft!
Looking to expand your operations? We offer end-to-end support in setting up your Django development office. Learn more!
Articles
Faster Python Docker Builds
Optimize Django and Python Docker builds by caching dependencies, using uv pip, and multi-stage builds to drastically reduce fresh build and rebuild times.
How I'm bundling frontend assets using Django and rspack these days
Using rspack for frontend asset bundling in Django enables efficient hot module reloading, content-based cache busting, and streamlined production builds via reusable configuration snippets.
Another Great PyCon
PyCon US 2025 showcased dynamic community engagement, rapid problem-solving, creative events, and inclusive practices that resonate with Python and Django developers.
Loading Pydantic models from JSON without running out of memory
Pydantic's JSON loading uses a huge amount of memory; here's how to reduce it.
Pygrunn: django template LSP, smarter completion for django templates
Django template LSP enhances editor support with auto-completion, custom tag detection, and docker integration for improved Django template development.
Docker: disable "What's next" adverts
Disable intrusive Docker CLI adverts by setting DOCKER_CLI_HINTS
to false to streamline output during Django tests and development.
Django Fellow Report
Django Fellow Report - Natalia Bidart
One ticket triaged, four reviewed, two authored, security work, t-strings research, and more.
Django Fellow Report - Sarah Boyce
Eleven tickets triaged, fifteen reviewed, two authored, security work, etc.
Events
PyBay2025: Call for Speakers @ Sessionize.com
PyBay2025 invites Python community members including Django developers to deliver innovative talks and network during its tenth anniversary celebration in San Francisco.
DjangoCon US Early Bird Tickets - Sale ends May 31st
The conference will take place from September 8 to 12 in Chicago, Illinois. Early bird ticket prices end May 31st.
Sponsored Link 2
Sponsor Django News
Podcasts
Django Chat #183: Django Deployments in 2025 - Eric Matthes
Eric is the author of Python Crash Course, the Mostly Python newsletter, and the django-simple-deploy
package. We talk about rewriting the Django deployment story, different hosting providers, and teaching Python & Django to newcomers.
Episode 5: Chocolately Django REST APIs
Django Brew Episode 5 explains building robust REST APIs in Django using JsonResponse, Django REST Framework, and Django Ninja for efficient backend API development.
Django News Jobs
Senior Backend Engineer at Wasmer
Python / Django Software Developer - full-time employee: No visa sponsorship at Off Duty Management
Django Newsletter
Django Forum
AI Agent Rules
A lively discussion of current best practices for using agent rules for Django.
Supporting t-strings from Python 3.14
t-strings
have been merged into Python 3.14. Adam Johnson leads off a discussion about how Django could use t-strings.
Projects
tmb/django-svg-sprite
A Django template tag for easy use of SVG sprites in templates.
koladev32/drf-simple-apikey
π A simple package for API Key authentication in Django Rest with rotation integrated.
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
30 May 2025 3:00pm GMT
28 May 2025
Django community aggregator: Community blog posts
Django Deployments in 2025 - Eric Matthes
- Python Crash Course, 3rd Edition
- django-simple-deploy
- Mostly Python Newsletter
- Django From First Principles Series
- DJP: A Plugin System for Django
- dsd-vps
- django-production
Sponsor
This episode was brought to you by Buttondown, the easiest way to start, send, and grow your email newsletter. New customers can save 50% off their first year with Buttondown using the coupon code DJANGO.
28 May 2025 4:00pm GMT
27 May 2025
Django community aggregator: Community blog posts
DjangoCon Europe 2025 Highlights
Three Cakti recently attended DjangoCon Europe 2025 in Dublin and it was a wonderful experience! It was great to see and chat with various Django community members we usually only see once or twice a year. Beyond that, we were most impressed by the consistently high quality of the talks throughout all three days of the conference. It was a pleasure to listen to so many excellent presentations, including the lightning talks at the end of each day. Here are some of our favorite talks.
Karen Tracey
It is hard to pick out a single favorite, so I am going to mention a few:
Tim Bell from Kraken Technologies Australia gave a talk on Converting integer fields to bigint using Django migrations at scale, followed up by a lightning talk a couple of days later that revealed how the real life production situation which was the inspiration for the talk was quite a near thing and required some daring creativity to sidestep disaster. I hope never to be in a similar situation but I do find talks on solving challenging production problems very enjoyable.
In a similar vein, Mia BajiΔ presented a keynote on The Most Bizarre Software Bugs in History which was fascinating. One takeaway: integration tests are invaluable and could have saved the NASA Mars Climate Orbiter from disaster. They probably wouldn't have added that much to the $327 million cost either.
Finally, I enjoyed Graham Knapp's talk on Feature Flags: Deploy to some of the people all of the time, and all of the people some of the time!. I particularly appreciated Graham's focus on the need to clean up the flags when they've outlived their usefulness. Keeping with the theme of my favorite talks from the conference, Graham noted that re-use of a feature flag had led to massive financial loss for a high frequency trading company several years ago.
Tobias McNulty
The talks Karen mentioned are some of my favorites, too, but I'll try to pick out a few more!
Sarah Boyce kicked off the conference with a keynote titled Why Django Need's You! (to do code review). I learned that the average time to merge a PR in Django is 319 days (!). I posted a summary of the key steps on LinkedIn. I encourage you to check them out and help spread the word!
Karen Jax's talk on the Anatomy of a Database Operation was also fascinating. Understanding the steps that the database goes through to "1. Parse, 2. Transform & Rewrite, 3. Plan, and 4. Execute" a query is helpful when debugging or reading query plans. The video of the talk isn't available yet, but she gave another talk on Tuning PostgreSQL to work even better at DjangoCon 2023 which is on my list to watch.
Lastly, Agnès Haasser gave a talk titled Europe, Django and two-factor authentication that went in depth on the standards and best practices for multifactor authentication. I particularly appreciated the quote, "You should worry about [multi-factor authentication] before your customer [or employer] asks you to, because when they do, it will be too late." I also posted this on LinkedIn, and I encourage you to advocate for multi-factor authentication wherever and whenever it may be needed!
Colin Copeland
To add to the talks above, I also enjoyed the following:
Haki Benita spoke on How to get Foreign Keys horribly wrong in Django. I appreciated his recommendation to always check the SQL generated by migrations, using sqlmigrate
, to ensure that the expected changes are being made to the database schema. Also, \di+
in psql is a great way to see the indexes on a table, including the foreign keys and sizes of the indexes.
How we make decisions in Django by Carlton Gibson was a great talk about the decision-making process in Django. He highlighted many topics, including the challenges presented by pushing more code into core vs keeping it in third-party packages, and the trade-offs involved in each approach. His discussion on thinking of community as trust resonated with me. In smaller groups, trust is easier, but that changes when groups grow larger. I appreciated his ideas around smaller working groups and bringing back the space for trust in larger communities.
Lastly, I also enjoyed Django for Data Science: Deploying Machine Learning Models with Django by William Vincent. Having only dabbled in training models with sample datasets from Hugging Face, I appreciated the overview of the process of not only deploying a model with Django, but also the basics of using the model to make predictions in a web context.
Conclusion
Finally, we'd be remiss for not mentioning Karen's talk, How to Enjoy Debugging in Production. Be sure to check it out once a video is available!
DjangoCon Europe 2025 was well worth the (long) trip for us, and we encourage anyone who was unable to attend the conference to check out these (and all the other) talks online once they become available!
27 May 2025 10:00am GMT
26 May 2025
Django community aggregator: Community blog posts
How I'm bundling frontend assets using Django and rspack these days
How I'm bundling frontend assets using Django and rspack these days
I last wrote about configuring Django with bundlers in 2018: Our approach to configuring Django, Webpack and ManifestStaticFilesStorage. An update has been a long time coming. I wanted to write this down for a while already, but each time I started explaining how configuring rspack is actually nice I look at the files we're using and switch to writing about something else. This time I managed to get through - it's not that bad, I promise.
This is quite a long post. A project where all of this can be seen in action is Traduire, a platform for translating gettext catalogs. I announced it on the Django forum.
Our requirements
The requirements were still basically the same:
- Hot module reloading during development
- A process which produces hashed filenames depending on their content so that we can use far-future expiry headers to cache assets in browsers
- While running Node.js in development is fine we do not want Node.js on the server (in the general case)
- We still want transpiling and bundling for now
We have old projects using SASS. These days we're only using PostCSS (especially autoprefixer and maybe postcss-nesting. Rewriting everything is out of the question, so we needed a tool which handled all that as well.
People in the frontend space seem to like tools like Vite or Next.js a lot. I have also looked at Parcel, esbuild, rsbuild and others. Either they didn't support our old projects, were too limited in scope (e.g. no HMR), too opinionated or I hit bugs or had questions about their maintenance. I'm sure all of them are great for some people, and I don't intend to talk badly about any of them!
In the end, the flexibility, speed and trustworthiness of rspack won me over even though I have a love-hate relationship with the Webpack/rspack configuration. We already had a reusable library of configuration snippets for webpack though and moving that library over to rspack was straightforward.
That being said, configuring rspack from scratch is no joke, that's why tools such as rsbuild exist. If you already know Webpack well or really need the flexibility, going low level can be good.
High-level project structure
The high-level overview is:
- Frontend assets live in their own folder,
frontend/
. - We're using fabric and rspack, their configuration resides in the root folder of the project as does Django's
manage.py
. - The frontend is transpiled and bundled directly into
static/
for production and intotmp/
during development. - We use the HTML plugin of rspack to emit snippets containing
<link>
and<script>
tags. The HTML snippet can be included as-is, without any postprocessing. frontend/
orfrontend/static
is optionally added toSTATICFILES_DIRS
so that some of the files from the frontend can easily be referenced in{% static %}
tags.
During development:
- We use the dev server of rspack/node to handle
127.0.0.1:8000
. This server handles requests for frontend assets and the websocket for hot module reloading and proxies everything else to the Django backend running on a different random port.
During deployment:
- The assets are compiled to
static/
and either rsynced to the server or added to the container separately from the standard./manage.py collectstatic --noinput
.
In production:
- Separate cache busting filenames from
ManifestStaticFilesStorage
and rspack allow us to set far-future expiry headers on all static assets. - I'm serving static assets from the same origin as the website itself. (rspack can be configured for different requirements!)
- I don't worry anymore about duplicating assets which are both referenced from frontend code and backend code. This doesn't affect many assets after all.
- The HTML snippet is loaded once only.
Example configuration
Here's an example configuration which works well for us. What follows is the rspack configuration itself, building on our snippet library rspack.library.js
. We mostly do not change anything in here except for the list of PostCSS plugins:
rspack.config.js:
module.exports = (env, argv) => {
const { base, devServer, assetRule, postcssRule, swcWithPreactRule } =
require("./rspack.library.js")(argv.mode === "production")
return {
...base,
devServer: devServer({ backendPort: env.backend }),
module: {
rules: [
assetRule(),
postcssRule({
plugins: [
"postcss-nesting",
"autoprefixer",
],
}),
swcWithPreactRule(),
],
},
}
}
The default entry point is main
and loads frontend/main.js
. The rest of the JavaScript and styles are loaded from there.
The HTML snippet loader works by adding WEBPACK_ASSETS = BASE_DIR / "static"
to the Django settings and adding the following tags to the <head>
of the website, most often in base.html
:
{% load webpack_assets %}
{% webpack_assets 'main' %}
The corresponding template tag in webpack_assets.py
follows:
from functools import cache
from django import template
from django.conf import settings
from django.utils.html import mark_safe
register = template.Library()
def webpack_assets(entry):
path = settings.BASE_DIR / ("tmp" if settings.DEBUG else "static") / f"{entry}.html"
return mark_safe(path.read_text())
if not settings.DEBUG:
webpack_assets = cache(webpack_assets)
register.simple_tag(webpack_assets)
Last but not least, the fabfile contains the following task definition:
@task
def dev(ctx, host="127.0.0.1", port=8000):
backend = random.randint(50000, 60000)
jobs = [
f".venv/bin/python manage.py runserver {backend}",
f"HOST={host} PORT={port} yarn run rspack serve --mode=development --env backend={backend}",
]
# Run these two jobs at the same time:
_concurrently(ctx, jobs)
The fh-fablib repository contains the _concurrently
implementation we're using at this time.
The library which enables the nice configuration above
Of course, the whole library of snippets has to be somewhere. The fabfile automatically updates the library when we release a new version, and the library is the same in all the dozens of projects we're working on. Here's the current version of rspack.library.js
:
const path = require("node:path")
const HtmlWebpackPlugin = require("html-webpack-plugin")
const rspack = require("@rspack/core")
const assert = require("node:assert/strict")
const semver = require("semver")
assert.ok(semver.satisfies(rspack.rspackVersion, ">=1.1.3"), "rspack outdated")
const truthy = (...list) => list.filter((el) => !!el)
module.exports = (PRODUCTION) => {
const cwd = process.cwd()
function swcWithPreactRule() {
return {
test: /\.(j|t)sx?$/,
loader: "builtin:swc-loader",
exclude: [/node_modules/],
options: {
jsc: {
parser: {
syntax: "ecmascript",
jsx: true,
},
transform: {
react: {
runtime: "automatic",
importSource: "preact",
},
},
externalHelpers: true,
},
},
type: "javascript/auto",
}
}
function swcWithReactRule() {
return {
test: /\.(j|t)sx?$/,
loader: "builtin:swc-loader",
exclude: [/node_modules/],
options: {
jsc: {
parser: {
syntax: "ecmascript",
jsx: true,
},
transform: {
react: {
runtime: "automatic",
// importSource: "preact",
},
},
externalHelpers: true,
},
},
type: "javascript/auto",
}
}
function htmlPlugin(name = "", config = {}) {
return new HtmlWebpackPlugin({
filename: name ? `${name}.html` : "[name].html",
inject: false,
templateContent: ({ htmlWebpackPlugin }) =>
`${htmlWebpackPlugin.tags.headTags}`,
...config,
})
}
function htmlSingleChunkPlugin(chunk = "") {
return htmlPlugin(chunk, chunk ? { chunks: [chunk] } : {})
}
function postcssLoaders(plugins) {
return [
{ loader: rspack.CssExtractRspackPlugin.loader },
{ loader: "css-loader" },
{ loader: "postcss-loader", options: { postcssOptions: { plugins } } },
]
}
function cssExtractPlugin() {
return new rspack.CssExtractRspackPlugin({
filename: PRODUCTION ? "[name].[contenthash].css" : "[name].css",
chunkFilename: PRODUCTION ? "[name].[contenthash].css" : "[name].css",
})
}
return {
truthy,
base: {
context: path.join(cwd, "frontend"),
entry: { main: "./main.js" },
output: {
clean: PRODUCTION,
path: path.join(cwd, PRODUCTION ? "static" : "tmp"),
publicPath: "/static/",
filename: PRODUCTION ? "[name].[contenthash].js" : "[name].js",
// Same as the default but prefixed with "_/[name]."
assetModuleFilename: "_/[name].[hash][ext][query][fragment]",
},
plugins: truthy(cssExtractPlugin(), htmlSingleChunkPlugin()),
target: "browserslist:defaults",
},
devServer(proxySettings) {
return {
host: "0.0.0.0",
hot: true,
port: Number(process.env.PORT || 4000),
allowedHosts: "all",
client: {
overlay: {
errors: true,
warnings: false,
runtimeErrors: true,
},
},
devMiddleware: {
headers: { "Access-Control-Allow-Origin": "*" },
index: true,
writeToDisk: (path) => /\.html$/.test(path),
},
proxy: [
proxySettings
? {
context: () => true,
target: `http://127.0.0.1:${proxySettings.backendPort}`,
}
: {},
],
}
},
assetRule() {
return {
test: /\.(png|webp|woff2?|svg|eot|ttf|otf|gif|jpe?g|mp3|wav)$/i,
type: "asset",
parser: { dataUrlCondition: { maxSize: 512 /* bytes */ } },
}
},
postcssRule(cfg) {
return {
test: /\.css$/i,
type: "javascript/auto",
use: postcssLoaders(cfg?.plugins),
}
},
sassRule(options = {}) {
let { cssLoaders } = options
if (!cssLoaders) cssLoaders = postcssLoaders(["autoprefixer"])
return {
test: /\.scss$/i,
use: [
...cssLoaders,
{
loader: "sass-loader",
options: {
sassOptions: {
includePaths: [path.resolve(path.join(cwd, "node_modules"))],
},
},
},
],
type: "javascript/auto",
}
},
swcWithPreactRule,
swcWithReactRule,
resolvePreactAsReact() {
return {
resolve: {
alias: {
react: "preact/compat",
"react-dom/test-utils": "preact/test-utils",
"react-dom": "preact/compat", // Must be below test-utils
"react/jsx-runtime": "preact/jsx-runtime",
},
},
}
},
htmlPlugin,
htmlSingleChunkPlugin,
postcssLoaders,
cssExtractPlugin,
}
}
Closing thoughts
Several utilities from this library aren't used in the example above, for example the sassRule
or the HTML plugin utilities which are useful when you require several entry points on your website, e.g. an entry point for the public facing website and an entry point for a dashboard used by members of the staff.
Most of the code in here is freely available in our fh-fablib repo under an open source license. Anything in this blog post can also be used under the CC0 license, so feel free to steal everything. If you do, I'd be happy to hear your thoughts about this post, and please share your experiences and suggestions for improvement - if you have any!
26 May 2025 5:00pm GMT
23 May 2025
Django community aggregator: Community blog posts
Django News - Django Sprints on the Med? - May 23rd 2025
News
Django sprints? On the Med?
A new initiative from Carlton Gibson and Paolo Melchiorre to organize three-day development sprints to get together and work on Django.
DjangoCon US early-bird tickets are going fast!
DjangoCon US 2025 early-bird tickets are now available at discounted rates through May for individuals and corporate attendees in Chicago.
Django Commons launched a website!
Django Commons launched their new website.
Django Fellow Report
Django Fellow Report - Natalia Bidart
3 triaged tickets, 8 reviewed, 1 authored, plus a lot of misc!
Django Fellow Report - Sarah Boyce
18 tickets triaged, 19 reviewed, plus misc!
Django Software Foundation
Our Google Summer of Code 2025 contributors
Google Summer of Code 2025 contributors will implement keyboard shortcuts and command palette in Django Admin, integrate template partials into core and automate contribution workflows.
Updates to Django
Today 'Updates to Django' is presented by Pradhvan from the Djangonaut Space!π
Last week we had 20 pull requests merged into Django by 16 different contributors - including 4 first-time contributors! Congratulations to savanto, Kashemir001, Pablo Bengoechea and Samuel Cormier-Iijima for having their first commits merged into Django - welcome on board! π
This week's Django highlights coming in Django 6.0 π
- Migration serialization now handles
functools.{partial,partialmethod}
with non-identifier keyword arguments. - Added support for negative indexing in JSONField on SQLite backends.
- Django admin's SVG icons have been refreshed and now use Font Awesome Free 6.7.2.
RoutePattern.match()
is now up to 45% faster for converter-less patterns, thanks to Jake Howard for working on long-standing PR that's finally landed.
Django Newsletter
Wagtail CMS
Wagtail and Django join forces at PyConUS 2025
Bringing together Django ponies and Wagtail birds made a powerful team of champions for Python web development
Articles
Why, in 2025, do we still need a 3rd party app to write a REST API with Django?
Use Django generic class-based views and ModelForms to build simple JSON CRUD REST endpoints without third-party libraries in under 100 lines.
Another Perspective Of The Django Triage Workflow
The proposed update to Django ticket triage introduces distinct stage names and horizontal progress mapping to streamline workflow, improve clarity, and reduce overhead.
Implement Text Similarity with Embeddings in Django
Integrates BERT embeddings and pgvector in Django for semantic product matching via vector search, combining two vector fields to enhance accuracy despite data inconsistencies.
Remote Single-file Python Scripts with uv
uv enables remote, single-file Python script execution with automated Python installation, dependency handling, and inline metadata, simplifying script sharing even from private repositories.
Too much magic
Explores the balance between declarative API magic and procedural clarity, emphasizing that increased abstraction is essential in overcoming verbose implementation within frameworks such as ORMs.
Dataclass For Django Custom Command Arguments
Leveraging dataclasses for Django custom command arguments centralizes default settings and URL query construction, streamlining code and reducing potential mismatches.
GitHub Actions: avoid double runs from on: [push, pull_request]
Using separate rules for GitHub actions events prevents redundant CI runs when both push and pull_request triggers fire, reducing costs, delays, and flaky failures.
Djangonaut Space Financial Report 2024
Djangonaut Space reported donations and a year-end balance, supporting the open-source Django community through conference aid and operational tools.
Events
DjangoCon US 2025 - Tickets now available!
Early-bird ticket prices are available until June 1st.
Videos
PyCon US 2025 videos are up!
The PyCon US 2025 videos are now available and being published in batches.
PyTexas 2025 videos are up!
PyTexas 2025 videos are up and offers advanced Python presentations on async processing, testing strategies, and tooling with insights applicable to Django development.
Sponsored Link 2
AI-Powered Django Development & Consulting
REVSYS specializes in seamlessly integrating powerful AI technologies, including GPT-4, into your existing Django applications. Your Django project deserves modern, intelligent features that enhance user engagement and streamline content workflows.
Django News Jobs
Check out these three new backend Python and Django roles at Wasmer, Off Duty Management, and Paytree.
Senior Backend Engineer at Wasmer π
Python / Django Software Developer - fulltime employee- No visa sponsorship at Off Duty Management
Backend Python Developer (Django/DRF) at Paytree
Django Newsletter
Projects
radiac/django-style
Basic tasteful designs for your Django project with plain CSS, Tailwind 4, or Bootstrap 5.
smattymatty/django_spellbook
Transforms markdown files into fully-rendered Django templates with auto-generated views and URLs, eliminating boilerplate code while maintaining Django's flexibility.
This RSS feed is published on https://django-news.com/. You can also subscribe via email.
23 May 2025 3:00pm GMT
22 May 2025
Django community aggregator: Community blog posts
Django, JavaScript modules and importmaps
How I'm using Django, JavaScript modules and importmaps together
I have been spending a lot of time in the last few months working on django-prose-editor. First I've rebuilt the editor on top of Tiptap because I wanted a framework for extending the underlying ProseMirror and didn't want to reinvent this particular wheel. While doing that work I noticed that using JavaScript modules in the browser would be really nice, but Django's ManifestStaticFilesStorage
doesn't yet support rewriting import
statement in modules out-of-the-box without opting into the experimental support accessible through subclassing the storage. A better way to use JavaScript modules with the cache busting offered by ManifestStaticFilesStorage
would be importmaps.
Motivation
Developing Django applications that include JavaScript has always been challenging when it comes to properly distributing, loading, and versioning those assets. The traditional approach using Django's forms.Media
works well for simple use cases, but falls short when dealing with modern JavaScript modules.
The ability to ship reusable JavaScript utilities in third-party Django apps has been a pain point for years. Often developers resort to workarounds like bundling all JS into a single file, using jQuery-style global variables, or requiring complex build processes for consumers of their apps.
Importmaps offer a cleaner solution that works with native browser modules, supports cache busting, and doesn't require complex bundling for simple use cases.
The history
The conversation around better JavaScript handling in Django has been ongoing for years. Thibaud Colas' DEP draft come to mind as does the discussion about whether to improve or deprecate forms.Media
.
A few packages exist which are offering solutions in this space:
- django-esm provides a solution for using ES modules with Django without bundling.
- django-js-asset provides helpers for delivering JavaScript modules, importmaps, JSON blobs etc. to the browser through Django's
forms.Media
. The blog post Object-based assets for Django's forms.Media explores this in more detail. - The article on Content Security Policy compliance explores better approaches to use JavaScript in the Django admin while avoiding inline JavaScript.
django-js-asset came before Django added official support for object-based media CSS and JS paths but has since been changed to take advantage of that official support. It has enabled the removal of ugly hacks. In the meantime, Django has even added official support for object-based Script
tags.
My DEP draft
Building on these efforts, I've been thinking about submitting my own DEP draft for importmap support. It hasn't yet come far though, and I'm still more occupied with verifying and using my existing solution, especially learning if it has limitations which would make the implemented approach unworkable for official inclusion.
The current effort
As alluded to above, I already have a working solution for using importmaps (in django-js-asset) and I'm actively using it in django-prose-editor. Here's how it works:
importmap.update({
"imports": {
"django-prose-editor/editor": static_lazy("django_prose_editor/editor.js"),
}
})
A minimal editor implementation using this:
import {
// Tiptap extensions
Document, Paragraph, HardBreak, Text, Bold, Italic,
// Prose editor utilities
Menu, createTextareaEditor, initializeEditors,
} from "django-prose-editor/editor"
const extensions = [
Document, Paragraph, HardBreak, Text, Bold, Italic, Menu,
]
initializeEditors((textarea) => {
createTextareaEditor(textarea, extensions)
})
The importmap looks as follows when using Django's ManifestStaticFilesStorage
which produces filenames containing the hash of the file's contents for cache busting (edited for readability):
<script type="importmap">
{"imports": {
"django-prose-editor/editor": "/static/django_prose_editor/editor.6e8dd4c12e2e.js"
}}
</script>
This means that when your code has import { ... } from "django-prose-editor/editor"
, the browser automatically loads the file from /static/django_prose_editor/editor.6e8dd4c12e2e.js
. The hashed filename provides cache busting while the import statement remains clean and consistent.
Problems with the current implementation
While this approach works, there are several issues to address:
-
I don't really like global variables but there doesn't seem to be a way around it. Browsers want to use a single importmap only (even though the algorithm for merging importmaps exists in the spec!) and the importmap has to be included above all ES modules.
-
The importmap may be added twice to the HTML when using a widget that works in both the admin and frontend contexts. Currently, if you want to avoid this problem or ugliness you have to determine in your Django form field if the code is requesting an admin widget or another widget, either by inspecting the callstack (very ugly) or by checking if the
widget
argument to the form field constructor is set to an admin-specific widget (also somewhat ugly, since widgets can be classes, instances, or not provided at all). -
It would be nice if we the installation of django-prose-editor didn't have more steps than what we have when installing any other Django widget integration. I'd like a more elegant solution, but haven't found one yet that doesn't introduce too much magic.
Comparison to django-esm
django-esm takes a different approach. It assumes you're using JavaScript modules everywhere and solves the problem of exposing the correct paths to those modules to the browser. It supports both private modules from your repository and modules installed in node_modules
.
However, it doesn't fully address the scenario where a third-party Django app (a Python package) ships JavaScript modules that need to be integrated into your application.
I still use a bundler for most of my JavaScript from node_modules
, so I don't need this specific functionality yet. That will probably change in the future.
Using bundlers
If you're still using a bundler, as I do, you want to ensure that the import
isn't actually evaluated by the bundler but left as-is. The rspack configuration I'm using at the moment is also documented in the django-prose-editor README but I'm duplicating it here for convenience:
module.exports = {
// ...
experiments: { outputModule: true },
externals: {
"django-prose-editor/editor": "module django-prose-editor/editor",
// Or the following, I'm never sure.
"django-prose-editor/editor": "import django-prose-editor/editor",
},
}
This configuration marks the dependency as "external" (so it won't be bundled) and specifies that it should be loaded as a module using a static import
statement.
For browser compatibility, you can also include es-module-shims to support browsers that don't yet handle importmaps natively (around 5% at the time of writing according to caniuse.com).
Using django-compressor or similar packages
Tools like django-compressor aren't well-suited for modern JavaScript modules as they typically produce old-style JavaScript files rather than ES modules. They're designed for a different era of web development and don't integrate well with the importmap approach.
Conclusion
Using importmaps with Django provides a clean solution for managing JavaScript modules in Django applications, especially for third-party apps that need to ship their own JavaScript. While there are still some rough edges to smooth out, this approach works well and offers a path forward that aligns with modern web standards.
Have you tried using importmaps with Django? I'd be interested to hear about your experiences and approaches.
22 May 2025 5:00pm GMT