07 Nov 2025

feedDjango community aggregator: Community blog posts

Django News - Django security releases issued: 5.2.8, 5.1.14, and 4.2.26 - Nov 7th 2025

News

Django security releases issued: 5.2.8, 5.1.14, and 4.2.26

Django 5.2.8, 5.1.14, and 4.2.26 fix a Windows NFKC redirect DoS and a high severity SQL injection via QuerySet Q _connector; upgrade now.

djangoproject.com

2026 DSF Board Candidates

DSF presents 19 candidates for three 2026 board seats with voting open to eligible members until November 26, 2025, 23:59 AOE.

djangoproject.com

Announcing DjangoCon Europe 2026 in Athens, Greece! β˜€οΈπŸ–οΈπŸ›οΈπŸ‡¬πŸ‡·

DjangoCon Europe 2026 will take place in Athens April 15 to 17 with Django and Python talks, workshops, sprints, and community engagement opportunities.

djangoproject.com

Django Software Foundation

Django Developers Survey 2025 results

2025 Django Developers Survey reveals key trends, tooling preferences, and actionable recommendations for Django development, with a full report and JetBrains analysis.

djangoproject.com

Python Software Foundation

Connecting the Dots: Understanding the PSF's Current Financial Outlook

PSF reports declining revenue, increased costs, and paused grants, urging community support and sponsorships to sustain PyPI, PyCon US, and core Python infrastructure.

blogspot.com

Wagtail CMS News

A new approach to search and more in Wagtail 7.2

With fully revamped search, readability checks, and more, this is a collection of new features you don't want to miss

wagtail.org

The 2025 State of Django's top packages

The State of Django 2025 shows Django Ninja and Wagtail rank highly by downloads, highlighting popular API, auth, and integration needs for Django projects.

wagtail.org

Updates to Django

Today, "Updates to Django" is presented by Raffaella from Djangonaut Space! πŸš€

Last week we had 14 pull requests merged into Django by 7 different contributors - including 2 first-time contributors! Congratulations to Michal MlΓ‘dek and varunkasyap for having their first commits merged into Django - welcome on board!

Django 6.1 news:

Django Newsletter

Sponsored Link 1

Until November 9, 2025, get PyCharm for 30% off. All money goes to the Django Software Foundation!

This annual promotion has raised over $330,000 for the Django Software Foundation over the years, by far the single biggest fundraiser for Django. If you're interested in trying out PyCharm Pro for the first time, this is the way to do it.

jetbrains.com

Articles

</> htmx ~ The fetch()ening

htmx 4.0 rewrites internals to use fetch, explicit :inherited attributes, network-backed history, and streaming swaps, simplifying interactions with server-rendered Django templates.

htmx.org

An Annual Release Cycle for Django

Proposal to move Django to an annual calendar-based release cycle with each release as an LTS, tighter Python support window, and more explicit stability guarantees.

buttondown.com

Rippling's Gunicorn pre-fork journey

Rippling converted their Django monolith to Gunicorn pre-fork with lifecycle hooks, proxies, and gc.freeze to cut memory by over 70% and costs 30%.

rippling.com

Your first django PR - from scratch to improved patch

Step-by-step git and local development workflow for improving existing Django patches, running tests, using pre-commit hooks, squashing commits, and creating a proper PR.

dev.to

Thoughts about Django-based content management systems

Prefer building a lightweight Django CMS on top of the Django admin, using modular components to reduce maintenance, enable faster upgrades, and support structured content editing.

406.ch

Finding (implicitly) inherited HTMX attributes

Locate and convert implicit HTMX attribute inheritance to explicit hx-inherit by logging in getAttributeValueWithDisinheritance before upgrading to HTMX v4 in Django projects.

noumenal.es

Forum

PEP 810: Explicit lazy imports - PEPs

The Python Steering Council unanimously approved PEP 810, Explicit Lazy Imports, praising the authors for improving on past proposals and delivering a well-balanced design. The Council endorsed the use of the lazy keyword, made minor recommendations for clarity and completeness, and expressed appreciation for the authors' work.

python.org

Events

PyCon US 2026 - Call for Proposals Now Open!

We're so excited to announce that PyCon US 2026 is heading to California for our first year in our sunny new host city of Long Beach, CA ! W...

blogspot.com

DjangoCon Videos

DjangoCon US 2025 Videos

DjangoCon US 2025 was held in Chicago, Illinois USA, September 2025.

djangotv.com

Django News Jobs

Senior Python Developer at Basalt Health πŸ†•

Senior Back-End Developer at Showcare πŸ†•

Software Engineer Lead at Center for Academic Innovation, University of Michigan

Part-Time Senior Full-Stack Engineer (Python/Django) (gn) at voiio

Founding Backend Engineer (On-site San Francisco) - Python β€’ AWS β€’ LLM/RAG at Purrfect Hire

Senior Python Developer at Basalt Health

Senior Software Engineer (Python and Solidity) at LiquidFi

Django Newsletter

Projects

Arfey/django-async-backend

Mykhailo Havelia has pulled @fcurella's initial work on async cursors for the Django ORM out into a separate DB backend. That means you can try it out on your projects and feed back. via Carlton

github.com

feincms/django-tree-queries

Adjacency-list trees for Django using recursive common table expressions. Supports PostgreSQL, sqlite, MySQL and MariaDB.

github.com


This RSS feed is published on https://django-news.com/. You can also subscribe via email.

07 Nov 2025 5:00pm GMT

06 Nov 2025

feedDjango community aggregator: Community blog posts

Hitting Limits and Noticing Clues in Graphs

Sometimes the limit you hit when dealing with high traffic on a website isn't the limit that needs to be raised. We encountered this recently on a site we're helping to maintain and upgrade. The site has been around since the very early days of Django. It was built back in the days when Apache with mod_wsgi (or even mod_python!) was one of the more common Django deployment environments.

06 Nov 2025 4:53am GMT

Cursor vs. Claude for Django Development

This article looks at how Cursor and Claude compare when developing a Django application.

06 Nov 2025 4:28am GMT

05 Nov 2025

feedDjango community aggregator: Community blog posts

Thoughts about Django-based content management systems

Thoughts about Django-based content management systems

I have almost exclusively used Django for implementing content management systems (and other backends) since 2008.

In this time, content management systems have come and gone. The big three systems many years back were django CMS, Mezzanine and our own FeinCMS.

During all this time I have always kept an eye open for other CMS than our own but have steadily continued working in my small corner of the Django space. I think it's time to write down why I have been doing this all this time, for myself and possibly also for other interested parties.

Why not use Wagtail, django CMS or any of those alternatives?

Let's start with the big one. Why not use Wagtail?

The Django administration interface is actually great. Even though some people say that it should be treated as a tool for developers only, recent improvements to the accessibility and the general usability suggest otherwise. I have written more about my views on this in The Django admin is a CMS. Using and building on top of the Django admin is a great way to immediately profit from all current and future improvements without having to reimplement anything.

I don't want to have to reimplement Django's features, I want to add what I need on top.

Faster updates

Everyone implementing and maintaining other CMS is doing a great job and I don't want to throw any shade. I still feel that it's important to point out that systems can make it hard to adopt new Django versions on release day:

These larger systems have many more (very talented) people working on them. I'm not saying I'm doing a better job. I'm only pointing out that I'm following a different philosophy where I'm conservative about running code in production and I'd rather have less features when the price is a lot of maintenance later. I'm always thinking about long term maintenance. I really don't want to maintain some of these larger projects, or even parts of them. So I'd rather not adopt them for projects which hopefully will be developed and maintained for a long time to come. By the way: This experience has been earned the hard way.

The rule of least power

From Wikipedia:

In programming, the rule of least power is a design principle that "suggests choosing the least powerful [computer] language suitable for a given purpose". Stated alternatively, given a choice among computer languages, classes of which range from descriptive (or declarative) to procedural, the less procedural, more descriptive the language one chooses, the more one can do with the data stored in that language.

Django itself already provides lots and lots of power. I'd argue that a very powerful platform on top of Django may be too much of a good thing. I'd rather keep it simple and stupid.

Editing heterogenous collections of content

Django admin's inlines are great, but they are not sufficient for building a CMS. You need something to manage different types. django-content-editor does that and has done that since 2009.

When Wagtail introduced the StreamField in 2015 it was definitely a great update to an already great CMS but it wasn't a new idea generally and not a new thing in Django land. They didn't say it was and welcomed the fact that they also started using a better way to structure content.

Structured content is great. Putting everything into one large rich text area isn't what I want. Django's ORM and admin interface are great for actually modelling the data in a reusable way. And when you need more flexibility than what's offered by Django's forms, the community offers many projects extending the admin. These days, I really like working with the django-json-schema-editor component; I even reference other model instances in the database and let the JSON editor handle the referential integrity transparently for me (so that referenced model instances do not silently disappear).

More reading

The future of FeinCMS and the feincms category may be interesting. Also, I'd love to talk about these thoughts, either by email or on Mastodon.

05 Nov 2025 6:00pm GMT

04 Nov 2025

feedDjango community aggregator: Community blog posts

Weeknotes (2025 week 45)

Weeknotes (2025 week 45)

Autumn is nice

I love walking through the forest with all the colors and the rustling when you walk through the leaves on the ground.

Updated packages since 2025-10-23

04 Nov 2025 6:00pm GMT

PyUtrecht (NL) meetup: the future of Python typing - Victorien Plot

(One of my summaries of the PyUtrecht meetup in Utrecht, NL).

Note: Victorien is currently the number one person maintaining Pydantic. Pydantic is basically "dataclasses with validation".

There was a show of hands: about 70% uses type hints. Type hints has been around since python 3.5. There have been improvements during the years like str|None instead of Union(str, None) in 3.10, for instance.

Something I didn't know: you can always introspect type hints when running your python code: typing.get_type_hints(my_func).

Getting typing-related changes into Python takes a lot of work. You need to implemeent the changes in CPython. You have to update the spec. And get it supported by the major type checkers. That's really a difference to typescript, as typing is built-in from the start, there.

Something that helps typing in the future is 3.15's lazy from xxx import yyy import.

There's an upcoming PEP 764, "inline typed dictionaries":

def get_movie() -> {"name": str, "year": int}:
    # At least something like this ^^^, I can't type that quickly :-)
    ...

He has some suggestions for a new syntax, using something like <{ .... }>, but getting a syntax change into Python takes a lot of talking and a really solid proposal.

04 Nov 2025 5:00am GMT

PyUtrecht (NL) meetup: streaming telemetry (network monitoring with gRPC and gNMI) - Maurice Stoof

(One of my summaries of the PyUtrecht meetup in Utrecht, NL).

"From SNMP to gRPC". Maurice is working on network automation. (The link goes to his github account, the presentation's demo code is there).

SNMP, the Simple Network Monitoring Protocol, has been the standard for network monitoring since 1980. But its age is showing. It is polling-pased, which is wasteful. The mechanism will continually poll the endpoints. It is like checking for new messages on your phone every minute instead of relying on push messaging.

The better way is streaming telemetry, the push model. He uses gRPC, "A high performance, open source universal RPC framework" and gNMI, "gRPC Network Management Interface".

You can ask for capabilities: used in the discovery phase. Get is a simple one-time request for a specific value. With set you can do a bit of configuring. The magic is in subscribe: it creates a persistent connection, allowing the device to continuously stream data back to the client (according to the settings done with "set").

(For the demo, he use pyGMNI, a handy python library for gNMI.)

When to use streaming?

  • With high-frequency monitoring. If you need data more frequent than once every 10 seconds.
  • When you need real-time alerting.
  • Large-scale deployments. With lots of devices, polling efficiency starts to pay off.

SNMP is still fine when you have small setup and hign frequency isn't really needed.

04 Nov 2025 5:00am GMT

03 Nov 2025

feedDjango community aggregator: Community blog posts

The silent mistake that's slowing down your Django app

Hey, Happy Monday!

Why are you getting this: *You signed up to receive this newsletter on Built with Django. I promised to send you the latest projects and jobs on the site as well as any other interesting Django content I encountered during the month. If you don't want to receive this newsletter, feel free to unsubscribe anytime.

News and Updates

Sponsors

Instead of doing a sponsoring block this week, I want to share a launch of my product: TuxSEO.

TuxSEO - Your Content Team on Auto-Pilot | Product Hunt

If you could check it out, it would mean the world to me ❀️

Projects

Jobs

Blog Posts from the Community

Support

You can support this project by using one of the affiliate links below. These are always going to be projects I use and love! No "Bluehost" crap here!

03 Nov 2025 8:00pm GMT

31 Oct 2025

feedDjango community aggregator: Community blog posts

Django News - Django 2025 Survey Results and Django's annual fundraiser - Oct 31st 2025

News

PyCharm & Django annual fundraiser

Boost productivity and contribute to Django initiatives by purchasing PyCharm at a 30% discount while supporting Django Fellows, the DSF Foundation, and many conferences and events including Django Girls.

djangoproject.com

Django Developers Survey 2025 Results

The Django Developer Survey 2025 highlights widespread adoption of recent Django versions, increased async integration, and robust community preferences in tools, testing, and infrastructure.

Have thoughts? Share them on the State of Django 2025 forum post.

jetbrains.com

Django is now a CVE Numbering Authority (CNA)

Django Software Foundation has been authorized by the CVE Program as a CVE Numbering Authority (CNA)! This means Django can be more autonomous as part of the process for assigning CVE IDs to vulnerabilities and creating/publishing info about the vulnerability in the associated CVE Record.

djangoproject.com

2026 DSF Board Nominations (last call)

LAST CALL: If you are interested in helping to support the development of Django we'd enjoy receiving your application for the Board of Directors. Please fill out the 2026 DSF Board Nomination form by 23:59 on October 31, 2025 Anywhere on Earth to be considered.

djangoproject.com

State of MariaDB 2025 Survey

If you use MariaDB with Django, please take a moment to fill out their annual survey on usage.

typeform.com

Django Software Foundation

On the Air for Django's 20th Birthday: Special Event Station W2D

Adam Fast writes about how three amateur radio operators spent two weeks broadcasting a special event call sign, W2D, making 1,026 radio contacts with radio operators in 47 geopolitical entities.

djangoproject.com

DSF member of the month - Anna Makarudze

Anna Makarudze is the DSF member of the month for her dedicated leadership as Former President and Chair of DjangoCon Africa. Discover her journey and the significant contributions she has made to strengthen the Django community.

djangoproject.com

Python Software Foundation

The PSF has withdrawn $1.5 million proposal to US government grant program

Kudos to the PSF for this stance. As they note in this blog post "the PSF simply can't agree to a statement that we won't operate any programs that 'advance or promote' diversity, equity, and inclusion, as it would be a betrayal of our mission and our community."

blogspot.com

Improving security and integrity of Python package archives

PSF white paper details archive vulnerabilities undermining Python package integrity and recommends enhancing security in ZIP and tar implementations and reproducible builds.

blogspot.com

Open Infrastructure is Not Free: PyPI, the Python Software Foundation, and Sustainability

Sustainable funding for PyPI requires long-term vendor partnerships, optimized caching, and expanded PSF investments to support exponential usage growth and ensure reliable operations.

blogspot.com

Updates to Django

Today, "Updates to Django" is presented by Rim Choi from Djangonaut Space πŸš€

Last week we had 12 pull requests merged into Django by 9 different contributors - including 4 first-time contributors! Congratulations to Annabelle Wiegart πŸš€, Emmanuel Ferdman, Matt Shirley and nzioker for having their first commits merged into Django - welcome on board!

News for this week:

A crucial fix for a potential log injection vulnerability in Django's development server (runserver) was merged this week. The patch improves security by escaping control characters in user inputs before they're passed to the logging utility.

The long-running discussion about adding analytics to djangoproject.com - using privacy-friendly tools like Plausible or Umami - has advanced with the creation of an official GitHub issue. The goal is to collect insights that will help guide future documentation and website improvements.

Django Newsletter

Sponsored Link 1

Peace of Mind for Your Django Projects

Great code doesn't keep you up at night. From maintenance to scalability, we've got your Django project under control. πŸ§‘β€πŸ’» Partner with HackSoft today!

hacksoft.io

Articles

Three times faster with lazy imports

Python 3.14/3.15 release manager Hugo van Kemenade benchmarks Python's proposed explicit lazy imports (PEP 810) showing that enabling lazy loading can make command-line tools like pypistats start up nearly three times faster by deferring module initialization until actually needed.

hugovk.dev

Reliable Django Signals

Using background tasks to reliability execute signal receivers

hakibenita.com

Building a Foundation: Migrating pyOpenSci to Django

Migrating pyOpenSci from Jekyll to Django leverages Wagtail, Tailwind CSS, and CI/CD to create a scalable, Python-native dynamic website foundation.

quansight.org

Loopwerk: Async Django: a solution in search of a problem?

Django async support introduces added complexity while delivering minimal performance gains; offloading to background workers remains a more pragmatic solution for most typical applications.

loopwerk.io

Time deltas are not intuitive

Understanding Python timedelta normalization is crucial for accurate duration computations in Django projects, allowing precise time tracking across days, negative durations, and display formatting.

mostlypython.com

uv is the best thing to happen to the Python ecosystem in a decade

uv automates Python version management, virtual environment creation, and dependency resolution with remarkable speed, offering Django developers a streamlined tool for consistent development environments.

emily.space

Why UUIDs won't protect your secrets

Django applications must secure sensitive resources by enforcing explicit authorization rather than relying solely on unguessable UUIDs, which expose inherent guessing vulnerabilities.

alexsci.com

Django Fellow Report

Django Fellow Report - Natalia

A big week marked by the Django 6.0 beta 1 release, an important step toward the final 6.0 milestone.

The week was heavy on debugging tricky test failures related to Python 3.14 and our parallel runner. Then, the usual: plenty of coordination, a few rabbit holes, but good progress overall.

djangoproject.com

Django Fellow Report - Jacob

I helped land two major 6.1 features this week: model field fetching modes, and database-level delete options. I also advanced some reviews for Djangonaut Space participants.

djangoproject.com

Forum

PEP 810: Explicit lazy imports

Proposal for an opt-in lazy import syntax that defers module loading until first use, aiming for faster startup, lower memory, and clear semantics with zero overhead when not used.

python.org

Sponsored Link 2

The State of Django 2025

Insights from 4,600 Django developers worldwide.

jetbrains.com

Podcasts

Django Chat #188: Django Survey 2025 with Jeff Triplett

Django Board Member Jeff Triplett joins us to discuss the results from the Django Survey, highlighting key trends, packages, and actionable ideas.

djangochat.com

Django News Jobs

Software Engineer Lead at Center for Academic Innovation, University of Michigan

Part-Time Senior Full-Stack Engineer (Python/Django) (gn) at voiio

Founding Backend Engineer (On-site San Francisco) - Python β€’ AWS β€’ LLM/RAG at Purrfect Hire

Senior Python Developer at Basalt Health

Senior Software Engineer (Python and Solidity) at LiquidFi

Django/Python Full-stack Engineer at JoinTriple.com

Django Newsletter

Projects

wagtail/queryish: A library for constructing queries on arbitrary data sources following Django's QuerySet API

A library for constructing queries on arbitrary data sources following Django's QuerySet API - wagtail/queryish

github.com

kraken-tech/django-subatomic

Precise control over transaction logic in Django.

github.com


This RSS feed is published on https://django-news.com/. You can also subscribe via email.

31 Oct 2025 3:00pm GMT

30 Oct 2025

feedDjango community aggregator: Community blog posts

Reliable Django Signals


Django signals are extremely useful for decoupling modules and implementing complicated workflows. However, the underlying transport for signals makes them unreliable and subject to unexpected failures.

In this article, I present an alternative transport implementation for Django signals using background tasks which makes them reliable and safer to use in mission critical workflows.

Table of Contents

A Common Workflow

Say you have an application that accept payments from users. Usually, you don't go and implement your own payment solution. Instead, you integrate with some 3rd-party provider.

Creating a Payment Process

This is a common workflow for integrating with a 3rd-party payment provider:

  1. You create some payment process in the provider's system
  2. You redirect the user to some URL, or you get something to pass to the provider's client SDK
  3. Sometime in the future you get notified about the status of the payment, usually by webhook or redirect

A simple state machine for a payment process can look like this:

payment process state machine
payment process state machine

To keep track of payments you create a simple Django module:

# payment/models.py

from typing import Literal, Self
from django.db import models, transaction


class Error(Exception):
    # Abstract.
    pass

class StateError(Error):
    pass


class PaymentProcess(models.Model):
    id = models.BigAutoField(primary_key=True)
    amount = models.BigIntegerField()
    status: Literal['initiated', 'succeeded', 'failed'] = models.CharField(max_length=20)

    @classmethod
    def create(cls, *, amount: int) -> Self:
        assert amount > 0
        return cls.objects.create(amount=amount, status='initiated')

    @classmethod
    def set_status(cls, id: int, *, succeeded: bool) -> Self:
        with transaction.atomic():
            payment_process = cls.objects.select_for_update(of=('self', ), no_key=True).get(id=id)
            if payment_process.status not in {'initiated'}:
                raise StateError()
            if succeeded:
                payment_process.status = 'succeeded'
            else:
                payment_process.status = 'failed'
            payment_process.save()

        return payment_process

To create a new payment process you provide an amount:

>>> from payment.models import PaymentProcess
>>> pp = PaymentProcess.create(amount=100_00)
>>> print(vars(pp))
{'id': 1, 'amount': 10000, 'status': 'initiated'}

The initial status is "initiated". At some point you'll make an API call to your payment provider, get some ID and pass it over to the client - this is outside the scope of this article.

Next, the user interacts with the 3rd-party to provide their payment details. When the user is done, you get an update on the outcome of the payment, usually a webhook or a redirect, and you set the status of the payment process in your local database:

>>> pp = PaymentProcess.set_status(1, succeeded=True)
>>> print(vars(pp))
{'id': 1, 'amount': 10000, 'status': 'succeeded'}

The payment succeeded and the status is set to "succeeded". So far so good!

Now that you can process payment, you can move on to handling orders.

Placing an Order

In your website, the user browse around until they find something they want, and proceed to checkout. At this point, you calculate the amount to be paid and create an order with a payment process. The user then interacts with the payment process to complete the payment. Based on the outcome of the payment, you decide if the order should be filled or cancelled.

A state machine for an order can look like this:

order state machine
order state machine

To keep track of orders you create a new "orders" module:

from django.db import models, transaction
from payment.models import PaymentProcess
from typing import Literal, Self

class Order(models.Model):
    id = models.BigAutoField(primary_key=True)
    payment_process = models.ForeignKey(PaymentProcess, on_delete=models.PROTECT)
    amount = models.BigIntegerField()
    status: Literal['pending_payment', 'completed', 'cancelled'] = models.CharField(max_length=20)

    @classmethod
    def create(cls, *, amount: int) -> Self:
        assert amount > 0
        with transaction.atomic():
            payment_process = PaymentProcess.create(amount=amount)
            order = cls.objects.create(
                payment_process=payment_process,
                amount=amount,
                status='pending_payment',
            )
        return order

To create the order you provide an amount to charge. The module then goes and create a payment process for the same amount and associates it with your order via a foreign key:

>>> o = Order.create(amount=120_00)
>>> print(vars(o))
{ 'id': 1, 'payment_process_id': 2, 'amount': 12000, 'status': 'pending_payment'}
>>> print(vars(o.payment_process))
{ 'id': 2, 'amount': 12000, 'status': 'initiated'}

In real life, when you create an order you keep a lot more information such as the user who placed the order, the items, shipping information and so on. All of this is not important for this article, so we ignore it.


Decoupling Modules

The initial state of an order is "pending_payment" and the current state of the payment is "initiated". The next step is for the user to complete the payment.

When a payment is updated, we need to update the state of the order. Here is a function that given a payment process, sets the status of the order:

# order/models.py
class Order(models.Model):
    # ...

    @classmethod
    def on_payment_completed(cls, *, payment_process_id: int) -> Self:
        """Update the order status based on the payment process status."""
        with transaction.atomic():
            order = (
                cls.objects
                .select_related('payment_process')
                .select_for_update(of=('self', ), no_key=True)
                .get(payment_process_id=payment_process_id)
            )
            if order.status not in {'pending_payment'}:
                return order

            match order.payment_process.status:
                case 'succeeded':
                    order.status = 'completed'
                case 'failed':
                    order.status = 'cancelled'
                case 'initiated':
                    assert False, f'Unexpected payment process status "{order.payment_process.status}"'
                case ever:
                    assert_never(ever)

            order.save()

        return order

The function first looks for the order that references the payment process. If the status of the order is not "pending_payment", we assume this function was already called, and we return the order. This provides some level of idempotency. In real life, you probably should verify that the current state of the order matches the state of the provided payment process.

Next, update the status of the order based on the status of the payment process, save to the database, and return the updated order.

This is where it gets hairy...

Who's in charge of calling this function? The order is not aware of changes to the payment process, so what's triggering this function?

Circular Dependency

When you create an order, the order creates a payment process. The order module is referencing the payment module using a foreign key, therefore, the order module depends on the payment module:

module dependencies
module dependencies

In our workflow, after the user completes the payment, the payment module receives a webhook with the outcome of the payment, and the status of the payment process is updated. Our order is not aware of changes to the payment process, so at what point do we trigger a change in the order?

A naive way of doing this is to simply update the order directly from the payment process using the reverse relation:

diff --git i/payment/models.py w/payment/models.py
 from typing import Literal, Self
 from django.db import models, transaction
+from order.models import Order

@@ -13,23 +13,25 @@  class PaymentProcess(models.Model):
     @classmethod
     def set_status(cls, id: int, *, succeeded: bool) -> Self:
         with transaction.atomic():
             payment_process = cls.objects.select_for_update(of=('self', ), no_key=True).get(id=id)
             if payment_process.status not in {'initiated'}:
                 raise StateError()
             if succeeded:
                 payment_process.status = 'succeeded'
             else:
                 payment_process.status = 'failed'
             payment_process.save()

+            Order.on_payment_completed(payment_process_id=payment_process.id)

         return payment_process

Now, when the payment process is updated, it explicitly goes to the order and attempts to update it as well. However, if you try to execute this, you'll get an exception:

$./manage.py check
Traceback (most recent call last):
    ... snipped ...
ImportError: cannot import name 'PaymentProcess' from partially initialized
module 'payment.models' (most likely due to a circular import) (payment/models.py)

Python is warning us about a circular dependency! An order currently references a payment process. With this change, the payment process is referencing the order back - this creates a circular dependency:

circular dependency
circular dependency

There is a way to make this work - you can import the order inside the function - but you should really avoid that. A circular dependency is usually a symptom of bad design!

Another reason this is a bad approach is that payment process is a low level module - it can potentially be used by other modules other than order. Should a low level module like payment be aware of all the modules that are using it? This won't scale well and will cause a web of dependencies within the application.

Polling Changes

To avoid circular dependency we can't have the payment process reference the order directly. Another approach, is for the order to periodically check for changes in relevant payment process:

from django.core.management.base import BaseCommand
from ...models import Order

class Command(BaseCommand):
    help = 'Check all orders with pending payment and update their status if needed.'

    def handle(self, *args, **options):
        for payment_process_id in Order.objects.filter(
            status='pending_payment',
            payment_process__status__in=['succeeded', 'failed'],
        ).values_list('payment_process_id', flat=True):
            order = Order.on_payment_completed(payment_process_id=payment_process_id)
            self.stdout.write(self.style.SUCCESS(f'order {order.id} status changed to "{order.status}"'))

This Django management command is looking for orders pending payment with a payment process that reached either "failed" or "succeeded" state, and triggers a status update for the order.

To demonstrate, create an order and mark the payment as successful:

>>> o = Order.create(amount=120_00)
>>> o.payment_process_id
3
>>> pp = PaymentProcess.set_status(3, succeeded=True)
>>> print(vars(pp))
{ 'id': 2, 'payment_process_id': 3, 'amount': 12000, 'status': 'pending_payment'}
>>> o.refresh_from_db()
>>> print(vars(o))
{ 'id': 3, 'amount': 12000, 'status': 'initiated'}

Notice that the payment completed successfully, but the order is pending payment. Let's use the management command to sync the state:

$ ./manage.py sync_orders_pending_payment
order 2 status changed to "completed"

Great! You can now execute this task on a schedule and your orders will eventually reach the correct state.

This approach has several advantages and disadvantages:

Using scheduled tasks is a good and reliable solution. However, for user-facing workflows that require quick response it's often not a good fit.

Django Signals

So far we tried to trigger a change in the order from the payment it references. This caused circular dependencies so we decided it's a bad idea. We then tried polling for changes which proved to be reliable and simple, but introduced unacceptable delays in the workflow.

To address these challenges, Django provides signals dispatcher as a way to communicate between modules in the system:

Django includes a "signal dispatcher" which helps decoupled applications get notified when actions occur elsewhere in the framework. In a nutshell, signals allow certain senders to notify a set of receivers that some action has taken place.

Using signals dispatcher, we can dispatch a signal and have one or more receivers subscribe to it. In our case, the payment process can send a signal when it completes, and the order can subscribe to it and update its status. Using signals the payment module can communicate with other modules in the system without explicitly depending on them!

decouple modules using signals
decouple modules using signals

First, define the signal:

# payment/signals.py.
from django.dispatch import Signal

payment_process_completed = Signal()

Next, send the signal when the payment completes:

diff --git i/payment/models.py w/payment/models.py
 from typing import Literal, Self
 from django.db import models, transaction

+from . import signals

@@ -13,23 +15,28 @@ class PaymentProcess(models.Model):
     @classmethod
     def set_status(cls, id: int, *, succeeded: bool) -> Self:
         with transaction.atomic():
             payment_process = cls.objects.select_for_update(of=('self', ), no_key=True).get(id=id)
             if payment_process.status not in {'initiated'}:
                 raise StateError()
             if succeeded:
                 payment_process.status = 'succeeded'
             else:
                 payment_process.status = 'failed'
             payment_process.save()

+            signals.payment_process_completed.send(
+                sender=None,
+                payment_process_id=payment_process.id,
+            )
+
         return payment_process

The order can now register a receiver that will be executed when the "payment_process_completed" signal is sent. This requires some minor adjustments on the receiving end:

diff --git a/order/models.py b/order/models.py
+from __future__ import annotations
 from django.db import models, transaction
 from payment.models import PaymentProcess
 from typing import Literal, Self, assert_never
+from django.dispatch import receiver

+import payment.signals

@@ -21,16 +24,22 @@ class Order(models.Model):
-    @classmethod
-    def on_payment_completed(cls, *, payment_process_id: int) -> Self:
+    @staticmethod
+    @receiver(payment.signals.payment_process_completed, dispatch_uid='1da6190f-0cf1-45e1-8481-0d1e27bf6e6f')
+    def on_payment_completed(payment_process_id: int, *args, **kwargs) -> Order | None:
         """Update the order status based on the payment process status."""
         with transaction.atomic():
-            order = (
-                cls.objects
-                .select_related('payment_process')
-                .select_for_update(of=('self', ), no_key=True)
-                .get(payment_process_id=payment_process_id)
-            )
+            try:
+                order = (
+                    Order.objects
+                    .select_related('payment_process')
+                    .select_for_update(of=('self', ), no_key=True)
+                    .get(payment_process_id=payment_process_id)
+                )
+            except Order.DoesNotExist:
+                # Not related to order.
+                return None
+
             if order.status not in {'pending_payment'}:
                 return order

Let's break it down:

We can now see it in action:

>>> o = Order.create(amount=150_00)
>>> print(vars(o))
{'id': 3, 'payment_process_id': 4, 'amount': 15000, 'status': 'pending_payment'}
>>> PaymentProcess.set_status(4, succeeded=True)
>>> o.refresh_from_db()
>>> print(vars(o))
{'id': 3, 'payment_process_id': 4, 'amount': 15000, 'status': 'completed'}

Notice how the state of the order changes to "completed" even though we did not explicitly call Order.on_payment_completed. This function was invoked implicitly when PaymentProcess.set_status dispatched the signal.

Using signals we can trigger changes in other modules without creating direct dependencies between them. As the documentation promised, signals allow us to keep modules decoupled. In our scenario, using signals, payment processes can trigger changes in orders without explicitly depending on them - problem solved!

This principle of keeping modules decoupled should also extend to how we name signals. It's tempting to name our signal something like "complete_order", but that creates an implicit dependency between the modules because this name implies intent - the payment process should not be aware of how its signal is being used. Instead, we name signals in a way that only reflects what happened, in our case "payment completed". Each receiver can then make whatever they want from that!

Another advantage of signals is that they can have many receivers. If for example we have an "analytics" module and we want to keep track of how many payment processes succeeded or failed, we can simply register another receiver for the same signal and increment some counter.

In the next sections we are going to challenge the signals approach and demonstrate when it falls short of its promise!

Robust Django Signals

In the previous section we used signals as a way to communicate between two modules without creating an explicit dependency between them. But, did we really achieve that?

Consider what happens when a receiver encounters an error and raises an exception:

>>> o = Order.create(amount=160_00)
>>> PaymentProcess.set_status(o.payment_process_id, succeeded=True)
---------------------------------------------------------------------------
Exception                                 Traceback (most recent call last)
Cell In[2], line 1
----> 1 PaymentProcess.set_status(o.payment_process_id, succeeded=True)

File payment/models.py:37, in PaymentProcess.set_status(cls, id, succeeded)
File .venv/lib/python3.13/site-packages/django/dispatch/dispatcher.py:209, in Signal.send(self, sender, **named)
File order/models.py:31, in Order.on_payment_completed(payment_process_id, *args, **kwargs)
     27 @staticmethod
     28 @receiver(payment.signals.payment_process_completed, dispatch_uid='1da6190f-0cf1-45e1-8481-0d1e27bf6e6f')
     29 def on_payment_completed(payment_process_id: int, *args, **kwargs) -> Order | None:
     30     """Update the order status based on the payment process status."""
---> 31     raise Exception("on_payment_completed FAILED!!")
     32     with transaction.atomic():
     33         try:

Exception: on_payment_completed FAILED!!

Oh no! An error in the order caused the payment process to fail. We thought payment process has nothing to do with orders any more, but we were wrong! To keep modules truly decoupled we can't have exceptions from signal receivers propagate to the signal sender.

Django provides another way of sending a signal, in a way that does not propagate errors to the sender:

--- a/payment/models.py
+++ b/payment/models.py
@@ -34,7 +34,7 @@ class PaymentProcess(models.Model):
                 payment_process.status = 'failed'
             payment_process.save()

-            signals.payment_process_completed.send(
+            signals.payment_process_completed.send_robust(
                 sender=None,
                 payment_process_id=payment_process.id,
             )

The documentation for Signal.send_robust explain the difference very well:

send() differs from send_robust() in how exceptions raised by receiver functions are handled. send() does not catch any exceptions raised by receivers; it simply allows errors to propagate. Thus not all receivers may be notified of a signal in the face of an error.

send_robust() catches all errors derived from Python's Exception class, and ensures all receivers are notified of the signal. [...]

Using send_robust() we can make sure that our modules remain decoupled even when errors happen.

So, are we finally truly decoupled?

Django Signals and Database Transactions

In the previous section we found that we weren't decoupled as we thought when the receiver raises an exception. We switched from Signal.send to Signal.send_robust which doesn't propagate errors. So now we are no longer affected by anything the receiver is doing, right? Not really!

Imagine we have another module, "analytics", to keep track of metrics in our system. To keep count of how many successful and failed payment processes we set up this simple receiver:

# analytics/handlers.py
import urllib.request
from django.dispatch import receiver

import payment.models import PaymentProcess
import payment.signals

@receiver(payment.signals.payment_process_completed, dispatch_uid='a4e3cd9c-1314-40c1-8251-955c20dd5d93')
def on_payment_process_completed(payment_process_id: int, *args, **kwargs) -> None:
    status = PaymentProcess.objects.values_list('status', flat=True).get(id=payment_process_id)
    response = urllib.request.urlopen('https://myanalytics.com/metric/inc', data={'key': 'payment_process:{status}'})
    if response.status != 200:
        raise Exception('Failed to increase metric')

The function fetches the status of the payment process and reports to some 3rd-party analytics service. We already know that if this fails the sender will not be affected. But what will happen if this request takes a very long time?

Receiver functions are called immediately by the signals framework when the signal is sent. This means where and when we send the signal is significant. This is where we send the payment_completed signal:

class PaymentProcess(models.Model):
    # ...
    @classmethod
    def set_status(cls, id: int, *, succeeded: bool) -> Self:
        with transaction.atomic():
            payment_process = cls.objects.select_for_update(of=('self', ), no_key=True).get(id=id)
            if payment_process.status not in {'initiated'}:
                raise StateError()
            if succeeded:
                payment_process.status = 'succeeded'
            else:
                payment_process.status = 'failed'
            payment_process.save()

            signals.payment_process_completed.send_robust(
                sender=None,
                payment_process_id=payment_process.id,
            )

        return payment_process

The signal is sent inside a database transaction. This can cause some problems:

The most straight forward solution here is to simply send the signal outside of the transaction:

--- a/payment/models.py
@@ -15,28 +15,28 @@ class PaymentProcess(models.Model):
     @classmethod
     def set_status(cls, id: int, *, succeeded: bool) -> Self:
         with transaction.atomic():
             payment_process = cls.objects.select_for_update(of=('self', ), no_key=True).get(id=id)
             if payment_process.status not in {'initiated'}:
                 raise StateError()
             if succeeded:
                 payment_process.status = 'succeeded'
             else:
                 payment_process.status = 'failed'
             payment_process.save()

-            signals.payment_process_completed.send_robust(
-                sender=None,
-                payment_process_id=payment_process.id,
-            )
+        signals.payment_process_completed.send_robust(
+            sender=None,
+            payment_process_id=payment_process.id,
+        )

         return payment_process

Sending the signal outside of the database transaction prevents prolonged transactions and issues that can be caused by unexpected side-effects, however, the solution is still not 100% reliable!

transaction.on_commit

Django provides a nice way of executing something only after the database transaction completed successfully, without having to move the call down. Using on_commit we can trust that the signal is only being sent after the transaction was successfully committed. If the transaction rolls-back, the callable in on_commit will not be executed, and the signal will not be sent.


Fault Tolerance

To understand how reliable our approach really is, we need to evaluate what happens when it fails at different points in the process. The easiest way of thinking about it is to imagine the server crashing while your process is running.

Simulating Failures

Consider the following places where the server might crash during the execution of the function:

@classmethod
def set_status(cls, id: int, *, succeeded: bool) -> Self:
    # πŸ’₯ Before the transaction
    with transaction.atomic():
        payment_process = cls.objects.select_for_update(of=('self', ), no_key=True).get(id=id)
        if payment_process.status not in {'initiated'}:
            raise StateError()
        if succeeded:
            payment_process.status = 'succeeded'
        else:
            payment_process.status = 'failed'

        payment_process.save()
        # πŸ’₯ Inside the transaction

    # πŸ’₯ After the transaction, before the signal is sent

    signals.payment_process_completed.send_robust(
        sender=None,
        payment_process_id=payment_process.id,
    )

    return payment_process

Let's analyze what happens if the server crashes in each of these points:

Our approach is not resilient to server crash at any point in the process so we have to consider it unreliable!

Atomicity

Database transactions provide atomicity - changes to multiple rows inside a single transaction are commited all at once or not at all. Ideally, we want the change to the payment and the following change to the order to be executed "all or nothing", otherwise, we risk leaving the process in an inconsistent state.

In our case, the change to the order is triggered by the signal which is sent outside the database transaction so we cannot guarantee atomic execution of both these changes. As a result, if the payment is updated and we crash before we sent the signal, the system will charge the user but the order will never be marked as completed! You'll end up with very angry users, and for a good reason.


Reliable Execution

We started by using Django signals to decouple modules. We then refined our implementation to minimize the impact of receivers on callers by sending signals outside the database transaction. As a result, we introduced scenarios that can leave the process in an inconsistent state.

Despite our best efforts so far, we are still left with a few significant problems:

All of these problems are not new, but they are rooted in the way Django signals work. An ideal solution should provide the following guarantees:

A lot of work and careful thought went into the Django signals framework, and the API is actually quite nice! So with that in mind, we'll try to adjust the execution mechanism for Django signals so that it's reliable and compatible as possible with the existing framework.

Django Tasks

Django 6.0 introduces a new "Tasks Framework":

Background Tasks can offload work to be run outside the request-response cycle, to be run elsewhere, potentially at a later date. This keeps requests fast, reduces latency, and improves the user experience.

Django tasks in its initial release is mostly an interface - it comes with two built-in backends, dummy and immediate, which are mostly intended for debug and development. The idea behind this approach is that developers can implement their own backends, and have seamless integration with other applications using Django tasks.

One prominent backend that has been developed in parallel with the tasks framework is the DatabaseBackend of django-tasks. The database backend maintains a queue in a database table, and provides a worker implementation to dequeue and execute tasks. It also comes with a built-in retry mechanism and a nice admin panel.

Using a database queue we can make changes to database objects and enqueue tasks atomically.

This is the idea:

This approach checks all of our requirements!

Using Django Tasks

First, since we're using the new tasks framework and the database backend, we need to install Django version 6 and django-tasks:

$ uv add "Django>=6"
$ uv add django-tasks

Next, configure django-tasks and set the default backend to be the DatabaseBackend:

+++ settings.py
@@ -38,6 +38,8 @@ INSTALLED_APPS = [
     'order.apps.OrderConfig',
     'payment.apps.PaymentConfig',
+    'django_tasks',
+    'django_tasks.backends.database',
 ]

@@ -117,3 +119,10 @@ USE_TZ = True
+
+TASKS = {
+    "default": {
+        "BACKEND": "django_tasks.backends.database.DatabaseBackend",
+        "ENQUEUE_ON_COMMIT": False,
+    }
+}

Make sure you are using a PostgreSQL database backend:

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': 'djangoreliablesignals',
        'USER': 'djangoreliablesignals',
    }
}

Run the migrations to create the queue tables:

$ ./manage.py migrate

Tasks are executed by a worker process. This means in addition to the processes that run Django itself, you also need a worker process running in the background. In another shell:

$ ./manage.py db_worker

Make sure to check the options for the worker if this ever makes it to production!

Great, on to the actual implementation...

Execute Receivers as Django Tasks

A Django signal is essentially a registry of receiver functions. When you use the receiver decorator, the wrapped function is added to a list of receivers on the signal instance. When you send the signal, the signal is iterating over the internal list of receivers and executes them.

One of the limitations of a database queue is that to enqueue a task, you must save all of the necessary information for executing it to the database. This means you need to serialize all the information to JSON - this includes the arguments as well.

In our case, to execute a receiver function we need to be able to tell the worker what function to execute. Since we can't persist a function object to the database, we need to find another way of referencing it. One way to reference a function is to generate a string with the name of the module and the fully qualified name of the function:

from typing import Callable, Any

def callable_to_qualname(f: Callable[..., Any]) -> str:
    """Return the <module>::<qualname> identifier of a function."""
    return f'{f.__module__}::{f.__qualname__}'

The function produces a string that includes the module name and the fully qualified name of the function we want to reference:

>>> callable_to_qualname(Order.on_payment_completed)
'order.models::Order.on_payment_completed'

To get the function from the string, we implement the opposite function:

import importlib

def qualname_to_callable(qualname: str) -> Callable[..., Any]:
    """Get a callable from its <module>::<qualname> identifier."""
    module_name, func_qualname = qualname.split('::', 1)
    module = importlib.import_module(module_name)

    # Handle nested attributes (e.g., 'ClassName.method_name')
    obj = module
    for attr in func_qualname.split('.'):
        obj = getattr(obj, attr)

    return obj  # type: ignore[return-value]

Given the fully qualified name we generated, the function returns the callable:

>>> receiver = qualname_to_callable('order.models::Order.on_payment_completed')
>>> receiver
<function order.models.Order.on_payment_completed(payment_process_id: 'int', *args, **kwargs) -> 'Order | None'>

Now that we are able to persist a reference to our receiver function, we can create a task to execute an arbitrary receiver:

from collections.abc import Mapping
from django_tasks import task

@task()
def execute_task_signal_receiver(
    *,
    receiver_qualname: str,
    named: Mapping[str, object],
) -> None:
    receiver = qualname_to_callable(receiver_qualname)
    receiver(signal=None, sender=None, **named)

This registers a new django-tasks task that accepts a receiver qualified name and arguments, and executes it. Simple as that!

To change the way signals are sent, we provide an alternative implementation of a Django Signal that instead of executing receivers immediately, enqueues a task for each one:

# reliable_signal/__init__.py
from django.dispatch import Signal as DjangoSignal
from django.dispatch.dispatcher import NO_RECEIVERS

class Signal(DjangoSignal):
    """A django-workers-capable signal."""

    def send_reliable(self, sender: None, **named) -> None:
        """Like send_robust(), but enqueues a task for each registered receiver."""
        if not self.receivers:
            return
        if self.sender_receivers_cache.get(sender) is NO_RECEIVERS:
            return
        sync_receivers, async_receivers = self._live_receivers(sender)
        assert not async_receivers, 'Async receivers not supported by task'
        for receiver in sync_receivers:
            execute_task_signal_receiver.enqueue(
                receiver_qualname=callable_to_qualname(receiver),
                named=named,
            )

Our reliable signal is extending Django's built-in Signal class and adds a function called send_reliable. The function works like send_robust, but instead of executing the receiver functions immediately, it enqueues a task for each receiver instead. We discuss this approach further later on.

Finally, to adjust our code to use the reliable signal, all we need to do is to use our new reliable signal, and replace send_robust with send_reliable:

diff --git i/payment/signals.py w/payment/signals.py
@@ -1,3 +1,3 @@
-from django.dispatch import Signal
+from reliable_signal import Signal

 payment_process_completed = Signal()

diff --git i/payment/models.py w/payment/models.py
@@ -34,9 +34,9 @@ class PaymentProcess(models.Model):
                 payment_process.status = 'failed'
             payment_process.save()

-        signals.payment_process_completed.send_robust(
-            sender=None,
-            payment_process_id=payment_process.id,
-        )
+            signals.payment_process_completed.send_reliable(
+                sender=None,
+                payment_process_id=payment_process.id,
+            )

         return payment_process

Notice that we enqueue the task inside the database transaction. Previously, we said this might cause some issues, but using a database queue, you actually do want to enqueue the task inside the sender transaction. This way, the task will be executed only after the sender commits. If the sender rollback, the task will not be enqueued and will not be executed.

Now we are ready to test this out. Make sure you have a worker running and execute this in Django shell:

>>> o = Order.create(amount=170_00)
>>> PaymentProcess.set_status(o.payment_process_id, succeeded=True)
<PaymentProcess: PaymentProcess object (7)>
>>> o.refresh_from_db()
>>> o.status
'completed'

Amazing! Quickly after we set the status for payment process, the worker picked up the task and updated the status of the order. You can see it in the worker logs as well:

$ ./manage.py db_worker
Watching for file changes with StatReloader
Starting worker worker_id=4tLA6TEzAdIZ7W620DrVnHuC342wiDfs queues=default
Task id=c34fa024-db4d-41b4-b875-723b2436a346 path=reliable_signal.execute_task_signal_receiver_simple state=RUNNING
Task id=c34fa024-db4d-41b4-b875-723b2436a346 path=reliable_signal.execute_task_signal_receiver_simple state=SUCCEEDED

We now have a reliable execution engine for Django signals:

Testing Reliable Django Signals

When we test workflows we usually don't care much about the execution engine, but rather with the business logic. We also want to keep our test suite simple and deterministic as possible - this means we don't want to execute receivers in another worker, we want to execute them immediately.

If you recall, we mentioned that Django comes with two built-in backends for testing and development. One of them is the ImmediateBackend. This backend will execute tasks immediately when they are enqueued - exactly what we need in tests.

from django.test import TestCase, override_settings
from order.models import Order
from payment.models import PaymentProcess

@override_settings(TASKS={'default': {'BACKEND': 'django.tasks.backends.immediate.ImmediateBackend'}})
class OrderTestCase(TestCase):
    def test_order_happy_path(self):
        order = Order.create(amount=100_00)
        self.assertEqual(order.status, 'pending_payment')
        self.assertEqual(order.amount, 100_00)
        self.assertEqual(order.payment_process.status, 'initiated')
        self.assertEqual(order.payment_process.amount, 100_00)

        # This will trigger the signal, which should execute the receiver immediately
        PaymentProcess.set_status(order.payment_process_id, succeeded=True)
        order.refresh_from_db()
        self.assertEqual(order.status, 'completed')
        self.assertEqual(order.payment_process.status, 'succeeded')

This tests the common "happy path" for an order - create an order, payment is successful, order status is updated. To provide an alternative backend for tasks during the test, we use the @override_settings with the path to the built-in ImmediateBackend.

Running the test:

$ ./manage.py test
Found 1 test(s).
Creating test database for alias 'default'...
System check identified no issues (0 silenced).
.
----------------------------------------------------------------------
Ran 1 test in 0.058s

OK
Destroying test database for alias 'default'...

Great! We now have reliable execution that we can easily test.

Reliable Signals Limitations

Reliable signals provide great benefits, but they also come with some restrictions and limitations:

Due to these limitations, we think it is crucial that reliable signals co-exist with the built-in signal system:

Future Work

The current implementation is mostly offered as a reference. While operational under the restrictions mentioned above, there are a few bits we did not address:


Finals Thoughts

We have a lot of custom workflow in our systems - this is really most of what we do! We are using Django signals extensively in situations where two decoupled modules needs to communicate with each other. As our system grew, we experienced first-hand the issues that can come from having un-reliable signals. Eventually, we developed our own database task queue and integrated it with Django signals. So far its been working pretty well with moderate traffic.

This article was motivated by our pains and learning from implementing reliable signals in our systems. The release of the the Django tasks framework (and the backends) is surely a welcome addition to an increasing number of large systems built with Django, that needs to have reliable and durable workflows.

30 Oct 2025 3:00am GMT

27 Oct 2025

feedDjango community aggregator: Community blog posts

Django Survey 2025 - Jeff Triplett

πŸ”— Links

πŸ“¦ Projects

πŸ“š Books

πŸŽ₯ YouTube

Sponsor

This episode was brought to you by Buttondown, the easiest way to start, send, and grow your email newsletter. New customers can save 50% off their first year with Buttondown using the coupon code DJANGO.

27 Oct 2025 6:00pm GMT

24 Oct 2025

feedDjango community aggregator: Community blog posts

Django News - Django 6.0 beta 1 released - Oct 24th 2025

News

Django 6.0 beta 1 released

Django 6.0 beta 1 is now available. It represents the second stage in the 6.0 release cycle and is an opportunity to try out the changes coming in Django 6.0.

djangoproject.com

PyCharm & Django annual fundraiser

JetBrains and the Django Software Foundation have launched their annual "Buy PyCharm, Support Django" fundraiser, running from October 23 to November 11, 2025, offering 30% off PyCharm with all proceeds donated to support Django's development and community programs.

djangoproject.com

Announcing Python Software Foundation Fellow Members for Q3 2025! πŸŽ‰

Quite a few friends of Django are newly-announced Fellows!

blogspot.com

CPython Core Dev Sprint 2025 at Arm Cambridge: The biggest one yet

For one week, Arm's Cambridge headquarters became the heart of Python development. Contributors from around the world came together for the CPython Core Developer Sprint. It was the largest gathering in the project's history, with 35 core developers and 13 invited guests collaborating in person.

blogspot.com

Updates to Django

Today, "Updates to Django" is presented by Raffaella from Djangonaut Space! πŸš€

Last week we had 26 pull requests merged into Django by 15 different contributors - including 4 first-time contributors! Congratulations to Lev Zlobin, Segni Mekonnen, Augusto Pontes and aj2s for having their first commits merged into Django - welcome on board!

News for this week:

Fixed a bug in Django 5.2 where QuerySet.first() and QuerySet.last() raised an error on querysets performing aggregation that selected all fields of a composite primary key.

In Django 6.0:

In Django 6.1:

Python 3.14 is now supported in Django 6.0 and Django 5.2

Django Newsletter

Wagtail CMS

Wagtail 7.2rc1 and 7.1.2 Released

πŸͺΆ Wagtail 7.2 Release Candidate 1 adds Python 3.14 support, drops Python 3.8, introduces new admin keyboard shortcuts, a usage count filter, and improved comment handling.

πŸ› οΈ Wagtail 7.1.2 is a maintenance release with fixes for label formatting, userbar loading on multi-site setups, header icon handling, cross-origin content metrics, and a small documentation update.

Django Newsletter

Sponsored Link 1

Until November 9, 2025, get PyCharm for 30% off. All money goes to the Django Software Foundation!

This annual promotion has raised over $330,000 for the Django Software Foundation over the years, by far the single biggest fundraiser for Django. If you're interested in trying out PyCharm Pro for the first time, this is the way to do it.

jetbrains.com

Articles

My favorite Django packages - Matthias Kestenholz

Some old classics as well as a few newer/slightly more obscure picks in this list from Matthias. Worth a read!

406.ch

How Functional Programming Shaped (and Twisted) Frontend Development

A thoughtful essay how functional programming principles-like immutability, purity, and determinism-reshaped modern frontend development.

alfy.blog

Per-object Permissions for Elasticsearch Lists in Django Websites

Aidas Bendoraitis explains how to implement efficient per-object permissions in Elasticsearch-powered Django list views using django-guardian and django-elasticsearch-dsl.

djangotricks.com

Using Async Functions in Celery with Django Connection Pooling

A deeply technical walkthrough by Don Brown showing how to properly run async Django code inside Celery tasks-using ThreadSensitiveContext to manage connection pooling and cleanup-bridging the gap between Django's async ORM and Celery's sync execution model.

blogspot.com

[2401.06889] Invisible Labor in Open Source Software Ecosystems

An academic article worth reading examining all the invisible labor in open source.

arxiv.org

My First DjangoCon Africa 2025 Experience: A Chaos Engineering Story.

Impressions from a speaker and first-time attendee at DjangoCon Africa 2025.

scribe.rip

Django Fellow Report

Django Fellow Report - Natalia

A security-heavy week with a steady flow of incoming reports keeping things quite busy (and sadly not that fun). The CNA process also moved forward, with hands-on testing and API study taking a fair share of focus. I also started work on the release checklist generator to update the CVE management process in preparation when CNA status is fully confirmed.

Add to that a full lineup of meetings and follow-ups, and it made for a packed but hopefully productive week. The new auto-magic roadmap pages also landed in djangoproject.com, with links from the Download page: this reduces the manual work required for future feature freezes/alpha releases.

djangoproject.com

Django Fellow Report - Jacob

I helped land two major 6.1 features this week: model field fetching modes, and database-level delete options. I also advanced some reviews for Djangonaut Space participants.

djangoproject.com

Videos

"Django, what the JOIN?"

Simon Charette presents his talk, "Django, what the JOIN?" to the Djangonaut Space 2025 Session 5 team.

djangotv.com

Sponsored Link 2

AI-Powered Django Development & Consulting

REVSYS specializes in integrating powerful AI technologies, including GPT-5, directly into your Django applications. We help bring modern, intelligent features to your project that boost user engagement and streamline content workflows.

revsys.com

Podcasts

Django Chat #187: Django on the Med - Paolo Melchiorre

Paolo and Carlton are just returned from the inaugural Django on the Med event and here to discuss how it came to pass, the code improvements from just three days, and plans for the future.

djangochat.com

Episode #454 It's some form of Elvish - [Python Bytes Podcast]

A reference to Emma Levit's new djrest2 library, a small and simple REST library for Django based on class-based views.

pythonbytes.fm

Django News Jobs

This week's Django job picks span academia, startups, and cutting-edge tech. From a lead engineering role at the University of Michigan to opportunities in AI and health tech, there's something for every Django developer ready for their next move.

Software Engineer Lead at Center for Academic Innovation, University of Michigan πŸ†•

Part-Time Senior Full-Stack Engineer (Python/Django) (gn) at voiio πŸ†•

Founding Backend Engineer (On-site San Francisco) - Python β€’ AWS β€’ LLM/RAG at Purrfect Hire

Senior Python Developer at Basalt Health

Senior Software Engineer (Python and Solidity) at LiquidFi

Django/Python Full-stack Engineer at JoinTriple.com

Django Newsletter

Projects

CuriousLearner/django-keel

A versatile, production-ready Django project template for any use case Build SaaS applications, API backends, web apps, or internal tools with one template.

github.com

timonweb/easy-django-cli

A modern CLI tool that simplifies Django development by replacing python manage.py and django-admin commands with simpler django or dj commands.

github.com

marlenezw/django-girls-offline

An offline version of the django girls tutorial. Contribute to marlenezw/django-girls-offline development by creating an account on GitHub.

github.com

adamghill/dj-spinners

Pure SVG loading spinners for Django.

github.com

Sponsorship

πŸ”– Sponsor Django News for Q3 2025!

Each week, Django News lands in the inboxes of almost 4,300 Django developers. Our 52% open rate and 15% click-through rate show just how engaged our readers are. Want to reach developers who actually read and click?

Sponsor an issue and get your product, service, or job in front of them.

πŸ‘‰ See sponsorship options

django-news.com


This RSS feed is published on https://django-news.com/. You can also subscribe via email.

24 Oct 2025 3:00pm GMT

23 Oct 2025

feedDjango community aggregator: Community blog posts

Weeknotes (2025 week 43)

Weeknotes (2025 week 43)

I published the last weeknotes entry in the first half of September.

Drama in OSS

I have been following the Ruby gems debacle a bit. Initially at Feinheit we used our own PHP-based framework swisdk2 to build websites. This obviously didn't scale and I was very annoyed with PHP, so I was looking for alternatives.

I remember comparing Ruby on Rails and Django, and decided to switch from PHP/swisdk2 to Python/Django for two reasons: The automatically generated admin interface and the fact that Ruby source code just had too much punctuation characters for my taste. It's a very whimsical reason and I do not put any weight on that. That being said, given how some of the exponents in Ruby/Rails land behave I'm very very glad to have chosen Python and Django. While not everything is perfect (it never is) at least those communities agree that trying to behave nicely to each other is something to be cheered and not something to be sneered at.

Copilot

I assigned some GitHub issues to Copilot. The result wasn't very useful. I don't know if I want to repeat it, local tools work fine for when I really need them.

Python and Django compatibility

It's the time again to update the GitHub actions matrix and Trove identifiers. I do not like doing it. You can expect all maintained packages to be compatible with the latest and best versions, no upper bounds necessary. Man, if only AI could automate those tasks…

Updated packages since 2025-09-10

23 Oct 2025 5:00pm GMT

22 Oct 2025

feedDjango community aggregator: Community blog posts

My favorite Django packages

My favorite Django packages

Inspired by other posts I also wanted to write up a list of my favorite Django packages. Since I've been working in this space for so long and since I'm maintaining quite a large list of packages I worry a bit about tooting my own horn too much here; that said, the reasons for choosing some packages hopefully speak for themselves.

Also, I'm sure I'm forgetting many many packages here. Sorry for that in advance.

Core Django

Data structures

CMS building

I have been working on FeinCMS since 2009. So, it shouldn't surprise anyone that this is still my favorite way to build CMS on top of Django. I like that it's basically a thin layer on top of Django's administration interface and doesn't want to take over the whole admin interface or even the whole website.

Working with external content

PDF generation

Testing and development

Last but not least, I really like django-debug-toolbar. So much, that I'm even helping with the maintenance since 2016.

Serving

We mostly use Kubernetes to serve websites these days. Inside the pods, I'm working with the granian RSGI/ASGI server and with blacknoise for serving static files.

22 Oct 2025 5:00pm GMT

An Introduction of sorts

I realised this week after a short conversation on Mastodon that I haven't ever shared my personal Django journey, so hear goes!

As a recent graduate in 2012, I first encounter Django while working at Ocado Technology, tasked with building internal tools for other developers. I was shown the ropes of Django by Ben Cardy (@benbacardi) and Mike Bryant. Essentially this was about putting a frontend to some scripts which would provision users and allow them to upload their ssh keys. This progressed to automating application provisioning with some hackery using the rubypython package if I remember correctly and storing data in an LDAP database. I started using Django 1.4, explored packaging these projects into .deb files, setup an internal pypi instance and tried to created a unified UI package across multiple projects. Finally we did start to open source a few packages on Github.

2015 saw me leave Ocado to join a small charity startup using Django. Here I joined Bruno Alla (@browniebroke) building out a Django application hosted on heroku. For a long time it was just us two as developers, but this eventually grew to a team of 6. Again we published a few packages that we could split from the main codebase, although most of these have not grown in popularity. It was around this time I was becoming more aware of the mailing lists and contributing back to packages we used in the ecosystem.

I left the startup in 2019 to go freelance, setting up Software Crafts, my own limited company. At the time I still wasn't sure what my ideal client would be and agonised over this for years before realising Django was the through line of my career and my passion when it came to building software! I have had numerous clients over the last 6 years, mostly with small teams or startups, either building something fresh or continuing off from where other developers had left with little to zero handover. During this time I realised the community had started a Discord server and so I joined in July 2022 and started answering questions and helping out.

The next year in 2023 I was asked to be a moderator and also attended DjangoCon Europe for the first time, which was an absolute blast. It was in this year that I started hosting a Django Social event each month in Cambridge, so generally getting very much more involved in the community. I also started donating regularly to the DSF mid 2023. 2024 saw me start contributing in the form code when I applied and got into one of the Djangonaut Sessions.

Towards the end of 2024 I proposed the Online Community Working Group which after many rounds of comments and iteration was approved earlier this year. We are still getting the ball rolling on this, but I hope to have an announcement before the year finishes! I also had the opportunity to attend DjangoCon Europe again this year which was again was lovely to meet new friends from online and old friends from the last conference, as well as take the stage for a lighting talk on the last day. I recently was also made an admin of the Discord server! In terms of my career, 2024 saw me accept the opportunity to become co-founder of a new startup which I have been building this year. This of course is built using Django and leveraging the latest patterns (HTMX & partials etc).

I see a bright future for Django, I'm going to continue to contribute my time, energy and finance to the community, currently focusing on improving our online spaces which is key for those who cannot join the conferences or other in-person meetups (for which there are many). I also hope to get to more in-person events myself in due course!

22 Oct 2025 5:00am GMT

21 Oct 2025

feedDjango community aggregator: Community blog posts

Django on the Med - Paolo Melchiorre

πŸ”— Links

πŸ“¦ Projects

πŸ“š Books

πŸŽ₯ YouTube

Sponsor

This episode was brought to you by HackSoft, your development partner beyond code. From custom software development to consulting, team augmentation, or opening an office in Bulgaria, they're ready to take your Django project to the next level!

21 Oct 2025 5:00pm GMT