13 Feb 2026

feedDjango community aggregator: Community blog posts

Django News - The Post-Heroku Django World - Feb 13th 2026

News

Django Steering Council 2025 Year in Review

They've been busy! A new-features repo, Community Ecosystem page, administrative bits, and more.

djangoproject.com

Read the Docs: Making search faster for all projects

Read the Docs massively improved search latency by reindexing into multiple shards, tuning Elasticsearch queries and client, and fixing Django ORM N+1s and caching.

readthedocs.com

Releases

Python Insider: Python 3.15.0 alpha 6

Python 3.15.0a6 preview highlights a new low-overhead sampling profiler, UTF-8 default encoding, JIT performance gains, unpacking in comprehensions, and typing improvements.

blogspot.com

Python Software Foundation

Python is for Everyone

Georgi from the PSF Diversity and Inclusion Working Group talks about the history of these efforts and most importantly, why it matters for all of us.

georgiker.com

Django Fellow Reports

Fellow Report - Natalia

3 tickets triaged, 2 reviewed, 1 authored, security work, and other misc.

djangoproject.com

Fellow Report - Jacob

8 tickets triaged, 18 reviewed, 6 authored, 2 discussed, and other misc.

djangoproject.com

Wagtail CMS News

Wagtail nominated for TWO CMS Critic Awards! πŸ†

Wagtail CMS is up for some trophies.

wagtail.org

Updates to Django

Today, "Updates to Django" is presented by Hwayoung from Djangonaut Space! πŸš€

Last week we had 11 pull requests merged into Django by 8 different contributors - including 2 first-time contributors! Congratulations to Patryk Bratkowski and ar3ph for having their first commits merged into Django - welcome on board!

It's fixed horizontal form field alignment issues within <fieldset> in admin. (#36788)

Django Newsletter

Sponsored Link 1

PyTV - Free Online Python Conference (March 4th)

1 Day, 15 Speakers, 6 hours of live talks including from Sarah Boyce, Sheena O'Connell, Carlton Gibson, and Will Vincent. Sign up and save the date!

jetbrains.com

Articles

Django Developer Salary Report 2026

An annual report from Foxley Talent on what's actually happening in the market.

foxleytalent.com

Sorting Strategies for Optional Fields in Django

How to control NULL value placement when sorting Django QuerySets using F() expressions.

blog.maksudul.bd

How to dump Django ORM data to JSON while debugging?

Sometimes, I need to debug specific high-level tests by inspecting what gets created in the database as a side effect. I could use a debugger and poke around the Django ORM at a breakpoint - but quite often it's simply faster to dump the entire table to JSON, see what's there, and then apply fixes accordingly.

github.io

Introducing: Yapping, Yet Another Python Packaging (Manager)

Yapping automates adding dependencies to pyproject.toml and running pip-tools compile/install, providing a simple, non-lockfile Python dependency workflow for Django projects.

jovell.dev

Python: introducing icu4py, bindings to the Unicode ICU library

icu4py provides Pythonic bindings to ICU4C for locale-aware text boundary analysis and MessageFormat pluralization, enabling precise internationalization in Django apps.

adamj.eu

Loopwerk: It's time to leave Heroku

Heroku is winding down; migrate Django apps now to alternatives like Fly.io, Render, or self-hosted Coolify and Hetzner to regain control, reliability, and lower costs.

loopwerk.io

Heroku Is (Finally, Officially) Dead

Analyzing the official announcement and reviewing hosting alternatives in 2026.

wsvincent.com

Videos

django-bolt - Rust-powered API Framework for Django

An overview from BugBytes on the new django-bolt package, describing what it is and how to use it!

youtube.com

Sponsored Link 2

Sponsor This Newsletter!

Reach 4,300+ highly-engaged and experienced Django developers.

django-news.com

Podcasts

Django Chat #195: Improving Django with Adam Hill

Adam is the co-host of the Django Brew podcast and prolific contributor to the Django ecosystem with author of a multitude of Django projects including django-unicorn, coltrane, dj-angles, and many more.

djangochat.com

Django Job Board

Lead Backend Engineer at TurnTable πŸ†•

Python Developer REST APIs - Immediate Start at Worx-ai

Backend Software Developer at Chartwell Resource Group Ltd.

Senior Django Developer at SKYCATCHFIRE

Django Newsletter

Projects

JohananOppongAmoateng/django-migration-audit

A forensic Django tool that verifies whether a live database schema is historically consistent with its applied migrations.

github.com

G4brym/django-cf

A set of tools to integrate Django with Cloudflare Developer platform.

github.com

DjangoAdminHackers/django-linkcheck

An app that will analyze and report on links in any model that you register with it. Links can be bare (urls or image and file fields) or embedded in HTML (linkcheck handles the parsing). It's fairly easy to override methods of the Linkcheck object should you need to do anything more complicated (like generate URLs from slug fields etc).

github.com


This RSS feed is published on https://django-news.com/. You can also subscribe via email.

13 Feb 2026 5:00pm GMT

Use your Claude Max subscription as an API with CLIProxyAPI

So here's the thing: I'm paying $100/month for Claude Max. I use it a lot, it's worth it. But then I wanted to use my subscription with my Emacs packages - specifically forge-llm (which I wrote!) for generating PR descriptions in Forge, and magit-gptcommit for auto-generating commit messages in Magit. Both packages use the llm package, which supports OpenAI-compatible endpoints.

The problem? Anthropic blocks OAuth tokens from being used directly with third-party API clients. You have to pay for API access separately. πŸ€”

That felt wrong. I'm already paying for the subscription, why can't I use it however I want?

Turns out, there's a workaround. The Claude Code CLI can use OAuth tokens. So if you put a proxy in front of it that speaks the OpenAI API format, you can use your Max subscription with basically anything that supports OpenAI endpoints. And that's exactly what CLIProxyAPI does.

Your App (Emacs llm package, scripts, whatever)
↓
HTTP Request (OpenAI format)
↓
CLIProxyAPI
↓
OAuth Token (from your Max subscription)
↓
Anthropic API
↓
Response β†’ OpenAI format β†’ Your App

No extra API costs. Just your existing subscription. Sweet!

Why CLIProxyAPI and not something else?

I actually tried claude-max-api-proxy first. It worked! But the model list was outdated (no Opus 4.5, no Sonnet 4.5), it's a Node.js project that wraps the CLI as a subprocess, and it felt a bit… abandoned.

CLIProxyAPI is a completely different story:

What you'll need

Installation

Linux

There's a community installer that does everything for you: downloads the latest binary to ~/cliproxyapi/, generates API keys, creates a systemd service:

curl -fsSL https://raw.githubusercontent.com/brokechubb/cliproxyapi-installer/refs/heads/master/cliproxyapi-installer | bash

If you're on Arch (btw):

yay -S cli-proxy-api-bin

macOS

Homebrew. Easy:

brew install cliproxyapi

Authenticating with Claude

Before the proxy can use your subscription, you need to log in:

# Linux
cd ~/cliproxyapi
./cli-proxy-api --claude-login

# macOS (Homebrew)
cliproxyapi --claude-login

This opens your browser for the OAuth flow. Log in with your Claude account, authorize it, done. The token gets saved to ~/.cli-proxy-api/.

If you're on a headless machine, add --no-browser and it'll print the URL for you to open elsewhere:

./cli-proxy-api --claude-login --no-browser

Configuration

The installer generates a config.yaml with random API keys. These are keys that clients use to authenticate to your proxy, not Anthropic keys.

Here's what I'm running:

# Bind to localhost only since I'm using it locally
host: "127.0.0.1"

# Server port
port: 8317

# Authentication directory
auth-dir: "~/.cli-proxy-api"

# No client auth needed for local-only use
api-keys: []

# Keep it quiet
debug: false

The important bit is api-keys: []. Setting it to an empty list disables client authentication, which means any app on your machine can hit the proxy without needing a key. This is fine if you're only using it locally.

If you're exposing the proxy to your network (e.g., you want to hit it from your phone or another machine), keep the generated API keys and also set host: "" so it binds to all interfaces. You don't want random people on your network burning through your subscription.

Starting the service

Linux (systemd)

The installer creates a systemd user service for you:

systemctl --user enable --now cliproxyapi.service
systemctl --user status cliproxyapi.service

Or just run it manually to test first:

cd ~/cliproxyapi
./cli-proxy-api

macOS (Homebrew)

brew services start cliproxyapi

Testing it

Let's make sure everything works:

# List available models
curl http://localhost:8317/v1/models

# Chat completion
curl -X POST http://localhost:8317/v1/chat/completions \
 -H "Content-Type: application/json" \
 -d '{
 "model": "claude-sonnet-4-20250514",
 "messages": [{"role": "user", "content": "Say hello in one sentence."}]
 }'

# Streaming (note the -N flag to disable curl buffering)
curl -N -X POST http://localhost:8317/v1/chat/completions \
 -H "Content-Type: application/json" \
 -d '{
 "model": "claude-sonnet-4-20250514",
 "messages": [{"role": "user", "content": "Say hello in one sentence."}],
 "stream": true
 }'

If you get a response from Claude, you're golden. πŸŽ‰

Using it with Emacs

This is the fun part. Both forge-llm and magit-gptcommit use the llm package for their LLM backend. The llm package has an OpenAI-compatible provider, so we just need to point it at our proxy.

Setting up the llm provider

First, make sure you have the llm package installed. Then configure an OpenAI provider that points to CLIProxyAPI:

(require 'llm-openai)

(setq my/claude-via-proxy
 (make-llm-openai-compatible
 :key "not-needed"
 :chat-model "claude-sonnet-4-20250514"
 :url "http://localhost:8317/v1"))

That's it. That's the whole LLM setup. Now we can use it everywhere.

forge-llm (PR descriptions)

I wrote forge-llm to generate PR descriptions in Forge using LLMs. It analyzes the git diff, picks up your repository's PR template, and generates a structured description. To use it with CLIProxyAPI:

(use-package forge-llm
 :after forge
 :config
 (forge-llm-setup)
 (setq forge-llm-llm-provider my/claude-via-proxy))

Now when you're creating a PR in Forge, you can hit SPC m g (Doom) or run forge-llm-generate-pr-description and Claude will write the description based on your diff. Using your subscription. No API key needed.

magit-gptcommit (commit messages)

magit-gptcommit does the same thing but for commit messages. It looks at your staged changes and generates a conventional commit message. Setup:

(use-package magit-gptcommit
 :after magit
 :config
 (setq magit-gptcommit-llm-provider my/claude-via-proxy)
 (magit-gptcommit-mode 1)
 (magit-gptcommit-status-buffer-setup))

Now in the Magit commit buffer, you can generate a commit message with Claude. Again, no separate API costs.

Any other llm-based package

The beauty of the llm package is that any Emacs package that uses it can benefit from this setup. Just pass my/claude-via-proxy as the provider. Some other packages that use llm: ellama, ekg, llm-refactoring. They'll all work with your Max subscription through the proxy.

Using it with other tools

Since CLIProxyAPI speaks the OpenAI API format, it works with anything that supports custom OpenAI endpoints. The magic three settings are always the same:

Here's a Python example using the OpenAI SDK:

from openai import OpenAI

client = OpenAI(
 base_url="http://localhost:8317/v1",
 api_key="not-needed"
)

response = client.chat.completions.create(
 model="claude-sonnet-4-20250514",
 messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Available models

CLIProxyAPI exposes all models available through your subscription. The names use the full dated format. You can always check the list with:

curl -s http://localhost:8317/v1/models | jq '.data[].id'

At the time of writing, you'll get Claude Opus 4, Sonnet 4, Sonnet 4.5, Haiku 4.5, and whatever else Anthropic has made available to Max subscribers.

How much does this save?

If you're already paying for Claude Max, this is basically free API access. For context:

Usage API Cost With CLIProxyAPI
1M input tokens/month ~$15 $0 (included)
500K output tokens/month ~$37.50 $0 (included)
Monthly Total ~$52.50 $0 extra

And those numbers add up quick when you're generating PR descriptions and commit messages all day. I was getting to the point where my API costs were approaching the subscription price, which is silly when you think about it.

Conclusion

The whole setup took me about 10 minutes. Download binary, authenticate, edit config, start service, point my Emacs llm provider at it. That's it.

What I love about CLIProxyAPI is that it's exactly the kind of tool I appreciate: a single binary, a YAML config, does one thing well, and gets out of your way. No magic, no framework, no runtime dependencies. And since it's OpenAI-compatible, it plays nicely with the entire llm package ecosystem in Emacs.

The project is at https://github.com/router-for-me/CLIProxyAPI and the community is very active. If you run into issues, their GitHub issues are responsive.

See you in the next one!

13 Feb 2026 6:00am GMT

11 Feb 2026

feedDjango community aggregator: Community blog posts

Improving Django - Adam Hill

πŸ”— Links

πŸ“¦ Projects

πŸ“š Books

πŸŽ₯ YouTube

Sponsor

This episode was brought to you by Buttondown, the easiest way to start, send, and grow your email newsletter. New customers can save 50% off their first year with Buttondown using the coupon code DJANGO.

11 Feb 2026 5:00pm GMT