13 Mar 2026
Docker
Achieving Test Reliability for Native E2E Testing: Beyond Fixing Broken Tests
End-to-end (E2E) tests are particularly important for native applications that run on various platforms (Android/iOS), screen sizes, and OS versions. E2E testing picks up differences in behavior across this fragmented ecosystem. But keeping E2E tests reliable is often more challenging than writing them in the first place. The fragmented device ecosystem, gaps in test frameworks,...
13 Mar 2026 1:00pm GMT
How to Run Claude Code with Docker: Local Models, MCP Servers, and Secure Sandboxes
Claude Code is quickly becoming a go-to AI coding assistant for developers and increasingly for non-developers who want to build with code. But to truly unlock its potential, it needs the right local infrastructure, tool access, and security boundaries. In this blog, we'll show you how to run Claude Code with Docker to gain full...
13 Mar 2026 12:17pm GMT
Secure Agent Execution with NanoClaw and Docker Sandboxes
Agents have enormous potential to power secure, personal AI assistants that automate complex tasks and workflows. Realizing that potential, however, requires strong isolation, a codebase that teams can easily inspect and understand, and clear control boundaries they can trust. Today, NanoClaw, a lightweight agent framework, is integrating with Docker Sandboxes to deliver secure-by-design agent execution....
13 Mar 2026 12:01pm GMT
12 Mar 2026
Docker
Flexibility Over Lock-In: The Enterprise Shift in Agent Strategy
Building agents is now a strategic priority for 95% of respondents in our latest State of Agentic AI research, which surveyed more than 800 developers and decision-makers worldwide. The shift is happening quickly: agent adoption has moved beyond experiments and demos into early operational maturity. But the road to enterprise-scale adoption is still complex. The...
12 Mar 2026 12:50pm GMT
11 Mar 2026
Docker
Building AI Teams: How Docker Sandboxes and Docker Agent Transform Development
It's 11 PM. You've got a JIRA ticket open, an IDE with three unsaved files, a browser tab on Stack Overflow, and another on documentation. You're context-switching between designing UI, writing backend APIs, fixing bugs, and running tests. You're wearing all the hats, product manager, designer, engineer, QA specialist, and it's exhausting. What if instead...
11 Mar 2026 1:00pm GMT
10 Mar 2026
Docker
What’s Holding Back AI Agents? It’s Still Security
It's hard to find a team today that isn't talking about agents. For most organizations, this isn't a "someday" project anymore. Building agents is a strategic priority for 95% of respondents that we surveyed across the globe with 800+ developers and decision makers in our latest State of Agentic AI research. The shift is happening...
10 Mar 2026 12:59pm GMT
06 Mar 2026
Docker
Celebrating Women in AI: 3 Questions with Cecilia Liu on Leading Docker’s MCP Strategy
To celebrate International Women's Day, we sat down with Cecilia Liu, Senior Product Manager at Docker, for three questions about the vision and strategy behind Docker's MCP solutions. From shaping product direction to driving AI innovation, Cecilia plays a key role in defining how Docker enables secure, scalable AI tooling. Cecilia leads product management for...
06 Mar 2026 12:59pm GMT
03 Mar 2026
Docker
Announcing Docker Hardened System Packages
Your Package Manager, Now with a Security Upgrade Last December, we made Docker Hardened Images (DHI) free because we believe secure, minimal, production-ready images should be the default. Every developer deserves strong security at no cost. It should not be complicated or locked behind a paywall. From the start, flexibility mattered just as much as...
03 Mar 2026 8:30pm GMT
26 Feb 2026
Docker
Docker Model Runner Brings vLLM to macOS with Apple Silicon
vLLM has quickly become the go-to inference engine for developers who need high-throughput LLM serving. We brought vLLM to Docker Model Runner for NVIDIA GPUs on Linux, then extended it to Windows via WSL2. That changes today. Docker Model Runner now supports vllm-metal, a new backend that brings vLLM inference to macOS using Apple Silicon's...
26 Feb 2026 2:42pm GMT
25 Feb 2026
Docker
Open WebUI + Docker Model Runner: Self-Hosted Models, Zero Configuration
We're excited to share a seamless new integration between Docker Model Runner (DMR) and Open WebUI, bringing together two open source projects to make working with self-hosted models easier than ever. With this update, Open WebUI automatically detects and connects to Docker Model Runner running at localhost:12434. If Docker Model Runner is enabled, Open WebUI...
25 Feb 2026 2:37pm GMT