Dev Containers in VS Code

Dev containers run your development environment inside Docker, so every team member is on the same Node version, same database, same tools — without the "works on my machine" problem.

March 30, 20265 min read

Dev containers run your VS Code development environment inside a Docker container. Your editor lives on your machine. Your code, your tools, your runtime — those live in the container.

The result: everyone on the team is running the same Node version, the same database, the same CLI tools, configured the same way. "Works on my machine" stops being an excuse because it's everyone's machine.

Do You Actually Need This?

Honest answer: maybe not.

You probably don't need dev containers if:

  • You're writing frontend code only
  • You connect to remote APIs (no local database)
  • You work on one project all the time

You probably do need them if:

  • Different projects need different Node/Ruby/Python versions
  • You have server-side dependencies (Postgres, Redis, Temporal, etc.)
  • Onboarding new team members is painful ("step 1: install Homebrew, step 2: install NVM, step 3: install...")
  • You want to safely run AI agents that can execute shell commands

The complexity cost is real. Don't add it unless it solves a real problem. But when it does solve a real problem, it's a significant relief.

How It Works

When you open a project with a dev container configuration, VS Code offers to "Reopen in Container." When you accept:

  1. Docker builds (or pulls) the container image
  2. VS Code connects its backend to that container
  3. Your editor looks and feels the same — but terminals, file operations, and extensions run inside the container

Your files: VS Code mounts your local project directory into the container, so changes are reflected immediately. You're editing the same files.

Setting One Up

Command Palette → "Dev Containers: Add Dev Container Configuration Files"

You'll be asked:

  1. Where to save: project (checked into git, shared with team) or user (personal, all projects)
  2. Base image: the OS/runtime you want — Node, Python, Ubuntu, etc.
  3. Node version: which LTS to use
  4. Features: additional tools to install (GitHub CLI, AWS CLI, Docker-in-Docker, etc.)

VS Code generates .devcontainer/devcontainer.json.

devcontainer.json

The generated file looks like this:

JSON
{ "name": "Node.js Dev", "image": "mcr.microsoft.com/devcontainers/typescript-node:22", "features": { "ghcr.io/devcontainers/features/github-cli:1": {}, "ghcr.io/devcontainers/features/aws-cli:1": {} }, "forwardPorts": [3000, 5173], "customizations": { "vscode": { "extensions": [ "esbenp.prettier-vscode", "dbaeumer.vscode-eslint", "GitHub.copilot" ], "settings": { "editor.formatOnSave": true, "editor.defaultFormatter": "esbenp.prettier-vscode" } } }, "postCreateCommand": "npm install" }

Key fields:

FieldWhat it does
imageThe base Docker image
featuresPre-built tooling layers to add on top
forwardPortsPorts to expose from the container to localhost
customizations.vscode.extensionsExtensions installed automatically for everyone
customizations.vscode.settingsVS Code settings applied inside the container
postCreateCommandCommand to run after the container is created

Port Forwarding

The forwardPorts field makes container ports available on your local machine. If your app runs on port 3000 inside the container, "forwardPorts": [3000] makes http://localhost:3000 work in your browser.

VS Code also detects open ports automatically and offers to forward them — a notification appears when it detects a new port.

Extensions and Settings

This is one of the most underrated features. With customizations.vscode, you specify which extensions every person working on this project should have. When they open the container, those extensions install automatically.

New team member → open in container → they have ESLint, Prettier, your language support, and your team's settings — without setup steps.

Using a Dockerfile Instead

For more control, use a Dockerfile instead of a pre-built image:

JSON
{ "build": { "dockerfile": "Dockerfile", "context": ".." } }
DOCKERFILE
FROM mcr.microsoft.com/devcontainers/node:22 # Install additional tools RUN apt-get update && apt-get install -y postgresql-client # Install global npm tools RUN npm install -g tsx # Set up workspace WORKDIR /workspace

Use a Dockerfile when you need custom configuration beyond what features provide: specific OS packages, custom scripts, environment-specific setup.

Multiple Services with Docker Compose

For projects with multiple services (API + database + cache), use Docker Compose:

JSON
{ "dockerComposeFile": "docker-compose.yml", "service": "app", "workspaceFolder": "/workspace" }
YAML
# docker-compose.yml services: app: build: . volumes: - ..:/workspace command: sleep infinity postgres: image: postgres:16 environment: POSTGRES_DB: myapp_dev POSTGRES_USER: dev POSTGRES_PASSWORD: dev ports: - "5432:5432" redis: image: redis:7 ports: - "6379:6379"

Now your container has Node, Postgres, and Redis all running consistently. DATABASE_URL=postgresql://dev:dev@postgres:5432/myapp_dev works for every developer.

Opening in the Container

Once configured, VS Code shows a notification: "Reopen in Container." Click it.

Or: Command Palette → "Dev Containers: Reopen in Container."

The first time, Docker has to build the image. Subsequent opens are faster (the image is cached).

To get back to local: Command Palette → "Dev Containers: Reopen Folder Locally."

Environment Variables

For secrets and credentials, use a .devcontainer/.env file (never committed):

ENV
DATABASE_URL=postgresql://dev:dev@postgres:5432/myapp_dev STRIPE_SECRET_KEY=sk_test_...

Reference it in devcontainer.json:

JSON
{ "runArgs": ["--env-file", ".devcontainer/.env"] }

Or use Docker Compose's env_file key.

The Real Benefits

Consistent environment, zero negotiation. Everyone runs Node 22.5. Nobody's on 18, nobody's on a version with a known memory leak. The Postgres version matches production. The environment variables point to the same local services.

Safe AI agent sandbox. If you're using Claude Code, Cursor agent, or any tool that runs shell commands, running inside a container limits blast radius. The agent can rm -rf whatever it wants inside the container — rebuild and you're back to clean state.

Contractor/consultant workflows. One machine, six client projects, each in its own container. Jump between Ruby 3.1 and Ruby 3.3, Postgres 14 and Postgres 16, without juggling rbenv and version managers.

Onboarding. git clone, open in container, npm run dev. That's the onboarding. No wiki page of setup instructions, no "oh also you need to install libvips" surprises.

The tradeoff is Docker complexity and slightly slower initial startup. For the right project, it's completely worth it. For a solo frontend project connecting to a remote API, it's overkill. You'll know which one you're in.

Enjoyed this? Get more like it.

Deep dives on system design, React, web development, and personal finance — straight to your inbox. Free, always.