A complete Docker Compose setup for Apache Airflow 3.1.0 optimized for Windows 11 development.
./
├── dags/ # Place your DAG files here
├── logs/ # Airflow logs (auto-generated)
├── plugins/ # Custom Airflow plugins
├── config/ # Airflow configuration files
├── docker-compose.yml # Main Docker Compose configuration
├── .env # Environment variables (customize this)
├── .env.example # Environment template
└── README.md # This file
-
Prerequisites
- Docker Desktop for Windows 11
- At least 4GB RAM allocated to Docker
- WSL2 backend enabled (recommended)
-
Setup
# Copy environment template cp .env.example .env # Edit .env file with your settings # Generate a new Fernet key: python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"
-
Start Airflow
# Initialize database (first time only) docker-compose up airflow-init # Start all services docker-compose up -d
-
Access Airflow
- Web UI: http://localhost:8080
- Get admin password:
scripts/get-admin-password.bat(Windows) orscripts/get-admin-password.sh(Linux/Mac) - Username: admin
- Password: Auto-generated by Simple Auth Manager (changes on restart)
- airflow-api-server: Web UI and REST API (port 8080)
- airflow-scheduler: Task scheduling and monitoring
- airflow-dag-processor: DAG file parsing
- airflow-triggerer: Deferred and async task handling
- postgres: Database backend (PostgreSQL 17)
- redis: Message broker for task distribution
- airflow-worker: Celery worker processes (scalable via
NUMBER_OF_WORKERS) - airflow-scheduler-celery: Scheduler with Celery configuration
- airflow-api-server-celery: API server with Celery configuration
- flower: Celery worker monitoring dashboard (port 5555)
Key configuration options in your .env file:
# Executor Configuration
AIRFLOW_EXECUTOR=LocalExecutor # or CeleryExecutor
COMPOSE_PROFILES=celery,flower # Enable Celery + Flower
# Worker Scaling (CeleryExecutor only)
NUMBER_OF_WORKERS=2 # Number of worker replicas
# Security Keys (Generate new ones for production)
AIRFLOW_FERNET_KEY=your-fernet-key
AIRFLOW_SECRET_KEY=your-secret-key
# Service Ports
AIRFLOW_WEBSERVER_PORT=8080
FLOWER_PORT=5555
# Database & Redis
POSTGRES_PASSWORD=airflow
REDIS_PASSWORD=airflowControl which services start using profiles:
- Default: Core Airflow services with LocalExecutor
- celery: Adds Redis, Celery workers, and Celery-configured services
- flower: Adds Flower monitoring dashboard
Examples:
# LocalExecutor only
docker-compose up -d
# CeleryExecutor with monitoring
COMPOSE_PROFILES=celery,flower docker-compose up -d
# Or set in .env file
echo "COMPOSE_PROFILES=celery,flower" >> .env
docker-compose up -d- Single-node execution
- Good for development and small workloads
- No additional services required
- Multi-worker distributed execution
- Scalable for production workloads
- Requires Redis message broker
To enable CeleryExecutor:
- Set
AIRFLOW_EXECUTOR=CeleryExecutorin.env - Set
COMPOSE_PROFILES=celery,flowerin.env - Restart services:
docker-compose up -d
Worker Scaling:
- Configure
NUMBER_OF_WORKERS=2in.envto set worker count - Workers will be identical with same configuration
- Monitor workers via Flower at http://localhost:5555
- Auto-generated passwords for enhanced security
- JWT token support for API access
- Lightweight and fast
- Passwords stored in
/opt/airflow/simple_auth_manager_passwords.json.generated
# Start with Simple Auth Manager (default)
docker-compose up -d
# Get current admin password
scripts/get-admin-password.bat # Windows
scripts/get-admin-password.sh # Linux/MacThis setup uses the correct Airflow 3.1.0 configuration parameters. If you're migrating from Airflow 2.x, note these important changes:
Airflow 2.x (Deprecated):
AIRFLOW__WEBSERVER__SECRET_KEY=your-secret-keyAirflow 3.x (Current):
AIRFLOW__API__SECRET_KEY=your-secret-key
AIRFLOW__SIMPLE_AUTH_MANAGER__JWT_SECRET_KEY=your-secret-keyAirflow 2.x:
- Single webserver component
Airflow 3.x:
- Separate API server (replaces webserver)
- Dedicated DAG processor service
- Enhanced triggerer for async operations
Airflow 2.x (Deprecated):
airflow db upgradeAirflow 3.x (Current):
airflow db migrateAirflow 2.x:
- Flask-AppBuilder based authentication
Airflow 3.x:
- SimpleAuthManager (default)
- FastAPI-based authentication
- Improved JWT token handling
If you see "JWT token is not valid: Signature verification failed" errors:
-
Check secret key consistency:
# Verify all services use the same secret key docker-compose exec airflow-api-server-celery printenv | grep SECRET docker-compose exec airflow-worker printenv | grep SECRET
-
Regenerate secret keys:
# Generate new secure secret key python -c "import secrets; print(secrets.token_urlsafe(32))" # Update AIRFLOW_SECRET_KEY in .env file
-
Restart services:
docker-compose down docker-compose up -d
If you see deprecation warnings about configuration parameters:
- Update your configuration to use the Airflow 3.x parameter names shown above
- Remove any old Airflow 2.x configuration parameters
- Restart services after configuration changes
Method 1: Environment Variable (Recommended)
# In .env file
NUMBER_OF_WORKERS=4Method 2: Docker Compose Scale
docker-compose up --scale airflow-worker=3Each worker is configured with:
- Concurrency: 16 parallel tasks per worker
- Autoscaling: 16 max, 4 min processes
- Memory Limit: 8GB per worker
- Task Limit: 1000 tasks before worker restart
- Flower Dashboard: http://localhost:5555 (CeleryExecutor only)
- Real-time worker status
- Task execution monitoring
- Worker resource usage
- Airflow Web UI: http://localhost:8080
- DAG execution status
- Task logs and details
- System health metrics
Workers not starting:
- Check
COMPOSE_PROFILES=celery,flowerin.env - Verify
AIRFLOW_EXECUTOR=CeleryExecutorin.env - Ensure Redis is healthy:
docker-compose ps redis
Tasks stuck in queued state:
- Check worker status in Flower: http://localhost:5555
- Verify worker logs:
docker-compose logs airflow-worker - Restart workers:
docker-compose restart airflow-worker
Flower not accessible:
- Ensure
flowerprofile is enabled inCOMPOSE_PROFILES - Check if port 5555 is available
- Verify Flower service is running:
docker-compose ps flower
Increase worker capacity:
# In .env
NUMBER_OF_WORKERS=4 # More workersAdjust worker concurrency:
Edit AIRFLOW__CELERY__WORKER_CONCURRENCY in docker-compose.yml
- Uses forward slashes for cross-platform volume compatibility
- Optimized for Docker Desktop with WSL2 backend
- Compatible with existing conda environments
- CeleryExecutor works seamlessly with WSL2 Docker backend
- Complete Usage Guide - Comprehensive guide covering all aspects of using Airflow 3.1.0 with Docker Compose
- DAG Development Best Practices - Best practices for developing DAGs in Airflow 3.1.0
- Windows 11 Troubleshooting Guide - Solutions for common Windows 11 specific issues
- Simple Auth Manager Usage - Using the default Simple Auth Manager with JWT tokens
This setup is ready for development. Add your DAG files to the dags/ directory and they will be automatically detected by Airflow.
For detailed usage instructions, DAG development best practices, and troubleshooting help, see the comprehensive documentation guides above.