Sitemap

Run Apache Airflow Locally On Windows

4 min readJul 23, 2025
Press enter or click to view image in full size

Apache Airflow is one of the most popular tools for building and scheduling data pipelines. While cloud-managed Airflow (like Cloud Composer or MWAA) is great for production, it’s often cumbersome and slow for local development. That’s where running Airflow locally with Docker really shines.

If you prefer live setup demo over Blog please watch out this Youtube Video

What is Apache Airflow?

Apache Airflow is an open-source workflow orchestration tool that lets you define, schedule, and monitor complex data pipelines as DAGs (Directed Acyclic Graphs) using Python.

Key Features:

  • Code your workflows (DAGs) using Python
  • Schedule them to run periodically
  • Monitor them via a rich web UI
  • Supports task dependencies, retries, SLAs, and external triggers

Why Set Up Airflow Locally?

Problem with Cloud Composer / MWAA

If you’re using Airflow in cloud-managed services like Google Cloud Composer or AWS MWAA, then:

  • Every time you make a change to a DAG, you need to upload it to a Cloud Storage bucket
  • Wait for it to parse & sync, which can take minutes
  • Repeat for every small code change 😩

This makes development slow and frustrating.

With Airflow running locally:

  • You get instant feedback on code changes
  • DAGs show up in UI immediately after saving
  • You can add breakpoints, test Python logic, simulate failures
  • It’s fast, flexible, and just like running any Python project

Using Docker, this setup becomes super easy and OS-independent.

Why Use Docker for Airflow?

Apache Airflow can be installed using pip, but it has a lot of dependencies like PostgreSQL, Celery, Redis, etc., making the local install complex and error-prone — especially on Windows.

Docker simplifies everything:

  • ✅ All dependencies are bundled
  • ✅ Just 1 command to start your Airflow instance
  • ✅ No need to install Python environments, Postgres, etc.
  • ✅ Works the same on any OS — Windows, macOS, or Linux

Can I install Airflow directly without Docker?
Yes, you can use
pip install apache-airflow, but it's officially recommended only for Linux-based systems. It often fails on Windows and requires additional setup like WSL or conda-based environments.

If you’re on Windows, Docker is by far the simplest and most reliable way to run Airflow locally.

Prerequisites

Before getting started, make sure you have the following installed on your system:

Docker Desktop → You can Setup/Install Docker desktop using below Link

🧱 Step 1: Create a Dockerfile

This sets up your custom Airflow image with additional packages (like Git):

Dockerfile

FROM apache/airflow:2.8.4
USER root
RUN apt-get update && \
apt-get install -y git && \
apt-get clean
USER airflow

Save this as Dockerfile in your project root.

Step 2: Create docker-compose.yml

This will start Airflow using the standalone mode, which is perfect for local testing

version: '3'
services:
airflow:
image: my-airflow:latest
volumes:
- ./airflow:/opt/airflow
ports:
- "8080:8080"
command: airflow standalone

🔁 Any DAGs inside ./airflow/dags will be automatically picked up.

Step 3: Build & Run

Run the following commands from the terminal:

docker build -t my-airflow .
docker-compose up

Verify if Image is created locally

Press enter or click to view image in full size

Verify if Container Started

Press enter or click to view image in full size

Then open your browser and go to:
http://localhost:8080

Press enter or click to view image in full size

If its not showing UI wait for sometime as it might takes 10–15 minutes as well to start All Airflow components based on your system configuration & performance.

You can check logs & look out for below log message. (If you dont find it in log , dont worry check next steps below.)

Press enter or click to view image in full size

If you are not able to find password in log message then use “admin” as username & password will be available in below file

airflow/ standalone_admin_password.txt file

Press enter or click to view image in full size

Login and BOOM! You’re now ready to use Airflow locally. Go ahead — create, break, and experiment with your DAGs!

About Me

As an experienced Fully certified (11x certified) Google Cloud Architect, Google Developer Expert(GDE), with over 9+ years of expertise in Google Cloud Networking,Data ,Devops, Security and ML, I am passionate about technology and innovation. Being a Champion Innovator and Google Cloud Architect, I am always exploring new ways to leverage cloud technologies to deliver innovative solutions that make a difference.

If you have any queries or would like to get in touch, you can reach me at Email address — vishal.bulbule@techtrapture.com or connect with me on LinkedIn at https://www.linkedin.com/in/vishal-bulbule/. For a more personal connection, you can also find me on Instagram at https://www.instagram.com/vishal_bulbule/?hl=en.

Additionally, please check out my YouTube Channel at https://www.youtube.com/@techtrapture for tutorials and demos on Google Cloud & Data Engineering.

--

--

Vishal Bulbule
Vishal Bulbule

Written by Vishal Bulbule

Google Cloud Architect || Believe in Learn , work and share knowledge ! https://www.youtube.com/@techtrapture

No responses yet