Docker compose file for airflow 2 ( version 2.0.0 )
Asked Answered
P

4

7

I am looking to write docker compose file to locally execute airflow in production similar environent.

For older airflow v1.10.14, docker compose is working fine. But same docker compose is not working for latest stable version, airflow scheduler & webservice is failing continuously. error message looks like unable to create audit tables.

docker-compose.yaml:

 version: "2.1"
services:
  postgres:
    image: postgres:12
    environment:
      - POSTGRES_USER=airflow
      - POSTGRES_PASSWORD=airflow
      - POSTGRES_DB=airflow
    ports:
      - "5433:5432"

  scheduler:
    image: apache/airflow:1.10.14
    restart: always
    depends_on:
      - postgres
      - webserver
    env_file:
      - .env
    ports:
      - "8793:8793"
    volumes:
      - ./dags:/opt/airflow/dags
      - ./airflow-logs:/opt/airflow/logs
      - ./plugins:/opt/airflow/plugins
    command: scheduler
    healthcheck:
      test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
      interval: 30s
      timeout: 30s
      retries: 5

  webserver:
    image: apache/airflow:1.10.14
    hostname: webserver
    restart: always
    depends_on:
      - postgres
    env_file:
      - .env
    volumes:
      - ./dags:/opt/airflow/dags
      - ./scripts:/opt/airflow/scripts
      - ./airflow-logs:/opt/airflow/logs
      - ./plugins:/opt/airflow/plugins
    ports:
      - "8080:8080"
    entrypoint: ./scripts/airflow-entrypoint.sh
    healthcheck:
      test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
      interval: 30s
      timeout: 30s
      retries: 5

.env:

AIRFLOW__CORE__LOAD_DEFAULT_CONNECTIONS=False
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgres+psycopg2://airflow:airflow@postgres:5432/airflow
AIRFLOW__CORE__FERNET_KEY=81HqDtbqAywKSOumSha3BhWNOdQ26slT6K0YaZeZyPs=
AIRFLOW_CONN_METADATA_DB=postgres+psycopg2://airflow:airflow@postgres:5432/airflow
AIRFLOW_VAR__METADATA_DB_SCHEMA=airflow
AIRFLOW__SCHEDULER__SCHEDULER_HEARTBEAT_SEC=10

./scripts/airflow-entrypoint.sh:

#!/usr/bin/env bash
airflow upgradedb
airflow webserver
Phrenology answered 25/1, 2021 at 17:57 Comment(1)
Can you try changing airflow upgradedb to airflow db upgrade in entrypoint.sh? upgradedb command should be removed in v2.0.0Turgeon
O
7

There is an official docker-compose.yml see here.

You will also find more information about docker and Kubernetes deployment in the official docs

Overissue answered 26/1, 2021 at 17:6 Comment(2)
I am getting an unexpected error with this image 67d927dd9b8a: Waiting ERROR: error pulling image configuration: received unexpected HTTP status: 503 Service UnavailableGalore
do you know if this is working for the latest version of airflow 2.0.2 image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:|version|}Galore
F
0

Official docker image for Airflow version 2.0 is available now. Here is list of 2.0.0 docker images

Example docker image :

# example:
apache/airflow:2.0.0-python3.8

The below docker-compose handles

  • airflow db schema upgrade
  • admin user creation

Here is a version that I use:

version: "3.7"

# Common sections extracted out
x-airflow-common:
  &airflow-common
  image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.0.0-python3.8}
  environment:
    &airflow-common-env
    AIRFLOW__CORE__EXECUTOR: CeleryExecutor
    AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
    AIRFLOW__CORE__FERNET_KEY: ''
    AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
    # Change log level when needed
    AIRFLOW__LOGGING__LOGGING_LEVEL: 'INFO'
  volumes:
    - ./airflow/dags:/opt/airflow/dags
  depends_on:
    - postgres
    - redis
  networks:
      - default_net

volumes: 
  postgres-db-volume:

services:
  postgres:
    image: postgres:13
    environment:
      POSTGRES_USER: airflow
      POSTGRES_PASSWORD: airflow
      POSTGRES_DB: airflow
    volumes:
      - postgres-db-volume:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD", "pg_isready", "-U", "airflow"]
      interval: 5s
      retries: 5
    restart: always
    networks:
        - default_net

  redis:
    image: redis:6.0.10
    ports:
      - 6379:6379
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
      interval: 5s
      timeout: 30s
      retries: 50
    restart: always
    networks:
        - default_net



  airflow-webserver:
    <<: *airflow-common
    command: webserver
    ports:
      - 8080:8080
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always
    environment:
      <<: *airflow-common-env
      # It is sufficient to run db upgrade and create admin user only from webserver service
      _AIRFLOW_WWW_USER_PASSWORD: 'yourAdminPass'
      _AIRFLOW_DB_UPGRADE: 'true'
      _AIRFLOW_WWW_USER_CREATE: 'true'


  airflow-scheduler:
    <<: *airflow-common
    command: scheduler
    restart: always

  airflow-worker:
    <<: *airflow-common
    command: celery worker
    restart: always
    
networks:
    default_net:
      attachable: true
Faith answered 3/2, 2021 at 21:41 Comment(1)
With this docker-compose I always get airflow-worker_1 | sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) FATAL: password authentication failed for user "airflow" How i can fix this?Bullfight
H
0

update docker-compose version to :version: "3.7"

Halftruth answered 27/6, 2022 at 9:54 Comment(0)
R
0

First run :

docker compose up airflow-init 

Notice docker compose with space (instead of -)

and then :

docker compose up 

it should work!

Rasheedarasher answered 20/10, 2023 at 22:39 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.