Command Line Interface Reference

Airflow has a very rich command line interface that allows for many types of operation on a DAG, starting services, and supporting development and testing.

Note

For more information on usage CLI, see Using the Command Line Interface

usage: airflow [-h] GROUP_OR_COMMAND ...

Positional Arguments

GROUP_OR_COMMAND

Possible choices: celery, config, connections, dags, db, info, kerberos, plugins, pools, roles, rotate_fernet_key, scheduler, sync_perm, tasks, users, variables, version, webserver

Sub-commands:

celery

Start celery components. Works only when using CeleryExecutor. For more information, see https://airflow.readthedocs.io/en/stable/executor/celery.html

airflow celery [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: flower, stop, worker

Sub-commands:

flower

Start a Celery Flower

airflow celery flower [-h] [-A BASIC_AUTH] [-a BROKER_API] [-D]
                      [-c FLOWER_CONF] [-H HOSTNAME] [-l LOG_FILE]
                      [--pid [PID]] [-p PORT] [--stderr STDERR]
                      [--stdout STDOUT] [-u URL_PREFIX]
Named Arguments
-A, --basic-auth

Securing Flower with Basic Authentication. Accepts user:password pairs separated by a comma. Example: flower_basic_auth = user1:password1,user2:password2

Default: “”

-a, --broker-api

Broker API

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-c, --flower-conf

Configuration file for flower

-H, --hostname

Set the hostname on which to run the server

Default: “0.0.0.0”

-l, --log-file

Location of the log file

--pid

PID file location

-p, --port

The port on which to run the server

Default: 5555

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-u, --url-prefix

URL prefix for Flower

Default: “”

stop

Stop the Celery worker gracefully

airflow celery stop [-h]
worker

Start a Celery worker node

airflow celery worker [-h] [-a AUTOSCALE] [-H CELERY_HOSTNAME]
                      [-c CONCURRENCY] [-D] [-p] [-l LOG_FILE] [--pid [PID]]
                      [-q QUEUES] [-s] [--stderr STDERR] [--stdout STDOUT]
                      [-u UMASK]
Named Arguments
-a, --autoscale

Minimum and Maximum number of worker to autoscale

-H, --celery-hostname

Set the hostname of celery worker if you have multiple workers on a single machine

-c, --concurrency

The number of worker processes

Default: 8

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-p, --do-pickle

Attempt to pickle the DAG object to send over to the workers, instead of letting workers run their version of the code

Default: False

-l, --log-file

Location of the log file

--pid

PID file location

-q, --queues

Comma delimited list of queues to serve

Default: “default”

-s, --skip-serve-logs

Don’t start the serve logs process along with the workers

Default: False

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-u, --umask

Set the umask of celery worker in daemon mode

Default: “0o077”

config

View the configuration options.

airflow config [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: get-value, list

Sub-commands:

get-value

Print the value of the configuration

airflow config get-value [-h] section option
Positional Arguments
section

The section name

option

The option name

list

List options for the configuration.

airflow config list [-h] [--color {off,on,auto}]
Named Arguments
--color

Possible choices: off, on, auto

Do emit colored output (default: auto)

Default: “auto”

connections

List/Add/Delete connections

airflow connections [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: add, delete, list

Sub-commands:

add

Add a connection

airflow connections add [-h] [--conn-extra CONN_EXTRA] [--conn-host CONN_HOST]
                        [--conn-login CONN_LOGIN]
                        [--conn-password CONN_PASSWORD]
                        [--conn-port CONN_PORT] [--conn-schema CONN_SCHEMA]
                        [--conn-type CONN_TYPE] [--conn-uri CONN_URI]
                        conn_id
Positional Arguments
conn_id

Connection id, required to add/delete a connection

Named Arguments
--conn-extra

Connection Extra field, optional when adding a connection

--conn-host

Connection host, optional when adding a connection

--conn-login

Connection login, optional when adding a connection

--conn-password

Connection password, optional when adding a connection

--conn-port

Connection port, optional when adding a connection

--conn-schema

Connection schema, optional when adding a connection

--conn-type

Connection type, required to add a connection without conn_uri

--conn-uri

Connection URI, required to add a connection without conn_type

delete

Delete a connection

airflow connections delete [-h] conn_id
Positional Arguments
conn_id

Connection id, required to add/delete a connection

list

List connections

airflow connections list [-h] [--conn-id CONN_ID] [--include-secrets]
                         [--output FORMAT]
Named Arguments
--conn-id

If passed, only items with the specified connection ID will be displayed

--include-secrets

If passed, the connection in the secret backend will also be displayed.To use this option you must pass –conn_id option.

Default: False

--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

dags

List and manage DAGs

airflow dags [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: backfill, delete, list, list_jobs, list_runs, next_execution, pause, report, show, state, test, trigger, unpause

Sub-commands:

backfill

Run subsections of a DAG for a specified date range. If reset_dag_run option is used, backfill will first prompt users whether airflow should clear all the previous dag_run and task_instances within the backfill date range. If rerun_failed_tasks is used, backfill will auto re-run the previous failed task instances within the backfill date range

airflow dags backfill [-h] [-c CONF] [--delay-on-limit DELAY_ON_LIMIT] [-x]
                      [-n] [-e END_DATE] [-i] [-I] [-l] [-m] [--pool POOL]
                      [--rerun-failed-tasks] [--reset-dagruns] [-B]
                      [-s START_DATE] [-S SUBDIR] [-t TASK_REGEX] [-v] [-y]
                      dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-c, --conf

JSON string that gets pickled into the DagRun’s conf attribute

--delay-on-limit

Amount of time in seconds to wait when the limit on maximum active dag runs (max_active_runs) has been reached before trying to execute a dag run again

Default: 1.0

-x, --donot-pickle

Do not attempt to pickle the DAG object to send over to the workers, just tell the workers to run their version of the code

Default: False

-n, --dry-run

Perform a dry run for each task. Only renders Template Fields for each task, nothing else

Default: False

-e, --end-date

Override end_date YYYY-MM-DD

-i, --ignore-dependencies

Skip upstream tasks, run only the tasks matching the regexp. Only works in conjunction with task_regex

Default: False

-I, --ignore-first-depends-on-past

Ignores depends_on_past dependencies for the first set of tasks only (subsequent executions in the backfill DO respect depends_on_past)

Default: False

-l, --local

Run the task using the LocalExecutor

Default: False

-m, --mark-success

Mark jobs as succeeded without running them

Default: False

--pool

Resource pool to use

--rerun-failed-tasks

if set, the backfill will auto-rerun all the failed tasks for the backfill date range instead of throwing exceptions

Default: False

--reset-dagruns

if set, the backfill will delete existing backfill-related DAG runs and start anew with fresh, running DAG runs

Default: False

-B, --run-backwards

if set, the backfill will run tasks from the most recent day first. if there are tasks that depend_on_past this option will throw an exception

Default: False

-s, --start-date

Override start_date YYYY-MM-DD

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-t, --task-regex

The regex to filter specific task_ids to backfill (optional)

-v, --verbose

Make logging output more verbose

Default: False

-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

delete

Delete all DB records related to the specified DAG

airflow dags delete [-h] [-y] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

list

List all the DAGs

airflow dags list [-h] [--output FORMAT] [-S SUBDIR]
Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

list_jobs

List the jobs

airflow dags list_jobs [-h] [-d DAG_ID] [--limit LIMIT] [--output FORMAT]
                       [--state STATE]
Named Arguments
-d, --dag-id

The id of the dag to run

--limit

Return a limited number of records

--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

--state

Only list the dag runs corresponding to the state

list_runs

List dag runs given a DAG id. If state option is given, it will only search for all the dagruns with the given state. If no_backfill option is given, it will filter out all backfill dagruns for given dag id. If start_date is given, it will filter out all the dagruns that were executed before this date. If end_date is given, it will filter out all the dagruns that were executed after this date.

airflow dags list_runs [-h] [-d DAG_ID] [-e END_DATE] [--no-backfill]
                       [--output FORMAT] [-s START_DATE] [--state STATE]
Named Arguments
-d, --dag-id

The id of the dag to run

-e, --end-date

Override end_date YYYY-MM-DD

--no-backfill

filter all the backfill dagruns given the dag id

Default: False

--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

-s, --start-date

Override start_date YYYY-MM-DD

--state

Only list the dag runs corresponding to the state

next_execution

Get the next execution datetimes of a DAG. It returns one execution unless the num-executions option is given

airflow dags next_execution [-h] [-n NUM_EXECUTIONS] [-S SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-n, --num-executions

The number of next execution datetimes to show

Default: 1

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

pause

Pause a DAG

airflow dags pause [-h] [-S SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

report

Show DagBag loading report

airflow dags report [-h] [--output FORMAT] [-S SUBDIR]
Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

show

The –imgcat option only works in iTerm.

For more information, see: https://www.iterm2.com/documentation-images.html

The –save option saves the result to the indicated file.

The file format is determined by the file extension. For more information about supported format, see: https://www.graphviz.org/doc/info/output.html

If you want to create a PNG file then you should execute the following command: airflow dags show <DAG_ID> –save output.png

If you want to create a DOT file then you should execute the following command: airflow dags show <DAG_ID> –save output.dot

airflow dags show [-h] [--imgcat] [-s SAVE] [-S SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
--imgcat

Displays graph using the imgcat tool.

Default: False

-s, --save

Saves the result to the indicated file.

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

state

Get the status of a dag run

airflow dags state [-h] [-S SUBDIR] dag_id execution_date
Positional Arguments
dag_id

The id of the dag

execution_date

The execution date of the DAG

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

test

Execute one single DagRun for a given DAG and execution date, using the DebugExecutor.

The –imgcat-dagrun option only works in iTerm.

For more information, see: https://www.iterm2.com/documentation-images.html

If –save-dagrun is used, then, after completing the backfill, saves the diagram for current DAG Run to the indicated file. The file format is determined by the file extension. For more information about supported format, see: https://www.graphviz.org/doc/info/output.html

If you want to create a PNG file then you should execute the following command: airflow dags test <DAG_ID> <EXECUTION_DATE> –save-dagrun output.png

If you want to create a DOT file then you should execute the following command: airflow dags test <DAG_ID> <EXECUTION_DATE> –save-dagrun output.dot

airflow dags test [-h] [--imgcat-dagrun] [--save-dagrun SAVE_DAGRUN]
                  [--show-dagrun] [-S SUBDIR]
                  dag_id execution_date
Positional Arguments
dag_id

The id of the dag

execution_date

The execution date of the DAG

Named Arguments
--imgcat-dagrun

After completing the dag run, prints a diagram on the screen for the current DAG Run using the imgcat tool.

Default: False

--save-dagrun

After completing the backfill, saves the diagram for current DAG Run to the indicated file.

--show-dagrun

After completing the backfill, shows the diagram for current DAG Run.

The diagram is in DOT language

Default: False

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

trigger

Trigger a DAG run

airflow dags trigger [-h] [-c CONF] [-e EXEC_DATE] [-r RUN_ID] [-S SUBDIR]
                     dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-c, --conf

JSON string that gets pickled into the DagRun’s conf attribute

-e, --exec-date

The execution date of the DAG

-r, --run-id

Helps to identify this run

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

unpause

Resume a paused DAG

airflow dags unpause [-h] [-S SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

db

Database operations

airflow db [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: check, check-migrations, init, reset, shell, upgrade

Sub-commands:

check

Check if the database can be reached.

airflow db check [-h]
check-migrations

Check if migration have finished (or continually check until timeout)

airflow db check-migrations [-h] [-t MIGRATION_WAIT_TIMEOUT]
Named Arguments
-t, --migration-wait-timeout

timeout to wait for db to migrate

Default: 0

init

Initialize the metadata database

airflow db init [-h]
reset

Burn down and rebuild the metadata database

airflow db reset [-h] [-y]
Named Arguments
-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

shell

Runs a shell to access the database

airflow db shell [-h]
upgrade

Upgrade the metadata database to latest version

airflow db upgrade [-h]

info

Show information about current Airflow and environment

airflow info [-h] [--anonymize] [--file-io]

Named Arguments

--anonymize

Minimize any personal identifiable information. Use it when sharing output with others.

Default: False

--file-io

Send output to file.io service and returns link.

Default: False

kerberos

Start a kerberos ticket renewer

airflow kerberos [-h] [-D] [-k [KEYTAB]] [-l LOG_FILE] [--pid [PID]]
                 [--stderr STDERR] [--stdout STDOUT]
                 [principal]

Positional Arguments

principal

kerberos principal

Named Arguments

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-k, --keytab

keytab

Default: “airflow.keytab”

-l, --log-file

Location of the log file

--pid

PID file location

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

plugins

Dump information about loaded plugins

airflow plugins [-h]

pools

CRUD operations on pools

airflow pools [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: delete, export, get, import, list, set

Sub-commands:

delete

Delete pool

airflow pools delete [-h] [--output FORMAT] NAME
Positional Arguments
NAME

Pool name

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

export

Export all pools

airflow pools export [-h] [--output FORMAT] FILEPATH
Positional Arguments
FILEPATH

Export all pools to JSON file

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

get

Get pool size

airflow pools get [-h] [--output FORMAT] NAME
Positional Arguments
NAME

Pool name

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

import

Import pools

airflow pools import [-h] [--output FORMAT] FILEPATH
Positional Arguments
FILEPATH

Import pools from JSON file

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

list

List pools

airflow pools list [-h] [--output FORMAT]
Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

set

Configure pool

airflow pools set [-h] [--output FORMAT] NAME slots description
Positional Arguments
NAME

Pool name

slots

Pool slots

description

Pool description

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

roles

Create/List roles

airflow roles [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: create, list

Sub-commands:

create

Create role

airflow roles create [-h] [role [role ...]]
Positional Arguments
role

The name of a role

list

List roles

airflow roles list [-h] [--output FORMAT]
Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

rotate_fernet_key

Rotate all encrypted connection credentials and variables; see https://airflow.readthedocs.io/en/stable/howto/secure-connections.html#rotating-encryption-keys

airflow rotate_fernet_key [-h]

scheduler

Start a scheduler instance

airflow scheduler [-h] [-D] [-d DAG_ID] [-p] [-l LOG_FILE] [-n NUM_RUNS]
                  [--pid [PID]] [--stderr STDERR] [--stdout STDOUT]
                  [-S SUBDIR]

Named Arguments

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-d, --dag-id

The id of the dag to run

-p, --do-pickle

Attempt to pickle the DAG object to send over to the workers, instead of letting workers run their version of the code

Default: False

-l, --log-file

Location of the log file

-n, --num-runs

Set the number of runs to execute before exiting

Default: -1

--pid

PID file location

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

sync_perm

Update permissions for existing roles and DAGs

airflow sync_perm [-h]

tasks

List and manage tasks

airflow tasks [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: clear, failed_deps, list, render, run, state, states_for_dag_run, test

Sub-commands:

clear

Clear a set of task instance, as if they never ran

airflow tasks clear [-h] [-R] [-d] [-e END_DATE] [-X] [-x] [-f] [-r]
                    [-s START_DATE] [-S SUBDIR] [-t TASK_REGEX] [-u] [-y]
                    dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-R, --dag-regex

Search dag_id as regex instead of exact string

Default: False

-d, --downstream

Include downstream tasks

Default: False

-e, --end-date

Override end_date YYYY-MM-DD

-X, --exclude-parentdag

Exclude ParentDAGS if the task cleared is a part of a SubDAG

Default: False

-x, --exclude-subdags

Exclude subdags

Default: False

-f, --only-failed

Only failed jobs

Default: False

-r, --only-running

Only running jobs

Default: False

-s, --start-date

Override start_date YYYY-MM-DD

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-t, --task-regex

The regex to filter specific task_ids to backfill (optional)

-u, --upstream

Include upstream tasks

Default: False

-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

failed_deps

Returns the unmet dependencies for a task instance from the perspective of the scheduler. In other words, why a task instance doesn’t get scheduled and then queued by the scheduler, and then run by an executor.

airflow tasks failed_deps [-h] [-S SUBDIR] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

list

List the tasks within a DAG

airflow tasks list [-h] [-S SUBDIR] [-t] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-t, --tree

Tree view

Default: False

render

Render a task instance’s template(s)

airflow tasks render [-h] [-S SUBDIR] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

run

Run a single task instance

airflow tasks run [-h] [--cfg-path CFG_PATH] [-f] [-A] [-i] [-I] [-N] [-l]
                  [-m] [-p PICKLE] [--pool POOL] [--ship-dag] [-S SUBDIR]
                  dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
--cfg-path

Path to config file to use instead of airflow.cfg

-f, --force

Ignore previous task instance state, rerun regardless if task already succeeded/failed

Default: False

-A, --ignore-all-dependencies

Ignores all non-critical dependencies, including ignore_ti_state and ignore_task_deps

Default: False

-i, --ignore-dependencies

Ignore task-specific dependencies, e.g. upstream, depends_on_past, and retry delay dependencies

Default: False

-I, --ignore-depends-on-past

Ignore depends_on_past dependencies (but respect upstream dependencies)

Default: False

-N, --interactive

Do not capture standard output and error streams (useful for interactive debugging)

Default: False

-l, --local

Run the task using the LocalExecutor

Default: False

-m, --mark-success

Mark jobs as succeeded without running them

Default: False

-p, --pickle

Serialized pickle object of the entire dag (used internally)

--pool

Resource pool to use

--ship-dag

Pickles (serializes) the DAG and ships it to the worker

Default: False

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

state

Get the status of a task instance

airflow tasks state [-h] [-S SUBDIR] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

states_for_dag_run

Get the status of all task instances in a dag run

airflow tasks states_for_dag_run [-h] [--output FORMAT] dag_id execution_date
Positional Arguments
dag_id

The id of the dag

execution_date

The execution date of the DAG

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

test

Test a task instance. This will run a task without checking for dependencies or recording its state in the database

airflow tasks test [-h] [-n] [--env-vars ENV_VARS] [-m] [-S SUBDIR]
                   [-t TASK_PARAMS]
                   dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-n, --dry-run

Perform a dry run for each task. Only renders Template Fields for each task, nothing else

Default: False

--env-vars

Set env var in both parsing time and runtime for each of entry supplied in a JSON dict

-m, --post-mortem

Open debugger on uncaught exception

Default: False

-S, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-t, --task-params

Sends a JSON params dict to the task

users

CRUD operations on users

airflow users [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: add_role, create, delete, export, import, list, remove_role

Sub-commands:

add_role

Add role to a user

airflow users add_role [-h] [-e EMAIL] -r ROLE [-u USERNAME]
Named Arguments
-e, --email

Email of the user

-r, --role

Role of the user. Existing roles include Admin, User, Op, Viewer, and Public

-u, --username

Username of the user

create

Create a user

airflow users create [-h] -e EMAIL -f FIRSTNAME -l LASTNAME [-p PASSWORD] -r
                     ROLE [--use-random-password] -u USERNAME
Named Arguments
-e, --email

Email of the user

-f, --firstname

First name of the user

-l, --lastname

Last name of the user

-p, --password

Password of the user, required to create a user without –use-random-password

-r, --role

Role of the user. Existing roles include Admin, User, Op, Viewer, and Public

--use-random-password

Do not prompt for password. Use random string instead. Required to create a user without –password

Default: False

-u, --username

Username of the user

delete

Delete a user

airflow users delete [-h] -u USERNAME
Named Arguments
-u, --username

Username of the user

export

Export all users

airflow users export [-h] FILEPATH
Positional Arguments
FILEPATH

Export all users to JSON file

import

Import users

airflow users import [-h] FILEPATH
Positional Arguments
FILEPATH

Import users from JSON file. Example format:

[
    {
        "email": "foo@bar.org",
        "firstname": "Jon",
        "lastname": "Doe",
        "roles": ["Public"],
        "username": "jondoe"
    }
]
list

List users

airflow users list [-h] [--output FORMAT]
Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, pretty, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “plain”

remove_role

Remove role from a user

airflow users remove_role [-h] [-e EMAIL] -r ROLE [-u USERNAME]
Named Arguments
-e, --email

Email of the user

-r, --role

Role of the user. Existing roles include Admin, User, Op, Viewer, and Public

-u, --username

Username of the user

variables

CRUD operations on variables

airflow variables [-h] COMMAND ...

Positional Arguments

COMMAND

Possible choices: delete, export, get, import, list, set

Sub-commands:

delete

Delete variable

airflow variables delete [-h] key
Positional Arguments
key

Variable key

export

Export all variables

airflow variables export [-h] file
Positional Arguments
file

Export all variables to JSON file

get

Get variable

airflow variables get [-h] [-d VAL] [-j] key
Positional Arguments
key

Variable key

Named Arguments
-d, --default

Default value returned if variable does not exist

-j, --json

Deserialize JSON variable

Default: False

import

Import variables

airflow variables import [-h] file
Positional Arguments
file

Import variables from JSON file

list

List variables

airflow variables list [-h]
set

Set variable

airflow variables set [-h] [-j] key VALUE
Positional Arguments
key

Variable key

VALUE

Variable value

Named Arguments
-j, --json

Deserialize JSON variable

Default: False

version

Show the version

airflow version [-h]

webserver

Start a Airflow webserver instance

airflow webserver [-h] [-A ACCESS_LOGFILE] [-D] [-d] [-E ERROR_LOGFILE]
                  [-H HOSTNAME] [-l LOG_FILE] [--pid [PID]] [-p PORT]
                  [--ssl-cert SSL_CERT] [--ssl-key SSL_KEY] [--stderr STDERR]
                  [--stdout STDOUT] [-t WORKER_TIMEOUT]
                  [-k {sync,eventlet,gevent,tornado}] [-w WORKERS]

Named Arguments

-A, --access-logfile

The logfile to store the webserver access log. Use ‘-‘ to print to stderr

Default: “-“

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-d, --debug

Use the server that ships with Flask in debug mode

Default: False

-E, --error-logfile

The logfile to store the webserver error log. Use ‘-‘ to print to stderr

Default: “-“

-H, --hostname

Set the hostname on which to run the web server

Default: “0.0.0.0”

-l, --log-file

Location of the log file

--pid

PID file location

-p, --port

The port on which to run the server

Default: 8080

--ssl-cert

Path to the SSL certificate for the webserver

Default: “”

--ssl-key

Path to the key to use with the SSL certificate

Default: “”

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-t, --worker-timeout

The timeout for waiting on webserver workers

Default: 120

-k, --workerclass

Possible choices: sync, eventlet, gevent, tornado

The worker class to use for Gunicorn

Default: “sync”

-w, --workers

Number of workers to run the webserver on

Default: 4