Command Line Interface Reference

Airflow has a very rich command line interface that allows for many types of operation on a DAG, starting services, and supporting development and testing.

Note

For more information on usage CLI, see Using the Command Line Interface

usage: airflow [-h]
               {celery,config,connections,dags,db,kerberos,pools,roles,rotate_fernet_key,scheduler,sync_perm,tasks,users,variables,version,webserver}
               ...

Positional Arguments

subcommand

Possible choices: celery, config, connections, dags, db, kerberos, pools, roles, rotate_fernet_key, scheduler, sync_perm, tasks, users, variables, version, webserver

sub-command help

Sub-commands:

celery

Start celery components

airflow celery [-h] {flower,stop,worker} ...

Positional Arguments

subcommand

Possible choices: flower, stop, worker

Sub-commands:

flower

Start a Celery Flower

airflow celery flower [-h] [-ba BASIC_AUTH] [-a BROKER_API] [-D]
                      [-fc FLOWER_CONF] [-hn HOSTNAME] [-l LOG_FILE]
                      [--pid [PID]] [-p PORT] [--stderr STDERR]
                      [--stdout STDOUT] [-u URL_PREFIX]
Named Arguments
-ba, --basic_auth

Securing Flower with Basic Authentication. Accepts user:password pairs separated by a comma. Example: flower_basic_auth = user1:password1,user2:password2

-a, --broker_api

Broker api

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-fc, --flower_conf

Configuration file for flower

-hn, --hostname

Set the hostname on which to run the server

Default: “0.0.0.0”

-l, --log-file

Location of the log file

--pid

PID file location

-p, --port

The port on which to run the server

Default: 5555

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-u, --url_prefix

URL prefix for Flower

stop

Stop the Celery worker gracefully

airflow celery stop [-h]
worker

Start a Celery worker node

airflow celery worker [-h] [-a AUTOSCALE] [-cn CELERY_HOSTNAME]
                      [-c CONCURRENCY] [-D] [-p] [-l LOG_FILE] [--pid [PID]]
                      [-q QUEUES] [-s] [--stderr STDERR] [--stdout STDOUT]
Named Arguments
-a, --autoscale

Minimum and Maximum number of worker to autoscale

-cn, --celery_hostname

Set the hostname of celery worker if you have multiple workers on a single machine

-c, --concurrency

The number of worker processes

Default: 8

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-p, --do_pickle

Attempt to pickle the DAG object to send over to the workers, instead of letting workers run their version of the code

Default: False

-l, --log-file

Location of the log file

--pid

PID file location

-q, --queues

Comma delimited list of queues to serve

Default: “default”

-s, --skip_serve_logs

Don’t start the serve logs process along with the workers

Default: False

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

config

Show current application configuration

airflow config [-h]

connections

List/Add/Delete connections

airflow connections [-h] {add,delete,list} ...

Positional Arguments

subcommand

Possible choices: add, delete, list

Sub-commands:

add

Add a connection

airflow connections add [-h] [--conn_extra CONN_EXTRA] [--conn_host CONN_HOST]
                        [--conn_login CONN_LOGIN]
                        [--conn_password CONN_PASSWORD]
                        [--conn_port CONN_PORT] [--conn_schema CONN_SCHEMA]
                        [--conn_type CONN_TYPE] [--conn_uri CONN_URI]
                        conn_id
Positional Arguments
conn_id

Connection id, required to add/delete a connection

Named Arguments
--conn_extra

Connection Extra field, optional when adding a connection

--conn_host

Connection host, optional when adding a connection

--conn_login

Connection login, optional when adding a connection

--conn_password

Connection password, optional when adding a connection

--conn_port

Connection port, optional when adding a connection

--conn_schema

Connection schema, optional when adding a connection

--conn_type

Connection type, required to add a connection without conn_uri

--conn_uri

Connection URI, required to add a connection without conn_type

delete

Delete a connection

airflow connections delete [-h] conn_id
Positional Arguments
conn_id

Connection id, required to add/delete a connection

list

List connections

airflow connections list [-h]
                         [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

dags

List and manage DAGs

airflow dags [-h]
             {backfill,delete,list,list_jobs,list_runs,next_execution,pause,show,state,test,trigger,unpause}
             ...

Positional Arguments

subcommand

Possible choices: backfill, delete, list, list_jobs, list_runs, next_execution, pause, show, state, test, trigger, unpause

Sub-commands:

backfill

Run subsections of a DAG for a specified date range. If reset_dag_run option is used, backfill will first prompt users whether airflow should clear all the previous dag_run and task_instances within the backfill date range. If rerun_failed_tasks is used, backfill will auto re-run the previous failed task instances within the backfill date range

airflow dags backfill [-h] [-c CONF] [--delay_on_limit DELAY_ON_LIMIT] [-x]
                      [-dr] [-e END_DATE] [-i] [-I] [-l] [-m] [--pool POOL]
                      [--rerun_failed_tasks] [--reset_dagruns] [-B]
                      [-s START_DATE] [-sd SUBDIR] [-t TASK_REGEX] [-v] [-y]
                      dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-c, --conf

JSON string that gets pickled into the DagRun’s conf attribute

--delay_on_limit

Amount of time in seconds to wait when the limit on maximum active dag runs (max_active_runs) has been reached before trying to execute a dag run again

Default: 1.0

-x, --donot_pickle

Do not attempt to pickle the DAG object to send over to the workers, just tell the workers to run their version of the code

Default: False

-dr, --dry_run

Perform a dry run

Default: False

-e, --end_date

Override end_date YYYY-MM-DD

-i, --ignore_dependencies

Skip upstream tasks, run only the tasks matching the regexp. Only works in conjunction with task_regex

Default: False

-I, --ignore_first_depends_on_past

Ignores depends_on_past dependencies for the first set of tasks only (subsequent executions in the backfill DO respect depends_on_past)

Default: False

-l, --local

Run the task using the LocalExecutor

Default: False

-m, --mark_success

Mark jobs as succeeded without running them

Default: False

--pool

Resource pool to use

--rerun_failed_tasks

if set, the backfill will auto-rerun all the failed tasks for the backfill date range instead of throwing exceptions

Default: False

--reset_dagruns

if set, the backfill will delete existing backfill-related DAG runs and start anew with fresh, running DAG runs

Default: False

-B, --run_backwards

if set, the backfill will run tasks from the most recent day first. if there are tasks that depend_on_past this option will throw an exception

Default: False

-s, --start_date

Override start_date YYYY-MM-DD

-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-t, --task_regex

The regex to filter specific task_ids to backfill (optional)

-v, --verbose

Make logging output more verbose

Default: False

-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

delete

Delete all DB records related to the specified DAG

airflow dags delete [-h] [-y] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

list

List all the DAGs

airflow dags list [-h] [-r] [-sd SUBDIR]
Named Arguments
-r, --report

Show DagBag loading report

Default: False

-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

list_jobs

List the jobs

airflow dags list_jobs [-h] [-d DAG_ID] [--limit LIMIT]
                       [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
                       [--state STATE]
Named Arguments
-d, --dag_id

The id of the dag to run

--limit

Return a limited number of records

--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

--state

Only list the dag runs corresponding to the state

list_runs

List dag runs given a DAG id. If state option is given, it will only search for all the dagruns with the given state. If no_backfill option is given, it will filter out all backfill dagruns for given dag id. If start_date is given, it will filter out all the dagruns that were executed before this date. If end_date is given, it will filter out all the dagruns that were executed after this date.

airflow dags list_runs [-h] [-d DAG_ID] [-e END_DATE] [--no_backfill]
                       [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
                       [-s START_DATE] [--state STATE]
Named Arguments
-d, --dag_id

The id of the dag to run

-e, --end_date

Override end_date YYYY-MM-DD

--no_backfill

filter all the backfill dagruns given the dag id

Default: False

--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

-s, --start_date

Override start_date YYYY-MM-DD

--state

Only list the dag runs corresponding to the state

next_execution

Get the next execution datetime of a DAG

airflow dags next_execution [-h] [-sd SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

pause

Pause a DAG

airflow dags pause [-h] [-sd SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

show

Displays DAG’s tasks with their dependencies

airflow dags show [-h] [--imgcat] [-s SAVE] [-sd SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
--imgcat

Displays graph using the imgcat tool.

For more information, see: https://www.iterm2.com/documentation-images.html

Default: False

-s, --save

Saves the result to the indicated file.

The file format is determined by the file extension. For more information about supported format, see: https://www.graphviz.org/doc/info/output.html

If you want to create a PNG file then you should execute the following command: airflow dags show <DAG_ID> –save output.png

If you want to create a DOT file then you should execute the following command: airflow dags show <DAG_ID> –save output.dot

-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

state

Get the status of a dag run

airflow dags state [-h] [-sd SUBDIR] dag_id execution_date
Positional Arguments
dag_id

The id of the dag

execution_date

The execution date of the DAG

Named Arguments
-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

test

Execute one run of a DAG

airflow dags test [-h] [-sd SUBDIR] dag_id execution_date
Positional Arguments
dag_id

The id of the dag

execution_date

The execution date of the DAG

Named Arguments
-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

trigger

Trigger a DAG run

airflow dags trigger [-h] [-c CONF] [-e EXEC_DATE] [-r RUN_ID] [-sd SUBDIR]
                     dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-c, --conf

JSON string that gets pickled into the DagRun’s conf attribute

-e, --exec_date

The execution date of the DAG

-r, --run_id

Helps to identify this run

-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

unpause

Resume a paused DAG

airflow dags unpause [-h] [-sd SUBDIR] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

db

Database operations

airflow db [-h] {check,init,reset,shell,upgrade} ...

Positional Arguments

subcommand

Possible choices: check, init, reset, shell, upgrade

Sub-commands:

check

Check if the database can be reached.

airflow db check [-h]
init

Initialize the metadata database

airflow db init [-h]
reset

Burn down and rebuild the metadata database

airflow db reset [-h] [-y]
Named Arguments
-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

shell

Runs a shell to access the database

airflow db shell [-h]
upgrade

Upgrade the metadata database to latest version

airflow db upgrade [-h]

kerberos

Start a kerberos ticket renewer

airflow kerberos [-h] [-D] [-kt [KEYTAB]] [-l LOG_FILE] [--pid [PID]]
                 [--stderr STDERR] [--stdout STDOUT]
                 [principal]

Positional Arguments

principal

kerberos principal

Named Arguments

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-kt, --keytab

keytab

Default: “airflow.keytab”

-l, --log-file

Location of the log file

--pid

PID file location

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

pools

CRUD operations on pools

airflow pools [-h] {delete,export,get,import,list,set} ...

Positional Arguments

subcommand

Possible choices: delete, export, get, import, list, set

Sub-commands:

delete

Delete pool

airflow pools delete [-h]
                     [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
                     NAME
Positional Arguments
NAME

Pool name

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

export

Export all pools

airflow pools export [-h]
                     [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
                     FILEPATH
Positional Arguments
FILEPATH

Export all pools to JSON file

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

get

Get pool size

airflow pools get [-h]
                  [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
                  NAME
Positional Arguments
NAME

Pool name

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

import

Import pools

airflow pools import [-h]
                     [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
                     FILEPATH
Positional Arguments
FILEPATH

Import pools from JSON file

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

list

List pools

airflow pools list [-h]
                   [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

set

Configure pool

airflow pools set [-h]
                  [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
                  NAME slots description
Positional Arguments
NAME

Pool name

slots

Pool slots

description

Pool description

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

roles

Create/List roles

airflow roles [-h] {create,list} ...

Positional Arguments

subcommand

Possible choices: create, list

Sub-commands:

create

Create role

airflow roles create [-h] [role [role ...]]
Positional Arguments
role

The name of a role

list

List roles

airflow roles list [-h]
                   [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

rotate_fernet_key

Rotate all encrypted connection credentials and variables; see https://airflow.readthedocs.io/en/stable/howto/secure-connections.html#rotating-encryption-keys

airflow rotate_fernet_key [-h]

scheduler

Start a scheduler instance

airflow scheduler [-h] [-D] [-d DAG_ID] [-p] [-l LOG_FILE] [-n NUM_RUNS]
                  [--pid [PID]] [--stderr STDERR] [--stdout STDOUT]
                  [-sd SUBDIR]

Named Arguments

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-d, --dag_id

The id of the dag to run

-p, --do_pickle

Attempt to pickle the DAG object to send over to the workers, instead of letting workers run their version of the code

Default: False

-l, --log-file

Location of the log file

-n, --num_runs

Set the number of runs to execute before exiting

Default: -1

--pid

PID file location

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

sync_perm

Update permissions for existing roles and DAGs

airflow sync_perm [-h]

tasks

List and manage tasks

airflow tasks [-h]
              {clear,failed_deps,list,render,run,state,states_for_dag_run,test}
              ...

Positional Arguments

subcommand

Possible choices: clear, failed_deps, list, render, run, state, states_for_dag_run, test

Sub-commands:

clear

Clear a set of task instance, as if they never ran

airflow tasks clear [-h] [-dx] [-d] [-e END_DATE] [-xp] [-x] [-f] [-r]
                    [-s START_DATE] [-sd SUBDIR] [-t TASK_REGEX] [-u] [-y]
                    dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-dx, --dag_regex

Search dag_id as regex instead of exact string

Default: False

-d, --downstream

Include downstream tasks

Default: False

-e, --end_date

Override end_date YYYY-MM-DD

-xp, --exclude_parentdag

Exclude ParentDAGS if the task cleared is a part of a SubDAG

Default: False

-x, --exclude_subdags

Exclude subdags

Default: False

-f, --only_failed

Only failed jobs

Default: False

-r, --only_running

Only running jobs

Default: False

-s, --start_date

Override start_date YYYY-MM-DD

-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-t, --task_regex

The regex to filter specific task_ids to backfill (optional)

-u, --upstream

Include upstream tasks

Default: False

-y, --yes

Do not prompt to confirm reset. Use with care!

Default: False

failed_deps

Returns the unmet dependencies for a task instance from the perspective of the scheduler. In other words, why a task instance doesn’t get scheduled and then queued by the scheduler, and then run by an executor)

airflow tasks failed_deps [-h] [-sd SUBDIR] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

list

List the tasks within a DAG

airflow tasks list [-h] [-sd SUBDIR] [-t] dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-t, --tree

Tree view

Default: False

render

Render a task instance’s template(s)

airflow tasks render [-h] [-sd SUBDIR] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

run

Run a single task instance

airflow tasks run [-h] [--cfg_path CFG_PATH] [-f] [-A] [-i] [-I] [-int] [-l]
                  [-m] [-p PICKLE] [--pool POOL] [--ship_dag] [-sd SUBDIR]
                  dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
--cfg_path

Path to config file to use instead of airflow.cfg

-f, --force

Ignore previous task instance state, rerun regardless if task already succeeded/failed

Default: False

-A, --ignore_all_dependencies

Ignores all non-critical dependencies, including ignore_ti_state and ignore_task_deps

Default: False

-i, --ignore_dependencies

Ignore task-specific dependencies, e.g. upstream, depends_on_past, and retry delay dependencies

Default: False

-I, --ignore_depends_on_past

Ignore depends_on_past dependencies (but respect upstream dependencies)

Default: False

-int, --interactive

Do not capture standard output and error streams (useful for interactive debugging)

Default: False

-l, --local

Run the task using the LocalExecutor

Default: False

-m, --mark_success

Mark jobs as succeeded without running them

Default: False

-p, --pickle

Serialized pickle object of the entire dag (used internally)

--pool

Resource pool to use

--ship_dag

Pickles (serializes) the DAG and ships it to the worker

Default: False

-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

state

Get the status of a task instance

airflow tasks state [-h] [-sd SUBDIR] dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

states_for_dag_run

Get the status of all task instances in a dag run

airflow tasks states_for_dag_run [-h]
                                 [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
                                 dag_id execution_date
Positional Arguments
dag_id

The id of the dag

execution_date

The execution date of the DAG

Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

test

Test a task instance. This will run a task without checking for dependencies or recording its state in the database

airflow tasks test [-h] [-dr] [-pm] [-sd SUBDIR] [-tp TASK_PARAMS]
                   dag_id task_id execution_date
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

execution_date

The execution date of the DAG

Named Arguments
-dr, --dry_run

Perform a dry run

Default: False

-pm, --post_mortem

Open debugger on uncaught exception

Default: False

-sd, --subdir

File location or directory from which to look for the dag. Defaults to ‘[AIRFLOW_HOME]/dags’ where [AIRFLOW_HOME] is the value you set for ‘AIRFLOW_HOME’ config you set in ‘airflow.cfg’

Default: “[AIRFLOW_HOME]/dags”

-tp, --task_params

Sends a JSON params dict to the task

users

CRUD operations on users

airflow users [-h] {add_role,create,delete,export,import,list,remove_role} ...

Positional Arguments

subcommand

Possible choices: add_role, create, delete, export, import, list, remove_role

Sub-commands:

add_role

Add role to a user

airflow users add_role [-h] [--email EMAIL] --role ROLE [--username USERNAME]
Named Arguments
--email

Email of the user

--role

Role of the user. Existing roles include Admin, User, Op, Viewer, and Public

--username

Username of the user

create

Create a user

airflow users create [-h] --email EMAIL --firstname FIRSTNAME --lastname
                     LASTNAME [--password PASSWORD] --role ROLE
                     [--use_random_password] --username USERNAME
Named Arguments
--email

Email of the user

--firstname

First name of the user

--lastname

Last name of the user

--password

Password of the user, required to create a user without –use_random_password

--role

Role of the user. Existing roles include Admin, User, Op, Viewer, and Public

--use_random_password

Do not prompt for password. Use random string instead. Required to create a user without –password

Default: False

--username

Username of the user

delete

Delete a user

airflow users delete [-h] --username USERNAME
Named Arguments
--username

Username of the user

export

Export all users

airflow users export [-h] FILEPATH
Positional Arguments
FILEPATH

Export all users to JSON file

import

Import users

airflow users import [-h] FILEPATH
Positional Arguments
FILEPATH

Import users from JSON file. Example format:

[
    {
        "email": "foo@bar.org",
        "firstname": "Jon",
        "lastname": "Doe",
        "roles": ["Public"],
        "username": "jondoe"
    }
]
list

List users

airflow users list [-h]
                   [--output {fancy_grid,github,grid,html,jira,latex,latex_booktabs,latex_raw,mediawiki,moinmoin,orgtbl,pipe,plain,presto,psql,rst,simple,textile,tsv,youtrack}]
Named Arguments
--output

Possible choices: fancy_grid, github, grid, html, jira, latex, latex_booktabs, latex_raw, mediawiki, moinmoin, orgtbl, pipe, plain, presto, psql, rst, simple, textile, tsv, youtrack

Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/).

Default: “fancy_grid”

remove_role

Remove role from a user

airflow users remove_role [-h] [--email EMAIL] --role ROLE
                          [--username USERNAME]
Named Arguments
--email

Email of the user

--role

Role of the user. Existing roles include Admin, User, Op, Viewer, and Public

--username

Username of the user

variables

CRUD operations on variables

airflow variables [-h] {delete,export,get,import,list,set} ...

Positional Arguments

subcommand

Possible choices: delete, export, get, import, list, set

Sub-commands:

delete

Delete variable

airflow variables delete [-h] key
Positional Arguments
key

Variable key

export

Export all variables

airflow variables export [-h] file
Positional Arguments
file

Export all variables to JSON file

get

Get variable

airflow variables get [-h] [-d VAL] [-j] key
Positional Arguments
key

Variable key

Named Arguments
-d, --default

Default value returned if variable does not exist

-j, --json

Deserialize JSON variable

Default: False

import

Import variables

airflow variables import [-h] file
Positional Arguments
file

Import variables from JSON file

list

List variables

airflow variables list [-h]
set

Set variable

airflow variables set [-h] [-j] key VALUE
Positional Arguments
key

Variable key

VALUE

Variable value

Named Arguments
-j, --json

Deserialize JSON variable

Default: False

version

Show the version

airflow version [-h]

webserver

Start a Airflow webserver instance

airflow webserver [-h] [-A ACCESS_LOGFILE] [-D] [-d] [-E ERROR_LOGFILE]
                  [-hn HOSTNAME] [-l LOG_FILE] [--pid [PID]] [-p PORT]
                  [--ssl_cert SSL_CERT] [--ssl_key SSL_KEY] [--stderr STDERR]
                  [--stdout STDOUT] [-t WORKER_TIMEOUT]
                  [-k {sync,eventlet,gevent,tornado}] [-w WORKERS]

Named Arguments

-A, --access_logfile

The logfile to store the webserver access log. Use ‘-‘ to print to stderr

Default: “-“

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-d, --debug

Use the server that ships with Flask in debug mode

Default: False

-E, --error_logfile

The logfile to store the webserver error log. Use ‘-‘ to print to stderr

Default: “-“

-hn, --hostname

Set the hostname on which to run the web server

Default: “0.0.0.0”

-l, --log-file

Location of the log file

--pid

PID file location

-p, --port

The port on which to run the server

Default: 8080

--ssl_cert

Path to the SSL certificate for the webserver

--ssl_key

Path to the key to use with the SSL certificate

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-t, --worker_timeout

The timeout for waiting on webserver workers

Default: 120

-k, --workerclass

Possible choices: sync, eventlet, gevent, tornado

The worker class to use for Gunicorn

Default: “sync”

-w, --workers

Number of workers to run the webserver on

Default: 4