The easiest way to install the latest stable version of Airflow is with
pip install apache-airflow
You can also install Airflow with support for extra features like
pip install "apache-airflow[s3, postgres]"
One of the dependencies of Apache Airflow by default pulls in a GPL library (‘unidecode’).
In case this is a concern you can force a non GPL library by issuing
export SLUGIFY_USES_TEXT_UNIDECODE=yes and then proceed with the normal installation.
Please note that this needs to be specified at every upgrade. Also note that if unidecode
is already present on the system the dependency will still be used.
apache-airflow PyPI basic package only installs what’s needed to get started.
Subpackages can be installed depending on what will be useful in your
environment. For instance, if you don’t need connectivity with Postgres,
you won’t have to go through the trouble of installing the
yum package, or whatever equivalent applies on the distribution you are using.
Behind the scenes, Airflow does conditional imports of operators that require these extra dependencies.
Here’s the list of the subpackages and what they enable:
||All Airflow features known to man|
||All databases integrations|
||Async worker classes for gunicorn|
||Minimum dev tools requirements|
||Airflow + dependencies on the Hadoop stack|
||Encrypt connection passwords in metadata db|
||Druid.io related operators & hooks|
||Google Cloud Platform hooks and operators
||JDBC hooks and operators|
||HDFS hooks and operators|
||All Hive related operators|
||kerberos integration for kerberized hadoop|
||ldap authentication for users|
||Microsoft SQL operators and hook, support as an Airflow backend|
||MySQL operators and hook, support as
an Airflow backend. The version of MySQL server
has to be 5.6.4+. The exact version upper bound
depends on version of
||Password Authentication for users|
||Postgres operators and hook, support as an Airflow backend|
||Enable QDS (qubole data services) support|
||Rabbitmq support as a Celery backend|
||Vertica hook support as an Airflow backend|
||Redis hooks and sensors|
Initiating Airflow Database¶
Airflow requires a database to be initiated before you can run tasks. If you’re just experimenting and learning Airflow, you can stick with the default SQLite option. If you don’t want to use SQLite, then take a look at Initializing a Database Backend to setup a different database.
After configuration, you’ll need to initialize the database before you can run tasks: