Open your IDE or source code editor and select the option to clone the repository
Paste the copied clone link in the URL field and submit.
- Configuring Airflow database connection
Airflow is by default configured to use SQLite database. Configuration can be seen on local machine
~/airflow/airflow.cfg
undersql_alchemy_conn
.Installing required dependency for MySQL connection in
airflow-env
on local machine.$ pyenv activate airflow-env $ pip install PyMySQL
Now set
sql_alchemy_conn = mysql+pymysql://root:@127.0.0.1:23306/airflow?charset=utf8mb4
in file~/airflow/airflow.cfg
on local machine.
- Debugging an example DAG
In Visual Studio Code open airflow project, directory
/files/dags
of local machine is by default mounted to docker machine when breeze airflow is started. So any DAG file present in this directory will be picked automatically by scheduler running in docker machine and same can be seen onhttp://127.0.0.1:28080
.Copy any example DAG present in the
/airflow/example_dags
directory to/files/dags/
.Add a
__main__
block at the end of your DAG file to make it runnable. It will run aback_fill
job:if __name__ == "__main__": dag.clear() dag.run()
Add
"AIRFLOW__CORE__EXECUTOR": "DebugExecutor"
to the"env"
field of Debug configuration.Using the
Run
view click onCreate a launch.json file
Change
"program"
to point to an example dag and add"env"
and"python"
fields to the new Python configuration{ "configurations": [ "program": "${workspaceFolder}/files/dags/example_bash_operator.py", "env": { "PYTHONUNBUFFERED": "1", "AIRFLOW__CORE__EXECUTOR": "DebugExecutor" }, "python": "${env:HOME}/.pyenv/versions/airflow/bin/python" ] }
Now Debug an example dag and view the entries in tables such as
dag_run, xcom
etc in mysql workbench.
Click on the branch symbol in the status bar
Give a name to a branch and checkout
Follow the Quick start for typical development tasks.