Passing parameters to Airflow's jobs through UI
Asked Answered
C

4

23

Is it possible to pass parameters to Airflow's jobs through UI?

AFAIK, 'params' argument in DAG is defined in python code, therefore it can't be changed at runtime.

Counterglow answered 20/11, 2017 at 16:44 Comment(2)
I need something similar, did you find how to do it?Tonsure
@LuisLeal you can consider Airflow variables from Bryan's answer.Counterglow
T
11

Depending on what you're trying to do, you might be able to leverage Airflow Variables. These can be defined or edited in the UI under the Admin tab. Then your DAG code can read the value of the variable and pass the value to the DAG(s) it creates.

Note, however, that although Variables let you decouple values from code, all runs of a DAG will read the same value for the variable. If you want runs to be passed different values, your best bet is probably to use airflow templating macros and differentiate macros with the run_id macro or similar

Tedman answered 20/11, 2017 at 18:31 Comment(2)
In general, I want to run one script in parallel with different parameters. I can't do it with global variables. As I understood, macros package contains constants and some functions like date and uuid, but I want to pass a general string. So, all in all, I see this solution: create n scripts and n global variables. In this case, it will be possible to run n jobs in parallel. Anyway, thanks for the answer.Counterglow
@Bryan, @AlexanderErshov I'm sufficiently sold on template macros: [1] offloading processing from scheduler to executors [2] custom arguments (are there more [3], [4] .. ?). But even after a fine bit of research, I'm not clear on how macros can produce the effect of passing params to DAGs / Operators from Airflow's WebUI. Any pointers on this?Acropetal
D
13

Two ways to change your DAG behavior:

  1. Use Airflow variables like mentioned by Bryan in his answer.
  2. Use Airflow JSON Conf to pass JSON data to a single DAG run. JSON can be passed either from

UI - manual trigger from tree view enter image description here UI - create new DAG run from browse > DAG runs > create new record enter image description here

or from

CLI

airflow trigger_dag 'MY_DAG' -r 'test-run-1' --conf '{"exec_date":"2021-09-14"}'

Within the DAG this JSON can be accessed using jinja templates or in the operator callable function context param.

def do_some_task(**context):
    print(context['dag_run'].conf['exec_date'])


task1 = PythonOperator(
    task_id='task1_id',
    provide_context=True,
    python_callable=do_some_task,
    dag=dag,
)

#access in templates
task2 = BashOperator(
    task_id="task2_id",
    bash_command="{{ dag_run.conf['exec_date'] }}",
    dag=dag,
)

Note that the JSON conf will not be present during scheduled runs. The best use case for JSON conf is to override the default DAG behavior. Hence set meaningful defaults in the DAG code so that during scheduled runs JSON conf is not used.

Diablerie answered 20/9, 2021 at 16:47 Comment(0)
T
11

Depending on what you're trying to do, you might be able to leverage Airflow Variables. These can be defined or edited in the UI under the Admin tab. Then your DAG code can read the value of the variable and pass the value to the DAG(s) it creates.

Note, however, that although Variables let you decouple values from code, all runs of a DAG will read the same value for the variable. If you want runs to be passed different values, your best bet is probably to use airflow templating macros and differentiate macros with the run_id macro or similar

Tedman answered 20/11, 2017 at 18:31 Comment(2)
In general, I want to run one script in parallel with different parameters. I can't do it with global variables. As I understood, macros package contains constants and some functions like date and uuid, but I want to pass a general string. So, all in all, I see this solution: create n scripts and n global variables. In this case, it will be possible to run n jobs in parallel. Anyway, thanks for the answer.Counterglow
@Bryan, @AlexanderErshov I'm sufficiently sold on template macros: [1] offloading processing from scheduler to executors [2] custom arguments (are there more [3], [4] .. ?). But even after a fine bit of research, I'm not clear on how macros can produce the effect of passing params to DAGs / Operators from Airflow's WebUI. Any pointers on this?Acropetal
B
1

A form builder will be built-in in Airflow 2.6.0, without the need for a plugin, thanks to AIP-50 (Airflow Improvement Proposal 50).

Sample views:

Yes/No switch: Yes/No switch

Date picker: Date picker

Select of the recent configs: select of the recent configs

Burgenland answered 1/4, 2023 at 19:0 Comment(0)
B
0

It is possible to improve the usability of the Answer by ns15 by building a user interface within Airflow web. Airflow's interface can be expanded with plugins, for instance web views. Plugins are saved in the Airflow plugins folder, normally $AIRFLOW_HOME/plugins

A full example is given here, where this UI image has been found:

Manual DAG trigger with custom parameter

Burgenland answered 21/3, 2023 at 15:37 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.