I have several multi-purpose shell scripts stored in .sh
files. My intention is to build a few Airflow DAGs on Cloud Composer that will leverage these scripts. The DAGs would be made mostly of BashOperators that call the scripts with specific arguments.
Here's a simple example, greeter.sh
:
#!/bin/bash
echo "Hello, $1!"
I can run it locally like this:
bash greeter.sh world
> Hello, world!
Let's write a simple DAG:
# import and define default_args
dag = DAG('bash_test',
description='Running a local bash script',
default_args=default_args,
schedule_interval='0,30 5-23 * * *',
catchup=False,
max_active_runs=1)
bash_task = BashOperator(
task_id='run_command',
bash_command=f"bash greeter.sh world",
dag=dag
)
But where to put the script greeter.sh
? I tried putting it both in the dags/
folder and the data/
folder, at first level or nested within a dependencies/
directory. I also tried writing the address as ./greeter.sh
. Pointless: I can never find the file.
I also tried using sh
in place of bash
and I get a different error: sh: 0: Can't open greeter.sh
. But this error also appears when the file is not there, so it's the same issue. Same with any attempt to run chmod +rx
.
How can I make my file available to Airflow?
/full/path/to/greeter.sh
work? (just for our info, I have not experience with airflow). Good luck! – Toothpaste