18

Is there a way to pass a parameter to:

airflow trigger_dag dag_name {param}

?

I have a script that monitors a directory for files - when a file gets moves into the target directory I want to trigger the dag passing as a parameter the file path.

bsd
  • 947
  • 3
  • 10
  • 25

2 Answers2

19

you can pass it like this:

airflow trigger_dag --conf {"file_variable": "/path/to/file"} dag_id

Then in your dag, you can access this variable using templating as follows:

{{ dag_run.conf.file_variable }}

If this doesn't work, sharing a simple version of your dag might help in getting better answers.

Him
  • 1,401
  • 10
  • 20
  • 2
    what if you dont want to use templating...? – melchoir55 Aug 23 '18 at 00:06
  • 1
    @melchoir55 I played some trick on non-template cases. Setting up a global variable, say `param`, at the top of the DAG definition file, `param = "{{dag_run.conf.get('file_variable')}}"`. Then you're able to use the `param` variable anywhere in the file. – Old Panda Oct 29 '18 at 23:09
14

yes you can. Your Dag should have a Dag and a Bask Task like this:

from airflow.operators.bash_operators import BashOperator

args = {'start_date':datetime.now(),
        'owner':'airflow',}
dag = DAG(
      dag_id='param_dag', 
      default_args=args,
      schedule_interval=None)

bash_task=BashOperator(
     task_id="bash_task" 
     bash_command= 'bash ~/path/bashscript.sh {{ dag_run.conf["parameter"] if dag_run else "" }} ', 
    //bashscript your script you want to run and the dag_run.conf will hold the parameter you want to pass
     dag=dag)

Now on your command line just type the command:

 airflow trigger_dag dag_id --conf '{"parameter":"~/path" }'
Anmol Karki
  • 341
  • 1
  • 3
  • 12