I have this code for airflow to execute python script inside docker :
a02_retrieve_2 = SSHExecuteOperator(
task_id='a02_retrieve_2',
ssh_hook=sshHook,
dag=dag,
bash_command= 'docker exec myDocker -i bash -c " /opt/conda/bin/python aretrieve_2.py --myarg 45 --myarg 35 " '
)
Unfortunately, it does not work.
But, the version with no argument works.
a02_retrieve_2 = SSHExecuteOperator(
task_id='a02_retrieve_2',
ssh_hook=sshHook,
dag=dag,
bash_command= 'docker exec myDocker -i bash -c " /opt/conda/bin/python aretrieve_2.py " '
)
Error is :
Python script starts to run but Airflow is unable to catch up the python script parameters...
How to make the version with python script arguments works ?
Is this related to jinja template ? BashOperator doen't run bash file apache airflow
bash -c " /opt/conda/bin/python aretrieve_2.py --myarg 45 --myarg 35 "directly on the machine?