You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There's a bug in the entrypoint.sh: when dags are downloaded from S3 the workdir is changed. That leads to the situation when stored_env file is created in dags folder instead of $AIRFLOW_HOME. Consequent import fails.
Description
We use mwaa-local-runner to emulate MWAA behaviour for ephemeral deployments. We recently discovered that whatever is defined as environmental variable in the startup.sh is never actually imported into the environment of local-runner.
The issue lives in this block of code. I am copying lines here
So if we provide S3_DAGS_PATH (1) then before download happens we changedir to ./dags (2), so when we start the startup script (3) we actually write stored_env (see run-startup.sh) to the dags folder instead of $AIRFLOW_HOME.
Unfortunately, the execute_startup_script explicitly switches back to $AIRFLOW_HOME before finishing.
Then we see the following error in the logs:
/entrypoint.sh: line 130: stored_env: No such file or directory
Expected behaviour
stored_env should be successfully imported, thus making exported variables from startup.sh available to a local-runner process.
Suggested fix
I am not a shell-ninja, but I think ensuring that stored_env will be available should be done in same context as the call to execute_startup_script.
Therefore it should be something like this in that place
Summary
There's a bug in the
entrypoint.sh
: when dags are downloaded from S3 the workdir is changed. That leads to the situation whenstored_env
file is created indags
folder instead of$AIRFLOW_HOME
. Consequent import fails.Description
We use
mwaa-local-runner
to emulate MWAA behaviour for ephemeral deployments. We recently discovered that whatever is defined as environmental variable in thestartup.sh
is never actually imported into the environment oflocal-runner
.The issue lives in this block of code. I am copying lines here
So if we provide
S3_DAGS_PATH
(1) then before download happens we changedir to./dags
(2), so when we start the startup script (3) we actually writestored_env
(see run-startup.sh) to thedags
folder instead of$AIRFLOW_HOME
.Unfortunately, the
execute_startup_script
explicitly switches back to$AIRFLOW_HOME
before finishing.Then we see the following error in the logs:
Expected behaviour
stored_env
should be successfully imported, thus making exported variables fromstartup.sh
available to a local-runner process.Suggested fix
I am not a shell-ninja, but I think ensuring that
stored_env
will be available should be done in same context as the call toexecute_startup_script
.Therefore it should be something like this in that place
The text was updated successfully, but these errors were encountered: