Mark Success but remote task continue running #40342
Replies: 6 comments
-
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval. |
Beta Was this translation helpful? Give feedback.
-
@dotrungkien3210 I'm afraid you are going to have to provide more information and context for anyone to be able to assist you in this. If this is a bug then you'll need to demonstrate it in a way that can be reproduced. If this is more of a support/implementation problem then you will be much better served by speaking to the community in Slack. |
Beta Was this translation helpful? Give feedback.
-
Hi @nathadfield this is full DAG code config = create_dag.getConfig(path="/airflow/crawler-system") for key, value in config.items(): server_bs = create_dag.select_server('', '') command_bs = 'cd ' + path_project_bs + project_name_bs + ' && bash bin/schedule/run_check_coverage_storybook.sh ' config_bs = create_dag.getConfig(path="/airflow/bs-core") command_build_repush = 'bash bin/airflow_build.sh ' command_repush = 'cd ' + path_project + project_name + ' && bash bin/run_repush_storybook.sh ' config_repush = create_dag.getConfig(path="/airflow/bs-core") [build_ope, build_ope_bs, build_ope_repush] >> run_ope >> run_ope_bs >> run_ope_repush |
Beta Was this translation helpful? Give feedback.
-
Thanks but this does not provide us with the ability to reproduce the problem, if indeed there even is a bug behind the problem you are facing and it not be something to do with your code or your implementation. You will have to try and pinpoint the problem a bit more on your side before anyone can really assess whether there truly is a bug or not. |
Beta Was this translation helpful? Give feedback.
-
If you are using SSHOperator and it gets killed, the SSH connection is closed (by the nature of SSH conneciton) - but whether the action originated by SSH continues running depends solely on what command you send it. For example if your command will execute Another thing with SSH is that it might not realise the connection is broken if the command on the other might not realize that the connection has been broken - which is the nature of TCP connection that gets broken - so you should look in your code. Also I see that SSHOperator does not have on_kill implemented that would gently and deliberately close the ssh connection - that might help in some cases. I will open a separate task for it. |
Beta Was this translation helpful? Give feedback.
-
Created feature and marked it as good first issue #40343 - that might help in some cases so feel free to implement the task, but in general case, it might simply be that your command is not reacting to being closed by the SSH server (ignoring the signals). Reviewing your commands and implementing on_kill to gently close the client should likely help in all cases. |
Beta Was this translation helpful? Give feedback.
-
Apache Airflow version
Other Airflow 2 version (please specify below)
If "Other Airflow 2 version" selected, which one?
2.9.0
What happened?
I had manage a airflow HA celery cluster and manage nearly 150 DAGs, almost working correctly. But only one DAG met trouble. When I mark success, the remote executer trigger by ssh operator continue running. The problem is just one DAG happend, another DAG work correctly although they have same logic
What you think should happen instead?
I think some trouble from code stuck and waiting
How to reproduce
hope you can make muntiple SSHook and process in diffirent server in one DAG
Operating System
Ubuntu 20.04
Versions of Apache Airflow Providers
2.9.0
Deployment
Other
Deployment details
airflow HA celery cluster in 2 node, with 2 scheduler, 2 consumer, 2 worker
Anything else?
No response
Are you willing to submit PR?
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions