Last active
March 19, 2022 03:16
-
-
Save philippefutureboy/f7f13e926fe6f8ce2d7defe55b5506c6 to your computer and use it in GitHub Desktop.
apache/airflow #22282 gist
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:357 DeprecationWarning: The dag_concurrency option in [core] has been renamed to max_active_tasks_per_dag - the old setting has been used, but please update your config. | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:357 DeprecationWarning: The processor_poll_interval option in [scheduler] has been renamed to scheduler_idle_sleep_time - the old setting has been used, but please update your config. | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py:149 SAWarning: relationship 'DagRun.serialized_dag' will copy column serialized_dag.dag_id to column dag_run.dag_id, which conflicts with relationship(s): 'BaseXCom.dag_run' (copies xcom.dag_id to dag_run.dag_id). If this is not the intention, consider if these relationships should be linked with back_populates, or if viewonly=True should be applied to one or more if they are read-only. For the less common case that foreign key constraints are partially overlapping, the orm.foreign() annotation can be used to isolate the columns that should be written towards. To silence this warning, add the parameter 'overlaps="dag_run"' to the 'DagRun.serialized_dag' relationship. (Background on this error at: https://sqlalche.me/e/14/qzyx) | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py:149 SAWarning: relationship 'SerializedDagModel.dag_runs' will copy column serialized_dag.dag_id to column dag_run.dag_id, which conflicts with relationship(s): 'BaseXCom.dag_run' (copies xcom.dag_id to dag_run.dag_id). If this is not the intention, consider if these relationships should be linked with back_populates, or if viewonly=True should be applied to one or more if they are read-only. For the less common case that foreign key constraints are partially overlapping, the orm.foreign() annotation can be used to isolate the columns that should be written towards. To silence this warning, add the parameter 'overlaps="dag_run"' to the 'SerializedDagModel.dag_runs' relationship. (Background on this error at: https://sqlalche.me/e/14/qzyx) | |
____________ _____________ | |
____ |__( )_________ __/__ /________ __ | |
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / / | |
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ / | |
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/ | |
[2022-03-19 01:19:12,052] {scheduler_job.py:596} INFO - Starting the scheduler | |
[2022-03-19 01:19:12,052] {scheduler_job.py:601} INFO - Processing each file at most -1 times | |
[2022-03-19 01:19:12,169] {manager.py:163} INFO - Launched DagFileProcessorManager with pid: 8 | |
[2022-03-19 01:19:12,172] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 01:19:12,175] {settings.py:52} INFO - Configured default timezone Timezone('UTC') | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py:1127 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
[2022-03-19 01:19:12,242] {scheduler_job.py:1137} INFO - Marked 1 SchedulerJob instances as failed | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py:847 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py:879 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py:944 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py:960 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py:791 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py:273 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/dag_processing/manager.py:1072 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py:1200 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py:1200 SAWarning: TypeDecorator ExtendedJSON() will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
[2022-03-19 01:22:43,609] {scheduler_job.py:288} INFO - 1 tasks up for execution: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T01:21:48+00:00 [scheduled]> | |
[2022-03-19 01:22:43,613] {scheduler_job.py:317} INFO - Figuring out tasks to run in Pool(name=default_pool) with 125 open slots and 1 task instances ready to be queued | |
[2022-03-19 01:22:43,614] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 1/50 running and queued tasks | |
[2022-03-19 01:22:43,614] {scheduler_job.py:410} INFO - Setting the following tasks to queued state: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T01:21:48+00:00 [scheduled]> | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py:414 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
[2022-03-19 01:22:43,625] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T01:21:48+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 01:22:43,625] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T01:21:48+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 01:22:44,034] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T01:21:48+00:00 exited with status queued for try_number 1 | |
[2022-03-19 01:22:44,049] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T01:21:48+00:00 [queued]> to e0d64119-cd59-4020-b6f9-538b84877cc1 | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/models/serialized_dag.py:276 SAWarning: TypeDecorator UtcDateTime(timezone=True) will not produce a cache key because the ``cache_ok`` attribute is not set to True. This can have significant performance implications including some performance degradations in comparison to prior SQLAlchemy versions. Set this attribute to True if this type object's state is safe to use in a cache key, or False to disable this warning. (Background on this error at: https://sqlalche.me/e/14/cprf) | |
[2022-03-19 01:23:11,121] {scheduler_job.py:288} INFO - 1 tasks up for execution: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T01:22:46+00:00 [scheduled]> | |
[2022-03-19 01:23:11,123] {scheduler_job.py:317} INFO - Figuring out tasks to run in Pool(name=default_pool) with 124 open slots and 1 task instances ready to be queued | |
[2022-03-19 01:23:11,123] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 2/50 running and queued tasks | |
[2022-03-19 01:23:11,123] {scheduler_job.py:410} INFO - Setting the following tasks to queued state: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T01:22:46+00:00 [scheduled]> | |
[2022-03-19 01:23:11,125] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T01:22:46+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 01:23:11,126] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T01:22:46+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 01:23:11,228] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T01:22:46+00:00 exited with status queued for try_number 1 | |
[2022-03-19 01:23:11,235] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T01:22:46+00:00 [queued]> to df0a46c0-9516-4542-861d-8541d6affd0c | |
[2022-03-19 01:23:22,604] {scheduler_job.py:288} INFO - 1 tasks up for execution: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T01:23:09+00:00 [scheduled]> | |
[2022-03-19 01:23:22,606] {scheduler_job.py:317} INFO - Figuring out tasks to run in Pool(name=default_pool) with 123 open slots and 1 task instances ready to be queued | |
[2022-03-19 01:23:22,607] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 3/50 running and queued tasks | |
[2022-03-19 01:23:22,607] {scheduler_job.py:410} INFO - Setting the following tasks to queued state: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T01:23:09+00:00 [scheduled]> | |
[2022-03-19 01:23:22,610] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T01:23:09+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 01:23:22,610] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T01:23:09+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 01:23:22,709] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T01:23:09+00:00 exited with status queued for try_number 1 | |
[2022-03-19 01:23:22,718] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T01:23:09+00:00 [queued]> to e45982f2-fb71-4242-b6bc-f5c4a9c92c70 | |
[2022-03-19 01:24:12,944] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 01:29:13,648] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 01:34:14,026] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 01:39:14,136] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 01:44:14,640] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 01:49:14,817] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 01:54:15,754] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 01:59:16,361] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:00:06,005] {scheduler_job.py:288} INFO - 3 tasks up for execution: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:00:03.578440+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:00:04.171370+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:00:04.737649+00:00 [scheduled]> | |
[2022-03-19 02:00:06,009] {scheduler_job.py:317} INFO - Figuring out tasks to run in Pool(name=default_pool) with 122 open slots and 3 task instances ready to be queued | |
[2022-03-19 02:00:06,009] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 4/50 running and queued tasks | |
[2022-03-19 02:00:06,009] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 5/50 running and queued tasks | |
[2022-03-19 02:00:06,009] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 6/50 running and queued tasks | |
[2022-03-19 02:00:06,010] {scheduler_job.py:410} INFO - Setting the following tasks to queued state: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:00:03.578440+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:00:04.171370+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:00:04.737649+00:00 [scheduled]> | |
[2022-03-19 02:00:06,015] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T02:00:03.578440+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 02:00:06,015] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T02:00:03.578440+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 02:00:06,016] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T02:00:04.171370+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 02:00:06,016] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T02:00:04.171370+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 02:00:06,016] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T02:00:04.737649+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 02:00:06,016] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T02:00:04.737649+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 02:00:06,522] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T02:00:03.578440+00:00 exited with status queued for try_number 1 | |
[2022-03-19 02:00:06,522] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T02:00:04.171370+00:00 exited with status queued for try_number 1 | |
[2022-03-19 02:00:06,522] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T02:00:04.737649+00:00 exited with status queued for try_number 1 | |
[2022-03-19 02:00:06,539] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:00:03.578440+00:00 [queued]> to 886337af-26c1-419a-a623-0c8b256c96e4 | |
[2022-03-19 02:00:06,540] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:00:04.171370+00:00 [queued]> to 96cde597-817e-495f-a1cb-536dd35e84bc | |
[2022-03-19 02:00:06,540] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:00:04.737649+00:00 [queued]> to d1b48bfd-20d3-41e5-a4ce-42968807c7cd | |
[2022-03-19 02:04:16,848] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:09:17,634] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:14:18,211] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:19:18,801] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:24:19,359] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:29:19,941] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:30:09,230] {scheduler_job.py:288} INFO - 3 tasks up for execution: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:30:04.182303+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:30:04.754661+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:30:05.318652+00:00 [scheduled]> | |
[2022-03-19 02:30:09,232] {scheduler_job.py:317} INFO - Figuring out tasks to run in Pool(name=default_pool) with 119 open slots and 3 task instances ready to be queued | |
[2022-03-19 02:30:09,233] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 7/50 running and queued tasks | |
[2022-03-19 02:30:09,233] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 8/50 running and queued tasks | |
[2022-03-19 02:30:09,233] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 9/50 running and queued tasks | |
[2022-03-19 02:30:09,233] {scheduler_job.py:410} INFO - Setting the following tasks to queued state: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:30:04.182303+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:30:04.754661+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:30:05.318652+00:00 [scheduled]> | |
[2022-03-19 02:30:09,239] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T02:30:04.182303+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 02:30:09,239] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T02:30:04.182303+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 02:30:09,239] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T02:30:04.754661+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 02:30:09,239] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T02:30:04.754661+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 02:30:09,240] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T02:30:05.318652+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 02:30:09,240] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T02:30:05.318652+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 02:30:09,739] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T02:30:04.182303+00:00 exited with status queued for try_number 1 | |
[2022-03-19 02:30:09,740] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T02:30:04.754661+00:00 exited with status queued for try_number 1 | |
[2022-03-19 02:30:09,740] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T02:30:05.318652+00:00 exited with status queued for try_number 1 | |
[2022-03-19 02:30:09,751] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:30:04.182303+00:00 [queued]> to fe9e9573-92c4-49fc-8651-8b75f0b7bcf6 | |
[2022-03-19 02:30:09,751] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:30:04.754661+00:00 [queued]> to 3e0fb5dc-5ff1-4c7e-854a-e3058c413852 | |
[2022-03-19 02:30:09,752] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T02:30:05.318652+00:00 [queued]> to 4392f897-9654-4966-925c-4494fc7931eb | |
[2022-03-19 02:34:20,444] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:39:21,160] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:44:22,213] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:49:22,754] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:54:23,523] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 02:59:23,617] {scheduler_job.py:1114} INFO - Resetting orphaned tasks for active dag runs | |
[2022-03-19 03:00:08,644] {scheduler_job.py:288} INFO - 3 tasks up for execution: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T03:00:04.065904+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T03:00:04.684939+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T03:00:05.219618+00:00 [scheduled]> | |
[2022-03-19 03:00:08,646] {scheduler_job.py:317} INFO - Figuring out tasks to run in Pool(name=default_pool) with 116 open slots and 3 task instances ready to be queued | |
[2022-03-19 03:00:08,646] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 10/50 running and queued tasks | |
[2022-03-19 03:00:08,646] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 11/50 running and queued tasks | |
[2022-03-19 03:00:08,646] {scheduler_job.py:345} INFO - DAG pinkdolphin-clinicinfo has 12/50 running and queued tasks | |
[2022-03-19 03:00:08,647] {scheduler_job.py:410} INFO - Setting the following tasks to queued state: | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T03:00:04.065904+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T03:00:04.684939+00:00 [scheduled]> | |
<TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T03:00:05.219618+00:00 [scheduled]> | |
[2022-03-19 03:00:08,650] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T03:00:04.065904+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 03:00:08,650] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T03:00:04.065904+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 03:00:08,651] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T03:00:04.684939+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 03:00:08,651] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T03:00:04.684939+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 03:00:08,651] {scheduler_job.py:450} INFO - Sending TaskInstanceKey(dag_id='pinkdolphin-clinicinfo', task_id='start', run_id='manual__2022-03-19T03:00:05.219618+00:00', try_number=1) to executor with priority 103 and queue default | |
[2022-03-19 03:00:08,651] {base_executor.py:82} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T03:00:05.219618+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 03:00:09,020] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T03:00:04.065904+00:00 exited with status queued for try_number 1 | |
[2022-03-19 03:00:09,021] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T03:00:04.684939+00:00 exited with status queued for try_number 1 | |
[2022-03-19 03:00:09,021] {scheduler_job.py:504} INFO - Executor reports execution of pinkdolphin-clinicinfo.start run_id=manual__2022-03-19T03:00:05.219618+00:00 exited with status queued for try_number 1 | |
[2022-03-19 03:00:09,030] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T03:00:04.065904+00:00 [queued]> to a111541f-2897-4575-998c-af8a78cb3b39 | |
[2022-03-19 03:00:09,030] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T03:00:04.684939+00:00 [queued]> to b49763c1-2f84-405f-be40-93c99987e9e1 | |
[2022-03-19 03:00:09,030] {scheduler_job.py:538} INFO - Setting external_id for <TaskInstance: pinkdolphin-clinicinfo.start manual__2022-03-19T03:00:05.219618+00:00 [queued]> to 6688e8a5-cf40-4c54-a521-a83833feabfd |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
airflow@airflow-worker-5845f7bd45-7dxf5:/opt/airflow$ celery status | |
Traceback (most recent call last): | |
File "/home/airflow/.local/lib/python3.8/site-packages/amqp/transport.py", line 172, in _connect | |
entries = socket.getaddrinfo( | |
File "/usr/local/lib/python3.8/socket.py", line 918, in getaddrinfo | |
for res in _socket.getaddrinfo(host, port, family, type, proto, flags): | |
socket.gaierror: [Errno -9] Address family for hostname not supported | |
During handling of the above exception, another exception occurred: | |
Traceback (most recent call last): | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/connection.py", line 447, in _reraise_as_library_errors | |
yield | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/connection.py", line 434, in _ensure_connection | |
return retry_over_time( | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/utils/functional.py", line 312, in retry_over_time | |
return fun(*args, **kwargs) | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/connection.py", line 878, in _connection_factory | |
self._connection = self._establish_connection() | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/connection.py", line 813, in _establish_connection | |
conn = self.transport.establish_connection() | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection | |
conn.connect() | |
File "/home/airflow/.local/lib/python3.8/site-packages/amqp/connection.py", line 323, in connect | |
self.transport.connect() | |
File "/home/airflow/.local/lib/python3.8/site-packages/amqp/transport.py", line 113, in connect | |
self._connect(self.host, self.port, self.connect_timeout) | |
File "/home/airflow/.local/lib/python3.8/site-packages/amqp/transport.py", line 181, in _connect | |
raise (e | |
File "/home/airflow/.local/lib/python3.8/site-packages/amqp/transport.py", line 197, in _connect | |
self.sock.connect(sa) | |
ConnectionRefusedError: [Errno 111] Connection refused | |
The above exception was the direct cause of the following exception: | |
Traceback (most recent call last): | |
File "/home/airflow/.local/bin/celery", line 8, in <module> | |
sys.exit(main()) | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/__main__.py", line 15, in main | |
sys.exit(_main()) | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/bin/celery.py", line 213, in main | |
return celery(auto_envvar_prefix="CELERY") | |
File "/home/airflow/.local/lib/python3.8/site-packages/click/core.py", line 1128, in __call__ | |
return self.main(*args, **kwargs) | |
File "/home/airflow/.local/lib/python3.8/site-packages/click/core.py", line 1053, in main | |
rv = self.invoke(ctx) | |
File "/home/airflow/.local/lib/python3.8/site-packages/click/core.py", line 1659, in invoke | |
return _process_result(sub_ctx.command.invoke(sub_ctx)) | |
File "/home/airflow/.local/lib/python3.8/site-packages/click/core.py", line 1395, in invoke | |
return ctx.invoke(self.callback, **ctx.params) | |
File "/home/airflow/.local/lib/python3.8/site-packages/click/core.py", line 754, in invoke | |
return __callback(*args, **kwargs) | |
File "/home/airflow/.local/lib/python3.8/site-packages/click/decorators.py", line 26, in new_func | |
return f(get_current_context(), *args, **kwargs) | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/bin/base.py", line 134, in caller | |
return f(ctx, *args, **kwargs) | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/bin/control.py", line 80, in status | |
replies = ctx.obj.app.control.inspect(timeout=timeout, | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/control.py", line 294, in ping | |
return self._request('ping') | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/control.py", line 106, in _request | |
return self._prepare(self.app.control.broadcast( | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/control.py", line 741, in broadcast | |
return self.mailbox(conn)._broadcast( | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/pidbox.py", line 328, in _broadcast | |
chan = channel or self.connection.default_channel | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/connection.py", line 896, in default_channel | |
self._ensure_connection(**conn_opts) | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/connection.py", line 434, in _ensure_connection | |
return retry_over_time( | |
File "/usr/local/lib/python3.8/contextlib.py", line 131, in __exit__ | |
self.gen.throw(type, value, traceback) | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/connection.py", line 451, in _reraise_as_library_errors | |
raise ConnectionError(str(exc)) from exc | |
kombu.exceptions.OperationalError: [Errno 111] Connection refused |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Name: airflow-worker-5845f7bd45-7dxf5 | |
Namespace: default | |
Priority: 0 | |
Node: gk3-redacted-node-name/10.162.0.59 | |
Start Time: Fri, 18 Mar 2022 21:17:05 -0400 | |
Labels: app=airflow-worker | |
pod-template-hash=5845f7bd45 | |
Annotations: kubectl.kubernetes.io/restartedAt: 2022-03-18T21:15:25-04:00 | |
seccomp.security.alpha.kubernetes.io/pod: runtime/default | |
Status: Running | |
IP: 10.48.0.131 | |
IPs: | |
IP: 10.48.0.131 | |
Controlled By: ReplicaSet/airflow-worker-5845f7bd45 | |
Containers: | |
worker: | |
Container ID: | |
Image: gcr.io/project/airflow:release-20220318-9b2632e85 | |
Image ID: | |
Port: 8793/TCP | |
Host Port: 0/TCP | |
Command: | |
airflow | |
Args: | |
celery | |
worker | |
State: Running | |
Started: Fri, 18 Mar 2022 21:17:53 -0400 | |
Ready: True | |
Restart Count: 0 | |
Limits: | |
cpu: 1 | |
ephemeral-storage: 1Gi | |
memory: 2560Mi | |
Requests: | |
cpu: 1 | |
ephemeral-storage: 1Gi | |
memory: 2560Mi | |
Startup: tcp-socket :8793 delay=0s timeout=1s period=5s #success=1 #failure=12 | |
Environment Variables from: | |
airflow-cfg ConfigMap Optional: false | |
Environment: | |
AIRFLOW__CELERY__RESULT_BACKEND: <set to the key 'backend-connection' in secret 'airflow-cfg'> Optional: false | |
AIRFLOW__CORE__SQL_ALCHEMY_CONN: <set to the key 'connection' in secret 'airflow-cfg'> Optional: false | |
... | |
Mounts: | |
/secrets from airflow-sa-creds (ro) | |
/var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-pzscf (ro) | |
Conditions: | |
Type Status | |
Initialized True | |
Ready True | |
ContainersReady True | |
PodScheduled True | |
Volumes: | |
airflow-sa-creds: | |
Type: Secret (a volume populated by a Secret) | |
SecretName: airflow-sa-creds | |
Optional: false | |
kube-api-access-pzscf: | |
Type: Projected (a volume that contains injected data from multiple sources) | |
TokenExpirationSeconds: 3607 | |
ConfigMapName: kube-root-ca.crt | |
ConfigMapOptional: <nil> | |
DownwardAPI: true | |
QoS Class: Guaranteed | |
Node-Selectors: <none> | |
Tolerations: node.kubernetes.io/not-ready:NoExecute op=Exists for 300s | |
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s | |
Events: | |
Type Reason Age From Message | |
---- ------ ---- ---- ------- | |
Warning FailedScheduling 54m gke.io/optimize-utilization-scheduler 0/8 nodes are available: 1 node(s) had taint {ToBeDeletedByClusterAutoscaler: 1647652429}, that the pod didn't tolerate, 1 node(s) had taint {ToBeDeletedByClusterAutoscaler: 1647652450}, that the pod didn't tolerate, 2 Insufficient memory, 6 Insufficient cpu. | |
Warning FailedScheduling 53m (x1 over 54m) gke.io/optimize-utilization-scheduler 0/8 nodes are available: 1 node(s) had taint {ToBeDeletedByClusterAutoscaler: 1647652429}, that the pod didn't tolerate, 1 node(s) had taint {ToBeDeletedByClusterAutoscaler: 1647652450}, that the pod didn't tolerate, 2 Insufficient memory, 6 Insufficient cpu. | |
Warning FailedScheduling 52m gke.io/optimize-utilization-scheduler 0/7 nodes are available: 1 node(s) had taint {node.kubernetes.io/not-ready: }, that the pod didn't tolerate, 2 Insufficient memory, 6 Insufficient cpu. | |
Warning FailedScheduling 52m gke.io/optimize-utilization-scheduler 0/8 nodes are available: 2 Insufficient memory, 2 node(s) had taint {node.kubernetes.io/not-ready: }, that the pod didn't tolerate, 6 Insufficient cpu. | |
Normal Scheduled 52m gke.io/optimize-utilization-scheduler Successfully assigned default/airflow-worker-5845f7bd45-7dxf5 to gk3-pinkdolphin-non-prod-nap-174z7o1z-3fd828af-dkpj | |
Normal TriggeredScaleUp 53m cluster-autoscaler pod triggered scale-up: [{https://www.googleapis.com/compute/v1/projects/project/zones/northamerica-northeast1-b/instanceGroups/gk3-pinkdolphin-non-prod-nap-174z7o1z-bd4be871-grp 0->1 (max: 1000)} {https://www.googleapis.com/compute/v1/projects/project/zones/northamerica-northeast1-c/instanceGroups/gk3-redacted-node-name 0->1 (max: 1000)}] | |
Normal Pulling 52m kubelet Pulling image "gcr.io/project/airflow:release-20220318-9b2632e85" | |
Normal Pulled 51m kubelet Successfully pulled image "gcr.io/project/airflow:release-20220318-9b2632e85" in 45.084169585s | |
Normal Created 51m kubelet Created container worker | |
Normal Started 51m kubelet Started container worker | |
Warning Unhealthy 51m kubelet Startup probe failed: dial tcp 10.48.0.131:8793: connect: connection refused |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:357 DeprecationWarning: The dag_concurrency option in [core] has been renamed to max_active_tasks_per_dag - the old setting has been used, but please update your config. | |
[2022-03-19 03:05:33 +0000] [8] [INFO] Starting gunicorn 20.1.0 | |
[2022-03-19 03:05:33 +0000] [8] [INFO] Listening at: http://0.0.0.0:8793 (8) | |
[2022-03-19 03:05:33 +0000] [8] [INFO] Using worker: sync | |
[2022-03-19 03:05:33 +0000] [9] [INFO] Booting worker with pid: 9 | |
[2022-03-19 03:05:33 +0000] [10] [INFO] Booting worker with pid: 10 | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:357 DeprecationWarning: The processor_poll_interval option in [scheduler] has been renamed to scheduler_idle_sleep_time - the old setting has been used, but please update your config. | |
-------------- celery@airflow-worker-5845f7bd45-xnz86 v5.2.2 (dawn-chorus) | |
--- ***** ----- | |
-- ******* ---- Linux-5.4.170+-x86_64-with-glibc2.2.5 2022-03-19 03:05:34 | |
- *** --- * --- | |
- ** ---------- [config] | |
- ** ---------- .> app: airflow.executors.celery_executor:0x7f96510af850 | |
- ** ---------- .> transport: redis://redis:6379/0 | |
- ** ---------- .> results: postgresql://postgres:**@postgres:5432/postgres | |
- *** --- * --- .> concurrency: 5 (prefork) | |
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) | |
--- ***** ----- | |
-------------- [queues] | |
.> default exchange=default(direct) key=default | |
[tasks] | |
. airflow.executors.celery_executor.execute_command | |
[2022-03-19 03:05:36,627: INFO/MainProcess] Connected to redis://redis:6379/0 | |
[2022-03-19 03:05:36,639: INFO/MainProcess] mingle: searching for neighbors | |
[2022-03-19 03:05:37,654: INFO/MainProcess] mingle: all alone | |
[2022-03-19 03:05:37,667: INFO/MainProcess] celery@airflow-worker-5845f7bd45-xnz86 ready. | |
[2022-03-19 03:05:37,670: INFO/MainProcess] Task airflow.executors.celery_executor.execute_command[e0d64119-cd59-4020-b6f9-538b84877cc1] received | |
[2022-03-19 03:05:37,674: INFO/MainProcess] Task airflow.executors.celery_executor.execute_command[df0a46c0-9516-4542-861d-8541d6affd0c] received | |
[2022-03-19 03:05:37,689: INFO/MainProcess] Task airflow.executors.celery_executor.execute_command[e45982f2-fb71-4242-b6bc-f5c4a9c92c70] received | |
[2022-03-19 03:05:37,700: INFO/MainProcess] Task airflow.executors.celery_executor.execute_command[96cde597-817e-495f-a1cb-536dd35e84bc] received | |
[2022-03-19 03:05:37,708: INFO/MainProcess] Task airflow.executors.celery_executor.execute_command[d1b48bfd-20d3-41e5-a4ce-42968807c7cd] received | |
[2022-03-19 03:05:37,728: INFO/ForkPoolWorker-3] Executing command in Celery: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T01:21:48+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 03:05:37,728: INFO/ForkPoolWorker-3] Celery task ID: e0d64119-cd59-4020-b6f9-538b84877cc1 | |
[2022-03-19 03:05:37,814: INFO/ForkPoolWorker-1] Executing command in Celery: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T02:00:04.171370+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 03:05:37,815: INFO/ForkPoolWorker-1] Celery task ID: 96cde597-817e-495f-a1cb-536dd35e84bc | |
[2022-03-19 03:05:37,830: INFO/ForkPoolWorker-4] Executing command in Celery: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T01:22:46+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 03:05:37,831: INFO/ForkPoolWorker-4] Celery task ID: df0a46c0-9516-4542-861d-8541d6affd0c | |
[2022-03-19 03:05:37,844: INFO/ForkPoolWorker-2] Executing command in Celery: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T02:00:04.737649+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 03:05:37,847: INFO/ForkPoolWorker-2] Celery task ID: d1b48bfd-20d3-41e5-a4ce-42968807c7cd | |
[2022-03-19 03:05:37,903: INFO/ForkPoolWorker-5] Executing command in Celery: ['airflow', 'tasks', 'run', 'pinkdolphin-clinicinfo', 'start', 'manual__2022-03-19T01:23:09+00:00', '--local', '--subdir', 'DAGS_FOLDER/clinicinfo_raw.py'] | |
[2022-03-19 03:05:37,903: INFO/ForkPoolWorker-5] Celery task ID: e45982f2-fb71-4242-b6bc-f5c4a9c92c70 | |
[2022-03-19 03:05:38,125: WARNING/ForkPoolWorker-4] /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py:149: SAWarning: relationship 'DagRun.serialized_dag' will copy column serialized_dag.dag_id to column dag_run.dag_id, which conflicts with relationship(s): 'BaseXCom.dag_run' (copies xcom.dag_id to dag_run.dag_id). If this is not the intention, consider if these relationships should be linked with back_populates, or if viewonly=True should be applied to one or more if they are read-only. For the less common case that foreign key constraints are partially overlapping, the orm.foreign() annotation can be used to isolate the columns that should be written towards. To silence this warning, add the parameter 'overlaps="dag_run"' to the 'DagRun.serialized_dag' relationship. (Background on this error at: https://sqlalche.me/e/14/qzyx) | |
log = Log( |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:357 DeprecationWarning: The dag_concurrency option in [core] has been renamed to max_active_tasks_per_dag - the old setting has been used, but please update your config. | |
[2022-03-19 01:17:59 +0000] [8] [INFO] Starting gunicorn 20.1.0 | |
[2022-03-19 01:17:59 +0000] [8] [INFO] Listening at: http://0.0.0.0:8793 (8) | |
[2022-03-19 01:17:59 +0000] [8] [INFO] Using worker: sync | |
[2022-03-19 01:17:59 +0000] [9] [INFO] Booting worker with pid: 9 | |
[2022-03-19 01:17:59 +0000] [10] [INFO] Booting worker with pid: 10 | |
/home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:357 DeprecationWarning: The processor_poll_interval option in [scheduler] has been renamed to scheduler_idle_sleep_time - the old setting has been used, but please update your config. | |
-------------- celery@airflow-worker-5845f7bd45-7dxf5 v5.2.2 (dawn-chorus) | |
--- ***** ----- | |
-- ******* ---- Linux-5.4.170+-x86_64-with-glibc2.2.5 2022-03-19 01:18:00 | |
- *** --- * --- | |
- ** ---------- [config] | |
- ** ---------- .> app: airflow.executors.celery_executor:0x7ff5a52ae700 | |
- ** ---------- .> transport: redis://redis:6379/0 | |
- ** ---------- .> results: postgresql://postgres:**@postgres:5432/postgres | |
- *** --- * --- .> concurrency: 5 (prefork) | |
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) | |
--- ***** ----- | |
-------------- [queues] | |
.> default exchange=default(direct) key=default | |
[tasks] | |
. airflow.executors.celery_executor.execute_command | |
[2022-03-19 01:18:01,862: INFO/MainProcess] Connected to redis://redis:6379/0 | |
[2022-03-19 01:18:01,871: INFO/MainProcess] mingle: searching for neighbors | |
[2022-03-19 01:18:02,888: INFO/MainProcess] mingle: all alone | |
[2022-03-19 01:18:02,904: INFO/MainProcess] celery@airflow-worker-5845f7bd45-7dxf5 ready. | |
[2022-03-19 01:18:06,148: INFO/MainProcess] Events of group {task} enabled by remote. | |
[2022-03-19 01:18:22,696: WARNING/MainProcess] consumer: Connection to broker lost. Trying to re-establish the connection... | |
Traceback (most recent call last): | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/worker/consumer/consumer.py", line 326, in start | |
blueprint.start(self) | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/bootsteps.py", line 116, in start | |
step.start(parent) | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/worker/consumer/consumer.py", line 618, in start | |
c.loop(*c.loop_args()) | |
File "/home/airflow/.local/lib/python3.8/site-packages/celery/worker/loops.py", line 97, in asynloop | |
next(loop) | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/asynchronous/hub.py", line 362, in create_loop | |
cb(*cbargs) | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/transport/redis.py", line 1266, in on_readable | |
self.cycle.on_readable(fileno) | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/transport/redis.py", line 504, in on_readable | |
chan.handlers[type]() | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/transport/redis.py", line 847, in _receive | |
ret.append(self._receive_one(c)) | |
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/transport/redis.py", line 857, in _receive_one | |
response = c.parse_response() | |
File "/home/airflow/.local/lib/python3.8/site-packages/redis/client.py", line 3505, in parse_response | |
response = self._execute(conn, conn.read_response) | |
File "/home/airflow/.local/lib/python3.8/site-packages/redis/client.py", line 3479, in _execute | |
return command(*args, **kwargs) | |
File "/home/airflow/.local/lib/python3.8/site-packages/redis/connection.py", line 739, in read_response | |
response = self._parser.read_response() | |
File "/home/airflow/.local/lib/python3.8/site-packages/redis/connection.py", line 324, in read_response | |
raw = self._buffer.readline() | |
File "/home/airflow/.local/lib/python3.8/site-packages/redis/connection.py", line 256, in readline | |
self._read_from_socket() | |
File "/home/airflow/.local/lib/python3.8/site-packages/redis/connection.py", line 201, in _read_from_socket | |
raise ConnectionError(SERVER_CLOSED_CONNECTION_ERROR) | |
redis.exceptions.ConnectionError: Connection closed by server. | |
[2022-03-19 01:18:22,703: WARNING/MainProcess] /home/airflow/.local/lib/python3.8/site-packages/celery/worker/consumer/consumer.py:361: CPendingDeprecationWarning: | |
In Celery 5.1 we introduced an optional breaking change which | |
on connection loss cancels all currently executed tasks with late acknowledgement enabled. | |
These tasks cannot be acknowledged as the connection is gone, and the tasks are automatically redelivered back to the queue. | |
You can enable this behavior using the worker_cancel_long_running_tasks_on_connection_loss setting. | |
In Celery 5.1 it is set to False by default. The setting will be set to True by default in Celery 6.0. | |
warnings.warn(CANCEL_TASKS_BY_DEFAULT, CPendingDeprecationWarning) | |
[2022-03-19 01:18:22,707: ERROR/MainProcess] consumer: Cannot connect to redis://redis:6379/0: Error 111 connecting to redis:6379. Connection refused.. | |
Trying again in 2.00 seconds... (1/100) | |
[2022-03-19 01:18:25,737: ERROR/MainProcess] consumer: Cannot connect to redis://redis:6379/0: Error 111 connecting to redis:6379. Connection refused.. | |
Trying again in 4.00 seconds... (2/100) | |
[2022-03-19 01:18:30,793: ERROR/MainProcess] consumer: Cannot connect to redis://redis:6379/0: Error 111 connecting to redis:6379. Connection refused.. | |
Trying again in 6.00 seconds... (3/100) | |
[2022-03-19 01:18:37,833: ERROR/MainProcess] consumer: Cannot connect to redis://redis:6379/0: Error 111 connecting to redis:6379. Connection refused.. | |
Trying again in 8.00 seconds... (4/100) | |
[2022-03-19 01:18:46,857: ERROR/MainProcess] consumer: Cannot connect to redis://redis:6379/0: Error 111 connecting to redis:6379. Connection refused.. | |
Trying again in 10.00 seconds... (5/100) | |
[2022-03-19 01:18:57,929: ERROR/MainProcess] consumer: Cannot connect to redis://redis:6379/0: Error 111 connecting to redis:6379. Connection refused.. | |
Trying again in 12.00 seconds... (6/100) | |
[2022-03-19 01:19:09,949: INFO/MainProcess] Connected to redis://redis:6379/0 | |
[2022-03-19 01:19:09,953: INFO/MainProcess] mingle: searching for neighbors | |
[2022-03-19 01:19:10,960: INFO/MainProcess] mingle: all alone |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment