airflow: S3 Remote Logging not working
Apache Airflow version: v2.0.0b3
Kubernetes version (if you are using kubernetes) (use kubectl version
): 1.16.15
Environment:
- Cloud provider or hardware configuration: AWS
- OS (e.g. from /etc/os-release):
- Kernel (e.g.
uname -a
): - Install tools: Custom Helm Chart
- Others:
What happened:
S3 Remote Logging not working. Below is the stacktrace:
Running <TaskInstance: canary_dag.print_date 2020-12-09T19:46:17.200838+00:00 [queued]> on host canarydagprintdate-9fafada4409d4eafb5e6e9c7187810ae │
│ [2020-12-09 19:54:09,825] {s3_task_handler.py:183} ERROR - Could not verify previous log to append: 'NoneType' object is not callable │
│ Traceback (most recent call last): │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 179, in s3_write │
│ if append and self.s3_log_exists(remote_log_location): │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 141, in s3_log_exists │
│ return self.hook.check_for_key(remote_log_location) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper │
│ connection = self.get_connection(self.aws_conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/base.py", line 63, in get_connection │
│ conn = Connection.get_connection_from_secrets(conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets │
│ conn = secrets_backend.get_connection(conn_id=conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 64, in wrapper │
│ with create_session() as session: │
│ File "/usr/local/lib/python3.7/contextlib.py", line 112, in __enter__ │
│ return next(self.gen) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 29, in create_session │
│ session = settings.Session() │
│ TypeError: 'NoneType' object is not callable │
│ [2020-12-09 19:54:09,826] {s3_task_handler.py:193} ERROR - Could not write logs to s3://my-favorite-airflow-logs/canary_dag/print_date/2020-12-09T19:46:17.200838+00:00/2.log │
│ Traceback (most recent call last): │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 190, in s3_write │
│ encrypt=conf.getboolean('logging', 'ENCRYPT_S3_LOGS'), │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper │
│ connection = self.get_connection(self.aws_conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/base.py", line 63, in get_connection │
│ conn = Connection.get_connection_from_secrets(conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets │
│ conn = secrets_backend.get_connection(conn_id=conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 64, in wrapper │
│ with create_session() as session: │
│ File "/usr/local/lib/python3.7/contextlib.py", line 112, in __enter__ │
│ return next(self.gen) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 29, in create_session │
│ session = settings.Session() │
│ TypeError: 'NoneType' object is not callable
stream closed
What you expected to happen Able to see the task instance logs in the airflow UI being read from S3 remote location.
How to reproduce it:
Pulled the latest master and created an airflow image from the dockerfile mentioned in the repo.
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 56 (51 by maintainers)
Fix for local executor comming.