airflow: Unable to load custom logging from log_config.LOGGING_CONFIG due to No module named 'log_config'

Apache Airflow version

2.2.2 (latest released)

Operating System

Ubuntu 18.04.4 LTS

Versions of Apache Airflow Providers

Providers info
apache-airflow-providers-amazon          | 2.4.0
apache-airflow-providers-celery          | 2.1.0
apache-airflow-providers-cncf-kubernetes | 2.1.0
apache-airflow-providers-docker          | 2.3.0
apache-airflow-providers-elasticsearch   | 2.1.0
apache-airflow-providers-ftp             | 2.0.1
apache-airflow-providers-google          | 6.1.0
apache-airflow-providers-grpc            | 2.0.1
apache-airflow-providers-hashicorp       | 2.1.1
apache-airflow-providers-http            | 2.0.1
apache-airflow-providers-imap            | 2.0.1
apache-airflow-providers-microsoft-azure | 3.3.0
apache-airflow-providers-mysql           | 2.1.1
apache-airflow-providers-odbc            | 2.0.1
apache-airflow-providers-postgres        | 2.3.0
apache-airflow-providers-redis           | 2.0.1
apache-airflow-providers-sendgrid        | 2.0.1
apache-airflow-providers-sftp            | 2.2.0
apache-airflow-providers-slack           | 4.1.0
apache-airflow-providers-sqlite          | 2.0.1
apache-airflow-providers-ssh             | 2.3.0

Deployment

Docker-Compose

Deployment details

structure of my local directory:

.
├── airflow.sh
├── config
│   └── log_config.py
├── dags
│   └── test.py
├── docker-compose.yaml
├── logs
├── Makefile
├── parameters.json
├── plugins
└── test.py

docker-compose.yml

only changes I have made in common env section:

  • added PYTONPATH env variable
  • mounting config folder were my log customization code lives
  environment:
    &airflow-common-env
    PYTHONPATH: /opt/airflow/config
    AIRFLOW__LOGGING__LOGGING_CONFIG_CLASS: 'log_config.LOGGING_CONFIG'
    AIRFLOW__CORE__EXECUTOR: CeleryExecutor
    AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
    AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
    AIRFLOW__CORE__FERNET_KEY: ''
    AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'false'
    AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
    AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
    _PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
  volumes:
    - ./dags:/opt/airflow/dags
    - ./logs:/opt/airflow/logs
    - ./plugins:/opt/airflow/plugins
    - ./config:/opt/airflow/config
  user: "${AIRFLOW_UID:-50000}:0"
  depends_on:
    &airflow-common-depends-on
    redis:
      condition: service_healthy
    postgres:
      condition: service_healthy

log_config.py (nothing special for now):

from copy import deepcopy
from airflow.config_templates.airflow_local_settings import DEFAULT_LOGGING_CONFIG

LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)

What happened

When executing docker-compose up command, the program exists with the following errors:

airflow_postgres_1 is up-to-date
airflow_redis_1 is up-to-date
Recreating airflow_airflow-init_1 ... done

ERROR: for airflow-scheduler  Container "2b3fbb5a9e97" exited with code 1.

ERROR: for airflow-webserver  Container "2b3fbb5a9e97" exited with code 1.

ERROR: for airflow-worker  Container "2b3fbb5a9e97" exited with code 1.

ERROR: for flower  Container "2b3fbb5a9e97" exited with code 1.

ERROR: for airflow-triggerer  Container "2b3fbb5a9e97" exited with code 1.
ERROR: Encountered errors while bringing up the project.

when looking at the error using docker logs command:

Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/logging_config.py", line 41, in configure_logging
    logging_config = import_string(logging_class_path)
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/module_loading.py", line 32, in import_string
    module = import_module(module_path)
  File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 965, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'log_config'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/airflow/.local/bin/airflow", line 5, in <module>
    from airflow.__main__ import main
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/__init__.py", line 46, in <module>
    settings.initialize()
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/settings.py", line 483, in initialize
    LOGGING_CLASS_PATH = configure_logging()
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/logging_config.py", line 50, in configure_logging
    raise ImportError(f'Unable to load custom logging from {logging_class_path} due to {err}')
ImportError: Unable to load custom logging from log_config.LOGGING_CONFIG due to No module named 'log_config'

ERROR!!!: Too old Airflow version !
The minimum Airflow version supported: 2.2.0. Only use this or higher!

What you expected to happen

Being able to use my custom logger definition.

How to reproduce

No response

Anything else

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Comments: 19 (9 by maintainers)

Most upvoted comments

Of course. You should use “standalone” command of airflow. rather than docker-compose.

https://airflow.apache.org/docs/apache-airflow/stable/cli-and-env-variables-ref.html#standalone

You are trying to shoot yoursefl in the foot by complicating your setup (and drag busy maintainer in it).

I think also it’s good to read some advices:

First of all, you are on your own there (so any custom configuration there is definitely not a bug). Discussions are much better for that. You were advice to do that when you opened a “bug”.

Secondly - you shoud also listen to some helpful messages printed. Your problem with docker compose might for example come from this very straightforward error that you have:

ERROR!!!: Too old Airflow version !
The minimum Airflow version supported: 2.2.0. Only use this or higher!

I do not know if that’s the problem, but if the compose tells you that you use wrong version, I think you should think twice before opening an issue classifed as “bug”.

You could patch your docker-compose.yml, modifying the “command” section of the airflow-init service (see the last three lines of the following excerpt). When the airflow-init container start, the config directory is not in the python path…

airflow-init:
  <<: *airflow-common
  entrypoint: /bin/bash
  # yamllint disable rule:line-length
  command:
    - -c
    - |
      function ver() {
        printf "%04d%04d%04d%04d" $${1//./ }
      }
      export PYTHONPATH=${PYTHONPATH}:/sources/config
      mkdir -p /sources/logs /sources/dags /sources/plugins /sources/config
      chown -R "${AIRFLOW_UID}:0" /sources/{logs,dags,plugins,config}
    

Hi there,

I’m facing the same issue as @xelita . I want to implement custom logging based on the official documentation https://airflow.apache.org/docs/apache-airflow/stable/logging-monitoring/logging-tasks.html#advanced-configuration

Sure, for development it would be easier to test this locally without containers, but I will use the new logging configuration in a Kubernetes cluster and container based setup, which means the compose stack is just perfect to test it. I didn’t modified the compose file, just added the config folder, and 2 files, based on the documentation, and setted a variable. This is not a change, it is an extension. So when we’re using the same containers anywhere (docker, compose, Kubernetes), this issue will happen. I don’t see why this is a compose related problem.

Summarizing: following the steps in the documentation leads to an error in the container startup. @xelita did you found a solution?

No I have followed @potiuk recommendation and we are now using the official HELM chart to deploy Airflow which actually works fine.