Airflow: ValueError: Unable to configure handler 'processor' - wasb logger
Asked Answered
S

4

13

I am trying to configure remote logging with Azure blob.

Airflow version: 1.10.2
Python: 3.6.5
Ubuntu: 18.04

Following are the step I did:

  1. In $AIRFLOW_HOME/config/log_config.py, I have put REMOTE_BASE_LOG_FOLDER = 'wasb-airflow-logs' (This is a folder inside the container (container name: airflow-logs))
  2. Empty init.py is in $AIRFLOW_HOME/config/
  3. $AIRFLOW_HOME/config/ is added in $PYTHONPATH
  4. Renamed DEFAULT_LOGGING_CONFIG to LOGGING CONFIG everywhere in $AIRFLOW_HOME/config/log_config.py
  5. User defined in Airflow blob connection has read/write access to REMOTE_BASE_LOG_FOLDER
  6. $AIRFLOW_HOME/airflow.cfg it has remote_logging = True logging_config_class = log_config.LOGGING_CONFIG remote_log_conn_id =

Following is the error:

Unable to load the config, contains a configuration error.
Traceback (most recent call last):
  File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 382, in resolve
    found = getattr(found, frag)
AttributeError: module 'airflow.utils.log' has no attribute 'wasb_task_handler'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 384, in resolve
    self.importer(used)
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/utils/log/wasb_task_handler.py", line 23, in <module>
    from airflow.contrib.hooks.wasb_hook import WasbHook
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/contrib/hooks/wasb_hook.py", line 22, in <module>
    from airflow.hooks.base_hook import BaseHook
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/hooks/base_hook.py", line 28, in <module>
    from airflow.models import Connection
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/models.py", line 86, in <module>
    from airflow.utils.dag_processing import list_py_file_paths
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 49, in <module>
    from airflow.settings import logging_class_path
ImportError: cannot import name 'logging_class_path'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 558, in configure
    handler = self.configure_handler(handlers[name])
  File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 708, in configure_handler
    klass = self.resolve(cname)
  File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 391, in resolve
    raise v
  File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 384, in resolve
    self.importer(used)
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/utils/log/wasb_task_handler.py", line 23, in <module>
    from airflow.contrib.hooks.wasb_hook import WasbHook
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/contrib/hooks/wasb_hook.py", line 22, in <module>
    from airflow.hooks.base_hook import BaseHook
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/hooks/base_hook.py", line 28, in <module>
    from airflow.models import Connection
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/models.py", line 86, in <module>
    from airflow.utils.dag_processing import list_py_file_paths
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 49, in <module>
    from airflow.settings import logging_class_path
ValueError: Cannot resolve 'airflow.utils.log.wasb_task_handler.WasbTaskHandler': cannot import name 'logging_class_path'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/gsingh/venv/bin/airflow", line 21, in <module>
    from airflow import configuration
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/__init__.py", line 36, in <module>
    from airflow import settings, configuration as conf
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/settings.py", line 262, in <module>
    logging_class_path = configure_logging()
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/logging_config.py", line 73, in configure_logging
    raise e
  File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/logging_config.py", line 68, in configure_logging
    dictConfig(logging_config)
  File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 795, in dictConfig
    dictConfigClass(config).configure()
  File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 566, in configure
    '%r: %s' % (name, e))
ValueError: Unable to configure handler 'processor': Cannot resolve 'airflow.utils.log.wasb_task_handler.WasbTaskHandler': cannot import name 'logging_class_path'

I am not sure which configuration I am missing. Has anyone faced the same issue?

Shotwell answered 21/3, 2019 at 6:4 Comment(0)
B
3

You need to install the azure package.

pip install 'apache-airflow[azure_blob_storage,azure_data_lake,azure_cosmos,azure_container_instances]

As per updating.md

This now should be installed with

pip install apache-airflow[azure]

But this didn't work for me.

Butterfat answered 27/5, 2019 at 8:49 Comment(2)
pip install apache-airflow[azure] has been deprecated in airflow 1.10.3;Denial
Good to know, poorly documented.Butterfat
E
13

I had the same error however if I scrolled up higher I could see that there was another exception thrown before the ValueError. Which was a PermissionError.

PermissionError: [Errno 13] Permission denied: '/usr/local/airflow/logs/scheduler'

The reason I got that error is because I didn't create the initial 3 folders (dags, logs, plugins) before running airflow docker container. So docker seems to have created then automatically but the permissions were wrong.

Steps to fix:

  1. Stop current container
docker-compose down --volumes --remove-orphans
  1. Delete folders dags, logs, plugins
  2. Just in case Destroy the images and volumes already created (in Docker Desktop)
  3. Create folders again from command line
mkdir logs dags plugins
  1. run airflow docker again
docker-compose up airflow-init
docker-compose up
Einsteinium answered 18/1, 2023 at 16:59 Comment(1)
@Shotwell perhaps you want to reconsider marking my answer as the solution as it seems to help more people?Einsteinium
I
6

sudo chown 50000:0 dags logs plugins in my case.

I tried to run official docker-compose.yml with all these containers (which are dependent on these 3 volume forwards) or simply wrap airflow standalone into a single container for a debug purpose. Turned out volumes were created with root ownerships instead of airflows.

Internationalism answered 18/8, 2022 at 23:40 Comment(0)
B
3

You need to install the azure package.

pip install 'apache-airflow[azure_blob_storage,azure_data_lake,azure_cosmos,azure_container_instances]

As per updating.md

This now should be installed with

pip install apache-airflow[azure]

But this didn't work for me.

Butterfat answered 27/5, 2019 at 8:49 Comment(2)
pip install apache-airflow[azure] has been deprecated in airflow 1.10.3;Denial
Good to know, poorly documented.Butterfat
W
0

For operating systems other than Linux, you may get a warning that AIRFLOW_UID is not set, but you can safely ignore it. You can also manually create an .env file in the same folder as docker-compose.yaml with this content to get rid of the warning:

AIRFLOW_UID=50000

Winger answered 23/5 at 6:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.