Set logging levels
Asked Answered
P

5

153

I'm trying to use the standard library to debug my code:

This works fine:

import logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
logger.info('message')

I can't make work the logger for the lower levels:

logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger(__name__)
logger.info('message')

logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger(__name__)
logger.debug('message')

I don't get any response for neither of those.

Pooler answered 23/7, 2016 at 3:38 Comment(0)
C
205

What Python version? That works for me in 3.4. But note that basicConfig() won't affect the root handler if it's already setup:

This function does nothing if the root logger already has handlers configured for it.

To set the level on root explicitly do logging.getLogger().setLevel(logging.DEBUG). But ensure you've called basicConfig() before hand so the root logger initially has some setup. I.e.:

import logging
logging.basicConfig()
logging.getLogger().setLevel(logging.DEBUG)
logging.getLogger('foo').debug('bah')
logging.getLogger().setLevel(logging.INFO)
logging.getLogger('foo').debug('bah')

Also note that "Loggers" and their "Handlers" both have distinct independent log levels. So if you've previously explicitly loaded some complex logger config in you Python script, and that has messed with the root logger's handler(s), then this can have an effect, and just changing the loggers log level with logging.getLogger().setLevel(..) may not work. This is because the attached handler may have a log level set independently. This is unlikely to be the case and not something you'd normally have to worry about.

Corkhill answered 23/7, 2016 at 3:55 Comment(1)
You can send basicConfig a force=True now to make it work even if it's already set up.Gasperoni
L
23

I use the following setup for logging.

Yaml based config

Create a yaml file called logging.yml like this:

version: 1

formatters:
    simple:
        format: "%(name)s - %(lineno)d -  %(message)s"

    complex:
        format: "%(asctime)s - %(name)s - %(lineno)d -  %(message)s"


handlers:
    console:
        class: logging.StreamHandler
        level: DEBUG
        formatter: simple

    file:
        class: logging.handlers.TimedRotatingFileHandler
        when: midnight
        backupCount: 5
        level: DEBUG
        formatter: simple
        filename : Thrift.log

loggers:

    qsoWidget:
        level: INFO
        handlers: [console,file]
        propagate: yes

    __main__:   
        level: DEBUG
        handlers: [console]
        propagate: yes

Python - The main

The "main" module should look like this:

import logging.config
import logging
import yaml

with open('logging.yaml','rt') as f:
        config=yaml.safe_load(f.read())
        f.close()
logging.config.dictConfig(config)
logger=logging.getLogger(__name__)
logger.info("Contest is starting")

Sub Modules/Classes

These should start like this

import logging

class locator(object):
    def __init__(self):
        self.logger = logging.getLogger(__name__)
        self.logger.debug('{} initialized')

Hope that helps you...

Lenes answered 23/7, 2016 at 5:2 Comment(6)
Is qsoWidget just the name of your app?Polarity
Is it not best practice to include all logging configs in __init__.py? This is a question.Anemophilous
I would say - you should try and put your logging setup at the highest possible point in your projects. And then get the dependent modules to use it. This way you have consistent logging. Doing it module per module will result in different level, different log formatted output. This will make downstream log processing (Humio, Splunk, PIG) much more difficult.Lenes
Great example! Thank you. Setting up logger config via .yaml sounds like a very efficient solutionSpiceberry
@AlexSkorokhod I think yaml is readable, but having the logging in an external file, allows you to vary the log level, without touching the code.Lenes
@TimSeed: agree. I have just directed the INFO only messages from all modules into the main app joint logs, while DEBUG level into module-specific logs without changing the code. Only changing YAML. Looks very transparent from git-commit point of view as well!Spiceberry
L
11

In my opinion, this is the best approach for the majority of cases.

Configuration via an INI file

Create a filename logging.ini in the project root directory as below:

[loggers]
keys=root

[logger_root]
level=DEBUG
handlers=screen,file

[formatters]
keys=simple,verbose

[formatter_simple]
format=%(asctime)s [%(levelname)s] %(name)s: %(message)s

[formatter_verbose]
format=[%(asctime)s] %(levelname)s [%(filename)s %(name)s %(funcName)s (%(lineno)d)]: %(message)s

[handlers]
keys=file,screen

[handler_file]
class=handlers.TimedRotatingFileHandler
interval=midnight
backupCount=5
formatter=verbose
level=WARNING
args=('debug.log',)

[handler_screen]
class=StreamHandler
formatter=simple
level=DEBUG
args=(sys.stdout,)

Then configure it as below:

import logging

from logging.config import fileConfig

fileConfig('logging.ini')
logger = logging.getLogger('dev')


name = "stackoverflow"

logger.info(f"Hello {name}!")
logger.critical('This message should go to the log file.')
logger.error('So should this.')
logger.warning('And this, too.')
logger.debug('Bye!')

If you run the script, the sysout will be:

2021-01-31 03:40:10,241 [INFO] dev: Hello stackoverflow!
2021-01-31 03:40:10,242 [CRITICAL] dev: This message should go to the log file.
2021-01-31 03:40:10,243 [ERROR] dev: So should this.
2021-01-31 03:40:10,243 [WARNING] dev: And this, too.
2021-01-31 03:40:10,243 [DEBUG] dev: Bye!

And debug.log file should contain:

[2021-01-31 03:40:10,242] CRITICAL [my_loger.py dev <module> (12)]: This message should go to the log file.
[2021-01-31 03:40:10,243] ERROR [my_loger.py dev <module> (13)]: So should this.
[2021-01-31 03:40:10,243] WARNING [my_loger.py dev <module> (14)]: And this, too.

All done.

Loden answered 31/1, 2021 at 2:42 Comment(1)
This is a fantastic setup. Thank you!Intenerate
F
6

I wanted to leave the default logger at warning level but have detailed lower-level loggers for my code. But it wouldn't show anything. Building on the other answer, it's critical to run logging.basicConfig() beforehand.

import logging
logging.basicConfig()
logging.getLogger('foo').setLevel(logging.INFO)
logging.getLogger('foo').info('info')
logging.getLogger('foo').debug('info')
logging.getLogger('foo').setLevel(logging.DEBUG)
logging.getLogger('foo').info('info')
logging.getLogger('foo').debug('debug')

Outputs expected

INFO:foo:info
INFO:foo:info
DEBUG:foo:debug

For a logging solution across modules, I did this

# cfg.py

import logging
logging.basicConfig()
logger = logging.getLogger('foo')
logger.setLevel(logging.INFO)
logger.info(f'active')

# main.py

import cfg
cfg.logger.info(f'main')
Footwear answered 12/6, 2021 at 16:50 Comment(0)
C
2

This works for me; and also for IPython/ Jupyter notebooks. Note Python 3.10

import logging as log
from datetime import datetime

time_hash=str(datetime.now()).strip()
outfile = "./out_" +time_hash +".log" # if you need to log to file
log.basicConfig(
    level=log.INFO,
    #format="%(asctime)s [%(levelname)s] %(message)s", # 
    format="[%(levelname)s] %(message)s",# dont need timing
    handlers=[
        #log.FileHandler(outfile),# dont need file logs
        log.StreamHandler()
    ],
    force = True
)

Note - the last one force=True from here https://mcmap.net/q/65882/-get-output-from-the-logging-module-in-ipython-notebook for Ipython notebooks

Cambria answered 15/9, 2023 at 12:21 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.