I am trying to start using logging in python and have read several blogs. One issue that is causing confusion for me is whether to create the logger per function or per module. In this Blog: Good logging practice in Python it is recommended to get a logger per function. For example:
import logging
def foo():
logger = logging.getLogger(__name__)
logger.info('Hi, foo')
class Bar(object):
def __init__(self, logger=None):
self.logger = logger or logging.getLogger(__name__)
def bar(self):
self.logger.info('Hi, bar')
The reasoning given is that
The logging.fileConfig and logging.dictConfig disables existing loggers by default. So, those setting in file will not be applied to your logger. It’s better to get the logger when you need it. It’s cheap to create or get a logger.
The recommended way I read everywhere else is like as shown below. The blog states that this approach "looks harmless, but actually, there is a pitfall"
.
import logging
logger = logging.getLogger(__name__)
def foo():
logger.info('Hi, foo')
class Bar(object):
def bar(self):
logger.info('Hi, bar')
I find the former approach to be tedious as I would have to remember to get the logger in each function. Additionally getting the logger in each function is surely more expensive than once per module. Is the author of the blog advocating a non-issue? Would following logging best practices avoid this issue?
The logging.fileConfig and logging.dictConfig disables existing loggers by default
; does this mean that if you use these functions to initialize loggers, other loggers initializing other loggers that are not configure by the dict/file will not work? – Courtney