Using logging in multiple modules
Asked Answered
M

12

426

I have a small python project that has the following structure -

Project 
 -- pkg01
   -- test01.py
 -- pkg02
   -- test02.py
 -- logging.conf

I plan to use the default logging module to print messages to stdout and a log file. To use the logging module, some initialization is required -

import logging.config

logging.config.fileConfig('logging.conf')
logger = logging.getLogger('pyApp')

logger.info('testing')

At present, I perform this initialization in every module before I start logging messages. Is it possible to perform this initialization only once in one place such that the same settings are reused by logging all over the project?

Morganstein answered 31/3, 2013 at 7:39 Comment(4)
In response to your comment on my answer: you don't have to call fileConfig in every module that does logging, unless you have if __name__ == '__main__' logic in all of them. prost's answer is not good practice if the package is a library, though it might work for you - one should not configure logging in library packages, other than to add a NullHandler.Peachy
prost implied that we need to call the import and logger stmts in every module, and only call the fileconfig stmt in the main module. isnt that similar to what you are saying?Morganstein
prost is saying that you should put the logging config code in package/__init__.py. That's not normally the place you put if __name__ == '__main__' code. Also, prost's example looks like it will call the config code unconditionally on import, which doesn't look right to me. Generally, logging config code should be done in one place and should not happen as a side-effect of import except when you're importing __main__.Peachy
How about using builtins, https://mcmap.net/q/82070/-how-to-define-logging-in-one-place-in-python?Himyaritic
P
457

Best practice is, in each module, to have a logger defined like this:

import logging
logger = logging.getLogger(__name__)

near the top of the module, and then in other code in the module do e.g.

logger.debug('My message with %s', 'variable data')

If you need to subdivide logging activity inside a module, use e.g.

loggerA = logging.getLogger(__name__ + '.A')
loggerB = logging.getLogger(__name__ + '.B')

and log to loggerA and loggerB as appropriate.

In your main program or programs, do e.g.:

def main():
    "your program code"

if __name__ == '__main__':
    import logging.config
    logging.config.fileConfig('/path/to/logging.conf')
    main()

or

def main():
    import logging.config
    logging.config.fileConfig('/path/to/logging.conf')
    # your program code

if __name__ == '__main__':
    main()

See here for logging from multiple modules, and here for logging configuration for code which will be used as a library module by other code.

Update: When calling fileConfig(), you may want to specify disable_existing_loggers=False if you're using Python 2.6 or later (see the docs for more information). The default value is True for backward compatibility, which causes all existing loggers to be disabled by fileConfig() unless they or their ancestor are explicitly named in the configuration. With the value set to False, existing loggers are left alone. If using Python 2.7/Python 3.2 or later, you may wish to consider the dictConfig() API which is better than fileConfig() as it gives more control over the configuration.

Peachy answered 31/3, 2013 at 22:9 Comment(15)
if you look at my example, i am already doing what you suggest above. my question was how do i centralize this logging initialization such that i dont have to repeat those 3 statements. also, in your example you missed the 'logging.config.fileConfig('logging.conf')' stmt. this stmt is actually the root cause of my concern. you see, if i have initiate the logger in every module, i would have to type this stmt in every module. that would mean tracking the path of conf file in every module, which does not look like a best practice to me (imagine the havoc when changing module/package locations).Morganstein
If you call fileConfig after creating the logger, whether in the same or in another module (e.g. when you create the logger at the top of the file) does not work. The logging configuration only applies to loggers created after. So this approach does not work or is not a viable option for multiple modules. @Quest Monger: You can always create another file that holds the location of the config file..;)Hindermost
@Oxidator: Not necessarily - see the disable_existing_loggers flag which is True by default but can be set to False.Peachy
@Vinay Sajip, thank you! Finally I can make it work in my main..:) I'm not entirely sure I understand why this works though, care to enlighten me? And perhaps add it to your post..;)Hindermost
@Vinay Sajip, thank you. Do you have recommendations for loggers that work in modules but outside classes as well? Since imports are done before the main function is called, those logs will have already been logged. I guess setting up your logger before all imports in the main module is the only way? This logger could then be overwritten in main, if you like.Hindermost
@Oxidator - no, you can create loggers anywhere, whether inside or outside classes. Ask a separate question if you need to - comments aren't the appropriate place for asking new questions.Peachy
Note that the official 2.6 docs are misleading and do not mention disable_existing_loggers. You can check the pydoc or the source to reassure yourself that it does in fact exist in that version.Simonize
Hi Vinay, good answer. I just want to know what do you mean by With the value set to False, existing loggers are left alone. Does that mean previously defined logger could also load settings from dictConfig() which are invoked later?Cantillate
If I want to have all my module specific loggers have a logging level different from the default WARNING, will I have to make that setting on each module? Say, I want to have all my modules log at INFO.Zeeba
this doesn't seem to work for me (using flask with python 3.6). the only way it logs is when i use hard coded the string "flask.app" as the logger's name. for example: logger = logging.getLogger('flask.app') is there a way to make it work with __name__ ? my flask app is also configured with __name__: app = Flask(__name__)Gaselier
@BenYitzhaki what is the module name of the Flask app, i.e. the value of __name__ for the logger? If it is e.g. foo.bar, then you would need to ensure that appropriate handlers are configured for loggers foo.bar, foo or the root logger.Peachy
@VinaySajip yeah thanks. i actually added my own answer as well that describes how i handled with this oneGaselier
I think that it is very important to clarify whether initializing the logger with getLogger or the configuration of the logger in the main file should happen before importing other modules, which get their loggers from calling getLogger.Wingspread
It seems like even if I import + getLogger before configuring the logger, the configured changes do get applied on those existing loggers.Nich
Also make sure you are using logging.getLogger() and not logging.Logger()... this confused me for a good few hoursNich
C
284

Actually every logger is a child of the parent's package logger (i.e. package.subpackage.module inherits configuration from package.subpackage), so all you need to do is just to configure the root logger. This can be achieved by logging.config.fileConfig (your own config for loggers) or logging.basicConfig (sets the root logger). Setup logging in your entry module (__main__.py or whatever you want to run, for example main_script.py. __init__.py works as well)

using basicConfig:

# package/__main__.py
import logging
import sys

logging.basicConfig(stream=sys.stdout, level=logging.INFO)

using fileConfig:

# package/__main__.py
import logging
import logging.config

logging.config.fileConfig('logging.conf')

and then create every logger using:

# package/submodule.py
# or
# package/subpackage/submodule.py
import logging
log = logging.getLogger(__name__)

log.info("Hello logging!")

For more information see Advanced Logging Tutorial.

Charlena answered 31/3, 2013 at 13:11 Comment(16)
this is, by far, the simplest solution to the problem, not to mention it exposes & leverages the parent-child relationship between modules, something that i as a noob was unaware of. danke.Morganstein
you are right. and as vinay pointed out in his post, your solution is right as long as its not in the init.py module. your solution did work when i applied it to main module (point of entry).Morganstein
actually much more relevant answer since the the question is concerned with separate modules.Cartload
Dumb question perhaps: if there is no logger in __main__.py (e.g. If I want to use the module in a script that has no logger) will logging.getLogger(__name__) still do some kind of logging in the module or will it raise an exception?Olly
@Olly I'm not sure if I understand your question. Do you mean that you have no logging.basicConfig or logging.config.fileConfig? You can definitely use logging.getLogger and do some logging, it just would not print anything anywhere. Many libraries do logging, but they leave the logging setup (like where the logging messages go) to their users.Charlena
Thanks @StanProkop. Yeah I guess that's my issue with this. I like the idea of the module 'inheriting' the logger and logging config from the main script but if its a general purpose module I might not always have logging in my main script. (In my case I am user and 'developer'). What if I'm importing the module from the interpreter for instance?Olly
Is there a way to do something like if logging.getLogger(__name__): ... else: logging.basicConfig(... )?Olly
@Olly when I play with my own or a third party libraries which use logging I use ipython with autoreload (so any saved changes in my libraries are auto reloaded) and I set logging just once in the beginning using basicConfig. In tests I use pytest, it captures logging and prints it on failures. Sometimes I even fail tests intentionally just to see logging.Charlena
Anyway, to check if logging was set - you can try to see if logging.root.handlers are empty. But I don't recommend it, there might be some caveats.Charlena
Finally. I had a working logger, but it failed in Windows for Parallel runs with joblib. I guess this is a manual tweak to the system -- something else is wrong with Parallel. But, it surely works! ThanksBarton
@StanProkop Here you are proposing using root logger for all submodules. But, what if I want to configure the logger of individual submodules differently?Shrubbery
@GopinathS well I was trying to solve the problem stated in the question, where actually all submodules were configurated individually (although with the very same configuration). You can definitely tweak configuration of any logger by getting the logger logging.getLogger and tune its configuration using Logger.setLevel and other methods. It's mentioned in the documentation I linked, section: "Configuring Logging for a Library". Hope that helps.Charlena
The newer version of the linked document: docs.python.org/3/howto/logging.htmlVitric
thanks @aderchox, I updated all links to Python 3 documentationCharlena
Can you give add an example of overwriting the config file. This method does not seem to be easy to do this.Amylene
hey @Amylene , I'm not sure if I understand your question completely, but this could help: docs.python.org/3/howto/logging.html#configuring-logging Feel free to post a stackoverflow question if that's not what you were asking for.Charlena
B
65

A simple way of using one instance of logging library in multiple modules for me was following solution:

base_logger.py

import logging

logger = logging
logger.basicConfig(format='%(asctime)s - %(message)s', level=logging.INFO)

Other files

from base_logger import logger

if __name__ == '__main__':
    logger.info("This is an info message")
Bushman answered 2/1, 2020 at 7:52 Comment(3)
For my small project, this was the solution to go. Note that the root logger is a singleton, conveniently, which adds to the texture of this simple solution.Carolinacaroline
This is an underrated answer. If your project only needs a single logger, there's no need to create multiple loggers with getLogger(__name__). With this answer, you only need a single line to import/config the logger. I also prefer using basicConfig in code instead of fileConfig(logging.conf) because you can do dynamic configurations. Another variation is that you could delete logger = logging alias and just use logging.info() directly. Or you create a shorter alias like log=logging to use log.info().Burdett
Thanks - simple and wonderful. Could you somehow make this code to enable naming of the log file - driven by the main function?Surrealism
P
16

I always do it as below.

Use a single python file to config my log as singleton pattern which named 'log_conf.py'

#-*-coding:utf-8-*-

import logging.config

def singleton(cls):
    instances = {}
    def get_instance():
        if cls not in instances:
            instances[cls] = cls()
        return instances[cls]
    return get_instance()

@singleton
class Logger():
    def __init__(self):
        logging.config.fileConfig('logging.conf')
        self.logr = logging.getLogger('root')

In another module, just import the config.

from log_conf import Logger

Logger.logr.info("Hello World")

This is a singleton pattern to log, simply and efficiently.

Ppm answered 31/3, 2013 at 7:54 Comment(3)
thanks for detailing the singleton pattern. i was planning on implementing this, but then the @prost solution is much more simple and suits my needs perfectly. i however do see your solution being useful is bigger projects that have multiple points of entry (other than main). danke.Morganstein
This is useless. The root logger is already a singleton. Just use logging.info instead of Logger.logr.info.Intramuscular
Is it useless though? In complex project, when you have multiple components (sets of modules) and you want each component to have its own logger and all modules of that component to share the same logger, this I think would helpFound
U
15

I would like to add my solution (which is based on logging cookbook and other articles and suggestions from this thread. However it took me quite a while to figure out, why it wasn't immediately working how I expected. So I created a little test project to learn how logging is working.

Since I have figured it out, I wanted to share my solution, maybe it can be of help to someone.

I know some of my code might not be best practice, but I am still learning. I left the print() functions in there, as I used them, while logging was not working as expected. Those are removed in my other application. Also I welcome any feedback on any parts of the code or structure.

my_log_test project structure (cloned/simplified from another project I work on)

my_log_test
├── __init__.py
├── __main__.py
├── daemon.py
├── common
│   ├── my_logger.py
├── pkg1
│   ├── __init__.py
│   └── mod1.py
└── pkg2
    ├── __init__.py
    └── mod2.py

Requirements

A few things different or that I have not seen explicitly mentioned in the combination I use:

  • the main module is daemon.pywhich is called by __main__.py
  • I want to be able to call the modules mod1.py and mod2.py seperately while in development/testing
  • At this point I did not want to use basicConfig() or FileConfig() but keep it like in the logging cookbook

So basically, that means, I need to initialize the root logger in daemon.py (always) and in the modules mod1.py and mod2.py (only when calling them directly).

To make this init in several modules easier, I created my_logger.py which does, what is described in the cookbook.

My mistakes

Beforehand, my mistake in that module was to init the logger with logger = logging.getLogger(__name__) (module logger) instead of using logger = logging.getLogger() (to get the root logger).

The first problem was, that when called from daemon.py the logger's namespace was set to my_log_test.common.my_logger. The module logger in mod1.py with an "unmatching" namespace my_log_test.pkg1.mod1 could hence not attach to the other logger and I would see no log output from mod1.

The second "problem" was, that my main program is in daemon.py and not in __main__.py. But after all not a real problem for me, but it added to the namespace confusion.

Working solution

This is from the cookbook but in a separate module. I also added a logger_cleanup function that I can call from daemon, to remove logs older than x days.

## my_logger.py
from datetime import datetime
import time
import os

## Init logging start 
import logging
import logging.handlers

def logger_init():
    print("print in my_logger.logger_init()")
    print("print my_logger.py __name__: " +__name__)
    path = "log/"
    filename = "my_log_test.log"

    ## get logger
    #logger = logging.getLogger(__name__) ## this was my mistake, to init a module logger here
    logger = logging.getLogger() ## root logger
    logger.setLevel(logging.INFO)

    # File handler
    logfilename = datetime.now().strftime("%Y%m%d_%H%M%S") + f"_{filename}"
    file = logging.handlers.TimedRotatingFileHandler(f"{path}{logfilename}", when="midnight", interval=1)
    #fileformat = logging.Formatter("%(asctime)s [%(levelname)s] %(message)s")
    fileformat = logging.Formatter("%(asctime)s [%(levelname)s]: %(name)s: %(message)s")
    file.setLevel(logging.INFO)
    file.setFormatter(fileformat)

    # Stream handler
    stream = logging.StreamHandler()
    #streamformat = logging.Formatter("%(asctime)s [%(levelname)s:%(module)s] %(message)s")
    streamformat = logging.Formatter("%(asctime)s [%(levelname)s]: %(name)s: %(message)s")
    stream.setLevel(logging.INFO)
    stream.setFormatter(streamformat)

    # Adding all handlers to the logs
    logger.addHandler(file)
    logger.addHandler(stream)


def logger_cleanup(path, days_to_keep):
    lclogger = logging.getLogger(__name__)
    logpath = f"{path}"
    now = time.time()
    for filename in os.listdir(logpath):
        filestamp = os.stat(os.path.join(logpath, filename)).st_mtime
        filecompare = now - days_to_keep * 86400
        if  filestamp < filecompare:
            lclogger.info("Delete old log " + filename)
            try:
                os.remove(os.path.join(logpath, filename))
            except Exception as e:
                lclogger.exception(e)
                continue

to run deamon.py (through __main__.py) use python3 -m my_log_test

## __main__.py
from  my_log_test import daemon

if __name__ == '__main__':
    print("print in __main__.py")
    daemon.run()

to run deamon.py (directly) use python3 -m my_log_test.daemon

## daemon.py
from datetime import datetime
import time
import logging
import my_log_test.pkg1.mod1 as mod1
import my_log_test.pkg2.mod2 as mod2

## init ROOT logger from my_logger.logger_init()
from my_log_test.common.my_logger import logger_init
logger_init() ## init root logger
logger = logging.getLogger(__name__) ## module logger

def run():
    print("print in daemon.run()")
    print("print daemon.py __name__: " +__name__)
    logger.info("Start daemon")
    loop_count = 1
    while True:
        logger.info(f"loop_count: {loop_count}")
        logger.info("do stuff from pkg1")
        mod1.do1()
        logger.info("finished stuff from pkg1")

        logger.info("do stuff from pkg2")
        mod2.do2()
        logger.info("finished stuff from pkg2")

        logger.info("Waiting a bit...")
        time.sleep(30)


if __name__ == '__main__':
    try:
        print("print in daemon.py if __name__ == '__main__'")
        logger.info("running daemon.py as main")
        run()
    except KeyboardInterrupt as e:
        logger.info("Program aborted by user")
    except Exception as e:
        logger.info(e)

To run mod1.py (directly) use python3 -m my_log_test.pkg1.mod1

## mod1.py
import logging
# mod1_logger = logging.getLogger(__name__)
mod1_logger = logging.getLogger("my_log_test.daemon.pkg1.mod1") ## for testing, namespace set manually

def do1():
    print("print in mod1.do1()")
    print("print mod1.py __name__: " +__name__)
    mod1_logger.info("Doing someting in pkg1.do1()")

if __name__ == '__main__':
    ## Also enable this pkg to be run directly while in development with
    ## python3 -m my_log_test.pkg1.mod1

    ## init root logger
    from my_log_test.common.my_logger import logger_init
    logger_init() ## init root logger

    print("print in mod1.py if __name__ == '__main__'")
    mod1_logger.info("Running mod1.py as main")
    do1()

To run mod2.py (directly) use python3 -m my_log_test.pkg2.mod2

## mod2.py
import logging
logger = logging.getLogger(__name__)

def do2():
    print("print in pkg2.do2()")
    print("print mod2.py __name__: " +__name__) # setting namespace through __name__
    logger.info("Doing someting in pkg2.do2()")

if __name__ == '__main__':
    ## Also enable this pkg to be run directly while in development with
    ## python3 -m my_log_test.pkg2.mod2

    ## init root logger
    from my_log_test.common.my_logger import logger_init
    logger_init() ## init root logger

    print("print in mod2.py if __name__ == '__main__'")
    logger.info("Running mod2.py as main")
    do2()

Happy if it helps. Happy to receive feedback as well!

Underdeveloped answered 2/11, 2021 at 18:6 Comment(1)
Thank you as you mentioned, it worked when I used the root logger insteadMonazite
C
14

Throwing in another solution.

In my module's init.py I have something like:

# mymodule/__init__.py
import logging

def get_module_logger(mod_name):
  logger = logging.getLogger(mod_name)
  handler = logging.StreamHandler()
  formatter = logging.Formatter(
        '%(asctime)s %(name)-12s %(levelname)-8s %(message)s')
  handler.setFormatter(formatter)
  logger.addHandler(handler)
  logger.setLevel(logging.DEBUG)
  return logger

Then in each module I need a logger, I do:

# mymodule/foo.py
from [modname] import get_module_logger
logger = get_module_logger(__name__)

When the logs are missed, you can differentiate their source by the module they came from.

Corner answered 12/11, 2015 at 1:11 Comment(4)
What does "my module's main init" mean? And "Then in each class I need a logger, I do:"? Can you provide a sample called_module.py, and an example of its usage as an import from module caller_module.py ? See this answer for an inspiration of the format I'm asking about. Not trying to be patronising. I am trying to understand your answer and I know I would if you wrote it that way.Heritable
@Heritable I clarified.Corner
Thanks - this helped me finally get it working. How would you make it more like I want. I have one main file (call it main.py) where I am calling different modules. I would like this main.py to set the name for the log file. With your solution it is not possible.Surrealism
I figured it out. In the main.py I use logger = get_module_logger('filename') and this is done before importing any modulesSurrealism
C
13

Several of these answers suggest that at the top of a module you you do

import logging
logger = logging.getLogger(__name__)

It is my understanding that this is considered very bad practice. The reason is that the file config will disable all existing loggers by default. E.g.

#my_module
import logging

logger = logging.getLogger(__name__)

def foo():
    logger.info('Hi, foo')

class Bar(object):
    def bar(self):
        logger.info('Hi, bar')

And in your main module :

#main
import logging

# load my module - this now configures the logger
import my_module

# This will now disable the logger in my module by default, [see the docs][1] 
logging.config.fileConfig('logging.ini')

my_module.foo()
bar = my_module.Bar()
bar.bar()

Now the log specified in logging.ini will be empty, as the existing logger was disabled by fileconfig call.

While is is certainly possible to get around this (disable_existing_Loggers=False), realistically many clients of your library will not know about this behavior, and will not receive your logs. Make it easy for your clients by always calling logging.getLogger locally. Hat Tip : I learned about this behavior from Victor Lin's Website.

So good practice is instead to always call logging.getLogger locally. E.g.

#my_module
import logging

logger = logging.getLogger(__name__)

def foo():
    logging.getLogger(__name__).info('Hi, foo')

class Bar(object):
    def bar(self):
        logging.getLogger(__name__).info('Hi, bar')    

Also, if you use fileconfig in your main, set disable_existing_loggers=False, just in case your library designers use module level logger instances.

Conjunctivitis answered 18/3, 2016 at 11:34 Comment(14)
Can you not run logging.config.fileConfig('logging.ini') before import my_module? As suggested in this answer.Heritable
Not sure - but it would definitely also be considered bad practice to mix imports and executable code in that way. You also don't want your clients to have to check whether they need to configure logging before they import, especially when there is a trivial alternative! Imagine if a widely used library like requests had done that....!Conjunctivitis
"Not sure - but it would definitely also be considered bad practice to mix imports and executable code in that way." - why?Heritable
I'm not too clear on why that is bad. And I don't fully understand your example. Can you post your config for this example and show some usage?Heritable
Pep 8 says that imports should come before module level variables: see eg #1189140 and this is basically the same idea in principlal - setting module level state before finishing imports.Conjunctivitis
There is a fully worked out example at the link in the original answer.Conjunctivitis
I would definitely call you good practice a bad practice. You try taking control from the user but in Python we expect the user to be a reasonable person. If I now do a configuration and use disable_existing_loggers=True I will be very surprised to find out that you ignore this and still log.Helmut
You don't really need line 4 (logger = logging.getLogger(__name__)) in the last "good practice" example.Fy
You seem to contradict the official docs: 'A good convention to use when naming loggers is to use a module-level logger, in each module which uses logging, named as follows: logger = logging.getLogger(__name__)'Asclepiadean
Suppose that we have a lot of logging statement in my_module, writing logging.getLogger(__name__).info("some message") every time is tedious. I do not think it is a good practice.Gymnastic
@Gymnastic obviously the idea here is to do, inside a function, logger = logging.getLogger(__name__), and then use logger.info("message")Dusty
@Asclepiadean - the official docs refer to the name of the logger. The name of the logger is the same either way since both use the module name variable to name them. The explanation in the docs makes it clear that is the reason.Conjunctivitis
@Conjunctivitis The mentioned docs suggest 'to use a module-level logger' which is not what you're suggesting.Asclepiadean
In both cases there is exactly one module level logger. The logging framework stores it after first call, and so getLogger(name) gets the single module level logger every single time you call it.Conjunctivitis
S
5

You could also come up with something like this!

def get_logger(name=None):
    default = "__app__"
    formatter = logging.Formatter('%(levelname)s: %(asctime)s %(funcName)s(%(lineno)d) -- %(message)s',
                              datefmt='%Y-%m-%d %H:%M:%S')
    log_map = {"__app__": "app.log", "__basic_log__": "file1.log", "__advance_log__": "file2.log"}
    if name:
        logger = logging.getLogger(name)
    else:
        logger = logging.getLogger(default)
    fh = logging.FileHandler(log_map[name])
    fh.setFormatter(formatter)
    logger.addHandler(fh)
    logger.setLevel(logging.DEBUG)
    return logger

Now you could use multiple loggers in same module and across whole project if the above is defined in a separate module and imported in other modules were logging is required.

a=get_logger("__app___")
b=get_logger("__basic_log__")
a.info("Starting logging!")
b.debug("Debug Mode")
Spongin answered 11/11, 2013 at 18:29 Comment(0)
S
5

@Yarkee's solution seemed better. I would like to add somemore to it -

class Singleton(type):
    _instances = {}

    def __call__(cls, *args, **kwargs):
        if cls not in cls._instances.keys():
            cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
        return cls._instances[cls]


class LoggerManager(object):
    __metaclass__ = Singleton

    _loggers = {}

    def __init__(self, *args, **kwargs):
        pass

    @staticmethod
    def getLogger(name=None):
        if not name:
            logging.basicConfig()
            return logging.getLogger()
        elif name not in LoggerManager._loggers.keys():
            logging.basicConfig()
            LoggerManager._loggers[name] = logging.getLogger(str(name))
        return LoggerManager._loggers[name]    


log=LoggerManager().getLogger("Hello")
log.setLevel(level=logging.DEBUG)

So LoggerManager can be a pluggable to the entire application. Hope it makes sense and value.

Spongin answered 6/1, 2014 at 9:28 Comment(1)
The logging module already deals with singletons. logging.getLogger("Hello") will get the same logger across all of your modules.Intramuscular
G
2

There are several answers. i ended up with a similar yet different solution that makes sense to me, maybe it will make sense to you as well. My main objective was to be able to pass logs to handlers by their level (debug level logs to the console, warnings and above to files):

from flask import Flask
import logging
from logging.handlers import RotatingFileHandler

app = Flask(__name__)

# make default logger output everything to the console
logging.basicConfig(level=logging.DEBUG)

rotating_file_handler = RotatingFileHandler(filename="logs.log")
rotating_file_handler.setLevel(logging.INFO)

app.logger.addHandler(rotating_file_handler)

created a nice util file named logger.py:

import logging

def get_logger(name):
    return logging.getLogger("flask.app." + name)

the flask.app is a hardcoded value in flask. the application logger is always starting with flask.app as its the module's name.

now, in each module, i'm able to use it in the following mode:

from logger import get_logger
logger = get_logger(__name__)

logger.info("new log")

This will create a new log for "app.flask.MODULE_NAME" with minimum effort.

Gaselier answered 19/12, 2018 at 10:13 Comment(0)
P
2

The best practice would be to create a module separately which has only one method whose task we be to give a logger handler to the the calling method. Save this file as m_logger.py

import logger, logging

def getlogger():
    # logger
    logger = logging.getLogger(__name__)
    logger.setLevel(logging.DEBUG)
    # create console handler and set level to debug
    #ch = logging.StreamHandler()
    ch = logging.FileHandler(r'log.txt')
    ch.setLevel(logging.DEBUG)
    # create formatter
    formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
    # add formatter to ch
    ch.setFormatter(formatter)
    # add ch to logger
    logger.addHandler(ch)
    return logger

Now call the getlogger() method whenever logger handler is needed.

from m_logger import getlogger
logger = getlogger()
logger.info('My mssg')
Podium answered 6/6, 2019 at 9:39 Comment(3)
This is good if you don't have any additional parameters. But if, let's say, you have --debug option in the app and want to set logging level in all loggers in your app based on this parameter...Durman
@TheGodfather Yes this is hard to achieve by this methodology. What we can do in this situation is that create a class for which would take formatter as parameter at the time of object creation and would have the similar function to return the logger handler.What are your views on this?Podium
Yes,I did similar thing, made get_logger(level=logging.INFO) to return some kind of singleton, so when it called first time from the main app, it initializes the logger and handlers with the proper level and then returns the same logger object to all other methods.Durman
L
1

New to python so I don't know if this is advisable, but it works great for not re-writing boilerplate.

Your project must have an init.py so it can be loaded as a module

# Put this in your module's __init__.py
import logging.config
import sys

# I used this dictionary test, you would put:
# logging.config.fileConfig('logging.conf')
# The "" entry in loggers is the root logger, tutorials always 
# use "root" but I can't get that to work
logging.config.dictConfig({
    "version": 1,
    "formatters": {
        "default": {
            "format": "%(asctime)s %(levelname)s %(name)s %(message)s"
        },
    },
    "handlers": {
        "console": {
            "level": 'DEBUG',
            "class": "logging.StreamHandler",
            "stream": "ext://sys.stdout"
        }
    },
    "loggers": {
        "": {
            "level": "DEBUG",
            "handlers": ["console"]
        }
    }
})

def logger():
    # Get the name from the caller of this function
    return logging.getLogger(sys._getframe(1).f_globals['__name__'])

sys._getframe(1) suggestion comes from here

Then to use your logger in any other file:

from [your module name here] import logger

logger().debug("FOOOOOOOOO!!!")

Caveats:

  1. You must run your files as modules, otherwise import [your module] won't work:
    • python -m [your module name].[your filename without .py]
  2. The name of the logger for the entry point of your program will be __main__, but any solution using __name__ will have that issue.
Livvie answered 14/10, 2019 at 10:33 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.