I'm trying to get Celery logging working with Django
. I have logging set-up in settings.py
to go to console (that works fine as I'm hosting on Heroku
). At the top of each module, I have:
import logging
logger = logging.getLogger(__name__)
And in my tasks.py, I have:
from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)
That works fine for logging calls from a task and I get output like this:
2012-11-13T18:05:38+00:00 app[worker.1]: [2012-11-13 18:05:38,527: INFO/PoolWorker-2] Syc feed is starting
But if that task then calls a method in another module, e.g. a queryset
method, I get duplicate log entries, e.g.
2012-11-13T18:00:51+00:00 app[worker.1]: [INFO] utils.generic_importers.ftp_processor process(): File xxx.csv already imported. Not downloaded
2012-11-13T18:00:51+00:00 app[worker.1]: [2012-11-13 18:00:51,736: INFO/PoolWorker-6] File xxx.csv already imported. Not downloaded
I think I could use
CELERY_HIJACK_ROOT_LOGGER = False
to just use the Django
logging but this didn't work when I tried it and even if I did get it to work, I would lose the "PoolWorker-6"
bit which I do want. (Incidentally, I can't figure out how to get the task name to display in the log entry from Celery, as the docs seems to indicate that it should).
I suspect I'm missing something simple here.
celery
logger which all other loggers inherits from (you can create a new one withcelery.utils.get_logger
, and there's thecelery.task
logger, which also inherits from thecelery
logger but does not propagate to its handlers, this is because it has a custom logging format (it includes the task id and so on). If you set up logging manually you should configure them both, with a custom logger format forcelery.task
– Wassyngton