Celery beat not picking up periodic tasks
Asked Answered
S

5

19

I am trying to get started with celery, but I can't get my task up and running. I have installed django-celery-beat and celery4.

My settings file.

Installed apps (with celery packages)

...
'django_celery_beat',
'django_celery_results' 

the celery configuration

CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

celery.py

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'sandbox.settings')

app = Celery('sandbox')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

and my simple task, which i configured to run through the admin panel of django celery beat.

from __future__ import absolute_import, unicode_literals
from sandbox.celery import app


@app.task()
def try_celery():
    print "Trying out Celery"

I am trying to run this task as a periodic task (beat) with cron tab as */2 * * * *

The log I am getting is,

$ celery -A sandbox worker --loglevel=debug

[2017-10-24 14:28:02,999: DEBUG/MainProcess] | Worker: Preparing bootsteps.
[2017-10-24 14:28:03,001: DEBUG/MainProcess] | Worker: Building graph...
[2017-10-24 14:28:03,002: DEBUG/MainProcess] | Worker: New boot order: {Beat, Timer, Hub, Pool, Autoscaler, StateDB, Consumer}
[2017-10-24 14:28:03,017: DEBUG/MainProcess] | Consumer: Preparing bootsteps.
[2017-10-24 14:28:03,017: DEBUG/MainProcess] | Consumer: Building graph...
[2017-10-24 14:28:03,038: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Mingle, Tasks, Control, Agent, Gossip, Heart, event loop}

 -------------- celery@mypc v4.1.0 (latentcall)
---- **** ----- 
--- * ***  * -- Linux-4.9.0-kali3-amd64-x86_64-with-Kali-kali-rolling-kali-rolling 2017-10-24 14:28:03
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         sandbox:0x7fe519d38610
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:     redis://localhost:6379/
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . celery.accumulate
  . celery.backend_cleanup
  . celery.chain
  . celery.chord
  . celery.chord_unlock
  . celery.chunks
  . celery.group
  . celery.map
  . celery.starmap
  . sandbox.applications.cats.try_celery
  . sandbox.celery.debug_task

[2017-10-24 14:28:03,053: DEBUG/MainProcess] | Worker: Starting Hub
[2017-10-24 14:28:03,053: DEBUG/MainProcess] ^-- substep ok
[2017-10-24 14:28:03,053: DEBUG/MainProcess] | Worker: Starting Pool
[2017-10-24 14:28:03,197: DEBUG/MainProcess] ^-- substep ok
[2017-10-24 14:28:03,198: DEBUG/MainProcess] | Worker: Starting Consumer
[2017-10-24 14:28:03,199: DEBUG/MainProcess] | Consumer: Starting Connection
[2017-10-24 14:28:03,216: INFO/MainProcess] Connected to redis://localhost:6379/0
[2017-10-24 14:28:03,217: DEBUG/MainProcess] ^-- substep ok
[2017-10-24 14:28:03,217: DEBUG/MainProcess] | Consumer: Starting Events
[2017-10-24 14:28:03,228: DEBUG/MainProcess] ^-- substep ok
[2017-10-24 14:28:03,229: DEBUG/MainProcess] | Consumer: Starting Mingle
[2017-10-24 14:28:03,229: INFO/MainProcess] mingle: searching for neighbors
[2017-10-24 14:28:04,255: INFO/MainProcess] mingle: all alone
[2017-10-24 14:28:04,256: DEBUG/MainProcess] ^-- substep ok
[2017-10-24 14:28:04,256: DEBUG/MainProcess] | Consumer: Starting Tasks
[2017-10-24 14:28:04,273: DEBUG/MainProcess] ^-- substep ok
[2017-10-24 14:28:04,274: DEBUG/MainProcess] | Consumer: Starting Control
[2017-10-24 14:28:04,277: DEBUG/MainProcess] ^-- substep ok
[2017-10-24 14:28:04,277: DEBUG/MainProcess] | Consumer: Starting Gossip
[2017-10-24 14:28:04,281: DEBUG/MainProcess] ^-- substep ok
[2017-10-24 14:28:04,282: DEBUG/MainProcess] | Consumer: Starting Heart
[2017-10-24 14:28:04,284: DEBUG/MainProcess] ^-- substep ok
[2017-10-24 14:28:04,284: DEBUG/MainProcess] | Consumer: Starting event loop
[2017-10-24 14:28:04,285: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2017-10-24 14:28:04,294: WARNING/MainProcess] /home/alexd/.virtualenvs/skate/local/lib/python2.7/site-packages/celery/fixups/django.py:202: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2017-10-24 14:28:04,294: INFO/MainProcess] celery@mypc ready.
[2017-10-24 14:28:04,295: DEBUG/MainProcess] basic.qos: prefetch_count->8

why my tasks are not working?

Update

log of running celery as beat

$ celery -A sandbox beat --loglevel=debug
celery beat v4.1.0 (latentcall) is starting.
Stale pidfile exists - Removing it.
__    -    ... __   -        _
LocalTime -> 2017-10-24 15:07:20
Configuration ->
    . broker -> redis://localhost:6379/0
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%DEBUG
    . maxinterval -> 5.00 minutes (300s)
[2017-10-24 15:07:20,216: DEBUG/MainProcess] Setting default socket timeout to 30
[2017-10-24 15:07:20,217: INFO/MainProcess] beat: Starting...
[2017-10-24 15:07:20,372: DEBUG/MainProcess] Current schedule:

[2017-10-24 15:07:20,373: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2017-10-24 15:07:20,373: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
[2017-10-24 15:15:43,232: DEBUG/MainProcess] beat: Synchronizing schedule...
[2017-10-24 15:15:43,245: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.

creation of task enter image description here this is how the task is created as a periodic task in the admin.

ps. I changed to interval option, may be some problems with my cron tab? anyway still not working.

Steinberg answered 24/10, 2017 at 14:37 Comment(7)
According to the documentation, to start a celery beat schedule you do $celery -A proj beat -- Looks like the problem could be that you specified worker not beat.Conferee
I see you updated your question with the output from the beat scheduler. It looks like you're using the default celery.beat.PersistentScheduler. According to the docs, for this extension, you want to use the Database scheduler. So perhaps try adding the option to specify the Database Scheduler as the documentation suggests --scheduler django_celery_beat.schedulers:DatabaseScheduler. See if this makes a difference. Also, don't forget to perform database migrations.Conferee
well thank you @sytech, that solved my problem. would you write an answer, so that I can accept it and close this question ?Steinberg
is there anyway to add this to the settings file?Steinberg
Yes, the documentation does say towards the very bottom of the page Note: You may also add this as an settings option directly. From a quick look at the docs, it seems like specifying beat_scheduler="django_celery_beat.schedulers:DatabaseScheduler" should do the trick.Conferee
You need to add -B arg to celery start commandTam
In a django conf you can use: CELERY_BEAT_SCHEDULER="django_celery_beat.schedulers:DatabaseScheduler"Spitball
C
36

The root cause, in this case, is that the beat scheduler needs to be started with the appropriate arguments. You supplied the following command:

$ celery -A sandbox worker --loglevel=debug

However, to start celery with a beat schedule, (as opposed to a regular celery worker) you must specify beat rather than worker. Moreover, when using the django_celery_beat extension, it is necessary to use the Database scheduler django_celery_beat.schedulers:DatabaseScheduler rather than the default scheduler celery.beat.PersistentScheduler.

So the corrected command would be:

$ celery -A sandbox beat --loglevel=debug --scheduler django_celery_beat.schedulers:DatabaseScheduler

Supporting documentation

Conferee answered 24/10, 2017 at 15:39 Comment(2)
From the document, there should be no = sign after --scheduler. docs.celeryproject.org/en/latest/userguide/…. Not quite sure if that affects though.Prehensile
As far as I know, the parser is not picky about this. But since that's what's reflected in the docs, I think it merits a change. If you'd like to edit my answer, I'd happily accept the change @ailionx -- or I'll get around to it eventually.Conferee
R
3

Add below code in "init.py" which is in "sandbox" project to pick task to Django admin.

This will make sure the app is always imported when django starts so that task will use this app.

from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
Riojas answered 3/2, 2020 at 12:21 Comment(0)
A
3

If you want to run Beat & Worker together, use the --beat flag.

celery -A sandbox worker --beat --loglevel=debug
Astrogation answered 23/2, 2022 at 23:4 Comment(0)
S
2

I think you did not define the cron shcedule. Where is it stored? Usually it is on disk or in database (django_celery). See http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html

Also, you have to run your worker with a beat option

Stuck answered 24/10, 2017 at 14:50 Comment(1)
this must be a comment, I have updated the question.Steinberg
P
0

Add this,

CELERY_BEAT_SCHEDULER="django_celery_beat.schedulers:DatabaseScheduler"
Paynter answered 16/7, 2024 at 17:31 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.