Celery task always PENDING
Asked Answered
R

6

28

I try to run Celery example on Windows with redis backend. The code looks like:

from celery import Celery

app = Celery('risktools.distributed.celery_tasks',
             backend='redis://localhost',
             broker='redis://localhost')

@app.task(ignore_result=False)
def add(x, y):
    return x + y

@app.task(ignore_result=False)
def add_2(x, y):
    return x + y

I start the tasks using iPython console:

>>> result_1 = add.delay(1, 2)    
>>> result_1.state
'PENDING'
>>> result_2 = add_2.delay(2, 3)    
>>> result_2.state
'PENDING'

It seems that both tasks were not executed, but Celery worker output shows that they succeeded:

[2014-12-08 15:00:09,262: INFO/MainProcess] Received task: risktools.distributed.celery_tasks.add[01dedca1-2db2-48df-a4d6-2f06fe285e45]
[2014-12-08 15:00:09,267: INFO/MainProcess] Task celery_tasks.add[01dedca1-2db2-48df-a4d6-2f06fe28
5e45] succeeded in 0.0019998550415s: 3
[2014-12-08 15:00:24,219: INFO/MainProcess] Received task: risktools.distributed.celery_tasks.add[cb5505ce-cf93-4f5e-aebb-9b2d98a11320]
[2014-12-08 15:00:24,230: INFO/MainProcess] Task celery_tasks.add[cb5505ce-cf93-4f5e-aebb-9b2d98a1
1320] succeeded in 0.010999917984s: 5

I've tried to troubleshoot this issue according to Celery documentation, but none of the advices were useful. What am I doing wrong and how can I receive results from a Celery task?

UPD: I've added a task without ignore_result parameter, but nothing has changed

@app.task
def add_3(x, y):
    return x + y

>>>r = add_3.delay(2, 2)
>>>r.state
'PENDING'
Rubious answered 8/12, 2014 at 12:11 Comment(4)
.get() will return the result. Not sure why you are always getting PENDING thoPatentee
@Patentee .get() fails with TimeoutError: The operation timed out.Rubious
Did you set the BROKER_URL anywhere?Patentee
Possible duplicate of Celery 'Getting Started' not able to retrieve results; always pendingSubsist
R
42

According to Celery 'Getting Started' not able to retrieve results; always pending and https://github.com/celery/celery/issues/2146 it is a Windows issue.

Celery -P threads or --pool=solo options solves the issue.

Rubious answered 8/12, 2014 at 13:21 Comment(2)
Mac also encounters this issue as well, thanks for the tipsDeification
Life Save for windows , thanks manCrossstaff
M
8

Instead of Celery --pool=solo option, try -P threads on Windows.

Mccune answered 10/4, 2020 at 8:38 Comment(1)
This sounds like a comment, not an answer.Robomb
R
5

Setting CELERY_TASK_TRACK_STARTED = True (or track_started=True on individual tasks) can also help - this will enable the STARTED status.

Rihana answered 29/5, 2020 at 6:20 Comment(0)
P
3

Remove the ignore_result=False from the celery docs

Task.ignore_result

Don’t store task state. Note that this means you can’t 
use AsyncResult to check if the task is ready, or get its return value.
Patentee answered 8/12, 2014 at 12:35 Comment(2)
I still get PENDING state (see add_3 method in UPD)Rubious
this worked for me nicely, thanks! (note that CELERY_TASK_TRACK_STARTED was already set to True)Translunar
L
1

thanks everyone.

my celery config:

-------------- celery@DESKTOP-FD38GOO v4.4.2 (cliffs)
--- ***** -----
-- ******* ---- Windows-10-10.0.18362-SP0 2020-04-17 06:58:18
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         mysite:0x25cfd40d208
- ** ---------- .> transport:   redis://localhost:6379//
- ** ---------- .> results:     redis://localhost:6379/1
- *** --- * --- .> concurrency: 8 (thread)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . mysite.celery.debug_task
  . supplier.tasks.add
  . supplier.tasks.count_widgets
  . supplier.tasks.count_widgets2
  . supplier.tasks.mul
  . supplier.tasks.xsum

i have fixed such issue:

i pending such issue about 1 days, and try uninstall redis and install redis on windows 10 some times.

at last i found there are not concurrency config.

first solution:

celery -A mysite worker -l info -P threads

second solution:

celery -A mysite worker -l info --pool=solo

my celery config:

CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/1'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_IGNORE_RESULT = False
CELERY_TIMEZONE = TIME_ZONE
CELERY_TRACK_STARTED = True
CELERYD_LOG_FILE = os.path.join(
    BASE_DIR, 'celery', 'logs')   
CELERYD_LOG_LEVEL = "INFO"
Limonite answered 16/4, 2020 at 23:5 Comment(0)
W
1

Other way to do it if you are setting the configuration from Celery object -

app = Celery("MyTasks")
app.conf.task_track_started = True
app.conf.task_ignore_result = False
..
..
Washboard answered 17/3, 2023 at 16:3 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.