Celery: Redis as broker leaving task meta keys
Asked Answered
C

3

6

I have celery app with Redis as broker.

The code consist of the following in a loop :

running = []
res = add.apply_async([1,2], queue='add')
running.append(res)

while running:
    r = running.pop()
    if r.ready():
        print r.get()
    else:
        running.insert(0,r)

everything works fine but when i redis-cli into redis and execute keys * I see bunch of celery-task-meta keys.

Why arent they cleaned up?
What are those for?

--

[EDIT]

I've read about CELERY_TASK_RESULT_EXPIRES setting.
Is it possible for the task keys in Redis to be cleaned up right after the result is read rather than wait until the expiration time?

Coburn answered 7/12, 2015 at 6:44 Comment(3)
Do you know the task_id or do you have the AsyncResult of your job? If so you can AsyncResult.forget(). From the docs: Forget about (and possibly remove the result of) this task.Dextrorse
yes i do have AsyncResult (its the variable 'r'). so would i have to r.get() and r.forget() right afterwards?Coburn
please give it a shot, becuase I have not tried myself. I was looking into it because I have a similar issue.Dextrorse
H
4

From the Celery Doc:

AsyncResult.forget()
   Forget about (and possibly remove the result of) this task.

You have to first r.get() then r.forget()

But, You needn't cleaned up the keys. For,doc say that:

CELERY_TASK_RESULT_EXPIRES

Default is to expire after 1 day.

Halfcocked answered 21/9, 2016 at 9:19 Comment(0)
D
0

I was having the same issue. What fixed it for me was adding app.autodiscover_tasks() to my celery.py file

Droshky answered 26/8, 2019 at 5:16 Comment(0)
H
0

I think what you are looking for is to ignore the result completely which can be done by setting this flag, 'task_ignore_result' as True. This doesn't store the result at all.

https://docs.celeryq.dev/en/stable/userguide/configuration.html#task-ignore-result

Halibut answered 12/8, 2022 at 16:2 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.