How to disable retry in celery?
Asked Answered
G

1

8

I am running a celerybeat scheduler every 15 mins where I need to fetch data from API (rate limit = 300 requests/min max) and store the results into the database. I would like to fetch the urls in parallel subject to rate limits at the same time. If any worker fails here, I don't want to retry since I will ping again in 15 mins. Any suggestions on how this can be accomplished in celery.

@celery.task(bind=True)
def fetch_store(self):
    start = time()
    return c.chain(c.group(emap.s() for _ in range(2000)), ereduce.s(start)).apply_async()

@celery.task(rate_limit='300/m')
def fetch():
    #... requests data from external API
    return data

@celery.task
def store(numbers, start):
    end = time()
    logger.info("Received" + numbers + " " + (end - start)/1000 + "seconds")
Gabriellagabrielle answered 3/4, 2018 at 12:0 Comment(1)
Did you get any option for this? I am looking for the same.Duenas
G
0

I typically define a custom Task subclass and set max_retries to 0 (not None, this makes it retry forever):

class NoRetryTask(Task):
    max_retries = 0
    ...

You could also do it in a single line as a decorator like this:

@app.task(max_retries=0)
def my_func():
    ...

See the docs for more information.

Graduate answered 2/4, 2023 at 21:6 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.