asyncio and coroutines vs task queues
Asked Answered
H

2

28

I've been reading about asyncio module in python 3, and more broadly about coroutines in python, and I can't get what makes asyncio such a great tool.

I have the feeling that all you can do with coroutines, you can do better by using task queues based on the multiprocessing module (celery for example).

Are there use cases where coroutines are better than task queues?

Hippodrome answered 23/12, 2015 at 14:31 Comment(0)
L
25

Not a proper answer, but a list of hints that could not fit into a comment:

  • You are mentioning the multiprocessing module (and let's consider threading too). Suppose you have to handle hundreds of sockets: can you spawn hundreds of processes or threads?

  • Again, with threads and processes: how do you handle concurrent access to shared resources? What is the overhead of mechanisms like locking?

  • Frameworks like Celery also add an important overhead. Can you use it e.g. for handling every single request on a high-traffic web server? By the way, in that scenario, who is responsible for handling sockets and connections (Celery for its nature can't do that for you)?

  • Be sure to read the rationale behind asyncio. That rationale (among other things) mentions a system call: writev() -- isn't that much more efficient than multiple write()s?

Loquacious answered 23/12, 2015 at 15:13 Comment(0)
C
20

Adding to the above answer:

If the task at hand is I/O bound and operates on a shared data, coroutines and asyncio are probably the way to go.

If on the other hand, you have CPU-bound tasks where data is not shared, a multiprocessing system like Celery should be better.

If the task at hand is a both CPU and I/O bound and sharing of data is not required, I would still use Celery.You can use async I/O from within Celery!

If you have a CPU bound task but with the need to share data, the only viable option as I see now is to save the shared data in a database. There have been recent attempts like pyparallel but they are still work in progress.

Corse answered 11/2, 2017 at 19:6 Comment(2)
how to use async I/o from within celery? Can you direct me to some useful material on this? ThanksPolinski
@bhaskrc @ravimalhotra If you got to know how to use asyncio with Celery, please share. I am using the same by async_to_sync function with the help of asgiref package. Eg: github.com/aman199002/task_schedular/blob/main/celery_stuff/…Tiemroth

© 2022 - 2024 — McMap. All rights reserved.