We have recently forced to replace celery with RQ as it is simpler and celery was giving us too many problems. Now, we are not able to find a way to create multiple queues dynamically because we need to get multiple jobs done concurrently. So basically every request to one of our routes should start a job and it doesn't make sense to have multiple users wait for one user's job to be done before we can proceed with next jobs. We periodically send a request to the server in order to get the status of the job and some meta data. This way we can update the user with a progress bar (It could be a lengthy process so this has to be done for the sake of UX)
We are using Django and Python's rq library. We are not using django-rq (Please let me know if there are advantages in using this)
So far we start a task in one of our controllers like:
redis_conn = Redis()
q = Queue(connection=redis_conn)
job = django_rq.enqueue(render_task, new_render.pk, domain=domain, data=csv_data, timeout=1200)
Then in our render_task
method we add meta data to the job based on the state of the long task:
current_job = get_current_job()
current_job.meta['state'] = 'PROGRESS'
current_job.meta['process_percent'] = process_percent
current_job.meta['message'] = 'YOUTUBE'
current_job.save()
Now we have another endpoint that gets the current task and its meta data and passes it back to client (This happens through oeriodic AJAX request)
How do we go about running jobs concurrently without blocking other jobs? Should we make queues dynamically? Is there a way to make use of Workers in order to achieve this?