1 year ago
#380811
sid0710
How to force celery tasks to be processed by a single docker container?
I am currently running a celery chord task with a bunch of group tasks within it. The group tasks write some files on the disk and pass on the filenames to the chord callback task which processes these files to create the final result. All of this is running inside a docker container which is accessed via flask API. This system works fine till the time there is only 1 docker container running celery.
As soon as I want to scale celery up by adding another docker container, things start breaking because the group tasks are distributed across multiple containers and the files are unavailable to the chord callback task.
How can I ensure that this complete task run on a single container and if another task comes in while the first one is being processed, it goes to the idle celery container and runs over there?
Adding a shared volume will not work for my use-case since I want to add auto-scaling to this which will enforce the containers to be on different machines/instances.
python
docker
celery
autoscaling
0 Answers
Your Answer