TL;DR: Check out celery-s3, it lets you store Celery task results in S3.
Celery has good support for a variety of different message brokers – RabbitMQ, Redis, SQS, etc. – but support for result storage is somewhat more limited.
This is particularly apparent if you’re running Celery on an EC2 server and wish to take advantage of the distributed services AWS provides. SQS works as a message broker, but there’s nowhere to store results (and using the AMQP result store with SQS results in a queue for each result).
Celery-S3 lets you store Celery results in an S3 bucket, which means you can run a fully-functioning Celery installation on AWS with nothing but a Python install.
(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)