Enterprise Integration Zone is brought to you in partnership with:

Rob Golding is a web applications developer from the UK, specializing in Python and Django with 3 years of experience in the trade as a systems administrator. Rob is a DZone MVB and is not an employee of DZone and has posted 8 posts at DZone. You can read more from them at their website. View Full User Profile

Announcing Celery-S3

03.05.2013
| 2157 views |
  • submit to reddit

TL;DR: Check out celery-s3, it lets you store Celery task results in S3.

Celery has good support for a variety of different message brokers – RabbitMQ, Redis, SQS, etc. – but support for result storage is somewhat more limited.

This is particularly apparent if you’re running Celery on an EC2 server and wish to take advantage of the distributed services AWS provides. SQS works as a message broker, but there’s nowhere to store results (and using the AMQP result store with SQS results in a queue for each result).

Celery-S3 lets you store Celery results in an S3 bucket, which means you can run a fully-functioning Celery installation on AWS with nothing but a Python install.

Published at DZone with permission of Rob Golding, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)