0

I currently have setup postgres with flask-sqlalchemy extension on AWS using ElasticBeanstalk (EB). Postgres is running using RDS. Now i want to setup some background tasks. I read about Celery, seems to fit the use case reasonably well.

I want to understand how I can set that up on AWS so that it talks to the same Database. For the actual queues I want to use Redis. The business logic for background process and what I have in flask-webserver is very intertwined. How would the deployment process look like (with or without EB). I am okay with setup a new instance if needed for celery and redis as long as I don't have to separate the business logic a lot.

Another hacky solution i have been thinking is to setup crons on a node that hit certain URLs in Flask application to execute background tasks. But I would rather have a more scalable solution.

Ankit
  • 2,180
  • 2
  • 28
  • 42

1 Answers1

2

I'm using Flask with a similar setup and I followed this answer:

How do you run a worker with AWS Elastic Beanstalk?

I also setup redis using this .config file:

https://gist.github.com/yustam/9086610

However, for my setup, I changed the command to be this:

command=/opt/python/run/venv/bin/python2.7 manage.py celery

My manage.py has:

@manager.command
def celery():
"""
Start the celery worker.
"""
with app.app_context():
    return celery_main(['celery', 'worker'])
Community
  • 1
  • 1
Scott
  • 166
  • 2
  • 5