40

Is there some way to configure multiple worker and/or web processes to run in the single Heroku app container? Or does this have to be broken up into multiple Heroku apps?

For example:

worker: node capture.js
worker: node process.js
worker: node purge.js
web: node api.js
web: node web.js
tobius
  • 709
  • 1
  • 7
  • 9
  • For further clarification when I try to do this all of the js files are executed, but they show up under foreman as the same process (e.g. worker.1 and web.1). I am interested in having them run on their own dynos so I can control their scaling individually. – tobius Apr 10 '14 at 13:26

1 Answers1

53

All processes must have unique names. Additionally, the names web and worker are insignificant and carry no special meaning. The only process that carries a significant name is the web process, as stated in the Heroku docs:

The web process type is special as it’s the only process type that will receive HTTP traffic from Heroku’s routers. Other process types can be named arbitrarily. -- (https://devcenter.heroku.com/articles/procfile)

So you want a Procfile like so:

capture: node capture.js
process: node process.js
purge: node purge.js
api: node api.js
web: node web.js

You can then scale each process separately:

$ heroku ps:scale purge=4
Yuval Adam
  • 149,388
  • 85
  • 287
  • 384
  • 1
    It seems that you were right that worker has no special meaning, but only the process named "web" will respond to web requests. If you update your answer I will accept. This was very helpful. – tobius Apr 10 '14 at 15:47
  • @tobius you're absoulutely right, i'll edit my answer. – Yuval Adam Apr 10 '14 at 17:18
  • Also, Heroku's maintenance mode will only alter the function of "web" dynos, (it will divert their traffic to a maintenance page). – Patrick Jan 06 '15 at 21:16
  • 31
    Is there a way to route traffic to the 'api' process in this example? – joerick Mar 04 '15 at 15:34
  • 2
    Looks like we will have to have a "web" proxy that streams `/api/*` requests to the "api" worker and streams all other requests to an "app" worker. This could get a little complicated when there are multiple instances of those workers; you would then need to implement or npm-install a load balancer. Or, you could just run multiple heroku apps, but now the repository organisation gets a little complicated (submodules?). – Dan Ross Feb 23 '16 at 07:03
  • 1
    Heroku already implements a load balancer. Workers run on separate CPUs that do not affect the web worker, thus you can use rabbitmq for example and send work to a worker, then communicate back using something like Redis. If you are using nodejs you can also cluster your app to take advantage of more CPUs but watch your RAM. – droid-zilla Oct 21 '16 at 15:40
  • In OP's example, the `api` process could bind on an arbitrary port, say `3456`, and the `web` process could proxy `/api` to `localhost:3456` with the help of something like `node-http-proxy` – Christophe Marois Jun 23 '17 at 18:13
  • What happens if you don't use unique names? I did this and it appeared to deploy okay but I haven't been able to test it on the server. – intcreator Jul 11 '17 at 17:56
  • 1
    I have done this test previously and the last declaration of the unique name in the Procfile will be used. As an example I tried running up two dyno's named `worker` of two different PHP files and a single `web` dyno. If you are keeping count that should have been 3 dyno's but only two where activated. The `worker` dyno was running the PHP script from the second declaration but when I named them `worker1` and `worker2` I wound up with all the dyno's I was trying to use. – Chris Rutherfurd Mar 04 '18 at 06:22