1

I have 2 scripts: lets say Worker.py and Monitor.py

Worker.py collects some data and dumps the metrics into some temp file. Monitor.py looks at the temp file and sends the metrics to a remote server running Graphite. Essentially I am trying to create a time-series graph, in real-time, of the data points collected by Worker.py but I am unsure how to coordinate these two scripts or whether this workflow is the most ideal. I was looking into multi-threading and Celery and not sure if those ideas apply to my situation. Any advice guys?

EDIT: the reason I have Monitor.py as a separate script is because I also have Worker1.py Worker2.py ...WorkerN-1.py WorkerN.py etc. So I would like to have the same Monitor script that can be configurable, via command line, to look at different temp files and plot them to different graphs

lollerskates
  • 570
  • 5
  • 21
  • 1
    one of the scripts (probably monitor in this case) could start the other in a `subprocess.Popen` and receive the data through the `stdout` pipe, looking for an example to link to now... – Tadhg McDonald-Jensen May 29 '16 at 19:30
  • @TadhgMcDonald-Jensen, I think you are looking for http://stackoverflow.com/questions/2715847/python-read-streaming-input-from-subprocess-communicate/17698359#17698359 – Gurupad Hegde May 29 '16 at 19:41
  • @TadhgMcDonald-Jensen, so you are saying have Monitor.py start the workers as a subprocess? – lollerskates May 29 '16 at 19:50
  • yes, unless you need the temp file for other reasons pushing the data directly to `monitor` through the stdout seems to be the best option in my opinion. – Tadhg McDonald-Jensen May 29 '16 at 19:52
  • My advice is to try it and see if it works. I suspect it will. – Warren Dew May 29 '16 at 19:54

0 Answers0