0

Assuming a server dumps logs to a file and we need to read last 100 lines. In the meantime, the file is loaded with more lines. How to tackle this kind of cases?

Arnab Mukherjee
  • 148
  • 1
  • 13
  • 1
    It isn't very clear in what requirements need to be met. For all you know, you could just dump all the log contents into an array or a string and append to it, and just flush all the contents later. – WorkingRobot Nov 20 '17 at 17:13
  • You might want to look [at this question](https://stackoverflow.com/questions/5419888/reading-from-a-frequently-updated-file), and [this one too](https://stackoverflow.com/questions/44407834/python-detect-log-file-rotation-while-watching-log-file-for-modification/44411621#44411621). – bgse Nov 20 '17 at 17:19

1 Answers1

0

Well, what you shouldn't do is dump all the contents of the file into a string. That would be horrible, as log files can get up to sizes of 100 MB. If you were only allowed one file handler, what I would recommend is to have a queue to write the lines, and every once in a while, dump the lines into the log. In the free time, it can read the last 100 lines.

WorkingRobot
  • 405
  • 8
  • 18