26

I have the following setup:

  1. Perl service running in a container and writing logs out to STDERR
  2. logspout to ship those logs out to a remote server for archiving

in a 600 MB RAM machine.

I also truncate the logs periodically at:

/var/lib/docker/containers/CID/CID-json.log

as suggested here to avoid 100% disk scenarios.

Problem

Docker daemon starts of with low memory usage, 1% initially and slowly increases to 40% after 2 days of running the container.

Reference

Docker daemon memory leak has been talked about in this issue and this issue. But both of them are closed now saying merged at a commit. Am running the latest major version of docker (Docker version 1.4.0, build 4595d4f), but still face a monotonically increasing memory usage issue.

EDIT: I did this experiment: Just run a bash process in the container, print out a lot of lines to STDERR, docker daemon process's memory usage accelerates very quickly

Does docker do some log buffering and doesn't release memory even if underlying log file (/var/lib/docker/containers/CID/CID-json.log) is cleared?

There's apparently no way to clear the logs. Will this commit solve this issue for long running tasks?

I don't know why docker daemon's memory usage keeps increasing. How do I debug this issue?

J. Scott Elblein
  • 3,179
  • 11
  • 44
  • 72
alpha_cod
  • 1,703
  • 3
  • 18
  • 40
  • At least try to send the output of the application to /dev/null or a logfile, instead of sending to STDOUT/STDERR. And see if this solves your problems. Then you can pinpoint if the logging causes memory consumption. – RoyB Dec 31 '14 at 12:06
  • How are you measuring memory usage? On my system I get a VmSize of 330MB straight away, but RSS of 14MB. RSS is not very good for this kind of discussion because it will go up and down depending on what else the machine is doing. – Bryan Jan 05 '15 at 10:14
  • I measured the memory usage using linux top command. The docker daemon start off with low memory usage and it slowly accelerates which can be seen within minutes of running a bash docker container and echoing lot of text into stdout. The acceleration of memory eventutally causes an OOM crash – alpha_cod Jan 05 '15 at 21:35
  • 1
    I tried it using `yes`; I could get Docker to use a lot of CPU but no increase in memory. Maybe I used different options - which of `-dti` did you use on `docker run`? Maybe post a complete minimal repro? – Bryan Jan 06 '15 at 10:04

3 Answers3

3

There is still at least one outstanding issue relating to memory leaks with logs: https://github.com/docker/docker/issues/9139

mhsmith
  • 4,840
  • 2
  • 31
  • 55
0

This may not be what you are looking for, but I usually run a cron job to restart my containers after a certain amount of time everyday. This ensures that the container has enough RAM all the time, and also I generally restrict the maximum ram usage by the container while creating them.

Containers take only few seconds to restart and serve data and if you are not running a High Availability service and can afford a few seconds downtime, consider restarting the container (assuming that you dont have persistent volumes).

However, if you do find a solution to your problem, do let us know.

Sohail
  • 3,968
  • 1
  • 33
  • 36
-3
  • docker rm $(docker ps -a -q)
  • docker rmi --force $(docker images -q)
  • docker system prune --force

Need to be root user.

  • systemctl stop docker
  • rm -rf /var/lib/docker/aufs
  • apt-get autoclean
  • apt-get autoremove
  • systemctl start docker
J. Scott Elblein
  • 3,179
  • 11
  • 44
  • 72
khushbu
  • 147
  • 9