Docker Log File Size Issue


Gico
Go to solution Solved by Gico,

Recommended Posts

My docker log keeps filling up.  It is caused by the following messages every second or two.

 

My issues/questions:
1. The docker doesn't comply with the docker log size limit: The settings are 10MB max, 2 log files. The log clogs my tmpfs: up to 128MB every day or so.
2. Why/how did the Docker log level changed to debug, and how do I change it back?
3. Is it normal to have the docker queried for info (?) so intensively? Should I search for the cause?

 

Repeating messages in docker log:
time="2018-09-26T10:39:51.711023129+03:00" level=debug msg="Calling GET /_ping"
time="2018-09-26T10:39:51.711452020+03:00" level=debug msg="Calling GET /v1.38/version"
time="2018-09-26T10:39:51.712261697+03:00" level=debug msg="Calling GET /v1.38/events?filters=%7B%22type%22%3A%7B%22container%22%3Atrue%7D%7D"
time="2018-09-26T10:39:51.712697853+03:00" level=debug msg="Calling GET /v1.38/containers/json?limit=0"
time="2018-09-26T10:39:51.720401485+03:00" level=debug msg="Calling GET /v1.38/containers/865dbb780dfb/stats?stream=0"
time="2018-09-26T10:39:51.720474027+03:00" level=debug msg="Calling GET /v1.38/containers/3db916e81f42/stats?stream=0"
time="2018-09-26T10:39:51.720529815+03:00" level=debug msg="Calling GET /v1.38/containers/a0cfb32dcac1/stats?stream=0"
time="2018-09-26T10:39:51.720559173+03:00" level=debug msg="Calling GET /v1.38/containers/1ed2bb3c89db/stats?stream=0"
time="2018-09-26T10:39:51.720589840+03:00" level=debug msg="Calling GET /v1.38/containers/5df94a2abd15/stats?stream=0"
time="2018-09-26T10:39:51.720728577+03:00" level=debug msg="Calling GET /v1.38/containers/693476194b3d/stats?stream=0"
time="2018-09-26T10:39:53.641488701+03:00" level=debug msg="Client context cancelled, stop sending events"

Edited by Gico
Link to comment
  • 2 weeks later...
On 9/26/2018 at 3:59 AM, Gico said:

1. The docker doesn't comply with the docker log size limit: The settings are 10MB max, 2 log files. The log clogs my tmpfs: up to 128MB every day or so.

 

Docker log size limit refers to the logs from a container.  Not from the docker system itself

On 9/26/2018 at 3:59 AM, Gico said:

Should I search for the cause?

I would.

Edited by Squid
  • Upvote 1
Link to comment
  • 1 year later...

My docker log often get filled with these messages every few seconds

time="2019-11-14T10:27:47.142030231+02:00" level=debug msg="Calling GET /v1.37/containers/f3c02ffa8cde0ef9623d4455236370f72084a2a166a10297b6124177fd80353c/json"
time="2019-11-14T10:27:47.142796464+02:00" level=debug msg="Calling GET /v1.37/containers/a7cca2b1634e5c33b7568d673d036f183d65d6d58acb6a4ff76be8faa10ef231/json"
time="2019-11-14T10:27:47.143377579+02:00" level=debug msg="Calling GET /v1.37/containers/716254a6834843da1450c22ecb5c7f46a3fdc131363e9fe6a41424ae3e748028/json"
time="2019-11-14T10:27:47.144050634+02:00" level=debug msg="Calling GET /v1.37/containers/d5a9683ea707d2eae142bda82d83201fc8b5d4502ea3ed1c59de4258cccc021e/json"
time="2019-11-14T10:27:47.144669639+02:00" level=debug msg="Calling GET /v1.37/containers/add57c9a2682fe3c199d6f3edf76df2f0e1628bfaa7f5d65b1919fb4ba657938/json"
time="2019-11-14T10:27:47.145322148+02:00" level=debug msg="Calling GET /v1.37/containers/d504ae998b721566680d7a41cd86ea1a755d6424bced611344cd05bfe6ba570b/json"
time="2019-11-14T10:27:47.146126158+02:00" level=debug msg="Calling GET /v1.37/containers/6458706016cc098ab5342f82446a1140cb7b97d4f191f60670d8b2a2dea1d91d/json"
time="2019-11-14T10:27:47.146880602+02:00" level=debug msg="Calling GET /v1.37/containers/9371149b4a9c477f1a2832936df9c555764a73ade300a2886054c0df68c31469/json"
time="2019-11-14T10:27:47.147601801+02:00" level=debug msg="Calling GET /v1.37/containers/f15da2ef0722efd5a742a2f2d0be94748e4543660ac83950383797eba2de5b92/json"

I assumed I need to disable the debug mode and following these instructions

I tried adding DOCKER_DEBUG="no" and DOCKER_DEBUG="false" in the docker.cfg file,

but debug is still enabled.

 

CMDs.jpg.451e7785fea6c81166d0af9b0c4fdc90.jpg

 

Edited by Gico
Link to comment
  • 4 weeks later...

This is an on going issue on my server for quite some time. Already posted it here and here in the Docker Engine forum,

but there was only little response and it's not solved yet.

My docker log keeps filling up with following debug messages every second or two.

time="2019-12-11T07:54:28.125897846+02:00" level=debug msg="Client context cancelled, stop sending events"
time="2019-12-11T07:54:28.163582755+02:00" level=debug msg="Calling GET /_ping"
time="2019-12-11T07:54:28.164229066+02:00" level=debug msg="Calling GET /v1.39/version"
time="2019-12-11T07:54:28.165000240+02:00" level=debug msg="Calling GET /v1.39/events?filters=%7B%22type%22%3A%7B%22container%22%3Atrue%7D%7D"
time="2019-12-11T07:54:28.165421501+02:00" level=debug msg="Calling GET /v1.39/containers/json?limit=0"
time="2019-12-11T07:54:28.175312376+02:00" level=debug msg="Calling GET /v1.39/containers/f15da2ef0722/stats?stream=0"
time="2019-12-11T07:54:28.175719189+02:00" level=debug msg="Calling GET /v1.39/containers/afebf568de86/stats?stream=0"
time="2019-12-11T07:54:28.177248098+02:00" level=debug msg="Calling GET /v1.39/containers/36b6cc31a520/stats?stream=0"
time="2019-12-11T07:54:28.177301408+02:00" level=debug msg="Calling GET /v1.39/containers/d6b5b1f59817/stats?stream=0"
time="2019-12-11T07:54:28.177403166+02:00" level=debug msg="Calling GET /v1.39/containers/d5f3d5caa135/stats?stream=0"
time="2019-12-11T07:54:28.177499741+02:00" level=debug msg="Calling GET /v1.39/containers/5888248d4de5/stats?stream=0"
time="2019-12-11T07:54:28.177694344+02:00" level=debug msg="Calling GET /v1.39/containers/50e7e412aba9/stats?stream=0"
time="2019-12-11T07:54:30.166188765+02:00" level=debug msg="Client context cancelled, stop sending events"
time="2019-12-11T07:54:30.199804739+02:00" level=debug msg="Calling GET /_ping"
time="2019-12-11T07:54:30.200733367+02:00" level=debug msg="Calling GET /v1.39/version"
time="2019-12-11T07:54:30.201967325+02:00" level=debug msg="Calling GET /v1.39/events?filters=%7B%22type%22%3A%7B%22container%22%3Atrue%7D%7D"
time="2019-12-11T07:54:30.202688682+02:00" level=debug msg="Calling GET /v1.39/containers/json?limit=0"
time="2019-12-11T07:54:30.213254645+02:00" level=debug msg="Calling GET /v1.39/containers/f15da2ef0722/stats?stream=0"
time="2019-12-11T07:54:30.213381857+02:00" level=debug msg="Calling GET /v1.39/containers/afebf568de86/stats?stream=0"
time="2019-12-11T07:54:30.214664222+02:00" level=debug msg="Calling GET /v1.39/containers/36b6cc31a520/stats?stream=0"
time="2019-12-11T07:54:30.214754288+02:00" level=debug msg="Calling GET /v1.39/containers/50e7e412aba9/stats?stream=0"
time="2019-12-11T07:54:30.214858638+02:00" level=debug msg="Calling GET /v1.39/containers/d6b5b1f59817/stats?stream=0"
time="2019-12-11T07:54:30.214910970+02:00" level=debug msg="Calling GET /v1.39/containers/d5f3d5caa135/stats?stream=0"
time="2019-12-11T07:54:30.214937826+02:00" level=debug msg="Calling GET /v1.39/containers/5888248d4de5/stats?stream=0"
time="2019-12-11T07:54:32.202934933+02:00" level=debug msg="Client context cancelled, stop sending events"
time="2019-12-11T07:54:32.248469161+02:00" level=debug msg="Calling GET /_ping"
time="2019-12-11T07:54:32.249427812+02:00" level=debug msg="Calling GET /v1.39/version"
time="2019-12-11T07:54:32.250282086+02:00" level=debug msg="Calling GET /v1.39/events?filters=%7B%22type%22%3A%7B%22container%22%3Atrue%7D%7D"
time="2019-12-11T07:54:32.251172942+02:00" level=debug msg="Calling GET /v1.39/containers/json?limit=0"
time="2019-12-11T07:54:32.259211926+02:00" level=debug msg="Calling GET /v1.39/containers/d6b5b1f59817/stats?stream=0"
time="2019-12-11T07:54:32.261015242+02:00" level=debug msg="Calling GET /v1.39/containers/f15da2ef0722/stats?stream=0"
time="2019-12-11T07:54:32.261227812+02:00" level=debug msg="Calling GET /v1.39/containers/36b6cc31a520/stats?stream=0"
time="2019-12-11T07:54:32.261383225+02:00" level=debug msg="Calling GET /v1.39/containers/afebf568de86/stats?stream=0"
time="2019-12-11T07:54:32.261488331+02:00" level=debug msg="Calling GET /v1.39/containers/d5f3d5caa135/stats?stream=0"
time="2019-12-11T07:54:32.261707089+02:00" level=debug msg="Calling GET /v1.39/containers/5888248d4de5/stats?stream=0"
time="2019-12-11T07:54:32.262745556+02:00" level=debug msg="Calling GET /v1.39/containers/50e7e412aba9/stats?stream=0" 

The docker engine is indeed in debug mode:

root@Juno:~# docker info | grep -i debug.*server
Debug Mode (server): true

I guess it is caused by the "-debug" added to the end of the containerd-shim commands,  but where is this come from?

root@Juno:~# ps -ef | grep docker
root      3581     1  0 10:36 ?        00:00:00 /bin/bash /usr/local/emhttp/plugins/dynamix.docker.manager/scripts/docker_load
root      4865 23132  0 10:36 ?        00:00:00 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 3306 -container-ip 172.17.0.3 -container-port 3306
root      6053 23132  0 10:37 ?        00:00:00 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 7910 -container-ip 172.17.0.4 -container-port 5900
root      6068 23132  0 10:37 ?        00:00:00 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 7810 -container-ip 172.17.0.4 -container-port 5800
root      6626 23154  0 10:37 ?        00:00:00 containerd-shim -namespace moby -workdir /var/lib/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/d5f3d5caa1350d062f2517202deee495c2546c6ae90cb0a2be0e8b365d1eb9ec -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc -debug
root      8640 23154  0 10:37 ?        00:00:00 containerd-shim -namespace moby -workdir /var/lib/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/36b6cc31a520a851372059264e45e568ff8bbe8676f1a33ec2f18032e2a82d90 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc -debug
root      9010 23132  0 10:37 ?        00:00:00 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 6080 -container-ip 172.17.0.5 -container-port 6080
root     10159 23154  0 10:37 ?        00:00:00 containerd-shim -namespace moby -workdir /var/lib/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/5888248d4de57eed593cdf2ed5b5f98fc9d14523a0e0435753d591cc47eae454 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc -debug
root     10485 23132  0 10:37 ?        00:00:00 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 8181 -container-ip 172.17.0.6 -container-port 8181
root     12035 23154  0 10:37 ?        00:00:00 containerd-shim -namespace moby -workdir /var/lib/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/f15da2ef0722efd5a742a2f2d0be94748e4543660ac83950383797eba2de5b92 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc -debug
root     13312 23154  0 10:37 ?        00:00:00 containerd-shim -namespace moby -workdir /var/lib/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/afebf568de86ad2b42114dac2f9d869e31e56108a960314c1af15938ce11a6ff -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc -debug
root     23132     1  1 10:30 ?        00:00:15 /usr/bin/dockerd -p /var/run/dockerd.pid --mtu=900 --log-opt max-size=10m --log-opt max-file=2 --storage-driver=btrfs -D --storage-driver=btrfs
root     23154 23132  2 10:30 ?        00:00:26 containerd --config /var/run/docker/containerd/containerd.toml --log-level debug
root     29857 23154  0 10:31 ?        00:00:00 containerd-shim -namespace moby -workdir /var/lib/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/50e7e412aba948d9f02c8ab6a5ea58eaae3e66fd766fd21e9fca6095200b6c1c -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc -debug
root     30385  3581  2 10:49 ?        00:00:00 docker stats --no-stream --format={{.ID}};{{.CPUPerc}};{{.MemUsage}}
root     30528  8819  0 10:49 pts/0    00:00:00 grep docker
root     30876 23132  0 10:31 ?        00:00:00 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 9443 -container-ip 172.17.0.2 -container-port 9443
root     30925 23132  0 10:31 ?        00:00:00 /usr/bin/docker-proxy -proto udp -host-ip 0.0.0.0 -host-port 1194 -container-ip 172.17.0.2 -container-port 1194
root     30963 23132  0 10:31 ?        00:00:00 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 943 -container-ip 172.17.0.2 -container-port 943
root     31813 23154  0 10:31 ?        00:00:00 containerd-shim -namespace moby -workdir /var/lib/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/d6b5b1f598175b787039994da22dea95cd0c367680efc4ba947c2de5198ef0c6 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc -debug

I followed these instructions and added a /etc/docker/daemon.json file with "debug": false

but that didn't help either:

root@Juno:/etc/docker# ls
daemon.json  key.json
root@Juno:/etc/docker# less daemon.json
{
    "debug": false
}
root@Juno:/etc/docker# sudo kill -SIGHUP $(pidof dockerd)
root@Juno:/etc/docker# docker info | grep -i debug.*server
Debug Mode (server): true

 

Link to comment
44 minutes ago, Gico said:

Already posted it here and here in the Docker Engine forum,

Posting in a new location is unlikely to get any better response unless you can provide new information1

 

As far as I know a standard Unraid docker installation does not have the daemon.json file (at least mine does not).   Have you identified what is putting it there on your system?   Have you tried posting your system diagnostics zip file to see if anyone can spot a cause.

  • Thanks 1
Link to comment
3 hours ago, Gico said:

This is an on going issue on my server for quite some time. Already posted it here and here in the Docker Engine forum,

I have merged the 3 threads. Don't crosspost! How can we coordinate our replies when you have posted multiple threads?

 

Crossposting has been considered a bad thing on message boards since before the World Wide Web.

 

If you think you haven't gotten attention just try bumping the thread by making a new post in it so it will show up as unread again.

 

Link to comment

After deleting and recreating the docker image, the docker service would not start. I had to restart the server and that solved this issue.

However even without any container, the docker is in debug mode:

root@Juno:~# docker info | grep -i debug.*server
Debug Mode (server): true

Can anyone check his docker service whether he gets Debug Mode true or false?

Edited by Gico
Link to comment
  • 2 years later...
  • Solution

So after 3 years tolerating this issue and restarting the Docker service like every 3 days for a while in order to delete the log file,

I found the issue:  /boot/config/docker.cfg had that line:

DOCKER_OPTS="-D --storage-driver=btrfs"
 

-D implied debug, so debug messages were written to the log. Removing the -D resolved the issue.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.