JohanSF Posted March 12, 2019 Share Posted March 12, 2019 (edited) Hi there, Some of my containers were turned off this morning, while others are running. When I try to turn a container on I get this: Running a Fix Common Problems test I get this: I searched a bit and found someone telling to do a balance although this didn't fix it for me: What should I do? hal9000-diagnostics-20190312-0617.zip Edited March 12, 2019 by JohanSF Quote Link to comment
JohanSF Posted March 12, 2019 Author Share Posted March 12, 2019 (edited) Hey GilbN, I searched about how to do that and found this way to increase the log size: However, I still get the docker error. I guess I will have to delete my docker image and do this: Unless someone here has another suggestion before I do that? Edited March 12, 2019 by JohanSF Quote Link to comment
GilbN Posted March 12, 2019 Share Posted March 12, 2019 16 minutes ago, JohanSF said: Hey GilbN, I searched about how to do that and found this way to increase the log size: However, I still get the docker error. I guess I will have to delete my docker image and do this: Unless someone here has another suggestion before I do that? a quick forum search gave me this. du -ah /var/lib/docker/containers/ | grep -v "/$" | sort -rh | head -60 and then delete the biggest files. You should limit docker log size though. --log-opt max-size=50m --log-opt max-file=1 Quote Link to comment
GilbN Posted March 12, 2019 Share Posted March 12, 2019 You can match ID in the Docker tab to find the culprit Quote Link to comment
JohanSF Posted March 12, 2019 Author Share Posted March 12, 2019 (edited) Most of them give "cannot read directory" "input/output error". Can't screenshot it right now as I'm not home though. For "--log-opt max-size=50m --log-opt max-file=1", is something missing in this command? those are only options. Edited March 12, 2019 by JohanSF Quote Link to comment
GilbN Posted March 12, 2019 Share Posted March 12, 2019 Does it not show any large json log files? --log-opt max-size=50m --log-opt max-file=1 that is for the extra parameters on each container Quote Link to comment
JohanSF Posted March 12, 2019 Author Share Posted March 12, 2019 The largest one that I can see is 75 mb, that is the lidarr container. I remember that one was running with Debug log level and I had recently started using it more. But there are of course quite a lot that I cannot see. Quote Link to comment
GilbN Posted March 12, 2019 Share Posted March 12, 2019 4 minutes ago, JohanSF said: The largest one that I can see is 75 mb, that is the lidarr container. I remember that one was running with Debug log level and I had recently started using it more. But there are of course quite a lot that I cannot see. Try docker system df -v this command can take a while Quote Link to comment
JohanSF Posted March 12, 2019 Author Share Posted March 12, 2019 (edited) Running it now. It's done, lots of lines of output, what are we looking for? Edited March 12, 2019 by JohanSF Quote Link to comment
GilbN Posted March 12, 2019 Share Posted March 12, 2019 You can also run: docker ps -s The "size" information shows the amount of data (on disk) that is used for the writable layer of each container The "virtual size" is the total amount of disk-space used for the read-only image data used by the container and the writable layer. Quote Link to comment
Squid Posted March 13, 2019 Share Posted March 13, 2019 Your docker.img file is trashed, and the syslog is being spammed with errors from it, hence the error FCP is throwing at you. Your recourse is to stop the docker service via Settings - docker. Delete the image from there, then re-enable the service, hit the apps tab, previous apps, check off whatever you want and hit install multi. 3 1 Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.