snusnu1987 Posted October 6, 2018 Share Posted October 6, 2018 I've allocated 30GB to my Docker Image and have the following containers installed: binhex-delugevpn binhex-sonarr couchpotato Duckdns Krusader letsencrypt mariadb nextcloud plex plexpy transmission My btrfs filesystem show is: Label: none uuid: ### Total devices 1 FS bytes used 27.61GiB devid 1 size 30.00GiB used 30.00GiB path /dev/loop2 Is ~27.61GB usage normal for this many containers? I've gotten warnings previously that I'm running out of space within my Docker image. I suspect that I have some sort of misconfiguration somewhere and files are getting stored directly to the container rather than the volume mount. How do I track this down? I've ssh-ed into the Docker containers but I'm not sure what I'm looking for. Link to comment
DZMM Posted October 6, 2018 Share Posted October 6, 2018 Nope - that's not usual. This script will help you find the culprit - The functionality should be built into 6.6.2 My money is on you've got delugevpn or transmission mapped to an incorrect folder somewhere Link to comment
snusnu1987 Posted October 7, 2018 Author Share Posted October 7, 2018 Thanks DZMM, that was very helpful. The output of the script is: Script location: /tmp/user.scripts/tmpScripts/Docker misconfiguration/script Note that closing this window will abort the execution of this script binhex-delugevpn Size: 893M binhex-sonarr Size: 1.1G Duckdns Size: 288M Krusader Size: 1.2G plexpy Size: 280M couchpotato Size: 164M letsencrypt Size: 265M mariadb Size: 351M nextcloud Size: 186M plex Size: 400M transmission Size: 47M Nothing looks unusual, but then I check the Docker log sizes and find: Script location: /tmp/user.scripts/tmpScripts/viewDockerLogSize/script Note that closing this window will abort the execution of this script 14G /var/lib/docker/containers/3b8340160f81ee5b9127d94d159fb8accd2bc8139469167b7344caa1bf3f1bff/3b8340160f81ee5b9127d94d159fb8accd2bc8139469167b7344caa1bf3f1bff-json.log 7.9G /var/lib/docker/containers/1fbd79eae2ad28e70c26f3e277c621d15bd564a809e401b7188da84226cd7b4a/1fbd79eae2ad28e70c26f3e277c621d15bd564a809e401b7188da84226cd7b4a-json.log 6.4M /var/lib/docker/containers/c8dd376d9a85b1dbfea7e0f68b37e0f5b19ec036e387c3618dbd551525aa1a0e/c8dd376d9a85b1dbfea7e0f68b37e0f5b19ec036e387c3618dbd551525aa1a0e-json.log 3.6M /var/lib/docker/containers/1ca822d4b4c2190189dd1c30a4797d4e056b3ca9a3c9d025cd7f002df1663e3e/1ca822d4b4c2190189dd1c30a4797d4e056b3ca9a3c9d025cd7f002df1663e3e-json.log 1.2M /var/lib/docker/containers/a21f6ce12a151ebff496a6f86119b4b29c1bc2f66ac87e3961de2545f679513f/a21f6ce12a151ebff496a6f86119b4b29c1bc2f66ac87e3961de2545f679513f-json.log 72K /var/lib/docker/containers/812c20071c63a27a5ea39ebd1637163411cd4effa74cc7748427bd6c309b6614/812c20071c63a27a5ea39ebd1637163411cd4effa74cc7748427bd6c309b6614-json.log 16K /var/lib/docker/containers/5f3870f71ee769b55fd6a0dd762d90ea88ece5aee3e2ad0cdf84cad1ed9c9661/5f3870f71ee769b55fd6a0dd762d90ea88ece5aee3e2ad0cdf84cad1ed9c9661-json.log 8.0K /var/lib/docker/containers/99c5f149bd4eaf06bb939ef610fd4dc37eb1690532c44960dc21e37ab1f23a1e/99c5f149bd4eaf06bb939ef610fd4dc37eb1690532c44960dc21e37ab1f23a1e-json.log 8.0K /var/lib/docker/containers/3fe9fda689de666020a695a9fc0fb91f6d19ce9cda978b9cbb46b7b490ffca43/3fe9fda689de666020a695a9fc0fb91f6d19ce9cda978b9cbb46b7b490ffca43-json.log 4.0K /var/lib/docker/containers/bf99c1296b092be69fbb9b07ed72d54e1955068639467b5e4d80803cf48abf78/bf99c1296b092be69fbb9b07ed72d54e1955068639467b5e4d80803cf48abf78-json.log Looks like both Sonarr and Krusader are storing their log files to the Docker image. I'm researching now how to trim these down. Any ideas? Link to comment
DZMM Posted October 7, 2018 Share Posted October 7, 2018 Have you got log rotation turned on in docker settings? If not, that should fix your problem - set it to 50MB I think you have to delete your docker image and reinstall them all via CA to implement - not 100% certain on this though, but when I've had problems in the past I've done this to make sure I have a clean 'image' Link to comment
snusnu1987 Posted October 7, 2018 Author Share Posted October 7, 2018 I added the following condition to Configuration -> Advanced View -> Extra Parameters: --log-opt max-size=50m --log-opt max-file=1 This shrunk the usage to: Label: none uuid: ### Total devices 1 FS bytes used 5.22GiB devid 1 size 30.00GiB used 30.00GiB path /dev/loop2 Hope this helps anyone looking for a fix to their Docker woes, I had been googling for a few days before I decided to make the thread. Link to comment
snusnu1987 Posted October 7, 2018 Author Share Posted October 7, 2018 10 minutes ago, DZMM said: Have you got log rotation turned on in docker settings? If not, that should fix your problem - set it to 50MB I think you have to delete your docker image and reinstall them all via CA to implement - not 100% certain on this though, but when I've had problems in the past I've done this to make sure I have a clean 'image' Log rotation was previously enabled and the Docker LOG maximum file size was set to 10MB. I am not sure why it was not being respected previously, but adding that condition to the Extra Parameters seemed to have cleared up my issue. Link to comment
DZMM Posted October 7, 2018 Share Posted October 7, 2018 yeah, sometimes dockers don't seem to respect that setting e.g. it happened to me with LE, and I had to add manually Link to comment
Recommended Posts
Archived
This topic is now archived and is closed to further replies.