NAS Posted September 13, 2014 Share Posted September 13, 2014 Something that is not docmented well yet is how to find out where you disk space is going in Docker land. Obviosuly this is complicated by the fact that BTRFS and Docker do their volume magic but it should still be possible for users to determine space usage without too many hoops. So for example: Label: none uuid: a64c0b52-f437-4526-bfa0-69841c0b28ae Total devices 1 FS bytes used 1.62GiB devid 1 size 10.00GiB used 5.04GiB path /dev/loop8 Btrfs v3.16 contains solely: TAG VIRTUAL SIZE gfjardim/nzbget:latest 583 MB gfjardim/owncloud:latest 700.9 MB needo/deluge:latest 493.8 MB needo/mariadb:latest 563.3 MB which obviosuly doesnt match the 5GB above Then becuase of the way unRAID PHP converts the raw number this doesnt even match what docker tells you for the same (docker images) REPOSITORY TAG IMAGE ID CREATED VIRTUAL SIZE gfjardim/owncloud latest 23af6b7b201b 9 days ago 734.9 MB gfjardim/nzbget latest 5e5dc2d88026 4 weeks ago 611.3 MB needo/mariadb latest 566c91aa7b1e 9 weeks ago 590.6 MB needo/deluge latest 1ce59257693f 10 weeks ago 517.7 MB If you try and drill down with (docker images -a) you get a list of every step which is just noise to most people. If you try to (du) /var/lib/docker/btrfs/ it takes an absolute age and comes back with stuff thats not very intersting anyway: 0 /var/lib/docker/btrfs/subvolumes/511136ea3c5a64f264b78b5433614aec563103b4d4702f3ba7d4d2698e22c158 214M /var/lib/docker/btrfs/subvolumes/5e66087f3ffe002664507d225d07b6929843c3f0299f5335a70c1727c8833737 214M /var/lib/docker/btrfs/subvolumes/4d26dd3ebc1c823cfa652280eca0230ec411fb6a742983803e49e051fe367efe 214M /var/lib/docker/btrfs/subvolumes/d4010efcfd86c7f59f6b83b90e9c66d4cc4d78cd2266e853b95d464ea0eb73e6 262M /var/lib/docker/btrfs/subvolumes/99ec81b80c55d906afd8179560fdab0ee93e32c52053816ca1d531597c1ff48f 262M /var/lib/docker/btrfs/subvolumes/82c9a67413361a13cd99f2140d7568d989171309d693ac74ec41179be71225df ... I can go on but I think the point has been made that this info needs to be standardized, simplified and centralized for users. Quote Link to comment
NAS Posted September 18, 2014 Author Share Posted September 18, 2014 With only normal starts and stops and no additonal dockers added look at the consumpion of the disk space in only a few days: Label: none uuid: a64c0b52-f437-4526-bfa0-69841c0b28ae Total devices 1 FS bytes used 1.62GiB devid 1 size 10.00GiB used 5.04GiB path /dev/loop8 Btrfs v3.16 Label: none uuid: a64c0b52-f437-4526-bfa0-69841c0b28ae Total devices 1 FS bytes used 3.25GiB devid 1 size 10.00GiB used 6.29GiB path /dev/loop8 Btrfs v3.16 Quote Link to comment
NAS Posted September 22, 2014 Author Share Posted September 22, 2014 And after another 4 days continuous operation no more space has been consumed Label: none uuid: a64c0b52-f437-4526-bfa0-69841c0b28ae Total devices 1 FS bytes used 3.65GiB devid 1 size 10.00GiB used 6.29GiB path /dev/loop8 Btrfs v3.16 Anyone have an idea how to delve deeper into where this space has gone? Quote Link to comment
NAS Posted October 6, 2014 Author Share Posted October 6, 2014 And now Label: none uuid: a64c0b52-f437-4526-bfa0-69841c0b28ae Total devices 1 FS bytes used 4.36GiB devid 1 size 10.00GiB used 7.29GiB path /dev/loop8 Btrfs v3.16.1 Something is consuming docker img space during normal operation which means eventually docker will fail. The problem is how on earth to you track it down on a filesystem like BTRFS. Can anyone confirm this is specific to me or could it be a creeping space bug? Update. Another command btrfs fi df /var/lib/docker Data, single: total=4.01GiB, used=3.04GiB System, DUP: total=8.00MiB, used=16.00KiB System, single: total=4.00MiB, used=0.00 Metadata, DUP: total=1.62GiB, used=1.33GiB Metadata, single: total=8.00MiB, used=0.00 GlobalReserve, single: total=96.00MiB, used=0.00 Why is it with all things docker and BTRFS nothing ever matches up space wise lol Quote Link to comment
ljm42 Posted October 7, 2014 Share Posted October 7, 2014 Interesting. I don't think the space reported on the docker page is accurate. I noticed my system said "size 10.00GiB used 10.00GiB" a few days ago, yet I have done a considerable amount of work on plexWatch the last few days and didn't have any problems. I also just upgraded to the latest plex docker a few minutes ago (which is the first to switch to phusion/baseimage:0.9.15) and didn't have any space issues with that either. So I'm not really sure what "size 10.00GiB used 10.00GiB" means, but it doesn't seem to be all that important. As of right now, my "docker service settings" says: Total devices 1 FS bytes used 5.12GiB devid 1 size 10.00GiB used 10.00GiB path /dev/loop8 and according to "df": /dev/loop8 10485760 5413012 3638252 60% /var/lib/docker and btrfs fi df: root@Tower:~# btrfs fi df /var/lib/docker Data, single: total=7.97GiB, used=4.50GiB System, DUP: total=8.00MiB, used=16.00KiB System, single: total=4.00MiB, used=0.00 Metadata, DUP: total=1.00GiB, used=337.55MiB Metadata, single: total=8.00MiB, used=0.00 GlobalReserve, single: total=112.00MiB, used=0.00 Quote Link to comment
NAS Posted October 7, 2014 Author Share Posted October 7, 2014 Similar results here. let me replicate your commands just for completeness Label: none uuid: a64c0b52-f437-4526-bfa0-69841c0b28ae Total devices 1 FS bytes used 4.46GiB devid 1 size 10.00GiB used 7.29GiB path /dev/loop8 Btrfs v3.16.1 # df -h /var/lib/docker/ Filesystem Size Used Avail Use% Mounted on /dev/loop8 10G 5.8G 3.6G 62% /var/lib/docker btrfs fi df /var/lib/docker Data, single: total=4.01GiB, used=3.13GiB System, DUP: total=8.00MiB, used=16.00KiB System, single: total=4.00MiB, used=0.00 Metadata, DUP: total=1.62GiB, used=1.33GiB Metadata, single: total=8.00MiB, used=0.00 GlobalReserve, single: total=112.00MiB, used=0.00 So on the face of it they are all different although if you add all the figures in the "btrfs fi df /var/lib/docker" command it comes "close" to the "df -h /var/lib/docker/" command. For sure the number presented in the unRAID GUI means something other that the real free space user will assume it means. Quote Link to comment
gshlomi Posted February 2, 2015 Share Posted February 2, 2015 Hi. Anyone figured it out ? It seems one of my dockers is eating up all the free space. Already set my docker image size to 50GB and it's all filled up :-( Thanks Quote Link to comment
dalben Posted February 2, 2015 Share Posted February 2, 2015 I've now killed my docker image and recreated everything from scratch (no real drama other than waiting for the containers to download) a couple of times to get everything back to a working state. Quote Link to comment
gshlomi Posted February 2, 2015 Share Posted February 2, 2015 I'm using docker on a reiserfs formatted cache drive. Is it a problem ? Doing a " docker images -a" I receive: REPOSITORY TAG IMAGE ID CREATED VIRTUAL SIZE gfjardim/crashplan latest 5b583eacd453 2 weeks ago 628 MB <none> <none> eb077b14f199 2 weeks ago 289 MB <none> <none> 2d1823e6de38 2 weeks ago 289 MB <none> <none> 52eb55d59b0c 2 weeks ago 289 MB <none> <none> d78cf1e9c99a 2 weeks ago 289 MB <none> <none> 3005bbb94ca8 2 weeks ago 289 MB <none> <none> 96600b849981 2 weeks ago 289 MB <none> <none> 7e366a533777 2 weeks ago 289 MB <none> <none> 53ca977e8f12 2 weeks ago 289 MB <none> <none> 8db050c227fd 2 weeks ago 289 MB <none> <none> 231c359ab85c 2 weeks ago 289 MB <none> <none> 6ef425b64862 2 weeks ago 289 MB <none> <none> 697a0bf1007a 2 weeks ago 289 MB <none> <none> 0475aad443c3 2 weeks ago 289 MB <none> <none> f3bd11b950d3 2 weeks ago 289 MB <none> <none> baf02df43d4a 3 weeks ago 17.79 MB google/cadvisor 0.8.0 4cde3eeb4b8d 3 weeks ago 17.79 MB google/cadvisor beta 4cde3eeb4b8d 3 weeks ago 17.79 MB google/cadvisor canary 4cde3eeb4b8d 3 weeks ago 17.79 MB google/cadvisor latest 4cde3eeb4b8d 3 weeks ago 17.79 MB <none> <none> 9d117f7fa32c 3 weeks ago 17.79 MB gfjardim/btsync latest b8f9c682b5d4 3 weeks ago 298.3 MB <none> <none> b50331c3cfad 3 weeks ago 289 MB <none> <none> a59792bcb13a 3 weeks ago 289 MB <none> <none> 16e9940ebd8a 3 weeks ago 289 MB <none> <none> 2be6a44cf02d 3 weeks ago 289 MB <none> <none> d1c8f163f1af 3 weeks ago 289 MB <none> <none> b79efc5ee28b 3 weeks ago 289 MB <none> <none> c14748741c44 3 weeks ago 289 MB <none> <none> 1e709d386e14 3 weeks ago 289 MB <none> <none> 5b25a1c620a8 3 weeks ago 289 MB <none> <none> 97f68553b69e 3 weeks ago 289 MB <none> <none> deddbac9616c 3 weeks ago 289 MB <none> <none> 19df77b26991 3 weeks ago 289 MB <none> <none> 3e072bc7e323 3 weeks ago 289 MB <none> <none> b45e3de9adfb 3 weeks ago 289 MB <none> <none> ef69e383a1cb 3 weeks ago 289 MB <none> <none> ce4b6caebf0b 3 weeks ago 289 MB <none> <none> 8e01015dfabc 3 weeks ago 289 MB <none> <none> 4705567c2bcc 3 weeks ago 289 MB needo/couchpotato latest 840e64c45261 3 weeks ago 333.3 MB <none> <none> fed5ac31f765 3 weeks ago 289 MB <none> <none> 4cd1910f74b6 3 weeks ago 289 MB <none> <none> 0d557a08f1c0 3 weeks ago 289 MB <none> <none> 5096dc28f11d 3 weeks ago 289 MB <none> <none> 7aa7dfd2edbf 3 weeks ago 289 MB <none> <none> 7d37bfd3850d 3 weeks ago 289 MB <none> <none> 5e0de68daac2 3 weeks ago 289 MB <none> <none> 6a0987090ad7 3 weeks ago 289 MB <none> <none> 7aae71dc982d 3 weeks ago 289 MB <none> <none> b28876c71dff 3 weeks ago 289 MB <none> <none> 78a76b54258a 3 weeks ago 289 MB <none> <none> e3006539cbbf 3 weeks ago 289 MB <none> <none> 5a61cce1175d 3 weeks ago 289 MB needo/sabnzbd latest 5b0d14faab57 6 weeks ago 665.9 MB <none> <none> 49e00d99e75f 6 weeks ago 665.9 MB <none> <none> 832419b2b087 6 weeks ago 665.9 MB <none> <none> df02e34b2d6d 6 weeks ago 665.9 MB <none> <none> 8ff24c2cc5fe 6 weeks ago 665.9 MB <none> <none> f0de1bcba7f7 6 weeks ago 665.9 MB <none> <none> a80c8a83add2 6 weeks ago 665.9 MB <none> <none> b2f621e88952 6 weeks ago 665.3 MB <none> <none> 214019d8f2f2 6 weeks ago 665.1 MB <none> <none> 0e5255a1258b 6 weeks ago 664.1 MB <none> <none> 2920a001f842 6 weeks ago 506.4 MB <none> <none> 471c01d4b922 6 weeks ago 391.8 MB <none> <none> 490fa3711e7a 6 weeks ago 391.8 MB <none> <none> fad9a8bc4d01 6 weeks ago 391.8 MB <none> <none> ea29d642fba0 6 weeks ago 391.8 MB <none> <none> 9f61648c9e93 6 weeks ago 391 MB <none> <none> 834f770169f8 6 weeks ago 391 MB <none> <none> 63387162a2b0 6 weeks ago 391 MB <none> <none> 96a42a5c3a0b 6 weeks ago 391 MB <none> <none> 5cd8f24b1a69 6 weeks ago 391 MB <none> <none> 6c021fa80814 6 weeks ago 391 MB <none> <none> 06ba2b469f02 10 weeks ago 4.808 MB <none> <none> 7acf13620725 3 months ago 4.808 MB <none> <none> 6f114d3139e3 3 months ago 4.808 MB <none> <none> 7fff0c6f0b8d 3 months ago 4.347 MB <none> <none> 8db8f013bfca 3 months ago 4.278 MB <none> <none> 21082221cb6e 3 months ago 4.269 MB <none> <none> bbd692fe2ca1 3 months ago 4.269 MB <none> <none> e2fb46397934 3 months ago 0 B <none> <none> 015fb409be0d 3 months ago 4.268 MB <none> <none> cf39b476aeec 4 months ago 289 MB <none> <none> 64463062ff22 4 months ago 289 MB <none> <none> d5199f68b2fe 4 months ago 194.8 MB <none> <none> 9d9561782335 4 months ago 194.8 MB <none> <none> bad562ead0dc 4 months ago 194.8 MB <none> <none> c27763e1f3e5 4 months ago 194.8 MB <none> <none> 6b4e8a7373fe 4 months ago 194.8 MB <none> <none> c900195dcbf3 4 months ago 194.8 MB <none> <none> 8b1c48305638 4 months ago 192.8 MB <none> <none> 25c4824a5268 4 months ago 192.7 MB <none> <none> 67b66f26d423 4 months ago 192.7 MB <none> <none> b18d0a2076a1 4 months ago 192.6 MB gfjardim/transmission latest 5a0c5c6db90d 4 months ago 476.1 MB <none> <none> c45425691914 4 months ago 476.1 MB <none> <none> 8c93a94f2126 4 months ago 476.1 MB <none> <none> d31290d5f24e 4 months ago 476.1 MB <none> <none> cf0e9da91e7e 4 months ago 476.1 MB <none> <none> 3f5cf35acef8 4 months ago 476.1 MB <none> <none> 25fee1d4386c 4 months ago 476.1 MB <none> <none> 78f33a8b8d2a 4 months ago 476.1 MB <none> <none> 54a97efbd006 4 months ago 476.1 MB <none> <none> 1ecd55d25782 4 months ago 476.1 MB <none> <none> 9df2d56ec00b 4 months ago 476.1 MB <none> <none> 1b6d9190a596 4 months ago 476.1 MB <none> <none> 271bb41524cf 4 months ago 470.8 MB <none> <none> c483035d1492 4 months ago 391.8 MB <none> <none> 483f896706ae 4 months ago 391.8 MB <none> <none> 576129a8323b 4 months ago 391.1 MB <none> <none> 4600a672c28f 4 months ago 391.1 MB <none> <none> 3200f75e724a 4 months ago 391.1 MB <none> <none> e8be2ebbb88a 4 months ago 391 MB <none> <none> 47f337d7d254 4 months ago 391 MB <none> <none> cc1f04df8886 4 months ago 391 MB <none> <none> 6ed767dec3d4 4 months ago 391 MB <none> <none> df6b87e75886 4 months ago 391 MB <none> <none> b4985d085e9c 4 months ago 391 MB needo/mariadb latest 566c91aa7b1e 6 months ago 590.6 MB <none> <none> fede856341ad 6 months ago 590.6 MB <none> <none> 864b9f0054e7 6 months ago 590.6 MB <none> <none> 23c87de41786 6 months ago 590.6 MB <none> <none> 5d7fe409bfe4 6 months ago 590.6 MB <none> <none> da7bfdf0e1b6 6 months ago 590.6 MB <none> <none> 6470f811e73f 6 months ago 590.6 MB <none> <none> 594f8b77e732 6 months ago 590.6 MB <none> <none> 9243c9adab8b 6 months ago 590.6 MB <none> <none> ee0b1941d320 6 months ago 590.6 MB <none> <none> 4fe585a340e7 6 months ago 590.6 MB <none> <none> 31bfb552afaf 6 months ago 468.4 MB <none> <none> 56b9543fe345 6 months ago 391 MB <none> <none> 776eac27ad9d 6 months ago 391 MB <none> <none> 4132f6005419 6 months ago 391 MB <none> <none> 7418ba5f7369 6 months ago 391 MB <none> <none> 30867c4ff0fd 6 months ago 391 MB needo/sickbeard latest cbe714aa54d9 7 months ago 526.1 MB <none> <none> 9e69e640405d 7 months ago 526.1 MB <none> <none> 16801c812d8b 7 months ago 526.1 MB <none> <none> 0153b63264d3 7 months ago 526.1 MB <none> <none> 675a9968eae8 7 months ago 526.1 MB <none> <none> 096105a05253 7 months ago 526.1 MB <none> <none> b34f41cb7a4c 7 months ago 526.1 MB <none> <none> 164579f5b74f 7 months ago 526.1 MB <none> <none> a3f00ed0390e 7 months ago 526.1 MB <none> <none> bdaca3438d9b 7 months ago 526.1 MB <none> <none> 1824cf6bdce5 7 months ago 526.1 MB <none> <none> fbe3989c3547 7 months ago 526.1 MB <none> <none> 36882b0d8d0c 7 months ago 519.7 MB <none> <none> 4a7279e87c7a 7 months ago 517.2 MB <none> <none> 3eaff2bf39aa 7 months ago 517.2 MB <none> <none> 3fba2009e2c2 7 months ago 502.3 MB <none> <none> 84e3c7159595 7 months ago 391.8 MB <none> <none> b8b792c22c13 7 months ago 391.8 MB <none> <none> 7168855fb6f8 7 months ago 391 MB <none> <none> 25d4dd3357bf 7 months ago 391 MB <none> <none> 742e5c8f30b0 7 months ago 391 MB <none> <none> 41869ebdb5c0 7 months ago 391 MB <none> <none> d06d5dbe65ca 7 months ago 391 MB <none> <none> f3a6e653deaf 7 months ago 391 MB <none> <none> dabfc8a44cb5 7 months ago 391 MB <none> <none> 37fd06241067 7 months ago 391 MB <none> <none> 68bde466ffab 7 months ago 266 MB <none> <none> 82c9a6741336 8 months ago 266 MB <none> <none> 5cbfee875b7b 8 months ago 266 MB <none> <none> afc7fc2f17c1 8 months ago 266 MB <none> <none> 99ec81b80c55 9 months ago 266 MB <none> <none> 4d26dd3ebc1c 9 months ago 192.7 MB <none> <none> d4010efcfd86 9 months ago 192.7 MB <none> <none> 5e66087f3ffe 9 months ago 192.5 MB <none> <none> 511136ea3c5a 19 months ago 0 B Quote Link to comment
saarg Posted February 2, 2015 Share Posted February 2, 2015 UIt looks like you have many copies of the same containers. If you look at the time of creation and the size you see that it is almost the same for many of them. The easiest way to solve this is to delete the docker.img and start over. There should not be any containers with Repository and tag set to none, as is the case in your situation. Quote Link to comment
BRiT Posted February 2, 2015 Share Posted February 2, 2015 UIt looks like you have many copies of the same containers. If you look at the time of creation and the size you see that it is almost the same for many of them. The easiest way to solve this is to delete the docker.img and start over. There should not be any containers with Repository and tag set to none, as is the case in your situation. Thats not true, not even in the least. Those are all the related layers and the copy on write aspect of docker. It is perfectly normal. This is just a pain of using Dockers in unraid. Each day the dockers grow ever so slightly in size so you end up doing a monthly dance of nuking it all and restarting with a fresh docker.img. This has no difference on what the filesystem of the cache drive is since the dockers are stored inside a BTRFS image. Quote Link to comment
gshlomi Posted February 2, 2015 Share Posted February 2, 2015 Well, at least it's not just me ;-) Anyway, did you post it at the Beta's thread, so it'll get fixed ? Quote Link to comment
BRiT Posted February 2, 2015 Share Posted February 2, 2015 Well, at least it's not just me ;-) Anyway, did you post it at the Beta's thread, so it'll get fixed ? It's well known, but I did post more about this in the Docker forum: http://lime-technology.com/forum/index.php?topic=37407.0 Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.