Jump to content

Docker image is 200GB and now wont start! Please Help.


unRaide

Recommended Posts

Hi,

 

My docker instance shutdown earlier today because it ran out of space (currently 200GB). I increased the size to 300GB and was able to get it running again. I've been trying to figure out what the issue is by browsing the forum where i came across this thread and was currently tackling the suggestion of rotating logs.

 

I stopped docker, enabled log rotation and set it to 20MB with 2 files. However when i tried enabling the docker it wouldnt start. Tried toggling log rotation and the docker off/on with no luck :( 

 

Does anyone know how i can get docker up and running again?

 

Another piece of info...I also noticed that Fix Common Problems was reporting that "/var/log is getting full (currently 100 % used)" so i looked in there and saw that I have two docker log files one of which is 128M!

sc_2018830_221357_429.png.910d033cc128dc9a1ea9262920fc3e13.png

It also seems that most of that file is showing the same error over and over

time="2018-08-29T04:12:44.148202018-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.164261825-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.180545313-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.196581706-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.213155518-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.229533165-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.250082934-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.266417765-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.282693676-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.298779464-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.314660558-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.330609093-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.348824688-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.364960059-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.381099516-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.397724331-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.421475502-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.447766246-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.464297420-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.480990378-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"
time="2018-08-29T04:12:44.497664613-07:00" level=error msg="Failed to log msg \"\" for logger json-file: error writing log entry: write /var/lib/docker/containers/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87/353c70186d935e4f278b66f9eab0f981e835f0ee8a643ad33130667175d80e87-json.log: no space left on device"

whats strange is that i don't have a container at that path?!?

 

Any ideas.... can i just delete that docker log?

 

Also attached my diagnostics.

unraid-diagnostics-20180830-2209.zip

Link to comment
11 hours ago, unRaide said:

My docker instance shutdown earlier today because it ran out of space (currently 200GB). I increased the size to 300GB and was able to get it running again.

 

Are you saying your docker.img file was 200GB and you increased it to 300GB? 

 

That is incredibly large for a docker image file.  One or more of your dockers may be misconfigured which is resulting in data being stored in the docker.img file which should be stored in a user share on the array, unassigned device or other location.  You may have logging issues as well, but, I suspect something that downloads files is storing them in the docker image. 

 

If logging does not appear to be the sole cause of your issue, double check the paths in your docker configurations and check the settings in download clients and make sure any temp, incomplete, completed download paths specify locations on the array, an unassigned device, etc.   Otherwise, they could end up in the docker.img file

 

docker ps -s may give you an idea as to what is using the space in your docker image file.

 

There are several threads in these forums discussing ballooning docker image files and how to control it.

 

I have a dozen or so dockers configured.  My docker.img file is 20GB in size and less than half of that is currently being used. 

Link to comment

Thanks for the reply @Hoopster!

 

Yea, there is definitely something going on with my containers... which I was trying to figure out before my docker service refocused to start up.

 

My primary concern now is getting docker enabled so I can continue investigating further... I can’t survive without my dockers 😩

 

Any ideas on how to debug why my docker service wont start?

Link to comment
59 minutes ago, unRaide said:

Any ideas on how to debug why my docker service wont start?

Problems with the docker.img, a full cache drive (assuming that is where appdata is stored), BTRFS errors, etc. may cause docker to fail to start.  Below is one thread on the subject. It contains a lot of good tips for troubleshooting docker issues some of which you have likely already tried.

 

 

If there is no data stored in the docker.img that you really need (of course, maybe you do not know what is stored there now), you can delete docker.img and recreate it with a smaller size such as 20GB, reinstall all your dockers as currently configured through the Previous Apps tab in Community applications and start troubleshooting your docker configurations and logging.

 

If no other options seem to work, deleting and recreating the docker.img is a good place to start, if you can do that, and should allow docker to start again.

Link to comment

After recreating docker.img as only 20GB, turn off all dockers except 1. Try to troubleshoot them one at a time. Go to the support thread for the docker you are troubleshooting and see if you can understand how it is supposed to be setup. You can go directly to the support thread of any docker by clicking on its icon and selecting Support. You will get plenty of help, but you will have to be patient and work through this one application at a time.

 

Yours is the largest docker.img I have ever heard of and definitely a sign of one or more of your applications being misconfigured. Almost nobody has actually needed more than 20GB when they had things setup right.

 

You will probably also have to reboot to clear up the /var/log.

 

Have you looked at the Docker FAQ which is pinned near the top of this same subforum?

 

 

Link to comment
  • 2 weeks later...

Thanks for the replies and per @trurl‘s comment restarting the server got Docker up and running again.

 

That said i clearly have an issue with 1+ dockers and will need to spend some time working through the steps you guys mentioned. Before re-creating the Docker.img file which as you mentioned would delete data that I may not be happy to loose is there a docker command that i can run within each docker to see how big the container is?  

 

Never realized it would be so difficult to debug this issue.

Link to comment
1 hour ago, unRaide said:

Before re-creating the Docker.img file which as you mentioned would delete data that I may not be happy to loose

I think it's unlikely there is any data in there that you want to keep, and it would be pretty difficult to get it out unless you are pretty adept at the command line and have some idea where in each docker the data might be. Probably the working data of each of your dockers is stored in the appdata for the docker since that is typically how they are setup by default. Depending on the specific application you may have downloaded data inside the docker image. Are you missing any downloads because they didn't seem to wind up in unRAID storage like you intended?

 

Maybe not worth the trouble unless you have something specific in mind to recover.

 

Have you looked at the Docker FAQ? It has some discussion of exactly these problems with misconfigured dockers.

Link to comment

Thanks @trurl, super helpful!!

 

I ran the script and the total comes out to ~20GB which is clearly a lot less than the 300GB that’s allocated!!!

 

Is there a way to tell how much of the allocated 300GB Docker is actually using? Looking at the advanced Docker settings i noticed something that there are two mentions of 23GB used and 100GB used. Is either of those the size of what’s actually being used within the image?

 

Does that mean i can safely disable Docker, change size to say 30GB, and re-enable? Not sure if I’ve fixed the problem but I have deleted some older dockers and added the global log cap for all Dockers as described in the Docker FAQ.

 

 

BBDAFAFA-DA4C-4AA4-A86D-151DC510196F.jpeg

Link to comment

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...