Jump to content

RAM peak to 97% = dockers freeze


Recommended Posts

Hi all,

 

I have a cyclical issue with my unraid server running inside my DELL T20.

 

It happens that dockers become unavailable, web applications are not responding. Looking at the dashboard ram is 97% occupied (at boot is 50%) and all CPU are at 100%.

 

The only solution is to reboot until next freeze.

 

Attached my diagnostics after last reboot few minutes ago.

 

Can someone please help me? 

joshua-diagnostics-20200509-2208.zip

Link to comment

My guess is you have one or more of your dockers misconfigured.

 

One reason I say that is because you have given docker image 50G, when 20G should be more than enough, and you have already used 23G of the 50G you have allocated. This most likely means you have an application writing to a path that isn't mapped. An unmapped path is a path inside the docker image.

 

However, that won't actually cause you to run out of RAM. It will just cause your docker image to fill up. Making it larger than 20G will not fix that problem, it will just make it take longer to fill.

 

What will fill RAM though is a docker mapping that isn't to actual storage. Any host path that isn't to the user shares or the disks is a path in RAM.

 

So, you probably have both situations.

 

Also, your appdata has files on the array even though it is cache-prefer. Not sure how you got them there unless you had changed the cache setting for appdata, or else you filled cache and appdata had to overflow to the array. Those probably can't be moved back to cache while dockers are running because mover can't move open files.

 

We will probably have to tear down your dockers and figure out what you have done with them.

 

Which dockers do you run?

Link to comment

Hi, thank you for quick response.

 

Below the list of all my dockers (with on and off status) with all my mapped path :

 

 

add_6a3d001f_assistant_relay
started     up-to-date    default    :3000/TCP:3000    
/data/mnt/user/appdata/hassio/adds/data/6a3d001f_assistant_relay

add_15ef4d2f_esphome
started     up-to-date    host        
/data/mnt/user/appdata/hassio/adds/data/15ef4d2f_esphome
/cfig/mnt/user/appdata/hassio/homeassistant
/ssl/mnt/user/appdata/hassio/ssl

add_243ffc37_valetudomapper
started     up-to-date    default        
/data/mnt/user/appdata/hassio/adds/data/243ffc37_valetudomapper
/ssl/mnt/user/appdata/hassio/ssl

add_a0d7b954_appdaem
started     up-to-date    default    :5050/TCP:5050    
/data/mnt/user/appdata/hassio/adds/data/a0d7b954_appdaem
/cfig/mnt/user/appdata/hassio/homeassistant
/ssl/mnt/user/appdata/hassio/ssl
/share/mnt/user/appdata/hassio/share

add_a0d7b954_grafana
started     up-to-date    default        
/data/mnt/user/appdata/hassio/adds/data/a0d7b954_grafana
/ssl/mnt/user/appdata/hassio/ssl

add_a0d7b954_influxdb
started     up-to-date    default    :8086/TCP:8086    
/data/mnt/user/appdata/hassio/adds/data/a0d7b954_influxdb
/ssl/mnt/user/appdata/hassio/ssl
/share/mnt/user/appdata/hassio/share

add_a0d7b954_logviewer
started     up-to-date    default        
/data/mnt/user/appdata/hassio/adds/data/a0d7b954_logviewer
/cfig/mnt/user/appdata/hassio/homeassistant
/ssl/mnt/user/appdata/hassio/ssl
/share/mnt/user/appdata/hassio/share

add_a0d7b954_motieye
started     up-to-date    host        
/data/mnt/user/appdata/hassio/adds/data/a0d7b954_motieye
/ssl/mnt/user/appdata/hassio/ssl
/share/mnt/user/appdata/hassio/share

add_a0d7b954_nodered
stopped     up-to-date    host        
/data/mnt/user/appdata/hassio/adds/data/a0d7b954_nodered
/cfig/mnt/user/appdata/hassio/homeassistant
/ssl/mnt/user/appdata/hassio/ssl
/share/mnt/user/appdata/hassio/share

add_a0d7b954_vscode
started     up-to-date    default        
/data/mnt/user/appdata/hassio/adds/data/a0d7b954_vscode
/cfig/mnt/user/appdata/hassio/homeassistant
/ssl/mnt/user/appdata/hassio/ssl
/adds/mnt/user/appdata/hassio/adds/local
/backup/mnt/user/appdata/hassio/backup
/share/mnt/user/appdata/hassio/share

add_core_decz
started     up-to-date    default    :5900/TCP:5900    
/data/mnt/user/appdata/hassio/adds/data/core_decz
/lib/modules/lib/modules

add_core_mosquitto
started     up-to-date    default    
:1883/TCP:1883
:1884/TCP:1884
:8883/TCP:8883
:8884/TCP:8884
/data/mnt/user/appdata/hassio/adds/data/core_mosquitto
/ssl/mnt/user/appdata/hassio/ssl
/share/mnt/user/appdata/hassio/share

bazarr
started     up-to-date    bridge    172.17.0.2:6767/TCP192.168.2.100:6767    
/movies/mnt/user/movies/
/tv/mnt/user/tv/
/cfig/mnt/user/appdata/bazarr

bitwardenrs
started     up-to-date    proxynet    172.18.0.2:80/TCP192.168.2.100:8086    /data/mnt/user/appdata/bitwarden    

deluge
started     up-to-date    host    
192.168.2.100:58846/TCP192.168.2.100:58846
192.168.2.100:58946/TCP192.168.2.100:58946
192.168.2.100:58946/UDP192.168.2.100:58946
192.168.2.100:8112/TCP192.168.2.100:8112
/downloads/mnt/user/downloads/Temp/
/File_Torrent/mnt/user/downloads/File_Torrent
/Completed/mnt/user/downloads/Completed/
/copied/mnt/user/downloads/Copied/
/cfig/mnt/user/appdata/deluge

emby
stopped     up-to-date    host    0.0.0.0:8096/TCP0.0.0.0:8096
0.0.0.0:8920/TCP0.0.0.0:8920    
/movies/mnt/user/movies/
/tv/mnt/user/tv/
/music/mnt/user/music/
/photo/mnt/user/photos/Marco - Irene - Adele/
/adele/mnt/user/documents/Adele/
/cfig/mnt/user/appdata/emby

FlexTV
stopped     up-to-date    bridge    0.0.0.0:80/TCP0.0.0.0:5666    /cfig/mnt/user/appdata/FlexTV    

freshrss
started     up-to-date    proxynet    172.18.0.3:80/TCP192.168.2.100:8044    /cfig/mnt/user/appdata/freshrss    

full-text-rss
started     apply update    default    :80/TCP:80        

ha-dockerm
started     up-to-date    bridge    172.17.0.4:8126/TCP192.168.2.100:8126    
/var/run/docker.sock/var/run/docker.sock
/cfig/mnt/user/appdata/ha-dockerm

hassio_audio
started     up-to-date    default        
/data/mnt/user/appdata/hassio/audio
/run/dbus/run/dbus

hassio_cli
started     up-to-date    default            

hassio_dns
started     up-to-date    default        /cfig/mnt/user/appdata/hassio/dns    

hassio_multicast
started     up-to-date    host            

hassio_supervisor
started     up-to-date    proxynet        
/data/mnt/user/appdata/hassio/
/var/run/docker.sock/var/run/docker.sock
/var/run/dbus/var/run/dbus

homeassistant
started     up-to-date    host        
/cfig/mnt/user/appdata/hassio/homeassistant
/ssl/mnt/user/appdata/hassio/ssl
/share/mnt/user/appdata/hassio/share
/etc/pulse/client.cf/mnt/user/appdata/hassio/tmp/homeassistant_pulse
/run/audio/mnt/user/appdata/hassio/audio/external
/etc/asound.cf/mnt/user/appdata/hassio/audio/asound

hydra2
started     up-to-date    bridge    172.17.0.5:5076/TCP192.168.2.100:5076    
/downloads/mnt/user/downloads/Completed/
/cfig/mnt/user/appdata/hydra2

jackett
started     up-to-date    bridge    172.17.0.6:9117/TCP192.168.2.100:9117    
/downloads/mnt/user/downloads/
/cfig/mnt/user/appdata/jackett

jellyfin
stopped     up-to-date    host    0.0.0.0:8096/TCP0.0.0.0:8096
0.0.0.0:8920/TCP0.0.0.0:8920    
/music/mnt/user/music/
/cfig/mnt/user/appdata/jellyfin
/movies/mnt/user/movies/
/tv/mnt/user/tv/

letsencrypt
started     up-to-date    proxynet    172.18.0.5:443/TCP192.168.2.100:1443
172.18.0.5:80/TCP192.168.2.100:180    /cfig/mnt/user/appdata/letsencrypt    

lidarr
started     up-to-date    proxynet    172.18.0.6:8686/TCP192.168.2.100:8686    
/downloads/mnt/user/downloads/Completed/
/music/mnt/user/music/_SISTEMATA/
/cfig/mnt/user/appdata/lidarr

mariadb
started     up-to-date    bridge    172.17.0.7:3306/TCP192.168.2.100:3306    /cfig/mnt/user/appdata/mariadb    

nextcloud
started     up-to-date    bridge    172.17.0.8:443/TCP192.168.2.100:444    
/data/mnt/user/nextcloud/
/cfig/mnt/user/appdata/nextcloud

plex
started     up-to-date    host    
192.168.2.100:1900/UDP192.168.2.100:1900
192.168.2.100:3005/TCP192.168.2.100:3005
192.168.2.100:32400/TCP192.168.2.100:32400
192.168.2.100:32410/UDP192.168.2.100:32410
192.168.2.100:32412/UDP192.168.2.100:32412
192.168.2.100:32413/UDP192.168.2.100:32413
192.168.2.100:32414/UDP192.168.2.100:32414
192.168.2.100:32469/TCP192.168.2.100:32469
192.168.2.100:8324/TCP192.168.2.100:8324
/photos/mnt/user/photos/
/adele/mnt/user/documents/Adele/
/cfig/mnt/user/appdata/plex
/movies/mnt/user/movies/
/tv/mnt/user/tv/
/music/mnt/user/music/
/transcode/tmp

pptag_pptag_1
started     up-to-date    pptag_default        
/Photos/mnt/user/photos
/app/pptag/cfig.py/mnt/user/appdata/pptag/cfig.py

projectsend
stopped     up-to-date    bridge    0.0.0.0:80/TCP0.0.0.0:8070    
/data/mnt/user/appdata/projectsend
/cfig/mnt/user/appdata/projectsend

radarr
started     up-to-date    proxynet    172.18.0.7:7878/TCP192.168.2.100:7878    
/downloads/mnt/user/downloads/
/movies/mnt/user/movies/
/cfig/mnt/user/appdata/radarr

radarr-sma
stopped     up-to-date    bridge    0.0.0.0:7878/TCP0.0.0.0:7879    
/cfig/mnt/user/appdata/radarr-sma
/storage/mnt/user/movies/2001 Odissea nello spazio (1968)/

sonarr
started     up-to-date    proxynet    172.18.0.8:8989/TCP192.168.2.100:8989    
/dev/rtc/dev/rtc
/tv/mnt/user/tv/
/downloads/mnt/user/downloads/
/cfig/mnt/user/appdata/sarr

TasmoAdmin
started     up-to-date    bridge    172.17.0.9:80/TCP192.168.2.100:9541    /data/mnt/user/appdata/tasmoadmin/    

tautulli
started     up-to-date    bridge    172.17.0.10:8181/TCP192.168.2.100:8181    
/logs/mnt/user/appdata/tautulli/
/cfig/mnt/user/appdata/tautulli

tdarr_aio
stopped     up-to-date    bridge    0.0.0.0:8265/TCP0.0.0.0:8265    
/home/Tdarr/Documents/mnt/user/appdata/tdarr
/home/Tdarr/Media/mnt/user/downloads/Copied/
/var/lib/mgodb/mnt/user/appdata/tdarr-db
/home/Tdarr/cache/mnt/cache/tdarr-cache
/home/Tdarr/De/mnt/user/downloads/Post_Processing/

unifi-controller
started     up-to-date    host    
192.168.2.100:8080/TCP192.168.2.100:8080
192.168.2.100:8081/TCP192.168.2.100:8081
192.168.2.100:8443/TCP192.168.2.100:8443
192.168.2.100:8843/TCP192.168.2.100:8843
192.168.2.100:8880/TCP192.168.2.100:8880
/cfig/mnt/user/appdata/unifi-ctroller    

unmanic
stopped     up-to-date    bridge    0.0.0.0:8888/TCP0.0.0.0:8888    
/cfig/mnt/user/appdata/unmanic
/library/movies/mnt/user/movies/Aquaman (2018)/
/tmp/unmanic/mnt/cache/unmanic
/library/tv/mnt/user/Media/TV

Unraid-API
started     up-to-date    bridge    172.17.0.11:80/TCP192.168.2.100:3005    /app/cfig/mnt/user/appdata/Unraid-API    

video2telegram
started     up-to-date    bridge        /video/mnt/user/appdata/hassio/share/motieye

 

 

I hope it will be enough!

Link to comment

Way too difficult to read. If someone else gave something like that to you would you even try to make any sense of it? Maybe a screenshot would be better unless you want to go to the trouble to reformat it into clear columns.

 

Really all I wanted is a list without all those details for each. A screenshot of the Docker section on the Dashboard page. I thought we could get into the other details as needed. And those details you dumped there will not really be sufficient for the problem of filling docker image since that also depends on settings within each application.

 

Do you understand docker mappings of host paths to container paths? Do you know Linux is case sensitive? Do you know to use absolute and not relative paths to refer to the container paths within the application? Do you know why I'm asking these questions and what their importance is?

Link to comment

Sorry. In notepad it was more readable. I tried before to share a simple screenshot but had difficult to visualize all those docker container in a single image. Below my best result. 

 

image.png.b55810ec2e0a38b8d8a31d056e9494bd.png

 

 

It's not the first time I use docker so I have some experience. But I'm just an amateur with a passion to IT so I have surely a lot to learn.

I know the link between host and container but was not aware of case sensitive (will check my container now looking for a unmapped path) Maybe it can be plex who use ram to transcode. But I rarely use transcode in plex (always prefer direct play).

Link to comment
3 minutes ago, Jokerigno said:

not aware of case sensitive

If an application writes to a path that doesn't exactly match the container path including upper/lower case, or a path that isn't an absolute path, then that path is inside the docker image. This is how you can fill docker image.

 

If you map a host path that isn't an actual disk or user share, then that host path is in RAM just like the rest of the Unraid OS. This is how you can fill RAM with docker.

 

You certainly have a lot of dockers. Do you really need all those, or are you just experimenting? Are you really using emby and also plex, for example? Many of those I have no experience with or even know what they are for. Probably we are not going to get into the details of many of them so it would be best if you can figure this out on your own.

Link to comment

Well most ot them (on the top) are container relate to my Home Assistant instance. 

 

Right now I'm exerimenting. I have issue with plex and I'm testing emby to see if it has better performance. In the same experiment there are involved others containers like unmanic and tdarr. But they are normally stopped so I will start with other container check that evey path is correct. 

Link to comment
  • 2 weeks later...

Ok, maybe I found one docker with issue.

 

This is the container https://hub.docker.com/r/heussd/fivefilters-full-text-rss

 

It allows me to have full rss even if the website don't allow it.

 

I build it using console so I cannot manage via unraid webui and looking at this docker compose here https://github.com/heussd/fivefilters-full-text-rss-docker/blob/master/docker-compose.yml there's I volume I didn't create.

 

So I tried to build another container with same source using unraid templates

 

I mounted the path /var/www/html/cache to the share /mnt/user/appdata/full-text-rss/ like this

 

image.png.19acdc12304073d8946fefbd6951be21.png

 

The webui of the container works but when I try to build a custom rss feed I see this error in the browser

 

Quote

Fatal error: Uncaught exception 'Zend_Cache_Exception' with message 'cache_dir "/var/www/html/cache/rss/" must be a directory' in /var/www/html/libraries/Zend/Cache.php:209 Stack trace: #0 /var/www/html/libraries/Zend/Cache/Backend/File.php(181): Zend_Cache::throwException('cache_dir "/var...') #1 /var/www/html/libraries/Zend/Cache/Backend/File.php(132): Zend_Cache_Backend_File->setCacheDir('/var/www/html/c...') #2 /var/www/html/libraries/Zend/Cache.php(153): Zend_Cache_Backend_File->__construct(Array) #3 /var/www/html/libraries/Zend/Cache.php(94): Zend_Cache::_makeBackend('File', Array, false, false) #4 /var/www/html/makefulltextfeed.php(1640): Zend_Cache::factory('Core', 'File', Array, Array) #5 /var/www/html/makefulltextfeed.php(506): get_cache() #6 {main} thrown in /var/www/html/libraries/Zend/Cache.php on line 209

That I think is related to that folder. But It is a directory, right?

 

PS: if i remove path the error disappear.

 

Can you please support me again?

Edited by Jokerigno
added a PS
Link to comment
  • 2 years later...
On 5/23/2020 at 2:23 PM, Jokerigno said:

Ok, maybe I found one docker with issue.

 

This is the container https://hub.docker.com/r/heussd/fivefilters-full-text-rss

 

It allows me to have full rss even if the website don't allow it.

 

I build it using console so I cannot manage via unraid webui and looking at this docker compose here https://github.com/heussd/fivefilters-full-text-rss-docker/blob/master/docker-compose.yml there's I volume I didn't create.

 

So I tried to build another container with same source using unraid templates

 

I mounted the path /var/www/html/cache to the share /mnt/user/appdata/full-text-rss/ like this

 

image.png.19acdc12304073d8946fefbd6951be21.png

 

The webui of the container works but when I try to build a custom rss feed I see this error in the browser

 

That I think is related to that folder. But It is a directory, right?

 

PS: if i remove path the error disappear.

 

Can you please support me again?

Did you manage to resolve this issue?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...