(Unraid 6.3.2) Too many open files...


Recommended Posts

Disable the Docker Service and reboot. Set your appdata share to cache-prefer, then run mover to get your appdata back onto cache. Then enable Docker Service.

 

Not sure if any of that will help, but it will at least make your configuration more correct and less confusing to diagnose.

Link to comment
  • 4 weeks later...
  • 2 weeks later...

This is my output on this:

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 128178
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 40960
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) unlimited
cpu time               (seconds, -t) unlimited
max user processes              (-u) 128178
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

 

Link to comment
But will it stay like that permanent? I've tried that but keeps reverting back to the old setting after reboot,

Did it fix it though?
And no it's not permanent. You can for the time being install the user scripts plugin and have it run the command at array start

Sent from my LG-D852 using Tapatalk

Link to comment
23 minutes ago, Squid said:


Did it fix it though?
And no it's not permanent. You can for the time being install the user scripts plugin and have it run the command at array start

Sent from my LG-D852 using Tapatalk
 

 

Gonna try and replicate the problem and see if it works. Might take some time.

 

Edit: Had mixed results, sometimes it works, sometimes it just hits the limit again.

Edited by kentromox
Link to comment

Then you've got to narrow it down by stopping docker apps until the problem disappears...  Then you know what's out of control...  

 

Or keep increasing the number.  But something is running amok  and increasing the number to obscene values is only masking the problem...

 

But, I did ask @dlandon to incorporate max files into the tips and tweaks plugin, as I can see with ever increasing complexity of apps, and number of apps running that more users will begin to need an increase from the 'nix defaults

Link to comment

Okey, I've managed to narrow it down to Headphones and Jackett. It only starts when those are on. Then it start affecting everyone else. I also get a bad file descriptor error messages.

 

Also tried to up the limit from 70000 to 80000, then just added more zeroes to the end. The problem still persist.

 

Apr 17 13:36:33 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:34 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:35 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/sickrage/cache.db (24) Too many open files
Apr 17 13:36:35 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/sickrage/cache.db (24) Too many open files
Apr 17 13:36:35 Ravanor shfs/user: err: shfs_flush: close: (9) Bad file descriptor
Apr 17 13:36:38 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:39 Ravanor shfs/user: err: shfs_open: open: /mnt/disk6/appdata/deluge_alex/state/torrents.state.tmp (24) Too many open files
Apr 17 13:36:39 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:40 Ravanor shfs/user: err: shfs_flush: close: (9) Bad file descriptor
Apr 17 13:36:40 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:41 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:42 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:43 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/sickrage/cache.db (24) Too many open files
Apr 17 13:36:43 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/sickrage/cache.db (24) Too many open files
Apr 17 13:36:43 Ravanor shfs/user: err: shfs_flush: close: (9) Bad file descriptor
Apr 17 13:36:45 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:46 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:47 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:48 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:49 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/disk2/Torrent/Syncthing/Dokumenter (24) Too many open files
Apr 17 13:36:49 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/disk4/Torrent/Syncthing/Dokumenter (24) Too many open files
Apr 17 13:36:49 Ravanor shfs/user: err: shfs_flush: close: (9) Bad file descriptor
Apr 17 13:36:49 Ravanor shfs/user: err: shfs_create: open: /mnt/disk6/appdata/sickrage/sickbeard.db-journal (24) Too many open files
Apr 17 13:36:49 Ravanor shfs/user: err: shfs_open: open: /mnt/disk6/appdata/radarr/nzbdrone.db-wal (24) Too many open files
Apr 17 13:36:49 Ravanor shfs/user: err: shfs_open: open: /mnt/disk6/appdata/radarr/nzbdrone.db-wal (24) Too many open files
Apr 17 13:36:49 Ravanor shfs/user: err: shfs_open: open: /mnt/disk8/appdata/radarr/logs/radarr.txt (24) Too many open files
Apr 17 13:36:49 Ravanor shfs/user: err: shfs_open: open: /mnt/disk8/appdata/radarr/logs/radarr.txt (24) Too many open files
Apr 17 13:36:50 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/sickrage/cache.db (24) Too many open files

This is some of the error that comes up.

ravanor-diagnostics-20170417-1346.zip

Edited by kentromox
Added Diagnostic zip
Link to comment

Headphones:

root@localhost:# /usr/local/emhttp/plugins/dynamix.docker.manager/scripts/docker run -d --name="jackett" --net="bridge" -e TZ="Europe/Berlin" -e HOST_OS="unRAID" -e "PUID"="99" -e "PGID"="100" -p 9117:9117/tcp -v "/mnt/user/appdata/jackett":"/config":rw -v "/mnt/user/Torrent/blackhole":"/downloads":rw linuxserver/jackett
f0da71ab9b3aa2398abfcaaba166b616b733985329b75b3ee9167a58e3850573

The command finished successfully!

Jackett:

root@localhost:# /usr/local/emhttp/plugins/dynamix.docker.manager/scripts/docker run -d --name="Headphones" --net="bridge" -e TZ="Europe/Berlin" -e HOST_OS="unRAID" -e "PUID"="99" -e "PGID"="100" -p 8181:8181/tcp -v "/mnt/user/appdata/headphones":"/config":rw -v "/mnt/user/Media/Ill Music/":"/music":rw -v "/mnt/user/Torrent/data/":"/data":rw linuxserver/headphones
02366f2f72e41757789a2335a9f83f6646ecbbf40574fc335734bb9f2ba93d70

The command finished successfully!

 

Link to comment

Okey, found something interesting on the logs when the problem occured, dont know if this is the cause.

Apr 22 08:17:01 Ravanor shfs/user: err: shfs_flush: close: (9) Bad file descriptor
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_flush: close: (9) Bad file descriptor
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_create: open: /mnt/disk7/appdata/ombi/Ombi.sqlite-journal (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/ombi/Ombi.sqlite (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/ombi/Ombi.sqlite (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/ombi/Ombi.sqlite (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/ombi/Ombi.sqlite (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/ombi/Ombi.sqlite (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/ombi/Ombi.sqlite (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/ombi/Ombi.sqlite (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/ombi/Ombi.sqlite (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/ombi/Ombi.sqlite (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_open: open: /mnt/cache/appdata/ombi/Ombi.sqlite (24) Too many open files
Apr 22 08:17:01 Ravanor shfs/user: err: shfs_flush: close: (9) Bad file descriptor
Apr 22 08:17:02 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/cache/. (24) Too many open files
Apr 22 08:17:02 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/disk1/. (24) Too many open files
Apr 22 08:17:02 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/disk2/. (24) Too many open files
Apr 22 08:17:02 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/disk3/. (24) Too many open files
Apr 22 08:17:02 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/disk4/. (24) Too many open files
Apr 22 08:17:02 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/disk5/. (24) Too many open files
Apr 22 08:17:02 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/disk6/. (24) Too many open files
Apr 22 08:17:02 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/disk7/. (24) Too many open files
Apr 22 08:17:02 Ravanor shfs/user: err: shfs_readdir: opendir: /mnt/disk8/. (24) Too many open files
Apr 22 08:17:02 Ravanor shfs/user: err: shfs_flush: close: (9) Bad file descriptor

The problem starts 08.17.01.

Also, cant quite understand the OOM the user share, could you give me an explanation to how the problem comes up?

Link to comment
1 hour ago, kentromox said:

Also, cant quite understand the OOM the user share, could you give me an explanation to how the problem comes up?

If linux runs Out Of Memory it will begin to kill processes. In V5 it wasn't uncommon for the webUI or SMB to get killed for this. Hasn't usually been much of a problem with V6 since it has 64bit addressing.

 

Maybe not what happened in your case. The diagnostics you posted don't go back far enough to see the start of the problem so there isn't a log entry that actually said OOM.

 

Looks like you may be onto the real problem. Don't know if the Open Files plugin would provide a clue or not.

Link to comment
35 minutes ago, trurl said:

If linux runs Out Of Memory it will begin to kill processes. In V5 it wasn't uncommon for the webUI or SMB to get killed for this. Hasn't usually been much of a problem with V6 since it has 64bit addressing.

 

Maybe not what happened in your case. The diagnostics you posted don't go back far enough to see the start of the problem so there isn't a log entry that actually said OOM.

 

Looks like you may be onto the real problem. Don't know if the Open Files plugin would provide a clue or not.

 

Interesting, where can I find the "Open Files" plugin? This is all new to me. Cant see it in the Plugin tab on Webui.

 

Edit: Might also be related to Ombi too, the problem starts at the same time for the last few days. Around 8 in the morning. Keep seeing Ombi cant open files cause of the "Too many files open" error.

 

Second edit: Just installed Open Files plugin, why the hell does Ombi need so many file open??

Uploading a pic of the example im being presented, this goes for pages.

2017-04-23 17_24_05-Ravanor_OpenFiles.png

 

Third edit: Killed the Ombi Docker, seems too be normal from the Open Files Plugin, gonna let it run for a couple off days to see if the problem still exst. Thanks for the help trurl. Gonna be interesting to see now.

Edited by kentromox
Third edit, Holy god why...
Link to comment

This is a very interesting read as I just migrated (about 3 or so hours ago) my Ombi container from a Synology to unRAID; which is running, well, this will do the talking for me:

 

Capture.thumb.PNG.234584220b7c0e6a53fed326b5267d29.PNG

 

Lo and behold, will you look at that -- I get my first OOM with unRAID ever.

First noticed this while browsing GitLab -- my images weren't loading.

Then it was WinSCP -- my NFS shares were going bye bye (this part is scary lol).

 

I wasn't able to capture the moment when it triggers, but Ombi is the only new element in my system.

I'm speculating (since this is what it was doing after installation) that the episode searching is what's causing the OOM issue (related: https://github.com/tidusjar/Ombi/issues/1256)

 

It doesn't help that I already have a high amount of open files (looking at you SonarQube) so it didn't take much to tip it over the edge.

 

This system stays fairly busy so I won't be able to provide any more info than this -- just wanted to confirm that Ombi is the common element.

I'm hoping those guys can move away from Mono and onto .net Core -- I can see a good app, just not under Linux.

Edited by avluis
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.