TheDon
-
Posts
27 -
Joined
-
Last visited
Content Type
Profiles
Forums
Downloads
Store
Gallery
Bug Reports
Documentation
Landing
Report Comments posted by TheDon
-
-
I am also still seeing crashes, seems more frequently than before.
IPV4 only, ipvlan since 6.12.3, bridging and bonding set to No
oxygen-diagnostics-20230903-2316.zip oxygen-diagnostics-20230904-2039.zip
-
On 8/10/2023 at 5:59 PM, Mainfrezzer said:
You can save yourself the time with the safe mode. Its broken even on complete fresh install with absolutely nothing running.
Stay on 6.11.5 unless you really want the exclusive shares or zfs.Actually @Mainfrezzer I have achieved 2 days, 10 hours of stability with safe mode on (v6.11.5), so I guess this might mean my issue is plugin related?
-
I have seen a lot of people mention ipv6, but i started this seeing this issue, and have had ipv6 disabled for a long time. When i found this thread, and read through it I went to check if i had ipv6 enabled, but it was completely disabled, and i think i did that pretty early on.
I am running 6.11.5 now, and still cant keep the GUI running for more than 12-24 hours.
I am attempting safe mode right now, see if it helps at all.
oxygen-diagnostics-20230810-1137.zip syslog-10.0.0.8-20230810.log
-
On 8/4/2023 at 10:38 AM, TheDon said:
I have been performing downgrades
[6.12.3] - "current stable", this is where i started. I dont think my issues started on this version, but its when i started dealing with it
[6.12.2] - Issue was still present
[6.12.0] - Issue seem to take longer to present itself, I got to over 19 hours runtime
[6.11.5] - This broken all of my docker containers from starting automatically, I had to change the network to something else, and then back to correct setting to launch all my docker containers.
Just booted into 6.11.5, so havent been able to give it a 24hr stability test.
Problem still exists in 6.11.5 for me, really not sure what else I can do at this point. Next gui crash i can try restarting the nginx, that didnt work for me in 6.12.x but maybe now?
-
I have been performing downgrades
[6.12.3] - "current stable", this is where i started. I dont think my issues started on this version, but its when i started dealing with it
[6.12.2] - Issue was still present
[6.12.0] - Issue seem to take longer to present itself, I got to over 19 hours runtime
[6.11.5] - This broken all of my docker containers from starting automatically, I had to change the network to something else, and then back to correct setting to launch all my docker containers.
Just booted into 6.11.5, so havent been able to give it a 24hr stability test.
-
root@oxygen:~# ps -aux | grep nginx root 1104 0.0 0.0 7928 5016 ? Ss Jul29 0:00 nginx: master process /usr/sbin/nginx nobody 1129 0.0 0.0 8520 4852 ? S Jul29 0:00 nginx: worker process nobody 1130 0.0 0.0 8520 4852 ? S Jul29 0:00 nginx: worker process nobody 1131 0.0 0.0 8520 4776 ? S Jul29 0:00 nginx: worker process nobody 1132 0.0 0.0 8520 4780 ? S Jul29 0:00 nginx: worker process root 9633 0.0 0.0 147024 4016 ? Ss Jul29 0:00 nginx: master process /usr/sbin/nginx -c /etc/nginx/nginx.conf root 9634 0.0 0.0 148236 8096 ? S Jul29 0:13 nginx: worker process root 13064 0.0 0.0 4052 2224 pts/0 S+ 13:16 0:00 grep nginx root 15478 0.0 0.0 212 20 ? S Jul29 0:00 s6-supervise svc-nginx root 15826 0.0 0.0 7812 3932 ? Ss Jul29 0:00 nginx: master process /usr/sbin/nginx nobody 15932 0.0 0.0 8160 2988 ? S Jul29 0:00 nginx: worker process nobody 15933 0.0 0.0 8160 2160 ? S Jul29 0:00 nginx: worker process nobody 15934 0.0 0.0 8160 2984 ? S Jul29 0:00 nginx: worker process nobody 15935 0.0 0.0 8160 2984 ? S Jul29 0:00 nginx: worker process nobody 21461 0.0 0.0 48488 11428 pts/0 Ss+ Jul29 0:00 nginx: master process nginx nobody 26207 0.0 0.0 49152 9440 pts/0 S+ 12:20 0:00 nginx: worker process nobody 26208 0.0 0.0 48724 6464 pts/0 S+ 12:20 0:00 nginx: worker process nobody 26209 0.0 0.0 48724 6464 pts/0 S+ 12:20 0:00 nginx: worker process nobody 26210 0.0 0.0 48724 6464 pts/0 S+ 12:20 0:00 nginx: worker process nobody 26211 0.0 0.0 48724 6464 pts/0 S+ 12:20 0:00 nginx: worker process nobody 26212 0.0 0.0 48724 6464 pts/0 S+ 12:20 0:00 nginx: worker process nobody 26214 0.0 0.0 48724 6464 pts/0 S+ 12:20 0:00 nginx: worker process nobody 26215 0.0 0.0 48724 6464 pts/0 S+ 12:20 0:00 nginx: worker process nobody 26216 0.0 0.0 47956 6604 pts/0 S+ 12:20 0:00 nginx: cache manager process
16 hours ago, srirams said:ps -aux | grep nginx kill -9 <process id of nginx master process and maybe the s6-supervise nginx>
I kept the nginx process stopped (/etc/rc.d/rc.nginx stop) unless I needed to use the web ui, in which case I would start it and immediately stop it after I was done.
@srirams From the list of many nginx processes that I have going, do i need to kill all that say master? (1104, 9633, 15826, 19862, 21461).
/etc/rc.d/rc.nginx stop
^ just hangs on "Shutdown Nginx gracefully..."
-
Can anyone provide guidance on how to work around this issue without shutting down?
My current solution has been SSH, captures diagnostics, and then "poweroff". Sometimes Ill wait, nothing seems to happen, and I send poweroff again, and the machine seems to shutdown way too quick (and on boot, when i push power button), it reports a unclean shutdown. I cancel parity check, and then rinse and repeat in 12-24 hours.
Am I using the wrong command? Is there a better way to get nginx to restart properly so i dont have to perform this everyday?
- 1
[6.12] Unraid webui stop responding then Nginx crash
in Stable Releases
Posted
So my unraid server was acting similar to what was described here, so i wanted to post a solution here in case those watching this post (as i did for weeks), dont see their problems solved with the recent docker networking changes in 6.12.4.
A symptom i missed from this whole ordeal was that the unraid GUI was crashing (docker containers all appeared fine, but deluge was ALWAYS not functioning). If this sounds similar to you, check out the post below.
TLDR for below: binhex torrent container has a special release tag (like :latest) to help prevent this issue from happening.
":libtorrentv1"