jbrukardt

Members
  • Posts

    62
  • Joined

  • Last visited

Everything posted by jbrukardt

  1. The docker rollback in 6.12.2 fixed this issue thus far. Been stable for 4 days now, couldnt stay up for more than 40 minutes before that.
  2. The docker rollback in 6.12.2 fixed this issue thus far. Been stable for 4 days now, couldnt stay up for more than 40 minutes before that.
  3. Been having exactly this problem though all the 6.12 RCs Link here with no resolution. I tracked mine down to the cloudflared docker container, but it seems many dockers trigger the signal 6 error https://forums.unraid.net/topic/138726-612-webui-inaccessible-and-varlog-full/
  4. good to get rid of those too, but this is what filled the log, its almost 50 errors a second until the log fills and crashes. Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5834 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5834 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5852 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5852 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5867 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5867 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5891 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5891 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5907 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5907 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5912 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5912 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5935 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5935 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5947 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5947 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5967 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5967 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5981 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5981 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 5999 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 5999 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6015 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6015 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6038 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6038 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6048 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6048 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6063 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6063 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6079 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6079 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6084 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6084 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6129 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6129 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6143 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6143 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6153 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6153 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6183 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6183 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6191 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6191 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6207 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6207 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6221 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6221 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6232 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6232 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6249 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6249 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6257 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6257 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6276 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6276 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6292 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6292 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6301 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6301 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6319 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6319 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6328 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6328 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6338 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6338 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6357 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6357 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6364 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6364 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6384 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6384 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6395 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6395 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6411 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6411 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: worker process 6426 exited on signal 6 Jun 2 10:50:04 Media nginx: 2023/06/02 10:50:04 [alert] 13955#13955: shared memory zone "memstore" was locked by 6426
  5. Im guessing this might be the /var/log fill issue that there are now about 6 or 7 different posts about. OP, if you're not headless, you should still have command line access physically on the server. check your /var/log fullness with du -sm /var/log/* You should also be able to cat /var/log/syslog from there and see if its filled with NGINX Error 6 spam. If it is, check my post here: https://forums.unraid.net/topic/138726-612-webui-inaccessible-and-varlog-full/#comment-1267931 @Eddie Seelkealso seems to be having a similar issue here:
  6. Please see my post here... i found your cloudflared script in another post from earlier this year, and I expect you may be using cloudflared to connect to unraid. I also am, and whether it be the docker, or your script, thats what triggers the WEBUI issue. Disable cloudflared, and the problem goes away. Something changed in the 6.11 release to cause this, or has changed in cloudflared.
  7. wget -nc http://s3.syncd.tech/files/unraid/supervisord ^^ this seems to be an invalid URL
  8. I did, no response unfortunately since its not a universal problem with the container, but rather one with its interaciton with unraid.
  9. Caught it process. Diagnostics attached before the server locked up with a full log several minutes later. At this point if this cant be resolved, I will have to switch off unraid. scarif-vault-diagnostics-20230601-1435.zip
  10. I would also like to understand at a higher level, why unraid fails so badly when logs fill up. Ideally logrotate or something should kick in rather than the server locking up.
  11. I have isolated this issue on my setup to the cloudflared docker. No error 6 without that docker active.
  12. This is 100% tied to the cloudflared docker. Logs do not fill, no signal 6 errors when its off. I would like to understand why at a more strategic level, Unraid bricks itself when the logs partitition is full, rather than logrotate or something kicking in. Thats very poor behavior to brick the OS when logs fill up.
  13. Im also experiencing this with no resolution
  14. Any reason why the Unraid-Cloudflared-Tunnel docker would be spitting thousands of NGINX errors into syslog? Ive been chasing the problem here, and the only thing that makes it reliably go away is turning off Unraid-Cloudflared-Tunnel docker. Im testing now with just the raw docker command straight from the zero-trust dashboard.
  15. Same issue, syslog attached. I have completely recreated all my docker containers, and removed and replaced the docker file and still have the issue. This time I caught it live, and it filled up the log with May 18 13:13:29 scarif-vault nginx: 2023/05/18 13:13:29 [alert] 8098#8098: shared memory zone "memstore" was locked by 25732 May 18 13:13:29 scarif-vault nginx: 2023/05/18 13:13:29 [alert] 8098#8098: worker process 25733 exited on signal 6 May 18 13:13:29 scarif-vault nginx: 2023/05/18 13:13:29 [alert] 8098#8098: shared memory zone "memstore" was locked by 25733 May 18 13:13:29 scarif-vault nginx: 2023/05/18 13:13:29 [alert] 8098#8098: worker process 25734 exited on signal 6 May 18 13:13:29 scarif-vault nginx: 2023/05/18 13:13:29 [alert] 8098#8098: shared memory zone "memstore" was locked by 25734 May 18 13:13:29 scarif-vault nginx: 2023/05/18 13:13:29 [alert] 8098#8098: worker process 25735 exited on signal 6 May 18 13:13:29 scarif-vault nginx: 2023/05/18 13:13:29 [alert] 8098#8098: shared memory zone "memstore" was locked by 25735 in less than 5 minutes. syslog-192.168.1.10.log
  16. not resolved yet. Took longer, but just went inaccessible again. No telnet no ssh, no webUI shares and dockers are down. But somehow the VMs are still up. Will pull the off-system syslog later. This is getting very frustrating
  17. went away with docker off. Turning each docker back on one by one to see who the offender is.
  18. Booted into safe mode @ May 10 07:31:43 Logs looks slightly different, but still a lot of NGINX spam. Attached syslog-192.168.1.10 (1).log
  19. logs starting to fill up again. Thousands of entries of this. Hasnt crashed yet May 9 14:59:23 scarif-vault nginx: 2023/05/09 14:59:23 [alert] 8081#8081: worker process 2807 exited on signal 6 May 9 14:59:25 scarif-vault nginx: 2023/05/09 14:59:25 [alert] 8081#8081: worker process 2972 exited on signal 6 May 9 14:59:27 scarif-vault nginx: 2023/05/09 14:59:27 [alert] 8081#8081: worker process 3008 exited on signal 6 May 9 14:59:29 scarif-vault nginx: 2023/05/09 14:59:29 [alert] 8081#8081: worker process 3203 exited on signal 6 May 9 14:59:31 scarif-vault nginx: 2023/05/09 14:59:31 [alert] 8081#8081: worker process 3238 exited on signal 6 May 9 14:59:33 scarif-vault nginx: 2023/05/09 14:59:33 [alert] 8081#8081: worker process 3275 exited on signal 6 May 9 14:59:35 scarif-vault nginx: 2023/05/09 14:59:35 [alert] 8081#8081: worker process 3680 exited on signal 6 May 9 14:59:37 scarif-vault nginx: 2023/05/09 14:59:37 [alert] 8081#8081: worker process 3760 exited on signal 6 May 9 14:59:39 scarif-vault nginx: 2023/05/09 14:59:39 [alert] 8081#8081: worker process 4079 exited on signal 6 May 9 14:59:41 scarif-vault nginx: 2023/05/09 14:59:41 [alert] 8081#8081: worker process 4097 exited on signal 6 May 9 14:59:43 scarif-vault nginx: 2023/05/09 14:59:43 [alert] 8081#8081: worker process 4132 exited on signal 6 May 9 14:59:45 scarif-vault nginx: 2023/05/09 14:59:45 [alert] 8081#8081: worker process 4258 exited on signal 6 May 9 14:59:47 scarif-vault nginx: 2023/05/09 14:59:47 [alert] 8081#8081: worker process 4294 exited on signal 6 May 9 14:59:49 scarif-vault nginx: 2023/05/09 14:59:49 [alert] 8081#8081: worker process 4346 exited on signal 6 May 9 14:59:51 scarif-vault nginx: 2023/05/09 14:59:51 [alert] 8081#8081: worker process 4741 exited on signal 6 May 9 14:59:53 scarif-vault nginx: 2023/05/09 14:59:53 [alert] 8081#8081: worker process 4776 exited on signal 6 May 9 14:59:55 scarif-vault nginx: 2023/05/09 14:59:55 [alert] 8081#8081: worker process 5152 exited on signal 6 May 9 14:59:57 scarif-vault nginx: 2023/05/09 14:59:57 [alert] 8081#8081: worker process 5187 exited on signal 6 May 9 14:59:59 scarif-vault nginx: 2023/05/09 14:59:59 [alert] 8081#8081: worker process 5223 exited on signal 6 May 9 15:00:01 scarif-vault nginx: 2023/05/09 15:00:01 [alert] 8081#8081: worker process 5417 exited on signal 6 May 9 15:00:03 scarif-vault nginx: 2023/05/09 15:00:03 [alert] 8081#8081: worker process 5642 exited on signal 6 May 9 15:00:05 scarif-vault nginx: 2023/05/09 15:00:05 [alert] 8081#8081: worker process 5677 exited on signal 6 May 9 15:00:07 scarif-vault nginx: 2023/05/09 15:00:07 [alert] 8081#8081: worker process 6129 exited on signal 6 May 9 15:00:09 scarif-vault nginx: 2023/05/09 15:00:09 [alert] 8081#8081: worker process 6223 exited on signal 6 May 9 15:00:11 scarif-vault nginx: 2023/05/09 15:00:11 [alert] 8081#8081: worker process 6385 exited on signal 6 May 9 15:00:13 scarif-vault nginx: 2023/05/09 15:00:13 [alert] 8081#8081: worker process 6558 exited on signal 6 May 9 15:00:15 scarif-vault nginx: 2023/05/09 15:00:15 [alert] 8081#8081: worker process 6593 exited on signal 6 May 9 15:00:17 scarif-vault nginx: 2023/05/09 15:00:17 [alert] 8081#8081: worker process 6733 exited on signal 6 May 9 15:00:19 scarif-vault nginx: 2023/05/09 15:00:19 [alert] 8081#8081: worker process 6772 exited on signal 6 May 9 15:00:21 scarif-vault nginx: 2023/05/09 15:00:21 [alert] 8081#8081: worker process 6800 exited on signal 6 May 9 15:00:23 scarif-vault nginx: 2023/05/09 15:00:23 [alert] 8081#8081: worker process 7227 exited on signal 6 May 9 15:00:26 scarif-vault nginx: 2023/05/09 15:00:26 [alert] 8081#8081: worker process 7390 exited on signal 6 May 9 15:00:28 scarif-vault nginx: 2023/05/09 15:00:28 [alert] 8081#8081: worker process 7659 exited on signal 6 May 9 15:00:30 scarif-vault nginx: 2023/05/09 15:00:30 [alert] 8081#8081: worker process 7694 exited on signal 6 May 9 15:00:32 scarif-vault nginx: 2023/05/09 15:00:32 [alert] 8081#8081: worker process 7729 exited on signal 6 May 9 15:00:34 scarif-vault nginx: 2023/05/09 15:00:34 [alert] 8081#8081: worker process 7847 exited on signal 6 May 9 15:00:36 scarif-vault nginx: 2023/05/09 15:00:36 [alert] 8081#8081: worker process 7883 exited on signal 6 May 9 15:00:38 scarif-vault nginx: 2023/05/09 15:00:38 [alert] 8081#8081: worker process 8095 exited on signal 6 May 9 15:00:40 scarif-vault nginx: 2023/05/09 15:00:40 [alert] 8081#8081: worker process 8428 exited on signal 6 May 9 15:00:42 scarif-vault nginx: 2023/05/09 15:00:42 [alert] 8081#8081: worker process 8479 exited on signal 6 May 9 15:00:44 scarif-vault nginx: 2023/05/09 15:00:44 [alert] 8081#8081: worker process 8575 exited on signal 6 May 9 15:00:46 scarif-vault nginx: 2023/05/09 15:00:46 [alert] 8081#8081: worker process 8610 exited on signal 6 May 9 15:00:48 scarif-vault nginx: 2023/05/09 15:00:48 [alert] 8081#8081: worker process 8671 exited on signal 6 May 9 15:00:50 scarif-vault nginx: 2023/05/09 15:00:50 [alert] 8081#8081: worker process 8796 exited on signal 6 May 9 15:00:52 scarif-vault nginx: 2023/05/09 15:00:52 [alert] 8081#8081: worker process 8832 exited on signal 6 May 9 15:00:54 scarif-vault nginx: 2023/05/09 15:00:54 [alert] 8081#8081: worker process 9160 exited on signal 6 May 9 15:00:56 scarif-vault nginx: 2023/05/09 15:00:56 [alert] 8081#8081: worker process 9446 exited on signal 6 May 9 15:00:58 scarif-vault nginx: 2023/05/09 15:00:58 [alert] 8081#8081: worker process 9566 exited on signal 6 May 9 15:01:00 scarif-vault nginx: 2023/05/09 15:01:00 [alert] 8081#8081: worker process 9743 exited on signal 6 May 9 15:01:02 scarif-vault nginx: 2023/05/09 15:01:02 [alert] 8081#8081: worker process 10140 exited on signal 6 May 9 15:01:04 scarif-vault nginx: 2023/05/09 15:01:04 [alert] 8081#8081: worker process 10231 exited on signal 6 May 9 15:01:06 scarif-vault nginx: 2023/05/09 15:01:06 [alert] 8081#8081: worker process 10304 exited on signal 6 May 9 15:01:08 scarif-vault nginx: 2023/05/09 15:01:08 [alert] 8081#8081: worker process 10342 exited on signal 6 May 9 15:01:10 scarif-vault nginx: 2023/05/09 15:01:10 [alert] 8081#8081: worker process 10826 exited on signal 6 May 9 15:01:12 scarif-vault nginx: 2023/05/09 15:01:12 [alert] 8081#8081: worker process 11028 exited on signal 6 May 9 15:01:14 scarif-vault nginx: 2023/05/09 15:01:14 [alert] 8081#8081: worker process 11064 exited on signal 6 May 9 15:01:16 scarif-vault nginx: 2023/05/09 15:01:16 [alert] 8081#8081: worker process 11196 exited on signal 6 May 9 15:01:18 scarif-vault nginx: 2023/05/09 15:01:18 [alert] 8081#8081: worker process 11232 exited on signal 6 May 9 15:01:20 scarif-vault nginx: 2023/05/09 15:01:20 [alert] 8081#8081: worker process 11272 exited on signal 6 May 9 15:01:22 scarif-vault nginx: 2023/05/09 15:01:22 [alert] 8081#8081: worker process 11352 exited on signal 6 May 9 15:01:24 scarif-vault nginx: 2023/05/09 15:01:24 [alert] 8081#8081: worker process 11432 exited on signal 6 May 9 15:01:26 scarif-vault nginx: 2023/05/09 15:01:26 [alert] 8081#8081: worker process 11803 exited on signal 6 May 9 15:01:28 scarif-vault nginx: 2023/05/09 15:01:28 [alert] 8081#8081: worker process 11972 exited on signal 6 May 9 15:01:30 scarif-vault nginx: 2023/05/09 15:01:30 [alert] 8081#8081: worker process 12023 exited on signal 6 May 9 15:01:32 scarif-vault nginx: 2023/05/09 15:01:32 [alert] 8081#8081: worker process 12108 exited on signal 6 May 9 15:01:34 scarif-vault nginx: 2023/05/09 15:01:34 [alert] 8081#8081: worker process 12161 exited on signal 6 May 9 15:01:36 scarif-vault nginx: 2023/05/09 15:01:36 [alert] 8081#8081: worker process 12212 exited on signal 6 May 9 15:01:38 scarif-vault nginx: 2023/05/09 15:01:38 [alert] 8081#8081: worker process 12256 exited on signal 6 May 9 15:01:40 scarif-vault nginx: 2023/05/09 15:01:40 [alert] 8081#8081: worker process 12491 exited on signal 6 May 9 15:01:42 scarif-vault nginx: 2023/05/09 15:01:42 [alert] 8081#8081: worker process 12741 exited on signal 6 May 9 15:01:44 scarif-vault nginx: 2023/05/09 15:01:44 [alert] 8081#8081: worker process 12949 exited on signal 6 May 9 15:01:46 scarif-vault nginx: 2023/05/09 15:01:46 [alert] 8081#8081: worker process 13125 exited on signal 6 May 9 15:01:48 scarif-vault nginx: 2023/05/09 15:01:48 [alert] 8081#8081: worker process 13178 exited on signal 6 May 9 15:01:50 scarif-vault nginx: 2023/05/09 15:01:50 [alert] 8081#8081: worker process 13232 exited on signal 6 May 9 15:01:52 scarif-vault nginx: 2023/05/09 15:01:52 [alert] 8081#8081: worker process 13338 exited on signal 6 May 9 15:01:54 scarif-vault nginx: 2023/05/09 15:01:54 [alert] 8081#8081: worker process 13377 exited on signal 6 May 9 15:01:56 scarif-vault nginx: 2023/05/09 15:01:56 [alert] 8081#8081: worker process 13569 exited on signal 6 May 9 15:01:58 scarif-vault nginx: 2023/05/09 15:01:58 [alert] 8081#8081: worker process 13729 exited on signal 6 May 9 15:02:00 scarif-vault nginx: 2023/05/09 15:02:00 [alert] 8081#8081: worker process 13901 exited on signal 6 May 9 15:02:02 scarif-vault nginx: 2023/05/09 15:02:02 [alert] 8081#8081: worker process 13987 exited on signal 6 May 9 15:02:04 scarif-vault nginx: 2023/05/09 15:02:04 [alert] 8081#8081: worker process 14079 exited on signal 6 May 9 15:02:06 scarif-vault nginx: 2023/05/09 15:02:06 [alert] 8081#8081: worker process 14114 exited on signal 6 May 9 15:02:08 scarif-vault nginx: 2023/05/09 15:02:08 [alert] 8081#8081: worker process 14198 exited on signal 6 May 9 15:02:10 scarif-vault nginx: 2023/05/09 15:02:10 [alert] 8081#8081: worker process 14268 exited on signal 6 May 9 15:02:12 scarif-vault nginx: 2023/05/09 15:02:12 [alert] 8081#8081: worker process 14475 exited on signal 6 May 9 15:02:14 scarif-vault nginx: 2023/05/09 15:02:14 [alert] 8081#8081: worker process 14621 exited on signal 6 May 9 15:02:16 scarif-vault nginx: 2023/05/09 15:02:16 [alert] 8081#8081: worker process 14783 exited on signal 6 May 9 15:02:18 scarif-vault nginx: 2023/05/09 15:02:18 [alert] 8081#8081: worker process 14874 exited on signal 6 May 9 15:02:20 scarif-vault nginx: 2023/05/09 15:02:20 [alert] 8081#8081: worker process 14909 exited on signal 6 May 9 15:02:22 scarif-vault nginx: 2023/05/09 15:02:22 [alert] 8081#8081: worker process 15350 exited on signal 6 May 9 15:02:24 scarif-vault nginx: 2023/05/09 15:02:24 [alert] 8081#8081: worker process 15391 exited on signal 6 May 9 15:02:26 scarif-vault nginx: 2023/05/09 15:02:26 [alert] 8081#8081: worker process 15552 exited on signal 6 May 9 15:02:28 scarif-vault nginx: 2023/05/09 15:02:28 [alert] 8081#8081: worker process 15623 exited on signal 6 May 9 15:02:30 scarif-vault nginx: 2023/05/09 15:02:30 [alert] 8081#8081: worker process 15785 exited on signal 6 May 9 15:02:32 scarif-vault nginx: 2023/05/09 15:02:32 [alert] 8081#8081: worker process 15999 exited on signal 6 May 9 15:02:34 scarif-vault nginx: 2023/05/09 15:02:34 [alert] 8081#8081: worker process 16037 exited on signal 6 May 9 15:02:36 scarif-vault nginx: 2023/05/09 15:02:36 [alert] 8081#8081: worker process 16072 exited on signal 6 May 9 15:02:38 scarif-vault nginx: 2023/05/09 15:02:38 [alert] 8081#8081: worker process 16151 exited on signal 6 May 9 15:02:40 scarif-vault nginx: 2023/05/09 15:02:40 [alert] 8081#8081: worker process 16186 exited on signal 6 May 9 15:02:42 scarif-vault nginx: 2023/05/09 15:02:42 [alert] 8081#8081: worker process 16432 exited on signal 6 May 9 15:02:44 scarif-vault nginx: 2023/05/09 15:02:44 [alert] 8081#8081: worker process 16467 exited on signal 6 May 9 15:02:46 scarif-vault nginx: 2023/05/09 15:02:46 [alert] 8081#8081: worker process 16628 exited on signal 6 May 9 15:02:48 scarif-vault nginx: 2023/05/09 15:02:48 [alert] 8081#8081: worker process 16828 exited on signal 6 May 9 15:02:50 scarif-vault nginx: 2023/05/09 15:02:50 [alert] 8081#8081: worker process 16847 exited on signal 6 May 9 15:02:52 scarif-vault nginx: 2023/05/09 15:02:52 [alert] 8081#8081: worker process 16884 exited on signal 6 May 9 15:02:54 scarif-vault nginx: 2023/05/09 15:02:54 [alert] 8081#8081: worker process 16950 exited on signal 6 May 9 15:02:56 scarif-vault nginx: 2023/05/09 15:02:56 [alert] 8081#8081: worker process 17133 exited on signal 6 May 9 15:02:58 scarif-vault nginx: 2023/05/09 15:02:58 [alert] 8081#8081: worker process 17191 exited on signal 6 May 9 15:03:00 scarif-vault nginx: 2023/05/09 15:03:00 [alert] 8081#8081: worker process 17226 exited on signal 6 May 9 15:03:02 scarif-vault nginx: 2023/05/09 15:03:02 [alert] 8081#8081: worker process 17389 exited on signal 6 May 9 15:03:04 scarif-vault nginx: 2023/05/09 15:03:04 [alert] 8081#8081: worker process 17646 exited on signal 6 May 9 15:03:06 scarif-vault nginx: 2023/05/09 15:03:06 [alert] 8081#8081: worker process 17681 exited on signal 6 May 9 15:03:08 scarif-vault nginx: 2023/05/09 15:03:08 [alert] 8081#8081: worker process 17765 exited on signal 6 May 9 15:03:10 scarif-vault nginx: 2023/05/09 15:03:10 [alert] 8081#8081: worker process 17800 exited on signal 6 May 9 15:03:12 scarif-vault nginx: 2023/05/09 15:03:12 [alert] 8081#8081: worker process 17960 exited on signal 6 May 9 15:03:14 scarif-vault nginx: 2023/05/09 15:03:14 [alert] 8081#8081: worker process 18085 exited on signal 6 May 9 15:03:16 scarif-vault nginx: 2023/05/09 15:03:16 [alert] 8081#8081: worker process 18120 exited on signal 6 scarif-vault-diagnostics-20230510-0518.zip
  20. enabled and writing to a survivable location. Will update, and then post syslog when it happens again
  21. Looks like its not yet resolved after the above. That got rid of most the errors in the log, but I woke up this morning to no webui, no network. The gaming vm was still running, but only on local console, and I couldnt even get graphical console on the port new diags attached. Had to hard reboot scarif-vault-diagnostics-20230508-2020.zip
  22. Will give it a few more days, since it was an aperiodic problem, but that looks to have greatly reduced both types of errors in the syslog. I saw that 6.12 RC5 has this change: Restrict avahidaemon to primary interface. I wonder if that would have fixed it?
  23. did that, seems that far more of syslog.1 is filled with this though nginx: 2023/05/02 17:08:04 [alert] 9336#9336: shared memory zone "memstore" was locked by 55 mb vs 2 for the Avahi stuff
  24. Periodically, once every 2-3 days, I lose access to the WebUI. Dockers remain up, as do VMs and the shares are accessible. Physical login from the console also works. Unfortunately due to that, I am unable to run diagnostics when it occurs. The problem resolves itself on reboot. Running the diagnostics command from cli just hangs the cli until i ctrl-c I have attached diagnostics post reboot, although I suspect that will not be terribly useful things I have tried already 1) Increasing the /var/log size before rebooting, so it is no longer 100% full. This did not resolve this issue 2) Rebooting the webGUI 3) Saving off /var/log Attached to this post is the diagnostics file from after reboot. And the logs that are taking up all of /var/log prior to rebootscarif-vault-diagnostics-20230503-1142.zipsyslog.1 additional logs.zip
  25. So after fighting with this for literally dozens of hours, everything appearing perfect when i was logged into parsec, all the devices showing up, drivers installed, no errors etc. But still no display. I even went and bought a new monitor. I had the damn HDMI cable backwards. I have super fancy fiber-optic HDMI cables, and theyre active cables which are unidirectional (but apparently not for EDID info...). I had it flipped around, and that means no signal. I'm dumber than i look, thanks for your patience and help all. I now deeply understand VFIO xml and passing parameters