Drives have been disabling randomly and for no reason 2-3 times a week. I posted in the forum previously, received advice and have been testing. I have been running copious tests to try to rule out variables and here is where I've landed.
Previously, I had a GPU (3070) in the PCIEx8 slot and the HBA in the PCIEx16 slot. To eliminate variables, I removed the GPU altogether, relocated the HBA to the x8 slot and ran the server for a full week. Not a single drive disabled and everything was great. Even through a parity check at it with no disabled drives.
Tonight, I reinstalled the GPU, changing nothing else. The server booted up and ran fine for an hour, then, a drive was disabled. Either the GPU is the issue or the motherboard does not like having 2 cards installed. I checked the manual and it states that if a card is installed in the x16 and x8 slots, both slots will run at x8. I've set both slots to run at 3.0. According to the mother board manual (z390) my current arrangement should work just fine. I'd really like to keep my gpu installed for transcoding but it seems I may not have a choice.
Diagnostics attached.
Hoping someone can point me to a bios setting or something that will solve this.
theark-diagnostics-20240312-1915.zip