Jump to content


  • Content Count

  • Joined

  • Last visited

  • Days Won


fmp4m last won the day on April 22 2018

fmp4m had the most liked content!

Community Reputation

10 Good

About fmp4m

  • Rank
    Advanced Member
  • Birthday September 26


  • Gender

Recent Profile Visitors

654 profile views
  1. Dockers running ATM however Parity2 being added now. nas-diagnostics-20190304-1914.zip
  2. I will grab diags when rebuilding the 2nd parity + 2tb to 8tb drive. This check just finished. Thanks!
  3. Alright, got everything into the Behemoth this past weekend, Started Parity Check and waiting for it to finish, its slow for some reason, but I will need for it to finish before checking on what the issue is. Data RW tests before booting unraid is over 2GB/s for all drives RW at once which was about .9Gb/s faster than before. A few questions: 1. I need to upgrade my last 2TB drive to a 8TB at the same time I need to add 2 8TB drives and add a 2nd 8TB Parity. Can I do all of this at once since only one drive will be being rebuilt and the rest are new? 2. Is there a way to speed up parity? or Troubleshoot what is taking so long? When I have Zero Dockers running the parity is around 200-300Mb/s for all drives at one time. I have all new controllers, cords, and expanders now as seen above.
  4. I have two issues going on and I know I need diagnostics, I do not have access to the physical until later this evening so I will add them then. 1. When booting up (cold or warm restart) my boot stops at the /haveged/ line for about 5-6 minutes before proceeding. I am unsure what is causing this. 2. When I am on the same network or even VPN, I can not access the web interface as it redirects to a xxxxxxxxxxxxxxxx.unraid.net that is unreachable. This is when I use http://IP:PORT or HTTPS://IP:PORT I can not reach the host at all unless physically attached via monitor.
  5. If your intent is to pass all ports to the same docker/vm you will be fine. If you want to separate the ports that card is not so nice. Overall throughput will not be 1gb per port and will Max around 700mb to 750mb per port. If that is acceptable then you will be fine. The Intel cards are worth the price difference.
  6. I will chime in and say i have the threadripper first gen and love it. I can't seem to throw too much at it. I'm about to install it in a 42 bay case to try lol. Currently have 22 drives and 19 Dockers with 6 VMs running 24/7 and rarely see more than 20to30% utilization of cpu with 3 of the VMs as Windows server. One of the Dockers as a 4k Plex with at time 18 viewers. I do have 128gb ram though. So I'm biased.
  7. Cables and 2x 8i cards came in. Now waiting on expanders. I hope I wired the two bottom bays correctly, I used sff8643 to sata break outs and put p1 and p2 of what will be expander 1 on the left, and p1 p2 of what will be expander 2 on the right. That will make 2 ports per parity sas trays and then 5x4 rows of hdds per expander. 21 drives per expander. 22 of 24 ports used per expander and two uplinks to the 8i card each. So many wires to cable manage.
  8. fmp4m

    PSU recommendation

    It seems as if one of your rails is losing integrity. Go with a gold or platinum 750, youre at the right wattage.
  9. I have hit a small wall with the bottom two hot-swap bays. The mfg says "Two bottom drive bays can be dedicated for OS Redundancy" which I initially took as "independent sas/sata drives"..... Turns out they are, but with 2 Sata Ports per Bay. 1 Blue 1 Black. I have never seen this before, I have reached out to istarusa and they have ZERO documentation on them and the tech sheet is useless. Anyone have any experience with them? Google is showing nothing about this specific backplane. Some other searches suggest its "dual port means that the hdd/ssd has additional pins. These pins are connected to a second port which can be connected to another hba/raid controller for high avalibility." So seeming I would only need to use the one port, and if wanted HA/Redundancy (I dont think unraid would like it) connect the second to a different controller than the first. I was able to snap some pics:
  10. Thank you Johnnie yet again. Do you have a donate link?
  11. Alright I have ordered 2x9300-8i's and 2xRES2CV360. This should equal the bandwidth and be my best overall. I really appreciate the follow up and answers to questions you have provided. The PSU came in and is installed, now waiting the cards and expanders to begin moving to the new case. I have not really worked with SAS to SAS before and the connectors on the backplanes does not look familiar to me. With the 9300-8i's and expanders and the backplanes pictured above, what cables will I need? I could wait until the cards come in, but want to get started on moving it when they do with minimal downtime.