Jump to content

Supermicro mobo with 3 Supermicro SATA controllers


Recommended Posts

I have a hardware problem that I hope someone can help me with, and I hope this is the right area to post in. I just finished building and transferring my entire 16 disk array to my new 24 disk server with hotswapping (Supermicro/Intel version). The problem is that I can only get any 2 Supermicro 8 port SATA cards to work at one time. That is, if I leave cards 2 and 3 in place, the system recognizes both cards and boots fine. If I leave cards 1 and 2 in place, the system boots fine. But if I have all 3 cards installed at the same time, the motherboard BIOS only recognizes 2 of the cards and then will not go past the 2 card POST screens (where you can see the controllers recognized, as well as checking the active ports) and at that point I cannot get into the motherboard BIOS – basically I am locked up. I figure that I must have a wrong jumper or some setting wrong in the motherboard BIOS which is stopping all 3 cards from working at the same time. Any ideas?

 

Motherboard – SUPERMICRO MBD-X8SIL-O LGA 1156 Intel 3400 Micro ATX Intel Xeon X3400/L3400 series Server Motherboard

SATA Controllers – SUPERMICRO AOC-SASLP-MV8 PCI Express x4 Low Profile SAS SAS RAID Controller

 

Link to comment

I have been wrestling with this and I have found a solution, but I don't know if I should continue with it, as my system is booting up way too fast now compared to the past. I found a setting in the BIOS/Advanced/PCI-PnP Configuration section labeled "PCIe Slots 5, 6, 7 & PCI Slot 4 OPROM". The description says "Use this feature to enable or disable PCI slot Option ROMs. The options are Disabled and Enabled (default). I changed all 3 of the PCIe slots to "Disabled" and now the system boots up with all 3 cards in at the same time. The thing is, the machine boots a LOT faster because I no longer have to wait while the POST screen goes through the controllers and the disks attached. Another thing that troubles me is that when I run ifconfig from the console I no longer see the IP Address being reported, though I am still able to connect via the web just fine. And the array comes up as valid and running.

 

Does this seem normal?

 

Edit - It seems that ifconfig is working just fine now, though I don't know why it was not reporting correctly before...maybe I wasn't waiting long enough for all boot process to complete?

 

My assumption at this point is that the reason the 3 cards were not working originally was because of some sort of conflict in the ROMs (BIOS) of the cards...maybe IRQ? By disabling the ROMs on the cards through the motherboard BIOS, I am now bypassing the 3 card BIOSes, but I have not had a reason to change anything in the card BIOSes anyway, so perhaps I will be fine with them disabled.

 

Does this make sense or am I completely out to lunch?

Link to comment

I have been wrestling with this and I have found a solution, but I don't know if I should continue with it, as my system is booting up way too fast now compared to the past. I found a setting in the BIOS/Advanced/PCI-PnP Configuration section labeled "PCIe Slots 5, 6, 7 & PCI Slot 4 OPROM". The description says "Use this feature to enable or disable PCI slot Option ROMs. The options are Disabled and Enabled (default). I changed all 3 of the PCIe slots to "Disabled" and now the system boots up with all 3 cards in at the same time. The thing is, the machine boots a LOT faster because I no longer have to wait while the POST screen goes through the controllers and the disks attached. Another thing that troubles me is that when I run ifconfig from the console I no longer see the IP Address being reported, though I am still able to connect via the web just fine. And the array comes up as valid and running.

 

Does this seem normal?

 

Edit - It seems that ifconfig is working just fine now, though I don't know why it was not reporting correctly before...maybe I wasn't waiting long enough for all boot process to complete?

 

My assumption at this point is that the reason the 3 cards were not working originally was because of some sort of conflict in the ROMs (BIOS) of the cards...maybe IRQ? By disabling the ROMs on the cards through the motherboard BIOS, I am now bypassing the 3 card BIOSes, but I have not had a reason to change anything in the card BIOSes anyway, so perhaps I will be fine with them disabled.

 

Does this make sense or am I completely out to lunch?

You should be fine with the BIOS disabled for the PCI slots as log as all the disks are showing up in the unRAID web-interface.
Link to comment
You could try turning only one on and see if that helps.

I'll bet that I can turn on any 2 of them, but is there any particular reason I should?

You should be fine with the BIOS disabled for the PCI slots as log as all the disks are showing up in the unRAID web-interface.

Yup, everything looks normal, but I am going to give it some time before making any changes. I have another 6 disks to add to the array and I don't want to make those changes until I know for sure that the current array is working 100%. To be honest with you, it might be my imagination, but the whole array seems to be responding much better than in the past. When I stop the array, or start it, or shut down, or reboot, it only takes a matter of seconds (under a minute) where previously it would take 5 or 6 minutes. Scary...:)

 

Thanks guys! I will report back with more information once I hear from Supermicro (I emailed tech support).

Link to comment

Got questions. 3 - 8 port cards correct?  That gives you 24 ports - max in unRAID is 20+2 = 22.  How many ports does your motherboard have?  That would give you 24 + 6 = 30 potentially.  Why do you need the extras?  If you DO have 6 on the motherboard you could get by with just 2 cards anyway 16 on the cards plus 6 on motherboard = 22.  The max that unRAID handles.

Link to comment
Got questions. 3 - 8 port cards correct?  That gives you 24 ports - max in unRAID is 20+2 = 22.  How many ports does your motherboard have?  That would give you 24 + 6 = 30 potentially.  Why do you need the extras?  If you DO have 6 on the motherboard you could get by with just 2 cards anyway 16 on the cards plus 6 on motherboard = 22.  The max that unRAID handles.

You are very close...the SM mobo only has 4 SATA ports, not 6. I understand that the current version of unRAID Pro only supports 20 data disks, but Tom assured me that he will send me a beta version to handle 24 disks when and if I need it. My case houses 24 hotswap disks, so my current plan is to expand to 24 data disks and then use 2 more disks physically outside of the array (1 for parity and 1 for cache). So I need a total of 26 SATA ports, thus the reason for using 3 cards.

 

Here was the official answer from Supermicro:

It could be due to the option ROM. Please try disabling the option ROM of one of the PCI-E slots to see if the problem still happens?

As you all know from reading this thread, I disabled all 3 card ROMs and the machine now boots correctly. I tried the tech's suggestion (and dgaschk's) of turning on the ROM 2 of the cards...no go. And I also tried turning on just one ROM...again no go. The ONLY way that this mobo boots with 3 cards is to have ALL 3 ROMs DISABLED! As soon as any one is enabled, the mobo BIOS finds all 3 cards and runs into the conflict again and will not boot. However, my machine is running better than ever with all 3 ROMs turned off!

 

Another strange thing I ran into is that the 3 cards recognize 15 of my original array's 16 drives perfectly, but would not recognize my parity drive in any slot whatsoever, BUT the motherboard SATA controller not only recognizes the parity drive, but it seems to be reading and writing data MUCH faster than it did in my old server. Normally it would take between 36 and 40 hours to do a complete parity check of my server, but now that I have the parity drive connected to the mobo SATA port, I just did a complete check in less than 8 hours. The drive is nothing special, just 1 of 6 Hitachi 2TB drives that are about 1-2 years old. The other 5 are recognized by the cards perfectly, but the parity drive is only recognized by the mobo SATA port...What's up with that?

Link to comment

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...