miogpsrocks Posted May 8, 2017 Share Posted May 8, 2017 (edited) Will this controller card work with unraid? LSI SAS31601E L3-01143-03D 16-port PCI-e SAS / SATA controller card 3G HBA 3Gb/s The motherboard I purchase lised these specs Expansion Slots PCI Express 3.0 x16 1 x PCI Express 3.0 x16 PCI Express x1 2 x PCIe 3.0 x1 https://www.newegg.com/Product/Product.aspx?Item=N82E16813130894 So how many(if any) of these card can my motherboard support? I'm confused by the specs, I guess its saying PCI Express 3.0 x16 is 16 speed not 16 ports and PCI Express x1 must be 1 speed ( not 1 port?) So does this mean there are 3 PCI express ports? Is PCIe the same as PCI Express? Since Unraid is the OS, is there drivers for the LSI SAS Drivers? Thanks. THANKS. Edited May 8, 2017 by miogpsrocks Quote Link to comment
heffe2001 Posted May 8, 2017 Share Posted May 8, 2017 I believe you can use one of those, in the 16x slot. Quote Link to comment
miogpsrocks Posted May 8, 2017 Author Share Posted May 8, 2017 48 minutes ago, heffe2001 said: I believe you can use one of those, in the 16x slot. So LSI makes drivers for UNraid or generic Linux OS as well as windows? Quote Link to comment
heffe2001 Posted May 8, 2017 Share Posted May 8, 2017 I'm not 100% on Unraid, but as far as your board, it'll take one. I have one of those cards on order, so hopefully I can tell you one way or another in a few days as far as Unraid is concerned. LSI has Linux drivers for most of their cards, so I'm fairly certain this one will work though.. Quote Link to comment
ken-ji Posted May 8, 2017 Share Posted May 8, 2017 That card needs a x8 PCI-E slot. Your motherboard has only one that will work - the x16 slot (usually occupied used for Graphics cards) Its a relatively old controller (circa 2007), and is more likely to be well supported by the Linux kernel - probably under the mptsas driver. Yes most (if not all) of LSI's HBAs are supported under Linux Quote Link to comment
JorgeB Posted May 8, 2017 Share Posted May 8, 2017 (edited) It works but it doesn't support >2TB, you'll want a LSI SAS2 controller, e.g., LSI 9201-16E Edited May 8, 2017 by johnnie.black Quote Link to comment
heffe2001 Posted May 8, 2017 Share Posted May 8, 2017 (edited) Just noticed myself that it's basically a single card with 2 1068e's built on. Guess it won't work for my current application (need it to run several 8tb drives in a MD1000 chassis). Good thing they aren't expensive, lol. Oh, and the 1068e controllers ARE well supported with Unraid, I'm using one in a Cisco C200 M2 at the moment for testing, and it works perfectly fine with Unraid (aside from the size limitations). Edited May 8, 2017 by heffe2001 1 Quote Link to comment
JorgeB Posted May 8, 2017 Share Posted May 8, 2017 6 minutes ago, heffe2001 said: Just noticed myself that it's basically a single card with 2 1068e's built on. Guess it won't work for my current application (need it to run several 8tb drives in a MD1000 chassis). Good thing they aren't expensive, lol. Best options for you are the 9200-8e or the 9216-4i4e. Quote Link to comment
Fireball3 Posted May 8, 2017 Share Posted May 8, 2017 9 minutes ago, johnnie.black said: Best options for you are the 9200-8e or the 9216-4i4e I can't see why you recommend the cards with external connectors!? Quote Link to comment
JorgeB Posted May 8, 2017 Share Posted May 8, 2017 (edited) Just now, Fireball3 said: I can't see why you recommend the cards with external connectors!? 19 minutes ago, heffe2001 said: (need it to run several 8tb drives in a MD1000 chassis) Edited May 8, 2017 by johnnie.black Quote Link to comment
heffe2001 Posted May 8, 2017 Share Posted May 8, 2017 Yeah, I'm looking at the 9200-8e at the moment, still relatively cheap too. I've got an Areca ARC-1231ML in my current server that works great with this setup (I have 2 4tb WD Red drives in a stripe set for parity, and several 8tb Seagate drives, plus a couple 4tb's in the array, with a 300gb Raptor for cache, and a 512gb SSD for all my docker stuff, everything but the SSD connected to the Areca card). I'm contemplating using 4 2tb reds for parity on the new setup (using the onboard 1068e controller) for parity, and the external box containing all the 8tb's (plus a couple new ones, I'm running low on space at the moment, lol). Quote Link to comment
heffe2001 Posted May 8, 2017 Share Posted May 8, 2017 3 minutes ago, johnnie.black said: Yep, that's what I need for my situation, not sure about the OP or not though. These Cisco C200 m2 boxes are pretty nice for what they are, 1u chassis with 2 5650's, capable of up to 192gb ram, with 4 hot-swap bays. If you use the onboard 1068e controller you're limited to 2tb drives per slot up front, 8tb max (6tb with parity), but you can always put a different controller in the 8x PCIE slot and plug the front drives into that controller, and use a controller with external ports in the 16x slot, going to something like a MD1000/MD3000 external chassis (that'd give you an additional 15 slots for drives, and depending on what controller use the larger 4tb+ drives, and push upwards of 150tb depending on the drives used.. Quote Link to comment
Fireball3 Posted May 8, 2017 Share Posted May 8, 2017 1 hour ago, johnnie.black said: need it to run several 8tb drives in a MD1000 chassis I'm not familiar with that chassis, and did a google on it. Found a site with details and it seemed to be a full server. But from your answer I get it's just an empty enclosure w/o board etc. Quote Link to comment
JorgeB Posted May 8, 2017 Share Posted May 8, 2017 10 minutes ago, Fireball3 said: But from your answer I get it's just an empty enclosure w/o board etc. Yes, it's just a chassis with a builtin expander. Quote Link to comment
heffe2001 Posted May 8, 2017 Share Posted May 8, 2017 Yeah, it's just a storage box without any sort of CPU. Has redundant power supplies, and most times redundant interfaces on the back (or you can split the array in it into 2 halves, one with 8 drives, one with 7, each set controlled by one of the rear controllers, probably how I will use it, with each controller on the back connected to a different port on the 8e card). If I remember correctly, you can chain 3 of them together (that may be a MD3000 + 2x MD1000's, can't exactly remember). I just wish they offered it in a tower version instead of just a rack-mount version. Had to 3d print a set of feet for it to sit vertically at our office to use with a T610 Hyper-v box that needed more drives.. Quote Link to comment
ken-ji Posted May 8, 2017 Share Posted May 8, 2017 (edited) Reminds me of mine - had some issues sourcing it from out of country... but it does the job nicely. Edited May 8, 2017 by ken-ji oops image was to big to inline. Quote Link to comment
heffe2001 Posted May 9, 2017 Share Posted May 9, 2017 Just around half the bays, but probably a whole lot quieter, lol. The MD1000 sounds like 3 vacuum's running at once when it's at full fan speed, lol. Quote Link to comment
CyberSkulls Posted May 9, 2017 Share Posted May 9, 2017 The LSI 9201-16e was mentioned earlier and I happen to run two of them in unRAID and they work perfectly fine. I only bring that card back up since they are fairly cheap on eBay these days. Sent from my iPhone using Tapatalk Quote Link to comment
JorgeB Posted May 9, 2017 Share Posted May 9, 2017 11 hours ago, ken-ji said: Reminds me of mine - had some issues sourcing it from out of country... but it does the job nicely. I'd love on of those but they are so expensive... Quote Link to comment
miogpsrocks Posted May 9, 2017 Author Share Posted May 9, 2017 On 5/8/2017 at 0:02 AM, heffe2001 said: I'm not 100% on Unraid, but as far as your board, it'll take one. I have one of those cards on order, so hopefully I can tell you one way or another in a few days as far as Unraid is concerned. LSI has Linux drivers for most of their cards, so I'm fairly certain this one will work though.. Did you get your card in? When do you think it will come in? Please let me know if they work. Thanks. Quote Link to comment
miogpsrocks Posted May 10, 2017 Author Share Posted May 10, 2017 23 hours ago, CyberSkulls said: The LSI 9201-16e was mentioned earlier and I happen to run two of them in unRAID and they work perfectly fine. I only bring that card back up since they are fairly cheap on eBay these days. Sent from my iPhone using Tapatalk Are you using these cards? Do you know if they have a 2TB limit? Thanks. Quote Link to comment
miogpsrocks Posted May 10, 2017 Author Share Posted May 10, 2017 On 5/8/2017 at 3:39 AM, johnnie.black said: It works but it doesn't support >2TB, you'll want a LSI SAS2 controller, e.g., LSI 9201-16E When you say that " IT works", are you referring to the LSI SAS31601E ? Thanks. Quote Link to comment
JorgeB Posted May 10, 2017 Share Posted May 10, 2017 1 hour ago, miogpsrocks said: When you say that " IT works", are you referring to the LSI SAS31601E ? Thanks. Yes, it works but it won't recognise disks >2TB. Quote Link to comment
CyberSkulls Posted May 10, 2017 Share Posted May 10, 2017 18 hours ago, miogpsrocks said: Are you using these cards? Do you know if they have a 2TB limit? Thanks. They are SAS2 cards and I am using two of these exact cards with 8TB Reds. So no, they have no limit. Quote Link to comment
1812 Posted May 10, 2017 Share Posted May 10, 2017 On 5/8/2017 at 10:06 AM, heffe2001 said: going to something like a MD1000/MD3000 external chassis (that'd give you an additional 15 slots for drives, and depending on what controller use I use an md1000 with an hp h220 hba. There are 2 issues. The first is that either unRaid or my hba absolute hate having all 15 disks from the enclosure sent to it in unified mode, and it locks up unRaid. So I use it in split mode with half as my primary array, the other half as a backup from another server. It works for my needs. The second issue is that it connects to the hba via an x4 cable, which should support over 1000MB/s. But it doesn't. I only see half the throughput. I believe (and this is not corroborated anywhere) that the way the md1000 is wired is that is splits the total bandwidth between each side of the enclosure, so you'll only ever get about 500MB/s out of each half max. If you get it working on unified mode, then I'm sure it's less of an issue getting then full bandwidth, but 6 disks splitting 500MB/s of bandwidth for parity checks creates a bottleneck, one that is still present with the bandwidth split. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.