jbrodriguez Posted September 10, 2012 Author Share Posted September 10, 2012 hi notandor, your proposal seems ok, i might investigate into doing it ... haven't had a chance to mess around with the das, since im now building a desk (based on and instructions) and switching from xbmc to jriver media center on my htpc, so i've had my hands full (and needing a working array for the latter ) ... i'll post as soon as im able to work on the das. Quote Link to comment
Johnm Posted September 10, 2012 Share Posted September 10, 2012 Hi lboregard and johnm, Just wanted to bounce of some ideas to address both: - maximum thru-put of PCIe x8 (to avoid slowdown and bottlenecks) to an Unraid DAS - and clean installation w.r.t. having a ESXi head + UnRAID DAS(es)! Do you think this following configuration would work? Esxi head box (could be whatever you have norco 2112 etc.) > IBM M1015 in PCIe x8 slot > 2 port pitstop adapter (where both INside ports will be internally connected to 2 sas ports of M1015) UnRAID DAS (Norco 4224... 24 bays) > Chenbro CK23601 ------ Has 6 SAS ports inside for backplane... i.e. 24 bays connected > 1 port pitstop adapter (connected to internal IN port of CK23601) With this you essentially connect both machines with 2x 8088 cables: - Esxi head's 1st OUTside port of pitstop adapter -> UnRAID DAS' Chenbro CK23601's External IN port - Esxi head's 2nd OUTside port of pitstop adapter -> UnRAID DAS' 1 (and only) OUTside port of pitstop adapter This configuration can easily be extended to ESXi head and 2 DAS(es), by just using 4 port (instead of 2 port) pitstop in the head. Advantages: - Should allow clean installation, clean cables etc. - Maximum bandwidth possible from a PCIe X8 (M1015 port), going to UnRAID DAS... avoiding slowdown of parity etc. Parity and cache can be now in the DAS itself. Question is will Chenbro allow both the inputs to be used (from M1015) and still work? I think other SAS cards (which don't specifically define INs OUTs e.g. Intel RES) allow this. Chenbro defines its ports as: Internal Ports - Input from RAID/HBA?4-ports (1x Mini-SAS) - Output to Backplane?24-ports (4x Mini-SAS) External Ports - Input from Host?4-ports (1x Mini-SAS) - Output to JBOD?4-ports (1x Mini-SAS) Let me know what u guys think? Also, lboregard... other than the original experiment of putting parity in the head... you might just connect the second cable from M1015 port to the other IN of Chenbro and run parity again (still in das) to see if this speeds up? From what I read on [h]ard.. is that the chenbro out ports only work for out and the in ports only work for in. Unlike the intel units. Sent from my iPhone using Tapatalk Quote Link to comment
mrow Posted September 10, 2012 Share Posted September 10, 2012 From what I read on [h]ard.. is that the chenbro out ports only work for out and the in ports only work for in. Unlike the intel units. Sent from my iPhone using Tapatalk So what advantage does the Chenbro have over the Intel then? Because it looks like the Intel is cheaper too. Quote Link to comment
notandor Posted September 11, 2012 Share Posted September 11, 2012 From what I read on [h]ard.. is that the chenbro out ports only work for out and the in ports only work for in. Unlike the intel units. Sent from my iPhone using Tapatalk So you mean there is a chance that the above proposal might work... as the proposal is not really modifying OUTs or INs of chenbro... just that it is trying to use 2 INs (external and internal) at the same time? If this works... I'm thinking this will be a sweet way of keeping HEAD and DAS with clean boundaries and cable/rack management without worrying about any sort of slowdown essentially passing 1 PCIe x8 bandwidth fully to Unraid. Ideal goal would be to have a Norco 4224 as DAS with 2 parity and 22 data drives... once in some future version of unraid limetech supports this... with cache being on NFS RAIDz in the ESXi HEAD! :-) Till then the empty hotswap slots can be used for hot/warm spares! lboregard, eagerly waiting for your results... Quote Link to comment
notandor Posted September 11, 2012 Share Posted September 11, 2012 From what I read on [h]ard.. is that the chenbro out ports only work for out and the in ports only work for in. Unlike the intel units. Sent from my iPhone using Tapatalk So what advantage does the Chenbro have over the Intel then? Because it looks like the Intel is cheaper too. If you are building the esxi and unraid in the same box then really no advantage... as you get 4 SATA port from 1 SAS port of M1015 and 20 SATA port from 5 SAS ports of Intel... so essentially 24 SATA ports... the other port of M1015 is connected to 1 SAS port of Intel. If you are building esxi and unraid in different boxes AND if you plan to keep parity and cache in the esxi box, then also no advantage as of today... as parity and cache will go on 1 port of M1015, and in esxi with Intel you can populate 20 data drives.. maximum limit of drives as of today. If you are building esxi and unraid in different boxes AND want to be anal :-) (I'm).. and keep all unraid related stuff in one box, then chenbro can come to rescue.. it allows 24 ports... so 1 parity, 1 cache, 20 data and will leave 2 free for the future. If one wants to be futuristic, they could buy the next version of Intel... RES2CV360... which has 36 ports (almost same price as chenbro).. though all are internal ports and 1 or 2 could be used to connect to M1015... and hope someday Unraid will use all! Chenbro is a solution if one wants to keep good clean boundaries in rack which has for e.g. sliding rails. In that case, you will want to have external cables pluggable/unpluggable to connect and disconnect two boxes. Quote Link to comment
mrow Posted September 11, 2012 Share Posted September 11, 2012 From what I read on [h]ard.. is that the chenbro out ports only work for out and the in ports only work for in. Unlike the intel units. Sent from my iPhone using Tapatalk So what advantage does the Chenbro have over the Intel then? Because it looks like the Intel is cheaper too. If you are building the esxi and unraid in the same box then really no advantage... as you get 4 SATA port from 1 SAS port of M1015 and 20 SATA port from 5 SAS ports of Intel... so essentially 24 SATA ports... the other port of M1015 is connected to 1 SAS port of Intel. If you are building esxi and unraid in different boxes AND if you plan to keep parity and cache in the esxi box, then also no advantage as of today... as parity and cache will go on 1 port of M1015, and in esxi with Intel you can populate 20 data drives.. maximum limit of drives as of today. If you are building esxi and unraid in different boxes AND want to be anal :-) (I'm).. and keep all unraid related stuff in one box, then chenbro can come to rescue.. it allows 24 ports... so 1 parity, 1 cache, 20 data and will leave 2 free for the future. If one wants to be futuristic, they could buy the next version of Intel... RES2CV360... which has 36 ports (almost same price as chenbro).. though all are internal ports and 1 or 2 could be used to connect to M1015... and hope someday Unraid will use all! Chenbro is a solution if one wants to keep good clean boundaries in rack which has for e.g. sliding rails. In that case, you will want to have external cables pluggable/unpluggable to connect and disconnect two boxes. When I was looking at the picture of the Chenbro I missed the 3 SAS ports in the center of the card. Only saw the 4 off the back. That is why I didn't understand the advantage haha. Quote Link to comment
notandor Posted September 17, 2012 Share Posted September 17, 2012 hi notandor, your proposal seems ok, i might investigate into doing it ... haven't had a chance to mess around with the das, since im now building a desk (based on and instructions) and switching from xbmc to jriver media center on my htpc, so i've had my hands full (and needing a working array for the latter ) ... i'll post as soon as im able to work on the das. Hi lboregard, Any new update? Quote Link to comment
jbrodriguez Posted September 23, 2012 Author Share Posted September 23, 2012 i tried today ... that's when i realized i don't have an sff8087 cable long enough to go from the m1015 port to the internal input port on the chenbro Quote Link to comment
jbrodriguez Posted October 11, 2012 Author Share Posted October 11, 2012 i finally connected an additional cable from the second port in the m1015 to the second input on the chenbro ... i think speed improvement was kind of marginal .. fwiw i didnt reboot the server ... just plugged the cable at both ends ... not sure if some initialization is required for it to acknowledge the presence of both lanes Quote Link to comment
Johnm Posted October 12, 2012 Share Posted October 12, 2012 You might want to check that the Chenbro supports link aggregation. I'm not sure if it does. Hate to see you fry something. Quote Link to comment
ldasilva Posted October 12, 2012 Share Posted October 12, 2012 nice lots of data... as drives fail are you going with 3tb's? Quote Link to comment
jbrodriguez Posted October 12, 2012 Author Share Posted October 12, 2012 You might want to check that the Chenbro supports link aggregation. I'm not sure if it does. Hate to see you fry something. thanks Johnm ... will look into that .. although i dont really plan to run both m1015 ports into the chenbro. Quote Link to comment
jbrodriguez Posted October 12, 2012 Author Share Posted October 12, 2012 nice lots of data... as drives fail are you going with 3tb's? yes, i'm going with 3tb's, but on the second das ... i've already purchased 3 ... should be here next week Quote Link to comment
notandor Posted November 2, 2012 Share Posted November 2, 2012 nice lots of data... as drives fail are you going with 3tb's? yes, i'm going with 3tb's, but on the second das ... i've already purchased 3 ... should be here next week lboregard, which 3tb's you purchased? brand? model? type? Quote Link to comment
technocoma Posted November 3, 2012 Share Posted November 3, 2012 Nice to see someone has done this! I have just started buying parts to do this myself. Currently running unraid in ESXi in a Norco 4220 and filled it up pretty much so am buying a norco 4224 only other difference is I'm planning to use the intel expanders. Updating my mobo/processor while i'm at it. Seems like going to have to order some bits from states as supermicro motherboards are a pain to find over here and a PE-2SD1-R10 for the DAS (if it will work?) so might take a while to get it all collected. Will keep an eye on this thread! Quote Link to comment
jbrodriguez Posted November 3, 2012 Author Share Posted November 3, 2012 hi, i purchased one WD RED (made it parity) and two WD GREENs ... has been running fine so far ! Quote Link to comment
th0r Posted November 13, 2012 Share Posted November 13, 2012 Does anyone know of anyway to pool 2 unraid shares into one? Or if there will ever be any plans for unraid to do it? Or if unraid will ever support 2 parity / 40 drives? Quote Link to comment
Johnm Posted November 13, 2012 Share Posted November 13, 2012 Limetech had put a poll up asking if multi server shares was an option people would want. There might still be a possibility of that happening after stable 5.0 release. Many media players like XBMC will allow you to pool several media shares into one library. I have been talking about virtualizing my second unraid and going DAS for a long time. That time is coming way to fast. My main 4224 is at maximum capacity (with 3TB drives) and has less then 4TB free. the only way to expand is Head+DAS. If (when) I do that, I'll plan on going Head+DAS+DAS. I'll have to do this before Christmas at this rate.. Quote Link to comment
technocoma Posted November 13, 2012 Share Posted November 13, 2012 Very interesting would be nice for it that to happen... I use XBMC so it is pretty easy to add all the different places. But using sickbeard means always having to make sure theres space on the orignal unRAID for the TV shows that are on that one that will continue to download. Quote Link to comment
jbrodriguez Posted March 13, 2013 Author Share Posted March 13, 2013 quick update ... i had one 4yr old 1.5tb die on me .. it was attached to the hermes vm (nzb downloading). this vm was running freebsd 9, with a zfs pool on each no disk, no redundancy, no nothing ... i thought long and hard about what to do and in the end i decided to go for a complete overhaul - install esxi 5.1 from scratch - install freebsd 9.1 for hermes with a 500gb disk for temp files and such and a 2x1.5tb zfs mirror pool for "unraid cache" - install solaris 11.1 for atlas with two zfs pools: a 2x2tb encrypted mirror for backups and a 4x1tb raidz pool (previously 5x1tb), to serve as nfs datastore and any other random purposes i backed up the vm internals, but not the vm themselves (really didnt feel like going the ghettovcb route). i also tried to update the supermicro bios to 2.0b, but it didnt work: not from ipmi with the servethehome.com iso, nor from a physical usb with the bios straight from supermicro's site. i was worried that my 3 m1015s would throw a fit ... but the handled it like da man ! passthrough went just fine and attaching them to the unraid vms worked flawlessly as well. all in all ... a very smooth "upgrade" ... no hardware change at all, except for the new hard drives. Quote Link to comment
mrow Posted March 14, 2013 Share Posted March 14, 2013 - install solaris 11.1 for atlas with two zfs pools: a 2x2tb encrypted mirror for backups and a 4x1tb raidz pool (previously 5x1tb), to serve as nfs datastore and any other random purposes Solaris or OpenIndiana? If Solaris, why it over OpenIndiana? Quote Link to comment
jbrodriguez Posted March 15, 2013 Author Share Posted March 15, 2013 solaris, because currently it's the only implementation with native zfs encryption .. all other solaris derivatives may handle encryption through a separate layer (geli, etc.) Quote Link to comment
jbrodriguez Posted January 23, 2014 Author Share Posted January 23, 2014 i made some changes to this setup. read more about it here http://lime-technology.com/forum/index.php?topic=31476.0 Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.