The Monolith - 17-drive Fractal Design XL R2 build [initial build complete]


Recommended Posts

UPDATE 2018/06/07 Pictures re-uploaded

UPDATE 2014/05/19 Power outage

UPDATE 2014/03/16 Installed additional exhaust fan, HDD cage, and fan controller, parts list and price updated accordingly.

UPDATE 2014/03/03 Initial build complete, post partly re-written. DIY fan bracket. Cache drive added. Ethernet speed woes fixed. Fixed images in OP, added new ones.

UPDATE 2014/02/01 Confirmation that the 2nd SATA card works, addendum to label section

UPDATE 2014/01/23 Added section on mobo surgery and HDD labels under BUILD section

UPDATE 2014/01/11 BUILD section below updated

UPDATE 2014/01/19 Likes, Dislikes, Add-ons, Future plans, Cost, and Current Status updated

 

This post is split into sections:

  • PLANNING,  OVERVIEW, AND PARTS LIST
  • BUILD
  • UNRAID INSTALLATION AND SETUP
  • CURRENT STATUS

 


PLANNING, OVERVIEW, AND PARTS LIST

 

After years of hesitation and planning, I finally decided to pull the trigger on an unRAID box. The array is mainly for XBMC video streaming and photo storage. This is heavily based on the UK-version of the budget box in the wiki.

 

The biggest hurdle during planning was picking the case. After hours and hours of reading reviews and watching youtube videos, I had three on my short list: Antec Twelve Hundred, Cooler Master Storm Trooper, and Fractal Design XL R2.

 

I ended up with Fractal Design, mainly due to its noise dampening features and minimalistíc sexy looks. It has 8x3.5" slots and 4x5.25", for a total of 13 drives with a 4-into-3 and 5.25" to 3.5" bracket. You can add a third 4x3.5" cage (sold separately), confirmed by Fractal Design tech support. This brings the total to 17 drives.

 

The build is partly from existing parts, as I'm upgrading my desktop at the same time.

 

CPU: Pentium E6500 (from current computer)

Motherboard: Asus P5Q SE PLUS (from current computer)

RAM: 8GB (from current computer)

Case: Fractal Design XL R2

Drive Cage(s): Silverstone CFP52B (4-into-3), Startech 3.5inch Hard Drive to 5.25inch, extra Fractal Design lower cage

Power Supply: Seasonic Platinum Series 660 Watt

SATA Expansion Card(s): 2 x Supermicro AOC-SASLP-MV8 which each support 8 SATA drives

Cables: 4 x 3Ware Multi-lane SATA cable CBL-SFF8087OCF-05M

Fans: case comes with 3x140mm fans, added two be quiet! Silent Wings 2 140mm fans, front and top

Fan Controller: NZXT LXE

UPS: APC Back-UPS Power-Saving ES 8 outlet 700VA (does not support gigabit ethernet, see "Notes and Warnings" below)

Ethernet surge protector: APC ProtectNet

GPU: Sapphire HD 6450 (my mobo won't boot without a GPU, and doesn't have integrated graphics. This is dirt cheap at 33 EUR, and has low power consumption at 5W idle.)

CPU cooler: Cooler Master Hyper TX3 EVO

 

Drives: 13 assorted 2TB drives

Cache Drive: yes

Total Drive Capacity: ~22TB. Maximum capacity with 17x2TB drives (no cache drive) is 29TB based on the unRAID capacity calculator.

 

Primary Use: XBMC streaming, movie, tv-series, mp3 and photo storage

Likes: The case looks beautifully understated, and is very solidly built. The box is whisper quiet on fan controller's lowest setting with five fans, and only ups to a gentle hum on highest.

Dislikes: The case doesn't have much space for cables on the back, which makes cable management unnecessarily difficult. In fact, Define R4 has much more space. Second lower HDD cage needs to be removed completely to work with the drives or cables.

Add Ons Used: Dynamix with various add-ons, namely directory cache, temps, stats, and monthly parity check.

Future Plans: Install another fan to the top of the case. I will try to get Crashplan to work after file migration.

 

Cost: 1,347 EUR, excluding shipping, including 2xunRAID Pro keys, 4x2TB WD and Toshiba green drives, and various SATA and power cables and splitters. Does not include cost of parts taken from current computer.

 

Note that the CPU, Mobo and RAM are from my current computer, so they are probably overkill for an unRAID box that doesn't need to do transcoding.

 


BUILD

 

Parts arrived. I didn't have to pay extra for shipping, although the parts shipped in three large boxes (bluray case for size reference):

 

DSC_0335.jpg

 

Portrait of everything that arrived, cases excluded. This includes parts for a desktop upgrade I'm doing at the same time (thread on building Spaceman Spiff on HardOCP Forums).

 

DSC_0337.jpg 

 

And here family portrait. Spaceman Spiff v1 (desktop) up front (notice pro cable management) in a no-name case, which is going to a well-deserved retirement after I'm done. The parts will be split between Fractal Design Define R4 for Spaceman Spiff v2 in the middle, and at the back Fractal Design XL R2 for The Monolith.

 

[photo lost]

 

I first built Spaceman Spiff v2, and installed Linux Mint on it. Then to The Monolith. Below Fractal Design XL R2 case after installing the PSU. On the left is the nice bag for cables that came with the Seasonic Platinum PSU.

 

I later moved the bottom fan to the front face, as I'm planning to install a second lower drive cage in that location for future expansion prep.

 

[photo lost]

 

Here the front face. The lower fan is the one I moved from the bottom of the case. The top fan came with the case. There's no need to work with tools when installing these fans, but you have to pull the cables partially out of their leads for them to fit properly.

 

The Silverstone 4-into-3 is in the 5.25" bay, no modification necessary. Also the plastic (!) 5.25" to 3.5" bracket in the top bay is visible. I changed this to a metal bracket later.

 

DSC_0374.jpg

 

First stage: mobo, memory, CPU, GPU, and ethernet card installed, along with three new 2TB drives, running pre-clear at this moment. I wasn't sure if the ethernet plug on the mobo was working, but it is, so I'm pulling it next time I shut her down.

 

DSC_0361.jpg 

 

Initial build complete below. Features HDD labels, mobo which went through surgery to fit the SATA x4 cards into x1 slots, and the 4-into-3 cage popping out to accommodate the DIY fan bracket. Details of those in the updates below.

 

DSC_0382.jpg 

 

INITIAL BUILD UPDATE 1:

 

I broke a mounting pin on the stock cooler, so I bought Cooler Master Hyper TX3 EVO which was cheap at 17 EUR. I don't expect the CPU to get very hot, so I was mainly concerned about noise; this cooler is supposed to be very quiet due to low RPM fan (edit: quietness confirmed). The store also had a green 5900rpm drive for dirt cheap at 66EUR, so I bought one as well.

 

Mobo surgery:

It quickly dawned on me that the two SATA cards I ordered are PCI-E x4 cards, and my mobo has two x1 slots. After research, there were a few options:

- Buy a new mobo (€€€, whole build would have to be re-done)

- Exchange the SATA cards to x1 cards (stock status at retailers 4-6 weeks)

- Buy a riser card

- Perform surgery on the PCI-E x1 slot on my mobo

 

I dismissed the first two options due to reasons in parantheses, so I was left with the latter two. I was surprised to learn that fitting x2/x4/x8/x16 cards in x1 slots is not only possible, but done quite routinely. Enthusiasts who use GPUs to mine Bitcoins do that to fit multiple GPUs on one mobo. You will of course lose bandwidth offered by the higher lanes, but that doesn't matter to Bitcoin mining.

 

The first option is to buy a riser card, and I found some locally on eBay. The other option was to perform a surgery on my mobo, by cutting one end of the x1 slots open, so that the x4 card could fit in there. When I first heard about this (credit goes to zorinac) I thought it was a joke, but after further research it didn't sound like such a hare-brained idea, after all. Risky, yes.

 

Below photographic documentation of the surgery.

 

DSC_0356.jpg

DSC_0357.jpg

DSC_0359.jpg

DSC_0347.jpg

DSC_0348.jpg 

 

Since I don't have a fancy dremel tool, I found some grinding stones for my powerdrill. Took The Monolith apart, removed the mobo, put it in a cardboard box to contain the ground plastic. Safety first, wear goggles! I carefully ground away the two slots like in the link above, holding the drill and grinding stone on the same axis as the slot. It only took a few minutes.

 

As the drill is quite big, I had to grind at a slight angle, therefore the bottom part of the grind didn't open the slot fully. I used a knife to cut open the rest. Yet another roadblock was the fact that the SATA cards were not full height. I jury-rigged a professional-grade retainer from duck tape, as seen in the pictures. Attached the SATA splitter cable.

 

Before I tested the card, I confirmed that the array works. I moved some cables around (parity drive to mobo SATA slot 1 just in case it makes a difference), and the sda/b/c/... assigments were wrong after I booted up. I confirmed on the forums that they don't matter - as long as the drives themselves are in correct order (disk1 is still the same Western Digital, etc.), you're ok. I started the array, tested a file, everything worked.

 

Shut it down again. Move the single 1TB drive I had to the SATA card, replacing with a brand new 2TB drive I picked up. Fire up the array. The card didn't require any BIOS or unRAID tweaks, it just started working. 1TB drive worked, which means the surgery was successful! I was glad that I wasn't the victim of some elaborate hoax! Second card and x1 port work, also.

 

This brings the # of HDDs to 22: six on the mobo, 2x4 in each of two SATA cards. The case holds "only" 17 drives, though.

 

HDD labels:

It occurred to me that with 10+ drives in the array figuring out which drive to pull in case of drive failure could be a major PITA. To limit that P, I labeled the drives.

 

I used two separate labels, one for the unRAID designator, other for the last four digits of the serial number. That way I don't have to re-do the entire label if I decide to change the order of the disks. I also noticed that the sda/b/c/... designation can change, and doesn't really matter, so it's probably unnecessary information - and it was already outdated by the time I booted her up.

 

If you don't label the drives, you have to pull the drive out to confirm it's the right one - and Mr Murphy will make sure you have to check several drives. Even with the nice caddies the Fractal Design case has, it can be a hassle as the drives don't come out unless you unplug the power and SATA cable.

 

It might become handy to know where the cable in each drive goes to. I will be adding that into a label. ie. "SATA3" for mobo SATA port 3, or "CARD 2/2/4" for SATA card 2, port 2, cable 4 (my

 

DSC_0364.jpg 

 

INITIAL BUILD UPDATE 2:

 

DIY Fan Bracket

 

The top four 5.25" bays hold the 4-into-3 and 3.5"-into-5.25" cages. The FD case doesn't have a fan slot for these drives, and I wasn't surprised when temps rose to well above 40 degrees even with only one drive in the 4-into-3 cage. I had to build my own fan bracket.

 

Here the parts. Be Quiet! Silent Wings 2 PWM 140mm fan, screws and brackets. I made the four L-shaped brackets out of aluminum, sawed them from a longer piece, marked the holes carefully to align with the case and fan, and drilled with a powerdrill.

 

 

[photo lost]

 

The nuts and bolts attach to the case, and are quite difficult to get in place. Therefore I elected for a semi-permanent solution: when I need to do changes to the 4-into-3 cage, I remove it from the back. There is just enough clearance between it and the mobo, thanks to the large case.

 

Below you can see the top of the case, with the drive cage popping out. The cage is only mounted on one row of screws, but it is secure.

 

[photo lost]

 

Here front view of the fan in place. Instead of screws, the fan is mounted with push-pins that came with it. It is very secure, and doesn't vibrate at all even at the highest speed.

 

[photo lost]

 

Unfortunately this is not enough to keep the temps at bay. Even at highest speed setting they climb close to 40 degrees. I will be adding a top fan to the case.

 

Cache drive

 

Although my initial plan did not include a cache drive, I elected to use one anyway. My main reasons to opt for one were mainly the use as a warm spare, speeding up torrent transfers, and avoiding wear and tear on the parity drive - cache drive is not protected by the parity drive.

 

WARNING: Ethernet speeds and UPS

 

I hadn't bothered to check what speeds to expect from an unRAID array. I happily moved over several terabytes at 10MB/sec or lower. When I installed the cache drive, I wondered why it didn't improve transfer speeds at all. I also found out that I should be expecting at least twice the speeds for writing to the array itself.

 

After a lot of testing and headscratching, I tracked down the culprit to my APC UPS. The UPS is in spec, as it doesn't claim to support gigabit ethernet. Instead of replacing two UPSs, I ended up buying APC ProtectNet, and now my speeds are as expected.

 

So before you buy a UPS, double-check it supports gigabit ethernet, or be prepared to protect it some other way like I did.

 


UNRAID INSTALLATION AND SETUP

 

I installed unRAID on a SanDisk Cruzer 8GB (smallest I could find) low profile USB stick. Asus P5Q SE Plus's BIOS is a bit strange with booting from USB. The USB is listed under HDDs, not as a separate USB device category. After I figured that out, I just moved the USB to the top of the bootup queue, and unRAID booted up. So with this mobo you do need a monitor for installation.

 

Installation of unRAID itself was a breeze - after I managed to find a way to make a bootable USB stick in Linux. I followed the tutorial here. Noted the IP address of the unRAID box, and fired up Firefox on Spaceman Spiff, and I was connected right away - no need to anything at the router or any software configs other than to plug it in. IP address wasn't needed, logging into //tower worked. Really user-friendly!

 

What wasn't very user friendly was installation of unMenu. I installed PuTTY on Spaceman Spiff, and was able to install unMenu with these instructions. It was easy, but probably a bit intimidating for those who're not used to CLI (it's been a few years for me).

 

Then I started three telnet sessions to run the pre-clear script on the three disks. Now I can't boot Spaceman Spiff, so I'm a bit stuck with making progress on upgrading. My friend told me there's a way to just have it running in the background, but haven't looked into that yet. edit: you can do it by running an unfortunately-named Linux utility called screen (good luck googling that :P? here very easy step-by-step guide to pre-clearing with screen, which I highly recommend over a straight telnet session. /edit

 

I didn't have fans running on the open case, and the HDDs have been running between 35 and 42 degrees Centigrade. I'll need to buy some molex adapters for the fan leads, since the mobo doesn't have enough to run three case fans. edit: I opened the instruction booklet :o which came with the case, and it clarified this: the built-in fan controller is connected to the PSU with a molex connector, and there are three fan connectors in the fan controller. /edit

 

Pre-clearing the three 2TB drives took 20hrs 20mins to 21hrs 5mins. The slowest drive I've successfully pre-cleared without errors took 25 hours. Formatting was done in a few minutes, and then I set one of the WD drives as the parity drive, rest as data drives, and started parity sync, which took 3-4 hours.

 

"Shares" was not available to be able to set up user shares, and you have to turn it on from Settings tab for it to show up - this was not mentioned in any of the guides I've seen. I set the first test user share at 100 split levels (ie. disabled), with high water, and excluding disk1, which I use for "the rest," ie. mp3s, photos, ebooks and personal stuff that I can fit on one disk.

 

Notes and warnings

  • Fractal Design XL R2 comes with "only" 32 HDD screws (ie. 8 drives). Fortunately the screws are the same for the Define R4 case I bought at the same time. Your current HDD screws might not be long enough, since there are anti-vibration grommets in the quick-swap bays.
  • The Silverstone 4-into-3 comes with a 120mm fan, but only works with a Silverstone case without modding. I didn't install it.
  • XL R2 and Seasonic Platinum PSU both are fully painted, so it's hard to find anything to attach an anti-static strap to. Finally I found an exposed end of a screw.
  • The blue power led is really bright and annoying, so I pulled that lead out.
  • I was shocked to learn that if you stop a file transfer to the array mid-stream, the incomplete file will not be deleted under Linux. There are no incomplete files when doing this in Windows. Therefore I ended up running md5deep hashes to ensure critical data is identical to source, and checking for duplicates on non-critical data. More on that topic here.
  • I had three (!) drive failures during the first weeks of build on my old HDDs I migrated, out of seven or eight drives. One failed pre-clear (would not complete at all), and two failed during the first few days. Therefore I recommend doing two pre-clear cycles even on used drives.

 


CURRENT STATUS

 

Currently at 16 drives, including parity and cache, for a total of 35TB of space.

Edited by Ulvan
Link to comment

For a pure storage server your repurposed CPU, mobo, and RAM should be more than sufficient. It should even be sufficient for most plugins. If you ever decided to go the Plex route (assuming not as you use XBMC) it should be sufficient for at least a single transcoded stream.

 

As for the fans, IMHO it's not overkill. You want to keep your drives as cool as possible.

Link to comment

Having owned both products, admittedly only the 900 case not the Antec 1200 as you suggest.

 

I would go for the Fractal if you want build quality and beautiful polish to the product, but with the downside of tricker maintenance and drive access.

 

I would go for the Antec case if you want quick and easy access to drives (if you buy a 5 in 3 cage - see my blog for a review of the ones i got from x-case UK). http://blog.ktz.me/?p=145 The product isn't as refined, but then neither is the price. You'll find the Antec keeps things a lot cooler (my Fractal things got toasty, upwards of 40+C on the drives somedays. They are now 28-34C in the same place physically.

 

Enjoy your decision making!

 

 

PS - I might be tempted to sell my Supermicro SAS card in the near future, it's the same model as you listed I think PM me if interested.

Link to comment
If you ever decided to go the Plex route (assuming not as you use XBMC) it should be sufficient for at least a single transcoded stream.

 

Most of my content is at 720p or 1080p native, which would be streamed to HTPC to my 1080p projector, no transcoding necessary. I watch BDs using a PS4. Or am I missing something where transcoding is necessary to get from unRAID to XBMC to screen?

 

Having owned both products, admittedly only the 900 case not the Antec 1200 as you suggest.

 

I would go for the Fractal if you want build quality and beautiful polish to the product, but with the downside of tricker maintenance and drive access.

 

I would go for the Antec case if you want quick and easy access to drives (if you buy a 5 in 3 cage - see my blog for a review of the ones i got from x-case UK). http://blog.ktz.me/?p=145 The product isn't as refined, but then neither is the price. You'll find the Antec keeps things a lot cooler (my Fractal things got toasty, upwards of 40+C on the drives somedays. They are now 28-34C in the same place physically.

 

Enjoy your decision making!

 

PS - I might be tempted to sell my Supermicro SAS card in the near future, it's the same model as you listed I think PM me if interested.

 

With the Twelve Hundred you'd have to break those tabs to fit the cages. It's also more expensive here (Netherlands) than Fractal Design XL. With the need to buy the cages it will bring the price quite a bit higher than Fractal Design.

 

How does noise level compare between Antec and Fractal? I have a small apartment, and the box would be in the same room as I watch my videos. The thing is, based on what you and other said, more fans might be necessary in the FD case, which could bring the noise level to parity between it and the Antec. Tough one.

 

I might take you up on the offer, so might be dropping you that IM :) I'll be doing the build in Jan/Feb.

Link to comment

If you ever decided to go the Plex route (assuming not as you use XBMC) it should be sufficient for at least a single transcoded stream.

 

Most of my content is at 720p or 1080p native, which would be streamed to HTPC to my 1080p projector, no transcoding necessary. I watch BDs using a PS4. Or am I missing something where transcoding is necessary to get from unRAID to XBMC to screen?

 

You're not missing anything. In your current setup transcoding is not necessary. unRaid will simply stream the raw file to your HTPC.

 

I was just stating that if you ever wanted to use Plex in the future for on-the-fly transcoding, to devices like iPads for instance, that your existing CPU would only be good for a single concurrent stream.

Link to comment

If you ever decided to go the Plex route (assuming not as you use XBMC) it should be sufficient for at least a single transcoded stream.

 

Most of my content is at 720p or 1080p native, which would be streamed to HTPC to my 1080p projector, no transcoding necessary. I watch BDs using a PS4. Or am I missing something where transcoding is necessary to get from unRAID to XBMC to screen?

 

You're not missing anything. In your current setup transcoding is not necessary. unRaid will simply stream the raw file to your HTPC.

 

I was just stating that if you ever wanted to use Plex in the future for on-the-fly transcoding, to devices like iPads for instance, that your existing CPU would only be good for a single concurrent stream.

 

Got it, thanks. No plans for that, at least at the moment.

Link to comment
  • 2 weeks later...

OP modified: added GPU choice of Sapphire HD 6450. It has passive cooling, and power consumption at 5W idle. Added capacity calculation update. Updated cost estimate.

 

edit: updated with a PSU upgrade to Seasonic Platinum Series 660 Watt. It's only a few more EUR than the X-650, has a similar or slightly better feature set (idling at fanless being important for me) and similarly ridiculously great reviews, but comes with a 7-year (!) warranty as opposed to 5-year for X-650.

Link to comment

Trigger pulled! Stock status is up to five days.

 

OP updated with final component list, cost updated.

 

Thank you for all the help, really appreciate it! I'm sure I'll need more when the parts arrive :)

Post pics!

 

I will, definitely. It'll be a week or two due to shipping, and I'm upgrading my desktop at the same time, building it first.

Link to comment

The parts arrived on Thursday! That was within the 3-5 delivery period, and I didn't have to pay any extra despite the rather large box plus two separate boxes for the case and UPS :)

 

Silverstone CFP52B 4-into-3 bay comes with a 120mm fan which I didn't know. The "detailed installation guide we have prepared meticulously for you" consists of seven images and a few accompanying words. It gets the job done, but mentions nothing about fan installation.

 

After googling, unfortunately the fan does not mount into the bay, but into some Silverstone case. One might be able to mod it to fit, but I'll forgo that for now. The HDD brackets are thick plastic, I'll see how this fares. Otherwise it seems to be a solid bay, and slid into the XL case without a problem.

 

Seasonic Platinum Series 660 came in a pretty nice box, but when you open it, you're in for a surprise. It's packed in a felt pouch, like some fine bottles of spirit. The modular cables are in two nice bags, as well.

 

Installed the PSU and the three HDDs that I ordered. Tomorrow I'll continue with the rest - although I'll probably start by building my desktop, which I'm upgrading at the same time.

 

Posting more tomorrow, with pics!

Link to comment

OP updated with build and installation info, and pics! Split the post into sections.

 

edit: section OPEN QUESTIONS added at bottom of OP. Trying to figure out how to get all the five case fans running using the case's built-in fan controller, so any help with that is welcome!

 

edit2: question above settled: fan controller is connected to the PSU with a molex, further detail in OP. Parts list updated with CPU cooler and fourth HDD, cost updated.

Link to comment

I've run into a major roadblock. The SATA cards arrived and they are PCI-E 4x cards. But the mobo only has two PCI-E 1x slots.

 

Is there an adapter, or do I have to get a new mobo? Perhaps there are SATA cards which fit PCI-E 1x slots. This card seems to fit. Or is it worth upgrading the mobo - how slow is 1x compared to 4x with up to four HDDs attached to one card?

 

edit: Looks like I'm fucked. The only controller available over here is Highpoint 2640, which has not been tested to work with unRAID. Looking at the unRAID compatibility page, Highpoint cards seem to often need scripts or other tweaks to get the to work. The PCI-E 1x cards have a 4-6 week delivery, which essentially means "you might get them if we can find them."

 

Changing the mobo is also not an option. None of the Socket 775 mobos available over here have PCI-E 4x ports.  :'(

 

edit2: Dawicontrol 3410 (pdf) uses Silicon Image Sil3124-2, which is on the compatibility list. It's a PCI card with four SATA ports. This might be an option, probably quite a downgrade in speed.

Link to comment

Clearly you need PCIe x4 slots for the SATA controller cards.  This requirement is clearly stated in the specifications for the cards, so it shouldn't have been a surprise.

 

The best solution is to just replace the motherboard and CPU with a new Socket 1150 board with the appropriate expansion slots, and get a Haswell CPU for it.    ... or a Socket 1155 board and CPU, if those are more available for you.

 

A PCIe x1 controller will work fine, but will only support a max of 4 drives; and will be bandwidth limited when multiple drives are accessed simultaneously (as in parity checks).

PCI controllers are even more bandwidth-restricted, although they'd work okay, if you really don't want to change the motherboard/CPU.

 

Link to comment

Clearly you need PCIe x4 slots for the SATA controller cards.  This requirement is clearly stated in the specifications for the cards, so it shouldn't have been a surprise.

 

It was. Not everyone is as knowledgeable as you are about computers.

 

The best solution is to just replace the motherboard and CPU with a new Socket 1150 board with the appropriate expansion slots, and get a Haswell CPU for it.    ... or a Socket 1155 board and CPU, if those are more available for you.

 

That's another few hundred more EUR :( And that's not including memory, if I need to change that as well.

 

A PCIe x1 controller will work fine, but will only support a max of 4 drives; and will be bandwidth limited when multiple drives are accessed simultaneously (as in parity checks).

PCI controllers are even more bandwidth-restricted, although they'd work okay, if you really don't want to change the motherboard/CPU.

 

Yeah, I'll have to re-assess things, as this really throws a monkey wrench in the plans.

 

edit: apparently there is such an adapter. A PCI-E 16x to 1x riser is used by Bitcoin miners to attach multiple GPUs to a motherboard.

Link to comment

Depending on what's available in your area, and the cost, clearly you have two options:

 

(1)  Replace motherboard/CPU and (possibly) memory

 

or

 

(2)  Use a PCIe x1 controller and some PCI controllers

 

#2 is simplest ... and PROBABLY a lot less expensive, depending on what's available for you.  Used PCI controllers are often available on e-bay for $10-15, but I'm not sure what they'd cost shipped to you.

 

Note that BOTH PCI and PCIe x1 controllers will NOT have the bandwidth needed to support modern drives at full speed when you're accessing multiple drives at once ... but will work just fine for "normal" UnRAID use, as when you're streaming media, or writing to a single drive, there will be ample bandwidth for that drive.    Clearly I'd be sure parity was on a motherboard port  :)

 

I'd compare the total cost you'll incure for both #1 and #2 => obviously #1 is by far the best technical solution ... but if the cost is prohibitive, and you can get the controllers you need for a lot less, then #2 will indeed work.

 

Link to comment

edit: apparently there is such an adapter. A PCI-E 16x to 1x riser is used by Bitcoin miners to attach multiple GPUs to a motherboard.

 

An adapter should work; but of course you'll still be restricted to x1 bandwidth ... and this will REALLY limit the bandwidth when you're accessing 8 drives.    But it should work -- and, except for parity checks and drive rebuilds (and the initial parity sync) won't have a major impact on your performance.

 

Link to comment

edit: apparently there is such an adapter. A PCI-E 16x to 1x riser is used by Bitcoin miners to attach multiple GPUs to a motherboard.

 

An adapter should work; but of course you'll still be restricted to x1 bandwidth ... and this will REALLY limit the bandwidth when you're accessing 8 drives.    But it should work -- and, except for parity checks and drive rebuilds (and the initial parity sync) won't have a major impact on your performance.

 

Yes, that's what I figured. It looks like PCI-E x1 has roughly double the bandwidth of plain PCI, so risers on the current cards would be a decent option vs returning the x4 cards and finding a pair of PCI cards.

 

I was reading on some old threads on here, which also echoed what you said: it most likely won't make a difference in everyday use since I'd rarely be using multiple drives, but parity checks and emulating a drive would be dead slow.

 

I found some adapters on eBay for 17.50 EUR, cheap. Then there'd be some jerry rigging to mount the cards in the case with the risers - but it looks like a decent compromise over spending several hundred more EUR and re-doing the whole damn build.

 

There's an even cheaper, but a lot scarier option: cut open the x1 port. Looks like it's not a joke, since it's been reported elsewhere. I think that's a bit too geeky for me :o

 

Just can't believe I've done 20+ years of PC upgrades and I've never run into problems with x1/x2/x4/x8/x16 compatibility so it never even occurred to me. I must have been lucky. Live and learn.

 

Thank you for the help in clarifying my options, garycase, really appreciate it!

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.