Jump to content

10GB Ethernet?


jeffreywhunter

Recommended Posts

I'm pretty sure someone tested this before and confirmed it worked for them too. Usually the NICs aren't what kills your budget for 10gbe, it's the switches. That said, if you got the budget and the SSDs for your cache yo warrant it, go for it!!

Link to comment

those HP cards are just mellanox connectx-2 cards that you can get on ebay for $35. If you only need 2 10gig ports you can get a quanta switch and a couple of dac cables and you can have 10Gb to your server and workstation for uploading media and you will have 48 1Gb for all other connections. I spent about $200 for everything, so quite cheap. Unfortunately the pitiful driver support in unraid doesn't support mellanox cards. I am hoping that someone will post some good instructions for building a custom kernel for unraid 6 so that I can finally see how fast SSD to SSD transfers can be.

Link to comment

those HP cards are just mellanox connectx-2 cards that you can get on ebay for $35. If you only need 2 10gig ports you can get a quanta switch and a couple of dac cables and you can have 10Gb to your server and workstation for uploading media and you will have 48 1Gb for all other connections. I spent about $200 for everything, so quite cheap. Unfortunately the pitiful driver support in unraid doesn't support mellanox cards. I am hoping that someone will post some good instructions for building a custom kernel for unraid 6 so that I can finally see how fast SSD to SSD transfers can be.

Did you post a feature request for that driver support?  I'm sure we could add that for you if you just asked [emoji6]

Link to comment

 

 

those HP cards are just mellanox connectx-2 cards that you can get on ebay for $35. If you only need 2 10gig ports you can get a quanta switch and a couple of dac cables and you can have 10Gb to your server and workstation for uploading media and you will have 48 1Gb for all other connections. I spent about $200 for everything, so quite cheap. Unfortunately the pitiful driver support in unraid doesn't support mellanox cards. I am hoping that someone will post some good instructions for building a custom kernel for unraid 6 so that I can finally see how fast SSD to SSD transfers can be.

 

FYI, I took the liberty of asking Tom to include this in the next release and he did just that.  Please be sure to report back after the next release to let us know how it works.

Link to comment

JonP's right -- there were definitely a couple of folks on the forum who set up some 10Gb systems.    But as he also noted, be sure you look at the switches you'll need, as a 10Gb switch is PRICEY !!

 

... You don't need a switch, by the way, if you've only got one client => you can direct connect between a PC and the server (You'll need to use static IP's).    ... But that's an unlikely scenario unless you're just testing [Although with the adapters available so inexpensively, buying two just to test how much better the network "feels" before committing to a switch might be worthwhile].

 

 

Link to comment

... (okay this isn't for home use) ...

 

They'll work just fine for home use  :) :)

 

... At least for those whose "home" is a multi-million dollar estate, where a $13,000 switch isn't a big deal  8)

http://www.cdw.com/shop/products/Cisco-Catalyst-4900M-switch-8-ports-managed-rack-mountable/1392267.aspx

 

... I concede, of course, this is probably not a very typical UnRAID user  :) :)

 

And just in case it isn't enough and you need something bigger, have a look at the Cisco Nexus 7000 series. They work perfect :)

 

Link to comment

If you only need a few ports, get an older server board with enough PCIE slots, fill it with $35 ebay 10Gb cards, and run pfsense on it. Find something with dual onboard gig ethernet and 7 PCIE slots, and you could end up with 7 10Gb and 2 Gb ports for a dual homed internet connection.

 

Presto, managed firewall 10Gb switch for MUCH less money.

 

At $35 per port, it may even make sense to find a used PCIE expansion chassis and see if it would work, if you need more than 7 10Gb clients.

Link to comment

jonp - thanks for adding the drivers, I never knew there was an option to request drivers be added.

 

garycase - 10GBASE-T is expensive but 10Gb doesn't need to be expensive. As I stated I have $200 in my setup. On ebay you can find quanta lb4m for $100, dell 5524 for $400, both have 2 SFP+ ports. If you need more high speed connections both those switches support link aggregation its easy to add a few 2Gb or 4Gb connections for other PC's or servers. A new Dlink with 4 SFP+ is around $600. Mellanox and brocade cards are $30-35 and as long as you can use dac cables those are cheap as well.

 

 

Link to comment

... (okay this isn't for home use) ...

 

They'll work just fine for home use  :) :)

 

... At least for those whose "home" is a multi-million dollar estate, where a $13,000 switch isn't a big deal  8)

http://www.cdw.com/shop/products/Cisco-Catalyst-4900M-switch-8-ports-managed-rack-mountable/1392267.aspx

 

... I concede, of course, this is probably not a very typical UnRAID user  :) :)

 

And just in case it isn't enough and you need something bigger, have a look at the Cisco Nexus 7000 series. They work perfect :)

 

Great idea !!  Never hurts to have a few spare ports  :) :)

Link to comment

... (okay this isn't for home use) ...

 

They'll work just fine for home use  :) :)

 

... At least for those whose "home" is a multi-million dollar estate, where a $13,000 switch isn't a big deal  8)

http://www.cdw.com/shop/products/Cisco-Catalyst-4900M-switch-8-ports-managed-rack-mountable/1392267.aspx

 

... I concede, of course, this is probably not a very typical UnRAID user  :) :)

 

And just in case it isn't enough and you need something bigger, have a look at the Cisco Nexus 7000 series. They work perfect :)

 

Great idea !!  Never hurts to have a few spare ports  :) :)

 

Yeap, just ordered one ...

 

Prod_Bulletin_Nexus_7000_DC-1.jpg

 

Link to comment

If you only need a few ports, get an older server board with enough PCIE slots, fill it with $35 ebay 10Gb cards, and run pfsense on it. Find something with dual onboard gig ethernet and 7 PCIE slots, and you could end up with 7 10Gb and 2 Gb ports for a dual homed internet connection.

 

Presto, managed firewall 10Gb switch for MUCH less money.

 

At $35 per port, it may even make sense to find a used PCIE expansion chassis and see if it would work, if you need more than 7 10Gb clients.

 

 

Interesting idea.    Note also that you likely don't actually need a lot of 10Gb connections.  A 10Gb to the server and possible to the main PC are likely the only 10Gb connections you'd actually need => 1Gb to all other clients is probably just fine in the vast majority of cases.    There are, of course, some relatively low-cost switches that provide a single 10Gb connection and a bunch of 1Gb connections, so they would work as well ... but your idea to "roll your own" is intriguing.

 

Link to comment
  • 1 month later...
  • 1 year later...

I have the connectx2 with a dag between my old server (using snapraid and an external enclosure) to my ne unraid box.  Private network for the 10gb stuff of 10.0.0.1 and .2

I have 2 1gb ports bonded together (803.ad) in a port channel on my ciso switch.

they are on the main home LAN, 192.168.1.x/24

 

I configure NFS exports on the old server, and use the unassigned devices plugin to map them in unraid.

I also mount the shares from unraid on the old server.

Now I can transfer files quickly between old and new servers, and even use the old server /media share with Plex on unraid until I get it all moved over (about 18TB)

 

Just thought I would share.  You CAN have bonded eth as long as you are not trying to include the 10GB adapter.

 

2 questions:  How do Ienable NFS4 on unraid?  They seem to be standard NFS shares.

How can I create a RAM drive for transcoding?  is it just the /tmp directory?
 

Link to comment
  • 3 weeks later...

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...