Help Configuring 10GBe Network?


Recommended Posts

 

 Basically trying to achieve what is done in this video ^^ with the only exception being that rather than going from one Windows PC to another Windows PC, I want to go from one Windows PC to one unRAID server.

 

My configuration looks like this:

 

unRaid server: NetApp Chelsio T320 Dual Port 10GBe NIC

Windows PC 1: Mellanox ConnectX-2 10GBe Single Port NIC

Windows PC 2: Mellanox ConnectX-2 10GBe Single Port NIC

 

Connected via SFP dirrect attach copper cable. unRAID --> Windows PC 1 & unRAID --> Windows PC 2. Pretty basic, direct connect, no expensive 10GBe switches.

 

Running latest unRAID 6.3.2, which recently added support for the NetApp Chelsio NIC I'm using. However, I have two problems at the moment.

 

On <server>/Settings/NetworkSettings, it only lists one of my two ethernet interfaces - I'm wondering if this is a bug in the webUI? I have 11 total network interfaces, eth0-10, and only eth9 (which would be the 10th) gets the full listing where I can specify settings. See the screenshot below, and notice how it jumps from Interface eth 9 straight to the interface rules (which does list eth10).

 

SND2iey.png

 

So with that issue, at the moment I cant specify network settings for one of my ports, so one of my two PC's will go unconnected. Not the end of the world for the moment...

 

However, on the one I can configure, I've configured it with the above settings. The rest of my network (router, wifi, switch, and all 1GBe devices) lives on 192.168.1.x. So I've manually assigned 192.168.0.0 for the IP address for the 10GBe network I'm trying to set up. On windows, I've done the same thing:

 

suZyWDz.png

 

So for this 10GBe network, ideally it'd look like this.

192.168.0.0 - unRAID 10GBe port 1

192.168.0.1 -  unRAID 10GBe port 2 (can't configure yet due to bug discussed above)

192.168.0.2 - Windows PC 1 10GBe port 1

192.168.0.3 - Windows PC 2 10GBe port 1

 

I try to ping 192.168.0.0 (unRAID port 1) from Windows PC 1, manually specifying the source address. I'm getting this, even with Windows Firewall completely disabled.

nMXPADS.png

 

Also unable to ping from unRAID Terminal to windows PC.

 

What am I missing here? I'm no networking pro and this is my first time dabbling with 10GBe, and with a direct connection (no switch/router). 

 

 

 

Link to comment

Well you can't use the 0th IP, and using two IPs on the same subnet under unraid can cause weird things... like packets going in and out of the wrong ports.

 

so Ideally, your unRAID 10GBe ports are bonded (probably LACP or balance-alb) and assigned a single IP.

the Windows clients then use this single IP and it should be able to utilize the 20GB serving the two clients...

 

If bonding doesn't quite work, you can always fallback to a different subnet per IP

kinda like: 

unraid 1: 192.168.0.1/255.255.255.127

unraid 2: 192.168.0.128/255.255.255.127

win 1: 192.168.0.2/255.255.255.127

win 2: 192.168.0.129/255.255.255.127

 

Link to comment

Thank you all for the help, I got this resolved. For anyone who wants to do a similar setup, here's what I ultimately did

 

Main 1GB Network - Still at 192.168.1.x

 

unRAID 10GBe Port 1 - 192.168.2.1

unRAID 10GBe Port 2 - 192.168.3.1

Windows PC 1 10GBe - 192.168.2.2

Windows PC 2 10GBe - 192.168.3.2

 

And then you'll want to edit your System32/Drivers/etc/hosts file on the Windows PC's to force all traffic to go through 10GB, otherwise it may prefer 1GB. Do it like so:

Windows PC 1 hosts file - add: 192.168.2.1 <HOSTNAME OF SERVER>

Windows PC 2 hosts file - add: 192.168.3.1 <HOSTNAME OF SERVER>

Edited by OChrisJonesO
Link to comment
1 hour ago, OChrisJonesO said:

Thank you all for the help, I got this resolved. For anyone who wants to do a similar setup, here's what I ultimately did

 

Main 1GB Network - Still at 192.168.1.x

 

unRAID 10GBe Port 1 - 192.168.2.1

unRAID 10GBe Port 2 - 192.168.3.1

Windows PC 1 10GBe - 192.168.2.2

Windows PC 2 10GBe - 192.168.2.3

 

And then you'll want to edit your System32/Drivers/etc/hosts file on the Windows PC's to force all traffic to go through 10GB, otherwise it may prefer 1GB. Do it like so:

Windows PC 1 hosts file - add: 192.168.2.1 <HOSTNAME OF SERVER>

Windows PC 2 hosts file - add: 192.168.3.1 <HOSTNAME OF SERVER>

 

I hope you meant

Windows PC 2 10GBe - 192.168.3.2

 

Because I don't see how it would work unless you have a router on the 10Gbp network. And even then you won;t be able to do 10Gbp on both Windows machines at the same time as they would bottleneck on the single 10Gbp on the unRAID

Link to comment
  • 7 months later...

Hi, I fully understand your networking side of things as I have been doing the same thing between windows 10 clients for over a year now.  I was wondering what both your hardware and software configuration was on your UNRAID system.

 

Currently I don't see 10GbE speeds with my configuration, and I hardly see 1GbE speeds.  I wanted to know what drive configuration you used (array, cache, and parity) to verify if my setup would be capable of that.

 

right now I have a direct connection to a FreeNAS box that provides 10GbE speeds (well close since I have 5 drives using 2 for parity).  would really like to see UnRAID reach those speeds with over 12 drives in it. (utilizing 2 SSD for cache)

 

Thanks!

Link to comment
1 hour ago, strikermed said:

would really like to see UnRAID reach those speeds with over 12 drives in it. (utilizing 2 SSD for cache)

 

unRAID doesn't stripe data disks, so max write speed will be limited by a single disk speed, you can get 10GbE speeds writing to a fast cache device or pool, like a NVMe device or stripped SSD pool (raid0 or raid10).

Link to comment
  • 4 weeks later...
On 11/16/2017 at 4:10 PM, johnnie.black said:

 

unRAID doesn't stripe data disks, so max write speed will be limited by a single disk speed, you can get 10GbE speeds writing to a fast cache device or pool, like a NVMe device or stripped SSD pool (raid0 or raid10).

 

Thanks for the confirmation Johnie.black.  I figured that much.  I noticed Linus Tech Tips had some kind of configuration I wasn't 100% was released or not.  I'll be reconfiguring my network cards to optimize my 10GbE workflow and just have Unraid using 1GbE.

Link to comment
12 minutes ago, johnnie.black said:

He was using the cache pool.

Ahhh, I gotcha... For my purposes that doesn't really help :(....  I rather utilize the drives, and if I add a cache pool, I would use them to quickly flush writes asap.  I could write a terabyte of data, and it would take hours to transfer the way he was doing it. 

 

Fortunately I got SMB shares to fly on FreeNAS via a Virtual Machine, so I guess they fixed those issues since he made those videos.

 

Thanks again!

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.