10gb/s networking project


2Piececombo

Recommended Posts

My goal is to achieve a 10gb/s link between my desktop and my unraid box. I've narrowed down my options to some extent based on certain limitations. I'll try to be as thorough yet brief as possible in explaining my setup, as to avoid a wall of text. I've already got a cat6 cable ran from my pc to my network closet. At this point I mostly just want someone with more knowledge to double check my thought process in selecting parts to ensure i'm not going to find myself with a whole bunch of buyers remorse.

 

My PC: 4th gen intel, 4790k. Asus Hero Maximus VII. I plan on using this nic in my pc, as I have an available pcie 3 slot to accommodate it.

 

My unraid server: Dual Xeox X5670s in a Tyan S7012. It's has several open pcie2 x8 slots. As i understand it the previously mentioned nic wouldn't be a good choice for my server, as it would limit me to 5g since requires a pcie3 x4 lane to achieve 10gb/s. My next thought was to find a pcie2 x8 nic. I assume due to the power inefficiency of 10g rj45 cards (making them less common and much more expensive) that I'll end up with an sfp+ card and need a transceiver, which is fine with me. My search led me to this card, which appears to be a pcie2 x8 nic. In addition to that, I assume any sfp+ to rj45 transceiver, such as this one, will work fine?

 

If you see and glaring issues that I overlooked please let me know what I can do to rectify them.

Thanks in advance for any input/suggestions.

Edited by 2Piececombo
Link to comment
23 minutes ago, 2Piececombo said:

As i understand it the previously mentioned nic wouldn't be a good choice for my server, as it would limit me to 5g since requires a pcie3 x4 lane to achieve 10gb/s.

A PCIe 2.0 x4 is still good for 2000MB/s max, though there's always some overhead, but always fast enough for 1GB/s.

Link to comment

Thanks for the quick reply! I guess I trusted the wrong source for my information on that. After looking at several pcie bandwidth charts I now realize that the Asus NIC should work on both ends. This leads me to one more question.. Unraid compatibility. It seems there is quite robust support for most NICs, but is there a way to find out exactly if this will work with unraid? Perhaps a official compatibility list somewhere I wasn't aware of?

 

Thanks again, I appreciate the help!

 

EDIT: Found this thread which seems to answer my question. I believe my next and final issue will be setting up the link. 2 youtubers I watch, byte my bits and spaceinvader one have both talked about doing a 10gbe setup. IIRC, spaceinvader one did a video of using the NIC only for a link between two machines, while byte my bits bridged his internet through his unraid server to avoid having 2 network cables to his PC. is there any advantage to doing it one way or the other?

Edited by 2Piececombo
Link to comment

update. Got the nics today, and hitting a major wall. It looks like pcie 2 x4 slots aren't capable of the 10gbe bandwidth after all? iperf topped out at ~3gbps. When both cards are in a pcie 3 x4 slot, iperf hits around the full 10gbe. So it does appear that a pcie 2 x4 slot is just not enough to support a 10gbe link, unless there is something obvious I am missing

Link to comment

You were right (again) :P

Turns out the slot I was using in my PC (which ended up being pcie 2 x4, because the pcie 3 x4 slot I was going to use was already in use by a pcie to nvme card) was being limited due to my other nvme drive. When the onboard m.2 is enabled, it cuts the bandwidth of the pcie slot down to x2 or x1 (cant remember which) After disabling the m.2, iperf was able to hit the full speed. Next I put my extra nvme into the second pc i was  testing with, and dropped a 12gb movie over the link and WOW! ~850MB/s the whole time.

 

I didn't end up having enough time last night to put the nic back in my server, so I'll do that tonight, but I did end up discovering one more unfortunate thing. My server only has sata II ports, so my 1tb 860evo is limited to ~375MB/s max. My only workaround is to either get a pcie sata III card and connect my cache dive to that (which id rather not do) or get another pcie to nvme card and replace my current cache drive with an nvme. I think I'll go with a Intel 660p seeing as they've dropped in price so significantly as of recent. Or perhaps ill put the 660p in my PC and put my 1tb samsung nvme in the server. Either way, I think im on the right track now. I thank you again for you help throughout my struggle :P

Edited by 2Piececombo
Link to comment
23 minutes ago, 2Piececombo said:

I think I'll go with a Intel 660p seeing as they've dropped in price so significantly as of recent.

Attention this is a QLC SSD, IIRC sustained write speed after the SLC cache is full maxes out at 100MB/s, not not a good option for large writes, no problem with read speed.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.