intel64gamer Posted February 4, 2021 Share Posted February 4, 2021 Hi, I am currently using Mellanox ConnectX-3 VPI cards in 40 Gigabit Ethernet mode on Unraid 6.8.3 / 6.9.0-rc2. I have tested the cards by temporarily putting Windows Server 2019 on both ends. I am able to nearly max out the SSDs with RDMA working in that constellation. Here are the speeds I am seeing on Windows to the drives I am using as Cache in Unraid. With Unraid I am maxed out around 1.8 GB/s and typically dropping to 1.0 - 1.5 GB/s on the same hardware. This seems to be down to RSS and RDMA not working out of the box. I have followed this thread to try to at least get RSS working: I am not having great success so far. My smb conf looks like this: server multi channel support = yes interfaces = "10.10.10.10;capability=RSS,capability=RDMA,speed=40000000000" aio read size = 1 aio write size = 1 strict locking = No use sendfile = Yes However, there seem to be additional problems with RSS. I ran the following command as per the thread to check if RSS is supported. egrep 'CPU|eth*' /proc/interrupts CPU0 CPU1 CPU2 CPU3 CPU4 CPU5 CPU6 CPU7 RTR: 6 0 0 0 0 0 0 0 APIC ICR read retries System Information: Mellanox Firmware Info: On Windows: PS C:\Windows\system32> Get-SmbClientNetworkInterface Interface Index RSS Capable RDMA Capable Speed IpAddresses Friendly Name --------------- ----------- ------------ ----- ----------- ------------- 26 False False 0 bps {fe80::edcd:2f13:ae5a:8a31} Ethernet 4 4 True True 40 Gbps {fe80::bd00:9b94:dde:bd6d, 10.10.10.9} Ethernet 14 17 True True 10 Gbps {fe80::3045:74bf:a53d:a6d5, 10.0.101.202} Ethernet 15 55 True False 10 Gbps {fe80::a9f3:43dc:fb3b:e2c8, 172.29.208.1} vEthernet (Default Switch) 22 False False 1 Gbps {fe80::d468:1d0:d904:b7a8} Local Area Connection Thanks for your help. Quote Link to comment
kacper91 Posted February 13, 2021 Share Posted February 13, 2021 I had exactly the same problem. Maybe the in kernel mlx4 module doesn't support RSS on the Connect-X 3? Try to install drivers from https://www.mellanox.com/products/ethernet-drivers/linux/mlnx_en (choose LTS Download, 4.9-2.2.4.0, the newer drivers do not support Connect-X 3). After installing the drivers I got: egrep 'CPU|eth*' /proc/interrupts CPU0 CPU1 CPU2 CPU3 48: 0 311 2 0 PCI-MSI 524289-edge eth0-0 49: 126 3 0 214 PCI-MSI 524290-edge eth0-1 50: 10 0 105 184 PCI-MSI 524291-edge eth0-2 51: 0 34 244 0 PCI-MSI 524292-edge eth0-3 RTR: 0 0 0 0 APIC ICR read retries Quote Link to comment
intel64gamer Posted February 13, 2021 Author Share Posted February 13, 2021 How did you install the driver? I didn't find any packages for Slackware or anything based on it. Did you compile and install from source? Quote Link to comment
jj1987 Posted September 25, 2021 Share Posted September 25, 2021 @intel64gamer: Could you solve this in the meantime? As i have the same problem with my Connectx-3 @kacper91: Would you mind to explain how you got these drivers working under Unraid/Slackware? Thanks in advance Quote Link to comment
intel64gamer Posted September 25, 2021 Author Share Posted September 25, 2021 @jj1987 Unfortunately not. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.