snowmirage Posted March 11 Share Posted March 11 What I'm trying to do I recently did a network upgrade and installed a 10gig NIC in my unraid server. My intention is to separate my network into several VLANs. I have all the networking stuff working, router + switches etc. I believe I have setup the networking in unraid as I should. Settings > Networking I can reach the unraid server on that VLAN (2) as expected. I would like to be able to assign a given Docker Container (or VM but troubleshooting just docker here) to that VLAN. I started with the defaults I already head in the docker settings Docker custom network type: ipvlan Host access to custom networks: Disabled But added a new custom network on interface br2.2. And removed the previous custom network I made sure my local physical DHCP server was passing out IP address in a different range on the same subnet than the DHCP pool configured in docker settings. Then in each container I had running I changed, the network type and set a fixed ip address. For several of my docker containers this seemed to work as expected. However I have two issues that I can't seem to explain. Issue 1 I have an nzbgetvpn container running, when it starts logs indicate its working as normal. I can ping the containers fixed IP address. But when I attempt to connect to the web interface on its ip + port the connection times out. Digging deeper looking at wireshark I can see the containers IP respond as expected to ICMP ping requests, but it doesn't respond at all to TCP traffic to the port for the web interface. During my troubleshooting I change that containers network type, back to "bridge" When I do so in the docker page I see it now has this IP address To my knowledge I don't have anything on my network using a 172.17.x.x address so I have no idea where it is getting that from..... On top of that if I then try to access http://10.2.0.16:6789 I can pull up the web interface of the container??? (Thats the IP address of the unraid host on VLAN2 + the port of the web interface of the container). Issue 2 Moving on to another container... I am running "nut-influxdb-exporter" which tries to connect to the unraid host running the NUT plugin (UPS power stuff) on port :3493 I had this container running with the same custom network I just setup When the container boots I see errors that I can't connect to the unraid host. Dropping into a console for the container I noticed that it can successfully ping other docker containers configured on the same network (10.2.0.68 for example is my InfluxDB container) and it can ping other hosts on that network 10.2.0.1 for example my physical routers interface on this VLAN. But it can't ping 10.2.0.16 the IP of the Unraid host on that VLAN. Even with all my searching over the last week I suspect I'm missing something fundamental about how to get this type of configuration correct. Might anyone be able to point me in the right direction? phoenix-diagnostics-20240311-0624.zip Quote Link to comment
Vr2Io Posted March 12 Share Posted March 12 (edited) 17 hours ago, snowmirage said: Host access to custom networks: Disabled For Issue_2, this need enable host access, but I check your diagnostics this actually enable. 17 hours ago, snowmirage said: For several of my docker containers this seemed to work as expected. If some docker work, that likely docker issue. Edited March 12 by Vr2Io Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.