Jump to content

Best way to access gaming VM from another room, HDMI/USB over ethernet? VDI? etc


Recommended Posts

I am upgrading my server and now have enough resources for multiple VMs. I plan on making a few gaming VMs but am trying to figure out the best way to physically locate them. I really don't want them all in the same room as the server. I happen to have a few amx-fg1010-310 and amx-fg1010-500 boxes but I have never used them. I've looked at a few similar solutions but it looks like they all have to be directly connected via ethernet.
image.thumb.png.0df8628a5296912f0da2d3cc1a5ba54b.png

 

I don't know if there is a solution where I can use my existing router, although that would need to operate on a different layer since it would be IP based. I don't know much about VDI, if that would be a solution or not for a gaming VM. I have a Steam box I haven't really used but I think that uses some kind of RDP which I image would be slower than HDMI/USB over IP. Since they are already VMs I don't know if there is a way to passthrough HDMI and USB over ethernet by bypassing the physical layer entirely. I'm not sure what realistic options there are now and most of the discussion I've seen about it are a bit outdated. I'm just looking for options.

Link to comment

I'd be interested to know if anyone comes up with a good solution over ethernet or the like, as I'd love to do something like this myself in the future; but in the meantime, you can use Parsec on both machines and get pretty low latency (less than 10ms) with relatively minimal video compression artifacts especially if both the host and client are wired.

Link to comment

Here's something I discovered recently that may help you decide what not to do. Steam Link (both the physical device and the app) somehow ties to the NIC itself. So, if you're running multiple VMs with a bridge, it won't be able to tell them apart. I have to exit Steam on one VM in order to use Steam Link with the other.

  • Like 1
Link to comment
  • 1 month later...

For the absolute bleeding edge I'd probably suggest passing thru a thunderbolt PCI-E card and using a fiber optic thunderbolt cable with a thunderbolt dock at the other end. This would give you video outputs and USB inputs for basically raw performance. It's a bit on the expensive side.

For the budget oriented I would head in the direction of using Moonlight to access the VM's remotely. This would limit your I/O options a bit, and would introduce some latency (the thunderbolt dock option would be effectively zero latency.) Moonlight has clients on basically every OS imaginable at this point.

The KVM/HDMI/USB over IP solutions are also going to be fairly low latency, but they will be heavily limited on what resolutions they support and what I/O they enable.

In all cases you will need a "client" box at the display end to handle the display output and I/O. I think some of the fanless braswell units available from china would be attractive moonlight "thin clients" since they would be fairly low cost, silent, and capable of outputting 4k60hz, and decoding 4k60hz natively. In theory you could lower the cost further by buying them barebones with no ram or storage, adding a single 2gb SoDIMM module and setting up a PXE server to hand out the thin client image on unraid. There'd be a lot of legwork involved in that but it would be cheaper and pretty slick.

Edited by Xaero
  • Like 1
Link to comment

I have my server in the basement about 100 ft. away from my office and I've been using a 4kex70-h2 and a u2ex50 for a few years now for a Win10 VM.  I use 2 cat5e cables terminated to keystone jacks in my office and a patch panel in the basement.  Then use regular patch cables to connect the transmitters and receivers.  I have the 4kex70-h2 transmitter connected via 6 ft. HDMI cable to a 3060 (recent upgraded from a 1050) with a resolution of 3440x1440 @60Hz.  The receiving unit does get quite hot, as many of the reviewers have noted.  I plan to run a new cat6 or cat7 cable soon for the 4kex70-h2 since I occasionally lose the signal which I attribute to the length of the run on cat5e.

 

If your run is shorter, you can save some $$ with the 4kex60-h2.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...