Squid Posted December 17, 2018 Share Posted December 17, 2018 In a nutshell, does anyone know what the difference is between Mode 1 and Mode 2 for the HDMI standard on 4K TV's We've got a problem at work with some compatibility problems on TV's and Intel NUCs (native HDMI and USB3 Plugable Adapters and USBC Plugable Adapters). I'm seeing some conflicting information on the internet (what little there is) on which Mode gives the best compatibility for the widest variety of equipment. Maximum resolution and refresh rates are not a concern of mine at all. (ie: 1080p is more than sufficient for our application) Quote Link to comment
saarg Posted December 17, 2018 Share Posted December 17, 2018 From what I found, mode 2 enables hdmi 2.0 for that port (might be for all as I don't know your TV) for higher bandwidth. Mode 1 allows for up to 3840x2160@30Hz while mode 2 allows up to 3840x2160@60Hz. I left out the chroma sub-sampling specs now, but check post #5 here. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.