macOS Sierra VM - 4k @ 60hz and other Q's


Recommended Posts

First of all, I want to express my appreciation to Gridrunner et al for the all the knowledge and experience shared here in the forums and elsewhere.  I'm currently running two so-far rock stable Sierra VMs -- one toiling away headless running Indigo (my home automation server) and various other "important" things, and this one, which I'm using as my daily driver.

 

My daily, however, needs a bit of tweaking.  I've got a pair of graphics cards in my server -- a Radeon R9 280, and an nVidia GTX 760.  Both seem reasonably comparable in power, both seem to work pretty well when passed through to the Sierra VM.  I do light gaming (console emulation mostly) on a Windows VM in the other room, either card is fine for that purpose, so I've got either at my disposal for the Mac.

 

I'm running a 43" Insignia 4k television as a monitor.  It supports 4k @ 60hz, with 4:4:4 colour space.  Connected to the server via HDMI cable.  On either card, I get the Mac displaying 4k just fine. However I can't get a refresh rate higher than 30hz.  On either card.  I've tried the HDMI ports on the cards, as well as an active DP to HDMI adapter (this one from Club 3D), which many people report success with, at least with genuine Apple machines.

 

I know this is an issue that I'm likely going to need the help of the Hackintosh community for..  But it has raised some questions regarding my Clover setup.  I can't get the VM to boot successfully with a system definition any newer than MacPro2,1.  Pretty old.  And having to specify a Penryn processor in the VM's xml definition..  Nosing around in this thread makes me wonder if that's because of an issue with Clover, that might have been corrected..  Anyway.  I'm a bit out of my depth here.  Does anyone have any insights they might be able to share as to how to configure my system so my hardware appears as modern as it actually is, so I can eliminate that as a possible reason why 4k@60hz screen modes might not be available?

 

Thanks!

Edited by iphillips77
Link to comment

Nvidia GPU on sierra should be 14,1 or 14,2... at least that is what it takes on my GTX 760.

 

Important to note about the 760 from nvidia's site: "3840x2160 at 30Hz or 4096x2160 at 24Hz supported over HDMI. 4096x2160 (including 3840x2160) at 60Hz supported over Displayport. Support for 4k tiled MST displays requires 326.19 driver or later."

 

Might be the same for the radeon card too. You'll have to investigate. 

 

I've never used an adapter on my DP port to push 4k, so I can't tell you if that is the problem. Or maybe you aren't using an HDMI 2.0 cable? Could be the common problem between both cards....

 

 

Link to comment

Thanks an awful lot for the suggestions, I'm getting closer.  After some tweaking, I noticed that even though I was using the displayport output, it was showing up as 'HDMI or DVI' in system report.  A little tinkering with the stock Mac OS framebuffer definitions and I've made some progress.  I can now get 4K at 60Hz......  for a few seconds.  Then, the screen blanks out and I get "snow".  Not sure if it's actual snow, because that makes no sense, or if it's something that the TV is doing to generate fake snow due to signal loss..  I'm using a fairly long HDMI cable so that may be the problem.

 

Another issue I'm seeing is if I use a "scaled" display, the option for 60hz vanishes.  Except for 1920x1280..  I can get that in retina at 60hz, as well as unscaled 4k.  Is this normal behaviour?

Link to comment

I think that is related to display modes and what the os x says the monitor can handle.

 

There is a utility/app you can download which I think can force any resolution be sent to any output regardless of who it was reported to os x. It don'ts mean you can make a "monitor" do something it can't, but if it isn' detected properly by os x, then you can send the right signal.

 

I don't remember what it is called though... but it was free last time I checked (about 6 months ago.) Maybe look into that.

Link to comment
  • 2 weeks later...

Just in case anyone else has the same problem in the future, and since I'm guessing I'm not the only person out there who keeps their server tucked away in the quiet of the basement it wouldn't surprise me -- I've got it sorted.  It was the HDMI cable all along.  I ended up picking up a "Luxe Series" active HDMI cable from Monoprice.  I'm currently going from Mini DisplayPort on the Radeon R9 280, through a passive adapter to full-sized DisplayPort, through the Club3D DisplayPort to HDMI adapter, up through the floor and wall to my wall-mounted television.  The cable is CL3 fire rated for in-wall use.  So much for "all HDMI cables are the same"!  So far it seems solid, 4k@60hz, and what appears to be 4:4:4.  Not 100% sure, because it turns out this is an RGBW panel, but even with the extra stupid subpixel this is so much better than the 1080p display I was using before!

 

Holding down "Option" while clicking "Scaled" in the display preference pane brings up all the scaled resolutions I could ever want, and some I don't, all at 60hz.  Now if I could only get my sound to work.  Off to the next problem!  Thanks for your help, everyone.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.