[Plugin] Nvidia-Driver


ich777

Recommended Posts

38 minutes ago, HellraiserOSU said:

Well I thought it was fixed, but now it's back to this.

This is really strange...

Have you already tried to boot into Legacy mode?

 

38 minutes ago, HellraiserOSU said:

Edit - did a server reboot and it appeared in the installed GPU for a second and then it goes with the RmInitAdapter failed errors.

How long did it work before?

What kind of power supply did you own?

 

Have you got a second PCIe socket where you can put the card in?

 

Seems like the card has fallen from the bus, is a monitor attached to the card?

Link to comment
9 hours ago, ich777 said:

This is really strange...

Have you already tried to boot into Legacy mode?

 

How long did it work before?

What kind of power supply did you own?

 

Have you got a second PCIe socket where you can put the card in?

 

Seems like the card has fallen from the bus, is a monitor attached to the card?

 

It is booting in Legacy mode.

It worked i think for a day before.. I wasn't really monitoring it..

I own a 750W power supply.

Nothing in a second PCIe slot

No monitor attached.. it's headless. When I do plugin a monitor it's in the motherboard HDMI plug to use the intel GPU.

 

Link to comment
5 hours ago, HellraiserOSU said:

No monitor attached.. it's headless. When I do plugin a monitor it's in the motherboard HDMI plug to use the intel GPU.

Can you try to attach a monitor to the Nvidia card, is the iGPU set to the primary graphics card?

Link to comment
11 hours ago, ich777 said:

Can you try to attach a monitor to the Nvidia card, is the iGPU set to the primary graphics card?

OK right now i have the monitor attached to the NVIDIA card.. iGPU is set as primary in the graphics card..

I rebooted and so far it's OK.. i have a feeling it isn't because of plugging into the HDMI but I could be wrong,
I also removed gpu statistics plugin.
It seemed after I booted up if I go to the Dashboard or NVIDIA plugin page it would do that init fail message.. so far so good but we'll see  :)

  • Like 1
Link to comment
13 hours ago, HellraiserOSU said:

i have a feeling it isn't because of plugging into the HDMI but I could be wrong

Just a guess but could it be that you have one of the newer cards that have a mining limiter and you actually have to plugin in a monitor since otherwise it believes it's used for mining?

This is really just a wild guess but the only thing that comes in my mind.

 

13 hours ago, HellraiserOSU said:

It seemed after I booted up if I go to the Dashboard or NVIDIA plugin page it would do that init fail message.. so far so good but we'll see  :)

Does it work until now?

Link to comment
4 hours ago, ich777 said:

Just a guess but could it be that you have one of the newer cards that have a mining limiter and you actually have to plugin in a monitor since otherwise it believes it's used for mining?

This is really just a wild guess but the only thing that comes in my mind.

 

Does it work until now?

 

I didn't touch the UI until this morning.
Went through the logs and saw nothing out of the ordinary.
As soon as I clicked on NVIDIA Driver in the plugins

No devices found.

RmInitAdapter failed!  appears in the logs.


I'm about ready to give up :)

Link to comment
4 minutes ago, HellraiserOSU said:

No devices found.

RmInitAdapter failed!  appears in the logs.

Sorry I really can't help with this since I think there is somewhere a setting that prevents the card from working correctly.

Have you already tried to enable persistenced mode?

 

Are you now booted with Legacy or UEFI mode?

 

Have you tried if hardware transcoding is working or not? Was it working?

 

5 minutes ago, HellraiserOSU said:

I'm about ready to give up :)

Isn't this the wrong smiley?

Link to comment
23 minutes ago, LeGreatMaxiking said:

Is it possible to use the VM also while have this plugin installed? Can`t split the IOMMU Group while this plugin is installed or? Any other way to forward my 1660 Super to my Win 10 VM?

I don't understand...

What do you want to do? Do you want to use it in a VM or in a Docker container?

Link to comment

@HellraiserOSU i seem to be having the exact same issue. multi GPU system however. and i have it on one of my GT710's i use to passthrough to VM's (i have 2x GT710 and 1x GTX1050ti, the 1050 is used for transcoding in dockers. And then we have AST onboard graphics to boot up unraid).

 

Have you found anything usefull? 

i'm only getting the error when trying to open up the nvidia driver plugin and the assigned VM is not started.

When i start the specific VM, i get the following 'statement': vfio-pci 0000:42:00.0: Invalid PCI ROM header signature: expecting 0xaa55, got 0x3f00, but the VM will boot without issues and is able to use the card.

 

@ich777 maybe this information helps you? As when i have the VM booted. The driver plugin powers up fine and i can open it. however it will show the 1050ti, not any of the other gpu's

Edited by Caennanu
Link to comment
11 minutes ago, Caennanu said:

Have you found anything usefull? 

Bind the two cards that you use in the VMs to VFIO and the error should be gone.

 

11 minutes ago, Caennanu said:

however it will show the 1050ti, not any of the other gpu's

If you started the other VMs the plugin can't see the other cards because they are used in the VMs.

Do you want to use the GT710 for Docker containers if yes, for what?

Link to comment

@ich777 okay, can do that. Binding, Never really read up on that what its actually for. And good to know. But maybe its similar for Hellraiser?

 

No no, the GT710's are nothing more than offloading basic graphics for the VM's from the cpu, while having the option to hook up a monitoring monitor to them (CCTV montage in a room near where the server is). The 1050 is for Docker containers. And well, i tagged you on that one the other day. So that works ;)

Link to comment
2 minutes ago, Caennanu said:

Binding, Never really read up on that what its actually for.

You do that to "hide" them from unRAID to use it exclusively in VMs.

 

Go to your system devices and bind the two cards, including the audio device from the cards to VFIO and reboot so that the plugin only can "see" the 1050Ti.

  • Thanks 1
Link to comment
1 minute ago, ich777 said:

You do that to "hide" them from unRAID to use it exclusively in VMs.

 

Go to your system devices and bind the two cards, including the audio device from the cards to VFIO and reboot so that the plugin only can "see" the 1050Ti.

Ahh ok gotcha.

Funny thing is, i knew how to, just not why to ;)

  • Like 1
Link to comment
1 hour ago, jang430 said:

Is it possible to install this driver, and use it when needed?

Yes and no, I never recommend doing it that way.

 

1 hour ago, jang430 said:

I also have GPU passed through to VM.  Will turn off VM when about to use Docker containers that use the GPU.

The main issue with such a configuration is that you most likely will end up in a server hard lock up.

You can try it but keep in mind if your VM is started before the container that uses the card too the container won't be able to start.

If you transcode something and you accidentally start the VM this can result in a hard lock up.

Link to comment

I see.  I guess I can't take advantage of my current GPU.  What kind of CPU can Handbrake benefit from?  I currently have 1050Ti connected.  I only have 1 single slot available, and should be 1 slot width only.  Will GT710 video card produce good results?  

Link to comment
1 hour ago, jang430 said:

Will GT710 video card produce good results?  

From what I know no because it even can't transcode h265.

 

I think a Quadro P200 or a P2000 (more expensive) is this what can help you out if you only got one free slot.

  • Like 1
Link to comment
On 7/13/2021 at 7:20 AM, ich777 said:

Sorry I really can't help with this since I think there is somewhere a setting that prevents the card from working correctly.

Have you already tried to enable persistenced mode?

 

Are you now booted with Legacy or UEFI mode?

 

Have you tried if hardware transcoding is working or not? Was it working?

 

Isn't this the wrong smiley?

 

Persistence mode? What is that?

I am booting in Legacy.

Hardware transcode... it say in Plex it's using hardware transcoding even if it's not finding the NVIDIA card..

I'm a developer and I do hardware.. I'm used to things not working at times when they do so I just smile and move on :)

Link to comment
On 7/19/2021 at 2:57 AM, Caennanu said:

@HellraiserOSU i seem to be having the exact same issue. multi GPU system however. and i have it on one of my GT710's i use to passthrough to VM's (i have 2x GT710 and 1x GTX1050ti, the 1050 is used for transcoding in dockers. And then we have AST onboard graphics to boot up unraid).

 

Have you found anything usefull? 

i'm only getting the error when trying to open up the nvidia driver plugin and the assigned VM is not started.

When i start the specific VM, i get the following 'statement': vfio-pci 0000:42:00.0: Invalid PCI ROM header signature: expecting 0xaa55, got 0x3f00, but the VM will boot without issues and is able to use the card.

I have not sorry. I'm not AS worried about it now since we don't go out as much so I only use Plex when we are out of the house and use Kodi for when we are inside and just do direct play.  All my TVs have NVIDIA Shields so they can play pretty much anything as is.

 

I only have a single GPU in my setup not counting the intel GPU. I can try to put in a GTX 980. I'll have to see if I can find the extra power cable and try that.. or the RTX 3080.. but I'm guessing i'll run into the same thing

Link to comment
5 hours ago, HellraiserOSU said:

Persistence mode? What is that?

Put these lines in your /boot/config/go file on the bottom and reboot to enable it:

# Enable persistenced mode for Nvidia cards
nvidia-persistenced

 

5 hours ago, HellraiserOSU said:

it say in Plex it's using hardware transcoding even if it's not finding the NVIDIA card..

That's really strange but eventually it's doing it over your iGPU (if you have one).

Link to comment
5 hours ago, Nanobug said:

Hello,

I have this issue again:

Without the Diagnostocs I can't do anything.

 

Something seems wrong on your system...

Can you try to do a 'ls -la /boot/config/nvidia-driver/packages'

 

What have you done so that it isn't working anymore? Updated to the newer driver version?

Link to comment
32 minutes ago, ich777 said:

Without the Diagnostocs I can't do anything.

 

Something seems wrong on your system...

Can you try to do a 'ls -la /boot/config/nvidia-driver/packages'

 

What have you done so that it isn't working anymore? Updated to the newer driver version?

It said a drive was missing, so I turned it off to have a look, and when I turned it on again, it couldn't start Plex, and I noticed the GPUID was missing again. 

 

Diagnostics is added.

 

ls -la /boot/config/nvidia-driver/packages gave me this:

image.png.a1a6ac42d3c527b514a4758f9b696b9e.png

 

nanostorage-diagnostics-20210721-2220.zip

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.