[Support] Linuxserver.io - NZBGet


Recommended Posts

 
For whatever reason, I've had to increase the number of connections in nzbgbet to get to the same download speed as sab. No clue why that is. I've got a 300mbps connection and I am able to saturate my download.
 
You may also want to take a look at performance tips on the nzbget wiki. Though my server is pretty beefy, I reduced the logging level as I just don't need that writing to my SSD constantly for every little thing.
It's really wicked... I'm using exact the same config like I did on OMV. Switched to SAB now where speed is maxed out. Dunno what might be the reason for the problems with Nzbget....

Gesendet von meinem Redmi Note 3 mit Tapatalk

Link to comment

I was so focused on the download path that I didn't realize it was my category paths that were screwing me up.   Everything is finally working now.   Thanks for the help trurl!  I've been working on this for days and couldn't figure out what I was doing wrong, you kept me from going insane lol.

  • Like 1
Link to comment
3 minutes ago, billium28 said:

Occasionally my cache drive fills up before the mover has time to move the files. Is it advantageous to not send Nzbget downloads to the cache? Wouldn't that mean my media share would miss out on the cache advantages?

The cache drive and caching writes to the array was introduced back when average write speeds directly to the array were much slower than they are nowadays.

 

Many people (including myself) only use the cache drive for applications (appdata, and the downloads share).  All writes to user shares go directly to the array.  (That, and I find it impossible to justify to the "boss" why I need a larger cache drive when neither her nor myself see any real improvements from caching writes to the media shares)

 

You still use the cache drive for downloads.  But post processing / moving by Couch / Radarr / Sonarr will bypass the cache and go to the array.

 

Now, if you cache drive isn't big enough to actually handle the size of the downloads, then try setting the download share to be use cache: Prefer.  When an article doesn't fit on the cache drive, it will fallover to the array.

  • Like 1
Link to comment

Not sure if this is in the right spot so direct me if it isn't. Having some issues with Unraid Plex Docker not playing nicely with NZBGet. I had this setup running before on Ubuntu with no issues. What is happening now is that whenever NZBGet is either downloading or especially when it is Post-Processing, the system crawls. Playback stutters every 15 seconds or so until I pause Nzbget and then it all goes back to normal. I know the hardware is sufficient since it was all running well before UnRaid. I grabbed logs from Plex and NZBGet when this issue is specifically happening. The other thing I'm noticing is that Postprocessing takes SIGNIFICANTLY longer in Unraid NZBGet than it did on Ubuntu, a 1.8GB file is taking over half an hour sometimes 45 minutes. 


Hardware: https://pastebin.com/afb0NwPD
Plex: https://pastebin.com/F1TzetMy
NZBGet: https://pastebin.com/HyDfgeJz
UnRaid: https://pastebin.com/pLSiHGMc

Thank you so much!
-Andrew

Link to comment
6 hours ago, Squid said:

The cache drive and caching writes to the array was introduced back when average write speeds directly to the array were much slower than they are nowadays.

 

Many people (including myself) only use the cache drive for applications (appdata, and the downloads share).  All writes to user shares go directly to the array.  (That, and I find it impossible to justify to the "boss" why I need a larger cache drive when neither her nor myself see any real improvements from caching writes to the media shares)

 

You still use the cache drive for downloads.  But post processing / moving by Couch / Radarr / Sonarr will bypass the cache and go to the array.

 

Now, if you cache drive isn't big enough to actually handle the size of the downloads, then try setting the download share to be use cache: Prefer.  When an article doesn't fit on the cache drive, it will fallover to the array.

Thank you that makes sense

Link to comment
  • 1 month later...

Ehm, how do i get the extentions to work? Ive setup like u worte on first page
 

/mnt/user/appdata/nzbget

 

is /config

 

and i place in

 /mnt/user/appdata/nzbget/ppscripts/SafeRename.py
 

 

but doesnt show in drop down!?


Also does someone know a fast way to rename many files /*/*.1 to /*/*.mp4? :D

 

Should i still change it to 000 (and why)? Bc it seems to work just fine. Its set to 1000 for me.

Edited by nuhll
Link to comment

Nonono.... but you got me on the right track, many many many, thanks, its owrking now.

 

I didnt saw that i can adjust the "scripts" folder in settings under paths. It was relaying on the maindir (which i use for /downloads/) so i had to put it in /downloads/scripts/. I was so hard googleing for the path that i didnt see i can adjust it... 


Sorry im dumb asf.


Its working now, thanks for your help... is there a way to redo all downloads with this script :D

Edited by nuhll
Link to comment
  • 3 weeks later...

Hey, i have another question

i would like to limit downloading while X,X IP is reachable.

 

(I want to limit it, when my PC, or a other PC is startet)

 

I know the inbuild sheduler, which is working, lets say, okay.

 

But i guess, someone with some small coding expierince could write a fast script which checks ping 192.168... OR 192.168.  and if its there, limit download to X. Could be easy implemented via every 5 min schedule, and set limit or remove it.


I would pay for it.

 

Thanks for any help.

Edited by nuhll
Link to comment
9 hours ago, nuhll said:

Hey, i have another question

i would like to limit downloading while X,X IP is reachable.

 

(I want to limit it, when my PC, or a other PC is startet)

 

I know the inbuild sheduler, which is working, lets say, okay.

 

But i guess, someone with some small coding expierince could write a fast script which checks ping 192.168... OR 192.168.  and if its there, limit download to X. Could be easy implemented via every 5 min schedule, and set limit or remove it.


I would pay for it.

 

Thanks for any help.

This sort of functionality is usually handled in the router by prioritizing different IPs or ports, a feature known as Quality of Service (QoS).

Link to comment

I've got the docker setup and working great. It works well with radarr and sonarr, so, of course, it's time to start tinkering! :)

 

My usenet provider (newshosting.com) offers a free VPN service (as "documented" here and also here and several other pages that I haven't yet read all of). How do I configure NZBGet to access their VPN sites? Most of the things I've seen indicate that I need to download their software (which appears to be the OpenVPN client). I suppose I could DL it, then figure out how to install it in the container and get NZBG to run through it...

 

It looks like the L2TP would be the best option for this, but I don't believe my Netgear WNR3500Lv2 supports L2TP pass-through.

Link to comment
  • 4 weeks later...
I've installed the NZBget dock within my Synology server and mapped the folder /DOWNLOAD with rw permissions within the docker settings.
 
When starting the following errors occur:
nzbget.conf(55): Invalid value for option "TempDir" (/DOWNLOAD/tmp): could not read information for directory /DOWNLOAD/tmp: errno 13, Permission denied
nzbget.conf(52): Invalid value for option "QueueDir" (/DOWNLOAD/queue): could not read information for directory /DOWNLOAD/queue: errno 13, Permission denied
nzbget.conf(46): Invalid value for option "NzbDir" (/DOWNLOAD/nzb): could not read information for directory /DOWNLOAD/nzb: errno 13, Permission denied
/DOWNLOAD/nzbget-2018-01-31.log: Permission denied
/DOWNLOAD/nzbget-2018-01-31.log: Permission denied
/DOWNLOAD/nzbget-2018-01-31.log: Permission denied
/DOWNLOAD/nzbget-2018-01-31.log: Permission denied
/DOWNLOAD/nzbget-2018-01-31.log: Permission denied
[ERROR] nzbget.conf(55): Invalid value for option "TempDir" (/DOWNLOAD/tmp): could not read information for directory /DOWNLOAD/tmp: errno 13, Permission denied
[ERROR] nzbget.conf(52): Invalid value for option "QueueDir" (/DOWNLOAD/queue): could not read information for directory /DOWNLOAD/queue: errno 13, Permission denied
[ERROR] nzbget.conf(46): Invalid value for option "NzbDir" (/DOWNLOAD/nzb): could not read information for directory /DOWNLOAD/nzb: errno 13, Permission denied
[iNFO] Pausing all activities due to errors in configuration
 
When i open an bash for the used uid and guid it is root, and i can manually create/edit/delete folders within the /DOWNLOAD path within this dock.
uid=0(root) gid=0(root) groups=0(root),1(bin),2(daemon),3(sys),4(adm),6(disk),10(wheel),11(floppy),20(dialout),26(tape),27(video)
 
What is the next step to troubleshoot this issue, for now i'm stuck. 
This forum is for Unraid users, either post in the linuxserver forum or go to our discord channel or IRC room for support on other platforms.

Sent from my LG-H815 using Tapatalk

Link to comment
1 hour ago, hari-bo said:

I'm sorry I thought that this for the docker container support, I will search for another dock which support them on other distros as well.

 

This docker is supported on other platforms, just not in the unRAID forum. Exactly as CHBMB said above:

 

1 hour ago, CHBMB said:

either post in the linuxserver forum or go to our discord channel or IRC room for support on other platforms.

 

Link to comment
  • 2 weeks later...

Since updating to 6.4.1 I've experienced a slow down in NZBGet. All of a sudden I'm not able to download at my full gigabit speed and seem to be stuck around 35MBs. On my Windows 10 machine I've tested my connection with Usenet and I'm downloading at full gigabit speed. Tried searching around to see if anyone else was experiencing this same thing but haven't found anything. Any ideas? Thanks

unraid-diagnostics-20180213-1507.zip

 

Edit: NM I figured it out. I had Extra Parameters: --cpuset-cpus=0,4. But this never behaved this way (slowing internet download) prior to 6.4.1. Once I removed the parameter it download at full 100MBs.

Edited by djsnafu
Link to comment
  • 1 month later...

I got this up and running and most of the time it's fine. However, every now and then it'll download a file and not give it the correct permissions (need access from nobody). I have umask set to 000 as suggested in the OP.

 

This only affects some downloaded files, majority are fine. Any ideas?

Edited by tyrindor
Link to comment

I don't know if this is a Docker or Unassigned Devices thing. I do receive permission denied for all sorts of files and directories. If I look at directories and files from within docker User "abc" and Group "abc" have permission. Outside the docker container it's "nobody:users". The docker container is running as "99:100" and all directories but "/config" are set to "Slave/RW". What else do I need to do?

 

Here's docker view vs. unRAID view:

root@Tower:/mnt/disks/UA01/nzbget# docker exec -it nzbget ls -lisa /nzbget/intermediate
total 12
4297828992  0 drwxrwxrwx 3 abc abc   73 Mar 26 10:38  .
        99  0 drwxrwxrwx 7 abc abc  124 Mar 27 07:59  ..
2147483745 12 drwxrwxrwx 2 abc abc 8192 Mar 26 11:03 'Test.#17489'

root@Tower:/mnt/disks/UA01/nzbget# ls -lisa /mnt/disks/UA01/nzbget/intermediate/
total 12
4297828992  0 drwxrwxrwx 3 nobody users   73 Mar 26 10:38 ./
        99  0 drwxrwxrwx 7 nobody users  124 Mar 27 07:59 ../
2147483745 12 drwxrwxrwx 2 nobody users 8192 Mar 26 11:03 Test.#17489/

 

Attached you'll find the important settings. If you need diagnostics I can attach it too.

 

Any help is highly appreciated.

 

nzbget01.jpg

nzbget02.jpg

Link to comment
  • 2 weeks later...
  • 3 weeks later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.