b0mb Posted October 24, 2017 Share Posted October 24, 2017 For whatever reason, I've had to increase the number of connections in nzbgbet to get to the same download speed as sab. No clue why that is. I've got a 300mbps connection and I am able to saturate my download. You may also want to take a look at performance tips on the nzbget wiki. Though my server is pretty beefy, I reduced the logging level as I just don't need that writing to my SSD constantly for every little thing.It's really wicked... I'm using exact the same config like I did on OMV. Switched to SAB now where speed is maxed out. Dunno what might be the reason for the problems with Nzbget.... Gesendet von meinem Redmi Note 3 mit Tapatalk Quote Link to comment
Tonitram Posted October 24, 2017 Share Posted October 24, 2017 I was so focused on the download path that I didn't realize it was my category paths that were screwing me up. Everything is finally working now. Thanks for the help trurl! I've been working on this for days and couldn't figure out what I was doing wrong, you kept me from going insane lol. 1 Quote Link to comment
billium28 Posted October 26, 2017 Share Posted October 26, 2017 Occasionally my cache drive fills up before the mover has time to move the files. Is it advantageous to not send Nzbget downloads to the cache? Wouldn't that mean my media share would miss out on the cache advantages? Quote Link to comment
Squid Posted October 26, 2017 Share Posted October 26, 2017 3 minutes ago, billium28 said: Occasionally my cache drive fills up before the mover has time to move the files. Is it advantageous to not send Nzbget downloads to the cache? Wouldn't that mean my media share would miss out on the cache advantages? The cache drive and caching writes to the array was introduced back when average write speeds directly to the array were much slower than they are nowadays. Many people (including myself) only use the cache drive for applications (appdata, and the downloads share). All writes to user shares go directly to the array. (That, and I find it impossible to justify to the "boss" why I need a larger cache drive when neither her nor myself see any real improvements from caching writes to the media shares) You still use the cache drive for downloads. But post processing / moving by Couch / Radarr / Sonarr will bypass the cache and go to the array. Now, if you cache drive isn't big enough to actually handle the size of the downloads, then try setting the download share to be use cache: Prefer. When an article doesn't fit on the cache drive, it will fallover to the array. 1 Quote Link to comment
navickas12 Posted October 26, 2017 Share Posted October 26, 2017 Not sure if this is in the right spot so direct me if it isn't. Having some issues with Unraid Plex Docker not playing nicely with NZBGet. I had this setup running before on Ubuntu with no issues. What is happening now is that whenever NZBGet is either downloading or especially when it is Post-Processing, the system crawls. Playback stutters every 15 seconds or so until I pause Nzbget and then it all goes back to normal. I know the hardware is sufficient since it was all running well before UnRaid. I grabbed logs from Plex and NZBGet when this issue is specifically happening. The other thing I'm noticing is that Postprocessing takes SIGNIFICANTLY longer in Unraid NZBGet than it did on Ubuntu, a 1.8GB file is taking over half an hour sometimes 45 minutes. Hardware: https://pastebin.com/afb0NwPDPlex: https://pastebin.com/F1TzetMyNZBGet: https://pastebin.com/HyDfgeJzUnRaid: https://pastebin.com/pLSiHGMcThank you so much!-Andrew Quote Link to comment
billium28 Posted October 27, 2017 Share Posted October 27, 2017 6 hours ago, Squid said: The cache drive and caching writes to the array was introduced back when average write speeds directly to the array were much slower than they are nowadays. Many people (including myself) only use the cache drive for applications (appdata, and the downloads share). All writes to user shares go directly to the array. (That, and I find it impossible to justify to the "boss" why I need a larger cache drive when neither her nor myself see any real improvements from caching writes to the media shares) You still use the cache drive for downloads. But post processing / moving by Couch / Radarr / Sonarr will bypass the cache and go to the array. Now, if you cache drive isn't big enough to actually handle the size of the downloads, then try setting the download share to be use cache: Prefer. When an article doesn't fit on the cache drive, it will fallover to the array. Thank you that makes sense Quote Link to comment
NewDisplayName Posted December 11, 2017 Share Posted December 11, 2017 (edited) Ehm, how do i get the extentions to work? Ive setup like u worte on first page /mnt/user/appdata/nzbget is /config and i place in /mnt/user/appdata/nzbget/ppscripts/SafeRename.py but doesnt show in drop down!? Also does someone know a fast way to rename many files /*/*.1 to /*/*.mp4? Should i still change it to 000 (and why)? Bc it seems to work just fine. Its set to 1000 for me. Edited December 11, 2017 by nuhll Quote Link to comment
NewDisplayName Posted December 11, 2017 Share Posted December 11, 2017 This *.1 -> *.mp4 really gets to an problem because my cache drive is filling up because it cant find a movie file and so the dir wont get moved, anyone has an idea? Quote Link to comment
CHBMB Posted December 11, 2017 Share Posted December 11, 2017 Try making the script executable and checking the permissions Quote Link to comment
NewDisplayName Posted December 12, 2017 Share Posted December 12, 2017 I dont know. I tried: root@Unraid-Server:~# chmod +x /mnt/user/appdata/nzbget/ppscripts/SafeRename.py root@Unraid-Server:~# chmod 777 /mnt/user/appdata/nzbget/ppscripts/SafeRename.py restartet docker, no change.... Quote Link to comment
CHBMB Posted December 12, 2017 Share Posted December 12, 2017 Never used scripts before in my life, but worked for me... mkdir -p /mnt/cache/appdata/nzbget/ppscripts cd /mnt/cache/appdata/nzbget/ppscripts/ wget https://raw.githubusercontent.com/clinton-hall/GetScripts/master/SafeRename.py chmod a+x SafeRename.py Quote Link to comment
NewDisplayName Posted December 12, 2017 Share Posted December 12, 2017 (edited) Nonono.... but you got me on the right track, many many many, thanks, its owrking now. I didnt saw that i can adjust the "scripts" folder in settings under paths. It was relaying on the maindir (which i use for /downloads/) so i had to put it in /downloads/scripts/. I was so hard googleing for the path that i didnt see i can adjust it... Sorry im dumb asf. Its working now, thanks for your help... is there a way to redo all downloads with this script Edited December 12, 2017 by nuhll Quote Link to comment
NewDisplayName Posted December 29, 2017 Share Posted December 29, 2017 (edited) Hey, i have another question i would like to limit downloading while X,X IP is reachable. (I want to limit it, when my PC, or a other PC is startet) I know the inbuild sheduler, which is working, lets say, okay. But i guess, someone with some small coding expierince could write a fast script which checks ping 192.168... OR 192.168. and if its there, limit download to X. Could be easy implemented via every 5 min schedule, and set limit or remove it. I would pay for it. Thanks for any help. Edited December 29, 2017 by nuhll Quote Link to comment
trurl Posted December 29, 2017 Share Posted December 29, 2017 9 hours ago, nuhll said: Hey, i have another question i would like to limit downloading while X,X IP is reachable. (I want to limit it, when my PC, or a other PC is startet) I know the inbuild sheduler, which is working, lets say, okay. But i guess, someone with some small coding expierince could write a fast script which checks ping 192.168... OR 192.168. and if its there, limit download to X. Could be easy implemented via every 5 min schedule, and set limit or remove it. I would pay for it. Thanks for any help. This sort of functionality is usually handled in the router by prioritizing different IPs or ports, a feature known as Quality of Service (QoS). Quote Link to comment
NewDisplayName Posted December 29, 2017 Share Posted December 29, 2017 Thanks, i know that, my router actually even supports it. BUT i cant use it because i dont have fixed internet. I have 2 LTE wans load balanced. Quote Link to comment
FreeMan Posted January 5, 2018 Share Posted January 5, 2018 I've got the docker setup and working great. It works well with radarr and sonarr, so, of course, it's time to start tinkering! My usenet provider (newshosting.com) offers a free VPN service (as "documented" here and also here and several other pages that I haven't yet read all of). How do I configure NZBGet to access their VPN sites? Most of the things I've seen indicate that I need to download their software (which appears to be the OpenVPN client). I suppose I could DL it, then figure out how to install it in the container and get NZBG to run through it... It looks like the L2TP would be the best option for this, but I don't believe my Netgear WNR3500Lv2 supports L2TP pass-through. Quote Link to comment
CHBMB Posted January 31, 2018 Share Posted January 31, 2018 I've installed the NZBget dock within my Synology server and mapped the folder /DOWNLOAD with rw permissions within the docker settings. When starting the following errors occur: nzbget.conf(55): Invalid value for option "TempDir" (/DOWNLOAD/tmp): could not read information for directory /DOWNLOAD/tmp: errno 13, Permission denied nzbget.conf(52): Invalid value for option "QueueDir" (/DOWNLOAD/queue): could not read information for directory /DOWNLOAD/queue: errno 13, Permission denied nzbget.conf(46): Invalid value for option "NzbDir" (/DOWNLOAD/nzb): could not read information for directory /DOWNLOAD/nzb: errno 13, Permission denied /DOWNLOAD/nzbget-2018-01-31.log: Permission denied /DOWNLOAD/nzbget-2018-01-31.log: Permission denied /DOWNLOAD/nzbget-2018-01-31.log: Permission denied /DOWNLOAD/nzbget-2018-01-31.log: Permission denied /DOWNLOAD/nzbget-2018-01-31.log: Permission denied [ERROR] nzbget.conf(55): Invalid value for option "TempDir" (/DOWNLOAD/tmp): could not read information for directory /DOWNLOAD/tmp: errno 13, Permission denied [ERROR] nzbget.conf(52): Invalid value for option "QueueDir" (/DOWNLOAD/queue): could not read information for directory /DOWNLOAD/queue: errno 13, Permission denied [ERROR] nzbget.conf(46): Invalid value for option "NzbDir" (/DOWNLOAD/nzb): could not read information for directory /DOWNLOAD/nzb: errno 13, Permission denied [iNFO] Pausing all activities due to errors in configuration When i open an bash for the used uid and guid it is root, and i can manually create/edit/delete folders within the /DOWNLOAD path within this dock. uid=0(root) gid=0(root) groups=0(root),1(bin),2(daemon),3(sys),4(adm),6(disk),10(wheel),11(floppy),20(dialout),26(tape),27(video) What is the next step to troubleshoot this issue, for now i'm stuck. This forum is for Unraid users, either post in the linuxserver forum or go to our discord channel or IRC room for support on other platforms.Sent from my LG-H815 using Tapatalk Quote Link to comment
hari-bo Posted January 31, 2018 Share Posted January 31, 2018 I'm sorry I thought that this for the docker container support, I will search for another dock which support them on other distros as well. Quote Link to comment
trurl Posted January 31, 2018 Share Posted January 31, 2018 1 hour ago, hari-bo said: I'm sorry I thought that this for the docker container support, I will search for another dock which support them on other distros as well. This docker is supported on other platforms, just not in the unRAID forum. Exactly as CHBMB said above: 1 hour ago, CHBMB said: either post in the linuxserver forum or go to our discord channel or IRC room for support on other platforms. Quote Link to comment
djsnafu Posted February 14, 2018 Share Posted February 14, 2018 (edited) Since updating to 6.4.1 I've experienced a slow down in NZBGet. All of a sudden I'm not able to download at my full gigabit speed and seem to be stuck around 35MBs. On my Windows 10 machine I've tested my connection with Usenet and I'm downloading at full gigabit speed. Tried searching around to see if anyone else was experiencing this same thing but haven't found anything. Any ideas? Thanks unraid-diagnostics-20180213-1507.zip Edit: NM I figured it out. I had Extra Parameters: --cpuset-cpus=0,4. But this never behaved this way (slowing internet download) prior to 6.4.1. Once I removed the parameter it download at full 100MBs. Edited February 16, 2018 by djsnafu Quote Link to comment
Earache Posted March 14, 2018 Share Posted March 14, 2018 I'm getting this now... Post-process-script nzbToMedia-nightly/nzbToNzbDrone.py ... failed (terminated with unknown status) All the settings are correct so I don't understand what is happening here... Quote Link to comment
tyrindor Posted March 19, 2018 Share Posted March 19, 2018 (edited) I got this up and running and most of the time it's fine. However, every now and then it'll download a file and not give it the correct permissions (need access from nobody). I have umask set to 000 as suggested in the OP. This only affects some downloaded files, majority are fine. Any ideas? Edited March 20, 2018 by tyrindor Quote Link to comment
hawihoney Posted March 27, 2018 Share Posted March 27, 2018 I don't know if this is a Docker or Unassigned Devices thing. I do receive permission denied for all sorts of files and directories. If I look at directories and files from within docker User "abc" and Group "abc" have permission. Outside the docker container it's "nobody:users". The docker container is running as "99:100" and all directories but "/config" are set to "Slave/RW". What else do I need to do? Here's docker view vs. unRAID view: root@Tower:/mnt/disks/UA01/nzbget# docker exec -it nzbget ls -lisa /nzbget/intermediate total 12 4297828992 0 drwxrwxrwx 3 abc abc 73 Mar 26 10:38 . 99 0 drwxrwxrwx 7 abc abc 124 Mar 27 07:59 .. 2147483745 12 drwxrwxrwx 2 abc abc 8192 Mar 26 11:03 'Test.#17489' root@Tower:/mnt/disks/UA01/nzbget# ls -lisa /mnt/disks/UA01/nzbget/intermediate/ total 12 4297828992 0 drwxrwxrwx 3 nobody users 73 Mar 26 10:38 ./ 99 0 drwxrwxrwx 7 nobody users 124 Mar 27 07:59 ../ 2147483745 12 drwxrwxrwx 2 nobody users 8192 Mar 26 11:03 Test.#17489/ Attached you'll find the important settings. If you need diagnostics I can attach it too. Any help is highly appreciated. Quote Link to comment
ICDeadPpl Posted April 5, 2018 Share Posted April 5, 2018 Could we get these python modules added: py2-requests-oauthlib py2-markdown py2-decorator These are required to get the subtitles post processing script from https://github.com/caronc/nzb-subliminal to work. Now I have a script that runs "apk add --update py2-requests-oauthlib py2-markdown py2-decorator" after updating of the docker as a workaround. Quote Link to comment
taalas Posted April 22, 2018 Share Posted April 22, 2018 I am currently planning to migrate my gfjardim NZBget docker to this one. Is there anything I should be aware of? Can I just keep my existing configuration? Any hints on how to best do this? Thanks! Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.