[Support] Linuxserver.io - NZBGet

Recommended Posts

Good afternoon!

I just got NZBGet all set up and mistakenly added a bunch of downloads to the queue with no post-processing enabled.  A bunch finished downloading, a bunch have not.  NONE of them have post-processing enabled.  The ones that are done need to be renamed and moved, the others that are queued, I don't want them to download without the script enabled...I just can't get it to enable without going through each one individually:  Clicking the postprocess button, clicking yes, then clicking save.

Anyway, that leaves me with 2 groups to deal with.  The completed batch and the queued batch. 

For the ones that are done, I can go through on the history tab, click each title, click the postprocess button, set it to yes, then click save, then check them all and click "post-process again".  That works but is INCREDIBLY time consuming.  Not real sure what else I can do with them though...is there a way to bulk-enable the post processing script and have them all post-process again?  I dunno.  Maybe even just directly run the VideoSort script on the directories I need to?...if that's even possible?  


For the ones that are queued, I've tried changing the category, setting the global category, everything I can think of and it's not working.  If I click a title and click the postprocess button, it still says "no"...even though the category changed to a category with PP enabled.  I'm thinking the PP behavior is set upon import of the download itself.  I'm thinking the best way to fix the batch that hasn't downloaded yet is to just clear it all out and re-add it with the proper category set.  But again, I dunno.

Hope all that made sense...lol


I'm open to ideas.  All this has been a NIGHTMARE to get set up and I'm so close to being done...hahaha.

I appreciate it!

Link to comment

UPDATE:  Ended up using (the MIRACULOUS...GLORIOUS...GENIOUS) FileBot to take care of the "completed batch".  So that's no longer an issue.


I will also take care of all the queued stuff after it downloads.  No big deal.  FileBot got everything sorted out in 2 seconds.

Then, going forward, I set up the FileBot Docker container to do all this automatically.  So, no scripts/categories/etc necessary.



Link to comment
  • 2 weeks later...

As I don't know if this is a radarr issue or an NZBGet issue, I will be posting my issue in this thread as well:


I am having a frustrating problem: Radarr can't seem to find the paths for movies downloaded from NZBGet.

I have seen others have this problem, and one proposed solution is to change the path name in NZBGet to /data like in Radarr. Did not work.

I tried executing Safe New Permissions and New Permissions. Did not work.


This also started happening randomly. I never used to have this problem until it started occurring about a month ago. Sometimes it randomly fixes itself, but it never lasts long, and the same problem starts occurring again.








How do I stop this from happening?

Link to comment
26 minutes ago, Squid said:

Because you mentioned doing this, probably best to post your docker run commands for each of the apps as per  https://forums.unraid.net/topic/57181-docker-faq/#comment-564345





I suppose the one thing different this will show is that I followed the tutorial that puts nzbget on the same network as delugevpn, so my downloads from usenet run through my vpn. I'm not sure why this should affect radarr's folder permissions though.

Link to comment
  • 3 weeks later...

I recently changed by subnet, but I cannot forward NZBGet through my Privoxy VPN container anymore, as outlined in Spaceinvader's tutorial:




NZBGet is working fine, nothing wrong showing up on the log, and I can access it when network type set to bridge, but not when I want to forward it through a VPN container. The extra parameters are 100% correct.


Is there a way to address this?

Link to comment
  • 1 month later...

I'm having the same issue as above - linuxserver NZBGet and NZBHydra2 UIs are no longer reachable through VPN by IP:PORT as per SpaceInvader's tutorial (had been working for.. years). Linuxserver Radarr and Sonarr are fine and reachable through bridge/Letsencrypt reverse proxy, and both say that NZBGet and Hydra2 are reachable. Changing to bridge network type allows accessing UIs again.


This morning both Radarr and Sonarr said they couldn't reach either NZBGet or Hydra2, which went away after an unrelated reboot. I see no errors in either the application or Unraid logs.


edit: figured it out, I use binhex-rtorrentvpn docker for VPN access, and the docker recently tweaked how access is granted (Had to add a new variable of ports and change docker configs to use localhost rather than IP).

Edited by tiphae
Link to comment

I have run in to a problem, where Sonarr cannot contact Nzbget.


I get the following message:

NzbDrone.Core.Download.Clients.DownloadClientException: Unable to connect to NzbGet. HTTP request failed: [503:ServiceUnavailable] [POST] at []


Docker allocations are all correct as well.


I should mention, that nzbget is working and actually is contactable on the IP and portnumber from a browser.


Skærmbillede 2021-03-01 kl. 08.21.27.png

Skærmbillede 2021-03-01 kl. 08.28.34.png

Edited by elcapitano
Link to comment

I'm getting Post Processing issues on most, but not all files. This all started with the Python 3 issue way back, which I resolved by updating the syntax in my scripts. This did not last, as I assume the scripts get replaced when the docker is updated? I fixed the scripts again, but I'm still getting Post Processing errors. My quick fix was to turn off the bad scripts in the file Post Processing window, but this does not work anymore. The PP error says that the Password Detector failed, but the script is not even enabled. The first file I changed processed OK, the rest did not. I restarted the docker, but still no joy.


If I dig a little deeper, I see this error:


ERROR Mon Mar 08 2021 23:44:50Could not post-process again MyFile.S01E08.1080p.BluRay.x264-CLOVER: destination directory /Video/tv/Series/MyFile.S01E08.1080p.BluRay.x264-CLOVER doesn't exist


Any bright ideas what's going on and how to permanently fix this issue?

Link to comment
  • 1 month later...

I've run into the same issue as elcapitano above. Everything was working well a few days ago, and now Sonarr and Radarr are unable to connect to nzbget.


[v3.0.6.1196] NzbDrone.Core.Download.Clients.DownloadClientUnavailableException: Unable to connect to NzbGet. Error getting response stream (ReadDoneAsync2): ReceiveFailure: '' ---> System.Net.WebException: Error getting response stream (ReadDoneAsync2): ReceiveFailure: '' ---> System.Net.WebException: Error getting response stream (ReadDoneAsync2): ReceiveFailure


Ive been trying various fixes without any progress - reinstalling nzbget, deleting appdata, changing the port, etc. nzbGet download client connection in sonarr and radarr test fine, but they're unable to add content for download. I can use the nzbget webui just fine, and docker allocations all match. I can still add nzb downloads manually.


Link to comment
  • 2 weeks later...


I had the same issue!

Everything was working as it should and then my server updated the container automatically and I had the issue.

I solved it by using an older version. (v21.0-ls70)


Here is how to use an older version:

- Save the configuration file from nzbget (settings->system->backup)

- Remove the container (also remove image)

- Install the container from CA but you have to make an adjustment in the template:


(Set the Repository to: "linuxserver/nzbget:amd64-v21.0-ls70")


If you need another architecture here is the list


- BOOM! Hopefully it works. If the container does not start try another architecture.

- Restore your settings (settings->system->restore)

Link to comment



Thanks! I ultimately ended up switching to SABnzbd after trying a few more avenues to no avail... I should have realized I could just roll back the container, the timing definitely coincided with the weekly automatic update.


Hopefully with the recent news that NZBget development has returned, we'll see issues like this fixed!

  • Like 1
Link to comment
  • 3 weeks later...
  • 4 months later...

My nzbget is not downloading after i did a successful parity check and disk rebuild. I don't know if that was a coincidence.  It seems to otherwise be working, like I can add nzb's to the queue.  It says "downloading" but the speed is zero.  I checked my internet was good, and did power cycles of unraid, but same problem.  

I tried nzbget "restore" to a backup that is a few months old, but same problem. 


Now I want to try removing the container, then re-install the container using unraid "previous apps".  If I do that do I lose my 360 item queue that took days to build?  Also I am open to any other suggestions.  Thanks.

Edited by xrqp
Link to comment

Hey.  I did remove the nzbget container, then re-installed the container using unraid "previous apps".  And IT WORKED GOOD!  I did not loose my download queue.  And it is downloading at full speed.  What a relief! 


Unraid and my dockers frequently give me a good scare, but usually I can fix it, with time and effort.  Is it fair to assume it will become more bulletproof as the years proceed?


Sometimes I wonder if running emby, with tv antenna, and all the *arrs, and doing disk rebuilds, and parity checks, and doing preclears - if it overloads it and causes problems?

Link to comment

20 hr later update.  Nzbget download rate slowed to a crawl, even though my internet is tested to be fast.  I tried again with the removal of the nzbget container, then re-installed the container using unraid "previous apps".  And now my nzbget download is fast.  I am guessing it will last for a 12 to 24 hours, then I may need to do it again.


At the same time i updated unraid from 6.9.1 to 6.9.2,  and all my containers needed updates too, so did that too.  Maybe  that will help keep the download rate high.

Edited by xrqp
Link to comment

Happened again.  Took only 12 hrs to drop from 5MBs to 0.5MBs download rate.  Did reinstall again and it "fixed" it again, but now I am pretty sure it will slowly slow down over 12 hours.  I guess at some point I may switch to Sabnzb, but I hate to recreate the queue.


I have 500 downloads queued.  Is that part of the problem? 

Is there any way to troubleshoot this with logs?

I also wonder if my ISP is doing this.

Edited by xrqp
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.