[Support] binhex - NZBGet


Recommended Posts

1 hour ago, binhex said:

@jonathanm i dont suppose you would be kind enough to see if you can try this out? it might lead me to finding out what the underlying issue is, it sounds a bit odd but hey worth a go right?.

I tried it, but it hung within an hour or so. I typically have at least 2 different machines with browsers open to the GUI, but tried it with one. It appears to me that it was a coincidence for the OP.

Link to comment
6 hours ago, jonathanm said:

I tried it, but it hung within an hour or so. I typically have at least 2 different machines with browsers open to the GUI, but tried it with one. It appears to me that it was a coincidence for the OP.

ok thanks for trying, it was a long shot 🙂

Link to comment

I was hoping it wouldn't be only on my side that makes it work. It's been unpacking successfully since last post.
 

I've been tracking it since and noticed a few more things. First I noticed that the page underneath the modal doesn't refresh, or update while it's up. And I know it's been working because the episodes are showing up in Plex. Once I click the close button, then the History page updates with all the successful downloads it's had since I last opened it.

 

I opened up dev tools and noticed a few things in the network panel. 

The highlighted items will continuously loop every second until (#1) I click a modal. At which point they stop.

A few minutes later I closed the modal, saw episode download update, started looping and then I clicked an item to open modal again. (#2)
The loops stopped and then a few minutes later I closed modal again and it started looping.
I noticed that the calls changed from 1/4 being a 304, to 3/4 being a 304.

Let me know if any of this is helpful or not. I've attached image. Let me know if there's anything else I can test/help with.

nzbget_history_page.png

Link to comment
5 hours ago, Alkapwn said:

I was hoping it wouldn't be only on my side that makes it work. It's been unpacking successfully since last post.

Maybe it relies on only one connection to the GUI. I typically have at least 3 or 4 machines with that page open, so I really couldn't see how opening the pop up would change how the backend acts.

 

Also, all my connections are through nginx reverse proxy. Is your connection direct?

Link to comment

Howdy, I'm a bit new to the nzbget scene, I've gotten the binhex docker installed, and files will download. Problem is, it's trying to download everything to the docker directory,(/usr/local/bin/nzbget/downloads/completed/Movies/) instead of the path that I set for it when I installed the docker. (/mnt/usr/downloads)

I've tried everything I can think of, but I'm lost. I did download another version of the container and it worked perfectly. Problem is, sonarr/radarr couldn't access the completed folder, so it couldn't move the files to my plex folders. Anybody have any ideas as to what I could do to fix it?  

Link to comment
2 minutes ago, blindy said:

Howdy, I'm a bit new to the nzbget scene, I've gotten the binhex docker installed, and files will download. Problem is, it's trying to download everything to the docker directory,(/usr/local/bin/nzbget/downloads/completed/Movies/) instead of the path that I set for it when I installed the docker. (/mnt/usr/downloads)

I've tried everything I can think of, but I'm lost. I did download another version of the container and it worked perfectly. Problem is, sonarr/radarr couldn't access the completed folder, so it couldn't move the files to my plex folders. Anybody have any ideas as to what I could do to fix it?  

Post your docker run command as explained in the very first link in the Docker FAQ:

https://forums.unraid.net/topic/57181-docker-faq

 

Probably worthwhile for you to read the Getting Started section also.

 

Here is another link in the Docker FAQ that might be useful to get your dockers to work together:

https://forums.unraid.net/topic/57181-docker-faq/page/2/#comment-566086

 

Link to comment
  • 3 weeks later...

Hey guys ran into an issue I can’t seem to figure out. It seems most of my downloads are getting stuck when unpacking. If I restart the docker one or two will go through then get stuck again and require another restart of nzbget. I’m sure I’ve got something set wrong but searching and watching the videos everything seems correct. Anyone else have this issue and how did you fix it?

  • Like 1
Link to comment
54 minutes ago, DocHodges said:

Hey guys ran into an issue I can’t seem to figure out. It seems most of my downloads are getting stuck when unpacking. If I restart the docker one or two will go through then get stuck again and require another restart of nzbget. I’m sure I’ve got something set wrong but searching and watching the videos everything seems correct. Anyone else have this issue and how did you fix it?

I and other users are seeing the same issue. I've discovered a few issues that seem to be related. First is that the par check/repair stage seems to fail randomly. Sometimes nzbget reports 'PAR Success' but no matter how many times I try and re-postprocess the download, the unpack fails or gets stuck. If I run QuickPAR from Windows using the same PAR set, it often finds 1 or 2 files that have all blocks present but they need to be re-joined. Once QuickPAR has re-joined these blocks/files, then nzbget can successfully unpack.

 

The other issue is some PAR repairs leave the renamed damaged files in the source folder. I find this confuses nzbget's unpack processing, especially when the first file in the archive set has a renamed copy. For example, if nzbget PAR does a repair/rejoin, it sometimes seems to create a file with one more leading '0' in the filename, i.e. xxxxxxxxxxxxxxxxx.7z.001 is repaired/rejoined but there is a copy of the bad file named xxxxxxxxxxxxxxxxx.7z.0001. The same can happen with rar archives - the filename might be  xxxxxxxxxxxxxxxxx.part001.rar and after the repair/rejoin there's a 2nd file named xxxxxxxxxxxxxxxxx.part0001.rar.

 

When you look at the source folder (the 'intermediate' folder for most, depending on how you have nzbget configured) and delete all the 'bad' files that have been renamed and then do a re-postprocess, the unpack will usually succeed. The 3rd case of failure I've found is the complete 'halt' of the extract/unpack process, which seems to be a bug on the way 7zip is called to process .7z archives. The logs show the unpack request is calling 7zip but the unpack hangs for some reason that the logs don't identify.

 

Hope these findings might help others and maybe even help the nzbget team further refine their post-processing routines. Note that I've also found these same issues when using the Linuxserver.io build of the nzbget Docker container. This means the issues are likely inherent to the nzbget app and/or the par/unrar/7zip extensions.

 

Dale

  • Like 2
Link to comment

@agentxxl thanks a ton for the information. I deleted all the files out for the intermediate folder and redownloaded the files. It seems to be working so far.

 

Edit: spoke too soon. Seems like the issue occurs with .rar files. QuickPar shows no errors. Deleting the file starts the next and it typically works then fails on the following. Annoying for sure hope they fix the issue

Link to comment
3 minutes ago, DocHodges said:

@agentxxl thanks a ton for the information. I deleted all the files out for the intermediate folder and redownloaded the files. It seems to be working so far.

No problem. Glad that's helping, but you don't have to delete all files and re-download - just delete the bad/renamed files for each affected download and attempt a re-postprocess from nzbget. For example, delete all files that have been renamed with the extension '.1' or any files that have had the extra leading '0' attached to the part identifier. Occassionally I'll have to ask nzbget to 'download remaining files' and let it attempt another repair before the unpack even tries to start.

 

For some older content that often has more missing articles, I wish I could find a way to tell nzbget to download ALL remaining files as sometimes it stops download of the next parity file and just marks the download as failed. Some older (and even sometimes new) content need the full set of parity files for par to successfully repair the archive.

 

Note that on the failed 7zip extracts that hang, I will sometimes just stop the nzbget Docker container and then use my Windows box with 7zip installed to do the extract manually. This is rare as most times I can cleanup the intermediate download folder and nzbget will then successfully call 7zip and proceed with the extract.

 

Dale

Link to comment
31 minutes ago, AgentXXL said:

Note that I've also found these same issues when using the Linuxserver.io build of the nzbget Docker container. This means the issues are likely inherent to the nzbget app and/or the par/unrar/7zip extensions.

this is of great interest to me, as i wasnt aware of issues with other docker images, in that case it does indeed point to an issue somewhere with the way nzbget is post processing, or using the tools you mentioned in order to do par repairs and/or unpacking. 

 

what would be good is some sort of coordinated issue raised for this (including other docker developers) as i think the nzbget devs are under the impression that everything is hunky dorey with the post processing functionality in nzbget. i can certainly attempt to re-open the existing issue that is on their board, but i currently have no idea what i can try to fix it, and if the issue is indeed buried away somewhere in nzbget source then i have little chance of identifying it :-( - its frustrating thats for sure!.

 

thanks for the detailed reply btw @AgentXXL 

Edited by binhex
  • Thanks 1
Link to comment

Quick update: I tried all the suggested and while reloading nzb would allow one or two to complete physically stopping the docker and starting it again seemed to resolve the issue for the resining downloads. Not a huge issue if this solves the problem temporarily. Thank you for all the help figuring this out as I have been driving myself mad thinking I was doing something wrong.

Link to comment
On 1/2/2020 at 3:00 PM, DocHodges said:

Hey guys ran into an issue I can’t seem to figure out. It seems most of my downloads are getting stuck when unpacking. If I restart the docker one or two will go through then get stuck again and require another restart of nzbget. I’m sure I’ve got something set wrong but searching and watching the videos everything seems correct. Anyone else have this issue and how did you fix it?

Not sure if this is related, but I was finding NZBGet downloads falling over with complete RAR archives that it was failing to extract.   I never had this problem under Windows.

 

I assumed it was some kind of issue with nested RAR files and added this NZBGet script:

https://forum.nzbget.net/viewtopic.php?t=1690&start=20

 

Whatever the issue was, I haven't had any problems since.

 

Note that I had to switch from binhex to linuxserver docker image, as I use a few NZBGet scripts that require python 2.7 (binhex includes python 3.x)

 

Link to comment
On 12/13/2019 at 10:20 PM, ConnectivIT said:

Just a heads up, I had to switch to linuxserver/nzbget because I couldn't get VideoSort.py plugin working.

 

Fairly sure this is because binhex/arch-nzbget includes python 3.7.x, whereas linuxserver/nzbget uses python2.7:

 

https://hub.docker.com/r/linuxserver/nzbget/dockerfile

(python2 \)

 

See my post above - NZBGet plugins generally require Python 2.7 so I had to switch my nzbget docker container to linuxserver/nzbget because the binhex container includes Python 3.x.

 

As long as you point linuxserver/nzbget to the same docker paths correctly this should "just work" (make a backup!)

 

I'm guessing you have already checked "Scripts" path in NZBGet (this is usually something like ${MainDir}/scripts)  If your ${MainDir} is /data then you'll need your scripts in /data/scripts.  Your custom scripts should each go in their own folder, so from inside the docker you should have something like:

 

/data/scripts/VideoSort/VideoSort.py  (and a bunch of other files)

 

Link to comment
On 1/2/2020 at 4:45 PM, binhex said:

this is of great interest to me, as i wasnt aware of issues with other docker images, in that case it does indeed point to an issue somewhere with the way nzbget is post processing, or using the tools you mentioned in order to do par repairs and/or unpacking. 

 

what would be good is some sort of coordinated issue raised for this (including other docker developers) as i think the nzbget devs are under the impression that everything is hunky dorey with the post processing functionality in nzbget. i can certainly attempt to re-open the existing issue that is on their board, but i currently have no idea what i can try to fix it, and if the issue is indeed buried away somewhere in nzbget source then i have little chance of identifying it :-( - its frustrating thats for sure!.

 

thanks for the detailed reply btw @AgentXXL 

 

Hey binhex,

 

Did you manage to re-open the issue for this on nzbget's board? I'm assuming it was on github, I am happy to contribute what I can and I'm guessing others are in this thread.

 

Thanks

Link to comment
  • 5 weeks later...

Hey All,

 

   I have been struggling with NZBGet marking NZB files as hidden after they fail. I wanted to know if there was a way to disable this. I have seem many of the NZB files fail due to the unpack snagging. I want to be able to put them back to post-process but once they are marked hidden i cant do this. 

 

Any help or guidance is greatly appreciated. 

 

Link to comment
7 hours ago, Aerodb said:

I have been struggling with NZBGet marking NZB files as hidden after they fail. I wanted to know if there was a way to disable this. I have seem many of the NZB files fail due to the unpack snagging. I want to be able to put them back to post-process but once they are marked hidden i cant do this.

Try fully deleting them in NZB.  Should get it to avoid the duplicate check when re-adding.

Link to comment
On 9/11/2019 at 7:27 AM, jonathanm said:

If it helps at all, I was able to get in to the container while there was a non functional unpack, and did htop. There was only one unrar process, sitting there taking up cpu time. I killed it, and it respawned, but did NOT progress. I killed it multiple times, just to be sure, each time the CPU counter started over, but it never actually did anything. Dunno if I'm reading that correctly, but it seems to me that unrar isn't the problem, it's the invocation somehow. If unrar was at fault, I would expect killing it to allow a clean restart of the unrar process that would continue. Since it doesn't, it feels like its related to how nzbget is calling the unrar command.

 

Only restarting the container worked.

Same here.  

Link to comment
  • 1 month later...
4 minutes ago, xilo said:

I have to restart NZBGet container frequently when there is a long queue for post-processing.  I'm surprised this has been around for so long and hasn't received much attention.  Is this only present on docker containers?

This problem is present on all Docker containers I've tried. In watching the logs it seems that the stalls when post-processing are often due to something failing in the PAR check/repair process. If you go into your temporary download location (assuming you have NZBGet setup that way, it's the 'intermediate' folder), look for any files with a .1 extension (i.e. bad files which have been replaced by repaired versions).

 

Delete all of these 'bad' files and then check to see if the par repair might have also created any repaired files with essentially the same name but adding some leading zeroes to the name. Par repair might rename a file to filename.partxxxx whereas the rest of the series is named filename.partxx. This confuses the extraction/unpacking and will usually show up as a download with a failed extract/unpack but a successful par check/repair.

 

I've also found that occasionally the par repair built into the NZBGet container will skip a damaged file no matter how many times you try to send it back to post-processing. If I run QuickPAR from a Windows box on the same files, it often will find one or more damaged files that the built-in par repair missed. Once I've deleted all the '.1 and renamed with extra zeroes' files, QuickPAR will successfully repair the set and then re-sending the set to post-processing is successful.

 

In summary, I suspect the command-line version of par used by the NZBGet dockers is a possible cause of the stalls.

 

Link to comment
3 hours ago, xilo said:

With so much intervention required, I'm thinking it will be easier to just switch to Sabnzbd for the time being.

Zero intervention required. I've had the following as a user scripts entry running hourly for probably a year.

#!/bin/bash
docker restart binhex-nzbget

Never have to touch it.

  • Like 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.