[Support] binhex - NZBGet


Recommended Posts

Maybe dumb question.  I am trying to install the videosort extension.  Where exactly is my pp-scripts directory?  I have been hunting for it for an hour now.

thank you

 

edit - right after writing this and i closed the tab, boom it was right there in the paths section of the settings but can not figure out where that is on my box  it says:  ${AppDir}/scripts  

Edited by david_waters
Link to comment
  • 2 weeks later...

Hey guys...  this container has been rock solid since I installed it!  Thanks, Binhex...

 

I finally resolved my slow internet speed and went from 30 MBps to 320 MBps.  To celebrate, I decided to hit the "Search All Missing" button in Sonarr.  ;-)

 

Now I have about 600 items downloading.  When the queue was lower, like 50 and less, I could get downloads in unlimited speed limit at about 30 MBps.  Now that the queue is huge, it's slowed down to about 5 MBps.  Has anyone seen this issue?  Any tweaks in container setup or NZBGet settings to alleviate the throttling?

Link to comment
On 7/31/2018 at 1:58 PM, david_waters said:

Maybe dumb question.  I am trying to install the videosort extension.  Where exactly is my pp-scripts directory?  I have been hunting for it for an hour now.

thank you

 

edit - right after writing this and i closed the tab, boom it was right there in the paths section of the settings but can not figure out where that is on my box  it says:  ${AppDir}/scripts  

 

I just tried looking for mine, and there is no scripts folder in my setup (I don't use scripts, though).

Maybe you need to manually create it?

Link to comment
14 hours ago, ajgriglak said:

Now I have about 600 items downloading.  When the queue was lower, like 50 and less, I could get downloads in unlimited speed limit at about 30 MBps.  Now that the queue is huge, it's slowed down to about 5 MBps.  Has anyone seen this issue?  Any tweaks in container setup or NZBGet settings to alleviate the throttling?

 

Once something has downloaded, Sonarr will make a backup copy of it, copy that backup into the correct location and then delete the originals.  This overhead *could* slow down your downloads - especially if you are downloading straight the unraid array (ie, not onto a cache drive) as they will be competing for use of the parity drive.

 

OR, your ISP might be throttling your connection.

 

Just a couple of thoughts.

Link to comment
4 hours ago, Cessquill said:

 

Once something has downloaded, Sonarr will make a backup copy of it, copy that backup into the correct location and then delete the originals.  This overhead *could* slow down your downloads - especially if you are downloading straight the unraid array (ie, not onto a cache drive) as they will be competing for use of the parity drive.

Thanks for your reply.  I download directly to a 1TB Unassigned Drive (not SSD).  Your mention of parity got me thinking...  I was also in the middle of a parity check, so maybe that had something to do with it.  The parity check is now done, and it has been running overnight.  It's now fluctuating between 15 and 30 MBps, so definitely more acceptable.

 

I just thought maybe there was some tweaks that could be done when the queue is so large.

Link to comment
  • 3 weeks later...
On 7/30/2018 at 6:09 PM, rmilyard said:

 

Do you know if the part for unpacking files and deleting it and doing it again in endless loop was fixed?

Hi - just realised I'm still on the old version of this.  Is it still stuck in a loop on unpacking?  If it means setting and testing for max limits I might stay where I am for now - save that for a weekend job.

Link to comment
17 hours ago, Cessquill said:

Hi - just realised I'm still on the old version of this.  Is it still stuck in a loop on unpacking?  If it means setting and testing for max limits I might stay where I am for now - save that for a weekend job.

 

last i heard there was still issues around unpacking, i have seen a test release of nzbget has been released and there was a fair bit of detail around par and crc checking and such like in the next release, so possibly the next release will fix this up, we shall see.

Link to comment
  • 3 weeks later...
  • 2 weeks later...

Hi, I've been having the unpacking/download queues problem for a few months and never got around to fixing it until now. Looking at it, seems like it's best to roll back to v19.1-1-02. I did so by following the Q12 on the link provided by editing the repository end.

 

Now, when I try to access the webUI, I get that "Error: 404 Not Found". I thought that problem was fixed? I went into the nzbget.conf file and looked for the WebDir=/usr/share/nzbget/webui, but only see WebDir={AppDir}/webui. Am I supposed to change that to WebDir=/usr/local/bin/nzbget/webui? thanks!

Link to comment
  • 2 weeks later...
On 10/6/2018 at 3:26 PM, puncho said:

Hi, I've been having the unpacking/download queues problem for a few months and never got around to fixing it until now. Looking at it, seems like it's best to roll back to v19.1-1-02. I did so by following the Q12 on the link provided by editing the repository end.

 

Now, when I try to access the webUI, I get that "Error: 404 Not Found". I thought that problem was fixed? I went into the nzbget.conf file and looked for the WebDir=/usr/share/nzbget/webui, but only see WebDir={AppDir}/webui. Am I supposed to change that to WebDir=/usr/local/bin/nzbget/webui? thanks!

Nzbget is working well but still can't get into the webui...any clues? :)

Link to comment
On 10/6/2018 at 12:26 PM, puncho said:

Now, when I try to access the webUI, I get that "Error: 404 Not Found".

I did the same thing, attempting the latest new build and reverting to v19.x, only to find a 404 error where the web UI used to be. I also tried editing the .conf file, but had no luck. I ended up pulling a copy of the .conf file from the previous week's backup, and now all is well again on v19. If it helps, my line 64 of that file is currently:

WebDir=/usr/share/nzbget/webui

Link to comment
On 10/17/2018 at 2:40 PM, KernelG said:

I did the same thing, attempting the latest new build and reverting to v19.x, only to find a 404 error where the web UI used to be. I also tried editing the .conf file, but had no luck. I ended up pulling a copy of the .conf file from the previous week's backup, and now all is well again on v19. If it helps, my line 64 of that file is currently:

WebDir=/usr/share/nzbget/webui

 

Thanks. Not a huge deal right now I guess...I'll wait till it's fixed in the future.

Link to comment
  • 3 weeks later...

So I'm having an issue that I can't quite figure out.  I have previously used NZBGet on Windows, and got an unRaid server going.  Installed it, and I'm probably doing something stupid, but unpacking fails every time I try and download something.

 

I have my data path set to /mnt/user/Downloads/

 

In the NZBget UI, I have my paths set up as:

 

MainDir: /data

DestDir: ${MainDir}/Completed

InterDir: ${MainDir}/intermediate

NzbDir: ${MainDir}/nzb

QueueDir: ${MainDir}/queue

TempDir: ${MainDir}/tmp

WebDir: ${AppDir}/webui

 

Whenever I download a file from usenet, two things happen.  My main provider (Eweka gets blocked for 10s for some reason, but a test in settings works, and I'm able to log in just fine), and it fails to unpack upon completion.

 

I get the error message Unpack for XXXXXXX (whatever the file name is) failed. It passes verification, says that no repair is needed, but then fails the unpack, and adds the file to the registry.

 

Can anyone provide me any guidance?

Link to comment
  • 1 month later...
13 hours ago, repomanz said:

while unpacking; seems it does not honor cpu pinning.  IE; even though i have cpu 1,5 pinned, it's maxing out all of my cpus. 

if you are really seeing this issue then its either a docker engine issue or an unraid issue, as pinning is done via the unraid web ui and sent to the docker engine, there is no additional processing within the container.

 

13 hours ago, repomanz said:

Seeing above posts are there plans to fix unpacking issues?

the fix is currently not known so no fix can be put in place, there has been a lot of back and forth with nzbget devs and nothing concrete came out of it. Im hoping the next release will not have this issue, but im not holding my breath.

Link to comment
  • 3 weeks later...

Hello all,

 

I just switched to using nzb recently and after reading this forum learned I should install v19 of nzbget due to a possible unpacking problem with v20 so I installed v19 as directed in the earlier posts and configured nzbget using spaceinvader ones video.  Unfortunately nzbget is hanging at the unpacking stage. 

 

I downloaded a 1gb file and its been over 30mins now with no result.  I am running a fx6300 and the /data folder is mapped to my cache drive which is an SSD.

 

There has been no error message yet with the unpacking of the file its just stuck at the moment.

 

I was reading on another forum about direct unpack, this option is off right now.  Should I turn it on?  Does that affect how sonarr post-processes?

 

Any other hints about this problem?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.