unRAID with SABnzbd


Recommended Posts

  • Replies 734
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Posted Images

Try this one if you are using Google Chrome

 

https://chrome.google.com/extensions/detail/jnjalpkoocinmppkagibfbhplifocfgp

 

And this is a great app if you have an iPhone/iTouch

 

http://web.me.com/vijaymason/

 

 

 

Does the myNZB app work even when you aren't connected on your home wifi?  For example, at work?

 

Even if it did (which it wont) you would be insane to browse anything usenet related at work

Link to comment

A better way to say it....

 

Dont allow all of the internet to connect to your sab restrict it with IP filtering using a firewall or even better use a VPN.

 

Just because a program has a password or even built in IP access controls doesnt mean it is secure. By default assume it is not and life will be much safer

Link to comment

 

 

Guys, can somebody please re-upload the dependencies package please? Romir's link doesn't work. Thanks a lot in advance :)

 

try here: http://www.lockstockmods.net/2010/06/03/unraid-with-sabnzbd-and-sickbeard/

 

il be using this guide as son as i get a cache drive installed

 

the lockstockmods site has older v1.3 dependencies and 0.5.2 of sabnzbd

 

Link to comment

ok, managed to get sabnzb and sickbeard working, currently grabbing some new TV episodes.

 

just some questions though

 

say i would like to add a brand new tv show that i want to watch, do i have to create the folder on my array and then get sickbeard to scan for this folder for it to setup auto downloads?

 

also, say i would like to download another another nzb (not tv related), i grab the nzb file, but what do i do with it so sabnzb on my unraid server opens it and starts to download it?

 

 

thanks

Link to comment

say i would like to add a brand new tv show that i want to watch, do i have to create the folder on my array and then get sickbeard to scan for this folder for it to setup auto downloads?

 

That's how I do it. Just adding a new show without the folder on the array does not work for me. Sickbeard will create the season folders but not the show folder.

 

also, say i would like to download another another nzb (not tv related), i grab the nzb file, but what do i do with it so sabnzb on my unraid server opens it and starts to download it?

 

This has already been answered in this thread. You can either have a firefox or chrome plugin that sends the nzb directly to Sab, or you can create a black hole that is being monitored by SAB periodically. My desktop computer is windows and I use sabnzbd gadget which allows me to dock my queu to the desktop, monitor the downloads and drag and drop NZB to the gadget to sab.

Link to comment

say i would like to add a brand new tv show that i want to watch, do i have to create the folder on my array and then get sickbeard to scan for this folder for it to setup auto downloads?

 

That's how I do it. Just adding a new show without the folder on the array does not work for me. Sickbeard will create the season folders but not the show folder.

 

also, say i would like to download another another nzb (not tv related), i grab the nzb file, but what do i do with it so sabnzb on my unraid server opens it and starts to download it?

 

This has already been answered in this thread. You can either have a firefox or chrome plugin that sends the nzb directly to Sab, or you can create a black hole that is being monitored by SAB periodically. My desktop computer is windows and I use sabnzbd gadget which allows me to dock my queu to the desktop, monitor the downloads and drag and drop NZB to the gadget to sab.

 

many thanks mate

 

ive just grabbed the firefox add in which sends the nzb to unraid for download :)

 

my next question is! when the file downloads and completes, it saves it on the cache drive but then it auto moves it onto the root of disk2.

 

is there a way of using the firefox tool to download the nzb then once downloaded/parred and extracted on the cache drive to then move it automatically to its desired folder?

 

i.e if its a movie then to download and complete on the cache drive, then auto move it into the movies folder on disk1?

 

Thanks again! im slowly getting it so its perfect for what i want it to do.

Link to comment

You can setup sabnzbd to do this when it completes a download. It will place the file in a folder of your choice. You can then choose a share to move it too.

 

many thanks, i think i have sussed this bit out

 

just needed to set the category on the download queue to movies, tv etc and now it puts it to the right folder :)

 

looks like it still uses the cache drive here, then moves it across in its own time

 

stunning!

Link to comment

You can setup sabnzbd to do this when it completes a download. It will place the file in a folder of your choice. You can then choose a share to move it too.

 

many thanks, i think i have sussed this bit out

 

just needed to set the category on the download queue to movies, tv etc and now it puts it to the right folder :)

 

looks like it still uses the cache drive here, then moves it across in its own time

 

stunning!

 

Yeah, once you have it all configured it's just a matter of sit back and relax. Throw in a peperroni pizza, and I'm a happy guy.

Link to comment

I have had SabNZBd running for a while now (currently at 0.5.3) and it's working great. Download speeds are as expected (12.5 MB/s on a 120 MBit connection), but repairing and unrarring/unzipping can be dreadfully slow.

 

I have a Samsung F2 500GB (5400rpm) setup as a cache disk and all temporary files are written to it. When finished with par-checking, the archives are extracted on the same disk. At this moment, unpar+repair+unrar takes WAY longer than downloading it. Could I see a speed increase if I replace the cache disk with a 7200rpm version? And would it be faster to extract the finished download directly to the protected array, instead of to the cache disk?

 

Thanks!

Link to comment

I am having a few problems with "failed" downloads..... I'm using the latest 2.0 dependencies and if I manually use the PAR2 and associated PARS I can repair it successfully.

 

I've started using SABnzbd and has been working well for the most part. I do have a couple downloads that FAILED (most have been great).

Since I'm so new at this, I haven't found instructions on how to manually do a PAR or how to handle the FAILED.

 

Any tips would be appreciated (links, quick tutorial, etc)

 

Link to comment

I am having a few problems with "failed" downloads..... I'm using the latest 2.0 dependencies and if I manually use the PAR2 and associated PARS I can repair it successfully.

 

I've started using SABnzbd and has been working well for the most part. I do have a couple downloads that FAILED (most have been great).

Since I'm so new at this, I haven't found instructions on how to manually do a PAR or how to handle the FAILED.

 

Any tips would be appreciated (links, quick tutorial, etc)

 

 

Depending on your OS (not unRAID!) you have to use an unpar program (MacPar Deluxe for Mac) to check the archive for CRC errors. Then, when that's finished, you'll have to extract the archive with an unrar program like unrar (Mac) or WinRAR (Windows). Some unpar programs also unrar when properly configured or by default.

 

Sometimes the extracting of archives fail in SabNZBd because they are password protected!!!

Link to comment

I have had SabNZBd running for a while now (currently at 0.5.3) and it's working great. Download speeds are as expected (12.5 MB/s on a 120 MBit connection), but repairing and unrarring/unzipping can be dreadfully slow.

 

I have a Samsung F2 500GB (5400rpm) setup as a cache disk and all temporary files are written to it. When finished with par-checking, the archives are extracted on the same disk. At this moment, unpar+repair+unrar takes WAY longer than downloading it. Could I see a speed increase if I replace the cache disk with a 7200rpm version? And would it be faster to extract the finished download directly to the protected array, instead of to the cache disk?

 

Those tasks, par, repair, and extract, are mostly bound by two major bottlenecks. The first is the CPU. If you're running on an Intel Atom CPU or ancient or low end Intel or AMD CPUs (P4/Athlon/Semperon) then it will take a long time. The second bottleneck is disk I/O. To maximize performance, you want fast drives for reading (par verify and repair). You also want to extract the files to a different drive than where you're reading from. However I'm not sure if extracting to the protected array directly would be faster than extracting to the same cache drive and moving the files later.

 

Also, to be pedantic, you do not unpar. You par verify or par repair.

Link to comment

Depending on your OS (not unRAID!) you have to use an unpar program (MacPar Deluxe for Mac) to check the archive for CRC errors. Then, when that's finished, you'll have to extract the archive with an unrar program like unrar (Mac) or WinRAR (Windows). Some unpar programs also unrar when properly configured or by default.

 

Sometimes the extracting of archives fail in SabNZBd because they are password protected!!!

 

Thanks.

Found what I needed. Or at least it worked. QuickPar (windows) for the repair, and 7-Zip was able to handle the rar.

 

 

The 3 FAILUREs I had have all been repaired.

 

 

Link to comment

I have had SabNZBd running for a while now (currently at 0.5.3) and it's working great. Download speeds are as expected (12.5 MB/s on a 120 MBit connection), but repairing and unrarring/unzipping can be dreadfully slow.

 

I have a Samsung F2 500GB (5400rpm) setup as a cache disk and all temporary files are written to it. When finished with par-checking, the archives are extracted on the same disk. At this moment, unpar+repair+unrar takes WAY longer than downloading it. Could I see a speed increase if I replace the cache disk with a 7200rpm version? And would it be faster to extract the finished download directly to the protected array, instead of to the cache disk?

 

Those tasks, par, repair, and extract, are mostly bound by two major bottlenecks. The first is the CPU. If you're running on an Intel Atom CPU or ancient or low end Intel or AMD CPUs (P4/Athlon/Semperon) then it will take a long time. The second bottleneck is disk I/O. To maximize performance, you want fast drives for reading (par verify and repair). You also want to extract the files to a different drive than where you're reading from. However I'm not sure if extracting to the protected array directly would be faster than extracting to the same cache drive and moving the files later.

 

Also, to be pedantic, you do not unpar. You par verify or par repair.

 

I have an Intel Pentium G6950 @ 2.8GHz. It's an Arrendale based CPU on a Intel H55 motherboard, so it's got plenty of power.

 

I also have two Samsung F1 640GB (7200rpm) spare drives, but those "only" have a 320gb platter, while the F2 has a 500GB platter. Since the unrarring and par verify/repair ;) are mainly sequential, not random, disk operations I assume having higher sequential throughput is of greater influence on performance that higher I/O.

 

Or am I just plain wrong/stupid? :)

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.