unRAID with SABnzbd


Recommended Posts

No, you're not plain wrong or stupid.

 

If you were only performing reads or only writes at a single time the sequential read patterns would matter more than random access patterns.

 

Of the four steps the following involves reads-and-writes, par repair and extraction. The par repair is hopefully a rare condition if you're using a quality source. I'd estimate it occurs less than 5% of the entire time, perhaps as low as 0.5%. The extraction step always occurs unless the content posted wasn't placed into a container. I'd estimate it occurs 99.99% of the time.

 

Now when you're extracting to the same drive there's other conditions to consider. The drive's buffer isn't as effective since it's split between reads and writes of different data. A larger portion of time the drive is stuck seeking between two general locations, the read data sectors and the write data sectors; this nearly devolves into the random access usage pattern. It might depend on the individual extraction program (unrar or 7z) and the ratio of reads before writes or how much of an in-memory buffer does the program uses before writing results. I don't imagine the items being downloaded will fit entirely within system memory twice (compacted data and extracted data), so i/o is still a factor.

 

The other item to consider is if you have multiple items queued up, unless you pause the downloading while processing completed items, you're now in a read-and-write pattern. The drive is busy seeking between three general locations, reading the data for item one, writing the extracted data for item one, and writing the data for item two which is being downloaded. This is another nearly random access usage pattern.

 

When you say spare drives, do you mean connected to the server but not in the array?

 

If I wanted absolute performance without going overboard, I'd have to do some general tests of various conditions.

 

A) if I pause all downloading while extracting an item is the performance improvement more than the introduced latency delay in having the next item not being immediately downloaded.

B) if there is any improvement when extracting to the parity protected array directly instead of to the same cache disk then moving the contents later.

C) if using two cache disks / spare disks with one being the working location and the other being the extraction location gives any substantial improvement.

D) if extracting to the protected array directory improves overall performance.

E) what combination of A, B, and C or D provide significant performance improvement.

 

Personally, I utilize a 5400 rpm laptop drive for the sabnzbd scratch disk and my main Slackware Distro root disk. I'm not looking to maximize the performance of sabnzbd, as its being processed in the background and I'm not anxiously awaiting the contents.

Link to comment
  • Replies 734
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Posted Images

Hey guys, I just finished downloading an mkv file from Giganews using sabnzbd. However I get an error

 

[filename] Repair failed, not enough repair blocks (2144 short)

 

What could be causing this? I really hate to have to re-download the darn thing again. It took me over 16 hrs to bring it down! The report for the nzb said complete.

 

Help?

Link to comment

Ok it did happen again this morning with another download. I copied all of the failed files and ran Quickpar on them and it repaired the two files perfectly. This leads me to believe that par2 is not working properly on unraid. I have the latest sabnzbd  dependencies that was supposed to fix the issue (2.0). Any ideas?

Link to comment

Ok it did happen again this morning with another download. I copied all of the failed files and ran Quickpar on them and it repaired the two files perfectly. This leads me to believe that par2 is not working properly on unraid. I have the latest sabnzbd  dependencies that was supposed to fix the issue (2.0). Any ideas?

 

I have the same problem.... intermittently.

 

 

Link to comment

Question. What do you do (if there is anything) if after doing PAR processing and you still are missing files?

 

Just chuck it or try to find another "nzb" to download from?

 

After running Quickpar and seeing which files are the offending ones (the ones that cant be fixed by quickpar), I usually re-download those files. That usually fixes it.

 

But my point is, shouldnt this be automated?

Link to comment

Question. What do you do (if there is anything) if after doing PAR processing and you still are missing files?

 

Just chuck it or try to find another "nzb" to download from?

 

After running Quickpar and seeing which files are the offending ones (the ones that cant be fixed by quickpar), I usually re-download those files. That usually fixes it.

 

But my point is, shouldnt this be automated?

I see that when the first repair fails, Sabnzbd fetches additional blocks to be used for a 2nd repair. This doesn't always work though. I have like 5% of failed downloads which are repairable using quickpar.

 

There is something in the UnRaid environment and par2cmdline that is not working 100%.

Maybe a different build of the par2 utility would be more reliable?

Here's a recent slackware build: http://connie.slackware.com/~alien/slackbuilds/par2cmdline/

Discussion about that here: http://www.linuxquestions.org/questions/slackware-14/rar-and-par2-822882/

I don't know if it would work installing it on top of the dependencies package. But for those having a lot of failed downloads, it would be worth a try.

Link to comment

Question. What do you do (if there is anything) if after doing PAR processing and you still are missing files?

 

Just chuck it or try to find another "nzb" to download from?

 

After running Quickpar and seeing which files are the offending ones (the ones that cant be fixed by quickpar), I usually re-download those files. That usually fixes it.

 

But my point is, shouldnt this be automated?

 

So I'm still a complete noob at this. 

 

How do you download just the offending files and what do you use to get just those files?

Link to comment

Usually it depends on the site you get the nzb from. If you look closely you will see a listing of all the files that make up that one nzb file. You usually have the option to now individually select certain files inside that nzb and create an new nzb for them. When you add this resulting nzb to sabnzbd it will download just the files you had selected  ;)

Link to comment

I've seen the same behaviour with SABnzbd 5.4 and SABnzbdDependencies-2.0. Certain failed nzbs would be recoverable surprisingly by issuing the 'par2 r file' command in a terminal. Now I've unchecked the option in the config section "Only perform post-processing on jobs that passed all PAR2 checks" and seeing some new oddities.

 

For example certain labeled _Failed_ nzbs would be completely unrarred and correct and should not have been labeled failed at all. Something is odd indeed.

Link to comment

Guys even on a perfect sab install some will fail. lots of posts on usenet are crap. without an nzb example that fails to analyze you might as well be saying "my car broke how do i fix it"

 

Sabnzbd has pretty extensive logging. Please post a log

 

"Please post a log" is a trademark of Joe L. Productions, Inc.

Link to comment

Ok, I found a problem with Sab, and decided to get the logs, etc to see if the issue can be fixed.

 

The resulting file can be repaired with QuickPar, files would appear to need rejoining, which Sab seems to fail to do.

 

I've attached the log file for this download only, along with a couple of screenshots.

 

Hope this helps.

 

Glen.

CC3.zip

Link to comment

The files repaired correctly by opening a telnet session, and typing par2 r ctu-....vol00+1.par2

 

par2cmdline version 0.4, Copyright © 2003 Peter Brian Clements.

Modifications for concurrent processing, Unicode support, and hierarchial

directory support are Copyright © 2007-2008 Vincent Tan.

Concurrent processing utilises Intel Thread Building Blocks 2.0,

Copyright © 2007-2008 Intel Corp.

Executing using the 32-bit x86 (IA32) instruction set.

 

par2cmdline comes with ABSOLUTELY NO WARRANTY.

 

This is free software, and you are welcome to redistribute it and/or modify

it under the terms of the GNU General Public License as published by the

Free Software Foundation; either version 2 of the License, or (at your

option) any later version. See COPYING for details.

 

Processing verifications and repairs concurrently.

Loading "ctu-x264-cold.case.602.vol00+1.par2".

Loaded 51 new packets including 1 recovery blocks

Loading "ctu-x264-cold.case.602.vol01+2.par2".

Loading "ctu-x264-cold.case.602.vol15+3.par2".

Loaded 2 new packets including 2 recovery blocks

Loaded 3 new packets including 3 recovery blocks

 

There are 24 recoverable files and 0 other files.

The block size used was 3456000 bytes.

There are a total of 352 data blocks.

The total size of the data files is 1173335766 bytes.

 

Verifying source files:

 

Target: "ctu-x264-cold.case.602.r00" - found.

Target: "ctu-x264-cold.case.602.r03" - found.

Target: "ctu-x264-cold.case.602.r01" - damaged. Found 14 of 15 data blocks.

Target: "ctu-x264-cold.case.602.r02" - damaged. Found 14 of 15 data blocks.

Target: "ctu-x264-cold.case.602.r04" - found.

Target: "ctu-x264-cold.case.602.r05" - found.

Target: "ctu-x264-cold.case.602.r06" - found.

Target: "ctu-x264-cold.case.602.r07" - found.

Target: "ctu-x264-cold.case.602.r08" - found.

Target: "ctu-x264-cold.case.602.r09" - found.

Target: "ctu-x264-cold.case.602.r10" - found.

Target: "ctu-x264-cold.case.602.r11" - found.

Target: "ctu-x264-cold.case.602.r12" - found.

Target: "ctu-x264-cold.case.602.r13" - found.

Target: "ctu-x264-cold.case.602.r14" - found.

Target: "ctu-x264-cold.case.602.r15" - found.

Target: "ctu-x264-cold.case.602.r16" - found.

Target: "ctu-x264-cold.case.602.r17" - found.

Target: "ctu-x264-cold.case.602.r19" - found.

Target: "ctu-x264-cold.case.602.r18" - found.

Target: "ctu-x264-cold.case.602.r20" - found.

Target: "ctu-x264-cold.case.602.r21" - found.

Target: "ctu-x264-cold.case.602.r22" - found.

Target: "ctu-x264-cold.case.602.rar" - found.

 

Scanning extra files:

 

 

Repair is required.

2 file(s) exist but are damaged.

22 file(s) are ok.

You have 350 out of 352 data blocks available.

You have 6 recovery blocks available.

Repair is possible.

You have an excess of 4 recovery blocks.

2 recovery blocks will be used to repair.

 

Computing Reed Solomon matrix.

Constructing: done.

Solving: done.

 

Wrote 100000000 bytes to disk

 

Verifying repaired files:

 

Target: "ctu-x264-cold.case.602.r01" - found.

Target: "ctu-x264-cold.case.602.r02" - found.

 

Repair complete.

 

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.