BRiT Posted September 23, 2010 Share Posted September 23, 2010 No, you're not plain wrong or stupid. If you were only performing reads or only writes at a single time the sequential read patterns would matter more than random access patterns. Of the four steps the following involves reads-and-writes, par repair and extraction. The par repair is hopefully a rare condition if you're using a quality source. I'd estimate it occurs less than 5% of the entire time, perhaps as low as 0.5%. The extraction step always occurs unless the content posted wasn't placed into a container. I'd estimate it occurs 99.99% of the time. Now when you're extracting to the same drive there's other conditions to consider. The drive's buffer isn't as effective since it's split between reads and writes of different data. A larger portion of time the drive is stuck seeking between two general locations, the read data sectors and the write data sectors; this nearly devolves into the random access usage pattern. It might depend on the individual extraction program (unrar or 7z) and the ratio of reads before writes or how much of an in-memory buffer does the program uses before writing results. I don't imagine the items being downloaded will fit entirely within system memory twice (compacted data and extracted data), so i/o is still a factor. The other item to consider is if you have multiple items queued up, unless you pause the downloading while processing completed items, you're now in a read-and-write pattern. The drive is busy seeking between three general locations, reading the data for item one, writing the extracted data for item one, and writing the data for item two which is being downloaded. This is another nearly random access usage pattern. When you say spare drives, do you mean connected to the server but not in the array? If I wanted absolute performance without going overboard, I'd have to do some general tests of various conditions. A) if I pause all downloading while extracting an item is the performance improvement more than the introduced latency delay in having the next item not being immediately downloaded. B) if there is any improvement when extracting to the parity protected array directly instead of to the same cache disk then moving the contents later. C) if using two cache disks / spare disks with one being the working location and the other being the extraction location gives any substantial improvement. D) if extracting to the protected array directory improves overall performance. E) what combination of A, B, and C or D provide significant performance improvement. Personally, I utilize a 5400 rpm laptop drive for the sabnzbd scratch disk and my main Slackware Distro root disk. I'm not looking to maximize the performance of sabnzbd, as its being processed in the background and I'm not anxiously awaiting the contents. Quote Link to comment
chanders Posted September 25, 2010 Share Posted September 25, 2010 Hey guys, I just finished downloading an mkv file from Giganews using sabnzbd. However I get an error [filename] Repair failed, not enough repair blocks (2144 short) What could be causing this? I really hate to have to re-download the darn thing again. It took me over 16 hrs to bring it down! The report for the nzb said complete. Help? Quote Link to comment
graywolf Posted September 25, 2010 Share Posted September 25, 2010 Get a PAR utility, I use QuickPAR. Article about PAR http://www.slyck.com/Newsgroups_Guide_PAR_PAR2_Files Use the PAR utility to try and repair the file. Once that happens, you'll need to extract the .rar file using a utility (I use 7-zip), WinRAR is another good one. Both utilities I use are from my Windows machine accessing the unRAID user share Quote Link to comment
chanders Posted September 26, 2010 Share Posted September 26, 2010 Thanks. I just tried quickpar and it said repair not necessary! It extracted with no problem using winrar. So I am confused now. Why did sabnzbd throw the error on unraid? Quote Link to comment
NAS Posted September 26, 2010 Share Posted September 26, 2010 There was probably more than one par set in the nzb. I would worry about the details unless it happens again Quote Link to comment
chanders Posted September 26, 2010 Share Posted September 26, 2010 Ok it did happen again this morning with another download. I copied all of the failed files and ran Quickpar on them and it repaired the two files perfectly. This leads me to believe that par2 is not working properly on unraid. I have the latest sabnzbd dependencies that was supposed to fix the issue (2.0). Any ideas? Quote Link to comment
spants Posted September 26, 2010 Share Posted September 26, 2010 Ok it did happen again this morning with another download. I copied all of the failed files and ran Quickpar on them and it repaired the two files perfectly. This leads me to believe that par2 is not working properly on unraid. I have the latest sabnzbd dependencies that was supposed to fix the issue (2.0). Any ideas? I have the same problem.... intermittently. Quote Link to comment
blontic Posted September 27, 2010 Share Posted September 27, 2010 I thought it was only me getting these issues. My SABnzbd fails on par2 for around 5% of my downloads. If I then run Quickpar and point it to the unraid share it fixes the file. Seems like Par2 isn't working correctly on unraid. Quote Link to comment
graywolf Posted September 28, 2010 Share Posted September 28, 2010 Question. What do you do (if there is anything) if after doing PAR processing and you still are missing files? Just chuck it or try to find another "nzb" to download from? Quote Link to comment
chanders Posted September 28, 2010 Share Posted September 28, 2010 Question. What do you do (if there is anything) if after doing PAR processing and you still are missing files? Just chuck it or try to find another "nzb" to download from? After running Quickpar and seeing which files are the offending ones (the ones that cant be fixed by quickpar), I usually re-download those files. That usually fixes it. But my point is, shouldnt this be automated? Quote Link to comment
NAS Posted September 28, 2010 Share Posted September 28, 2010 if the logs dont show anything what we need is an nzb that fails, isnt to large and that we can try between us for debugging Quote Link to comment
Orbi Posted September 28, 2010 Share Posted September 28, 2010 Question. What do you do (if there is anything) if after doing PAR processing and you still are missing files? Just chuck it or try to find another "nzb" to download from? After running Quickpar and seeing which files are the offending ones (the ones that cant be fixed by quickpar), I usually re-download those files. That usually fixes it. But my point is, shouldnt this be automated? I see that when the first repair fails, Sabnzbd fetches additional blocks to be used for a 2nd repair. This doesn't always work though. I have like 5% of failed downloads which are repairable using quickpar. There is something in the UnRaid environment and par2cmdline that is not working 100%. Maybe a different build of the par2 utility would be more reliable? Here's a recent slackware build: http://connie.slackware.com/~alien/slackbuilds/par2cmdline/ Discussion about that here: http://www.linuxquestions.org/questions/slackware-14/rar-and-par2-822882/ I don't know if it would work installing it on top of the dependencies package. But for those having a lot of failed downloads, it would be worth a try. Quote Link to comment
chanders Posted September 28, 2010 Share Posted September 28, 2010 ^^ If anyone is successful, maybe we can have get a working dependencies file going.. Quote Link to comment
graywolf Posted September 30, 2010 Share Posted September 30, 2010 Question. What do you do (if there is anything) if after doing PAR processing and you still are missing files? Just chuck it or try to find another "nzb" to download from? After running Quickpar and seeing which files are the offending ones (the ones that cant be fixed by quickpar), I usually re-download those files. That usually fixes it. But my point is, shouldnt this be automated? So I'm still a complete noob at this. How do you download just the offending files and what do you use to get just those files? Quote Link to comment
chanders Posted September 30, 2010 Share Posted September 30, 2010 Usually it depends on the site you get the nzb from. If you look closely you will see a listing of all the files that make up that one nzb file. You usually have the option to now individually select certain files inside that nzb and create an new nzb for them. When you add this resulting nzb to sabnzbd it will download just the files you had selected Quote Link to comment
joelones Posted September 30, 2010 Share Posted September 30, 2010 I've seen the same behaviour with SABnzbd 5.4 and SABnzbdDependencies-2.0. Certain failed nzbs would be recoverable surprisingly by issuing the 'par2 r file' command in a terminal. Now I've unchecked the option in the config section "Only perform post-processing on jobs that passed all PAR2 checks" and seeing some new oddities. For example certain labeled _Failed_ nzbs would be completely unrarred and correct and should not have been labeled failed at all. Something is odd indeed. Quote Link to comment
NAS Posted October 1, 2010 Share Posted October 1, 2010 if the logs dont show anything what we need is an nzb that fails, isnt to large and that we can try between us for debugging debugging solution is still the same ^^^ Quote Link to comment
chanders Posted October 1, 2010 Share Posted October 1, 2010 Something is odd indeed. Where is Romir when we need him.. Quote Link to comment
JetShred Posted October 6, 2010 Share Posted October 6, 2010 Several have failed for me. However, I can run par2 from the console to successfully repair the download. Is SABnzbd not seeing par2? Quote Link to comment
NAS Posted October 6, 2010 Share Posted October 6, 2010 Guys even on a perfect sab install some will fail. lots of posts on usenet are crap. without an nzb example that fails to analyze you might as well be saying "my car broke how do i fix it" Quote Link to comment
neilt0 Posted October 6, 2010 Share Posted October 6, 2010 Guys even on a perfect sab install some will fail. lots of posts on usenet are crap. without an nzb example that fails to analyze you might as well be saying "my car broke how do i fix it" Sabnzbd has pretty extensive logging. Please post a log "Please post a log" is a trademark of Joe L. Productions, Inc. Quote Link to comment
betgear Posted October 6, 2010 Share Posted October 6, 2010 Ok, I found a problem with Sab, and decided to get the logs, etc to see if the issue can be fixed. The resulting file can be repaired with QuickPar, files would appear to need rejoining, which Sab seems to fail to do. I've attached the log file for this download only, along with a couple of screenshots. Hope this helps. Glen. CC3.zip Quote Link to comment
betgear Posted October 6, 2010 Share Posted October 6, 2010 Attachment was too big to include the pictures, so here they are... Quote Link to comment
betgear Posted October 6, 2010 Share Posted October 6, 2010 The files repaired correctly by opening a telnet session, and typing par2 r ctu-....vol00+1.par2 par2cmdline version 0.4, Copyright © 2003 Peter Brian Clements. Modifications for concurrent processing, Unicode support, and hierarchial directory support are Copyright © 2007-2008 Vincent Tan. Concurrent processing utilises Intel Thread Building Blocks 2.0, Copyright © 2007-2008 Intel Corp. Executing using the 32-bit x86 (IA32) instruction set. par2cmdline comes with ABSOLUTELY NO WARRANTY. This is free software, and you are welcome to redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. See COPYING for details. Processing verifications and repairs concurrently. Loading "ctu-x264-cold.case.602.vol00+1.par2". Loaded 51 new packets including 1 recovery blocks Loading "ctu-x264-cold.case.602.vol01+2.par2". Loading "ctu-x264-cold.case.602.vol15+3.par2". Loaded 2 new packets including 2 recovery blocks Loaded 3 new packets including 3 recovery blocks There are 24 recoverable files and 0 other files. The block size used was 3456000 bytes. There are a total of 352 data blocks. The total size of the data files is 1173335766 bytes. Verifying source files: Target: "ctu-x264-cold.case.602.r00" - found. Target: "ctu-x264-cold.case.602.r03" - found. Target: "ctu-x264-cold.case.602.r01" - damaged. Found 14 of 15 data blocks. Target: "ctu-x264-cold.case.602.r02" - damaged. Found 14 of 15 data blocks. Target: "ctu-x264-cold.case.602.r04" - found. Target: "ctu-x264-cold.case.602.r05" - found. Target: "ctu-x264-cold.case.602.r06" - found. Target: "ctu-x264-cold.case.602.r07" - found. Target: "ctu-x264-cold.case.602.r08" - found. Target: "ctu-x264-cold.case.602.r09" - found. Target: "ctu-x264-cold.case.602.r10" - found. Target: "ctu-x264-cold.case.602.r11" - found. Target: "ctu-x264-cold.case.602.r12" - found. Target: "ctu-x264-cold.case.602.r13" - found. Target: "ctu-x264-cold.case.602.r14" - found. Target: "ctu-x264-cold.case.602.r15" - found. Target: "ctu-x264-cold.case.602.r16" - found. Target: "ctu-x264-cold.case.602.r17" - found. Target: "ctu-x264-cold.case.602.r19" - found. Target: "ctu-x264-cold.case.602.r18" - found. Target: "ctu-x264-cold.case.602.r20" - found. Target: "ctu-x264-cold.case.602.r21" - found. Target: "ctu-x264-cold.case.602.r22" - found. Target: "ctu-x264-cold.case.602.rar" - found. Scanning extra files: Repair is required. 2 file(s) exist but are damaged. 22 file(s) are ok. You have 350 out of 352 data blocks available. You have 6 recovery blocks available. Repair is possible. You have an excess of 4 recovery blocks. 2 recovery blocks will be used to repair. Computing Reed Solomon matrix. Constructing: done. Solving: done. Wrote 100000000 bytes to disk Verifying repaired files: Target: "ctu-x264-cold.case.602.r01" - found. Target: "ctu-x264-cold.case.602.r02" - found. Repair complete. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.