[Support] Linuxserver.io - Sonarr


Recommended Posts

Can someone help me out with my docker settings? I keep getting the "Import failed, path does not exist or is not accessible for items that do not exist error in sonarr. Within the sab settings it shows that my downloads are pointed to /config/Downloads/complete so I tried switching the sonarr container dir to that with no luck. This is my current setting:

 

 

sab.jpg

sonarr.jpg

Link to comment
8 minutes ago, Squid said:

Can you post the full error message.

 

The mappings taken by themselves appear to be correct.

 

my error message is:

 

Import failed, path does not exist or is not accessible by Sonarr: /config/Downloads/complete/Vikings.S05E05.REPACK.720p.HDTV.x264-AVS-BUYMORE/
Close
 
 
Link to comment
39 minutes ago, s.lamoureux said:

/config/Downloads/...

That's the problem

 

You've got Sab moving the completed downloads to a sub-folder of it's appdata.

 

I switched a long time ago to NZBget, but IIRC when you set where to move the completed downloads, it points out the relative folders will be placed within its config folder

 

IE:  You've got it moving to Downloads You want it to move to /downloads   Note that everything is case sensitive (downloads does not equal Downloads), and the leading slash which tells sab that it's not a relative folder

 

EDIT:  If you've got a slew of downloads queued up, then odds are decent that they will still get placed into the wrong folder, but new ones will get the correct folder.

 

You'll find all of your previously completed downloads in your appdata share (sab folder / Downloads)

Edited by Squid
Link to comment
  • 3 weeks later...

Okay. I gotta add my 2 cents because I'm going rather insane. Years ago when I set my server up I was using binhex for things like sab and couch potato and whatever. Awhile back, I decided to switch to the linuxserver versions. It seemed to go well for awhile...and yet I had trouble getting my movies from the downloads folder into my actual movie folder. Well, I decided I would try to fix that. I moved to Radarr and moved everything to linuxserver thinking that the mappings would be easier that way. Now neither my tv nor my movies will go from the completed downloads to folders where they are supposed to go.  I've made it WORSE!!!!

And it's driving me nuts. I've been reading page after page here and I've tried quite a few things but I cannot get it to work. I really don't want to have to nix my docker and start from scratch. There's just too much information there. But if I have to delete all the sonarr, radarr, sab and whatever else....I will. I'll even try nzbget. I just really want to fix this. It's a massive hassled to come to this thing every night and put everything in it's place.

So thank you in advance whoever is willing to help.

 

The mappings are as follows: Sab, Sonarr, Radarr.

 

mappings.jpeg

Link to comment
Default Base Folder: /config
Temporary Download Folder BrowseLocation to store unprocessed downloads.
Can only be changed when queue is empty.
Minimum Free Space for Temporary Download FolderAuto-pause when free space is beneath this value.
In bytes, optionally follow with K,M,G,T. For example: "800M" or "8G"
Completed Download Folder Browse
 
There are those settings, jj. I'm not sure how to run docker safe permissions, but I would be very happy to.
Squid, that's a really good question.  Here is the answer:

Import failed, path does not exist or is not accessible by Sonarr: /config/Downloads/complete/tv/Steven Universe S05E24 Legs From Here to Homeworld 1080p iT WEB-DL AAC2 0 H 264-iT00NZ-RakuvFIN-Obfuscated/

 

I swear I'm not insane. And yet, I'm pretty sure you are going to tell me to do something like "push 'enter'" and it will fix everything.


Thanks so much for both of you responding.

Link to comment

I've dug around for a good bit now trying to find a previous instance of my issue so, if my Google-Fu has just failed me, do please point me to previous solutions if I'm repeating an issue that has already been solved!

So, what I'm getting is this error:

Couldn't import episode /downloads/complete/sonarr/DefinitelyTheActualFilename.mkv: Disk full. Path 

With this tracelog:

System.IO.IOException: Disk full. Path 
  at System.IO.File.Move (System.String sourceFileName, System.String destFileName) [0x00116] in <b0e1ad7573a24fd5a9f2af9595e677e7>:0 
  at NzbDrone.Common.Disk.DiskProviderBase.MoveFileInternal (System.String source, System.String destination) [0x00000] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Common\Disk\DiskProviderBase.cs:232 
  at NzbDrone.Mono.Disk.DiskProvider.MoveFileInternal (System.String source, System.String destination) [0x00076] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Mono\Disk\DiskProvider.cs:170 
  at NzbDrone.Common.Disk.DiskProviderBase.MoveFile (System.String source, System.String destination, System.Boolean overwrite) [0x000e3] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Common\Disk\DiskProviderBase.cs:227 
  at NzbDrone.Common.Disk.DiskTransferService.TryMoveFileTransactional (System.String sourcePath, System.String targetPath, System.Int64 originalSize, NzbDrone.Common.Disk.DiskTransferVerificationMode verificationMode) [0x0008f] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Common\Disk\DiskTransferService.cs:490 
  at NzbDrone.Common.Disk.DiskTransferService.TransferFile (System.String sourcePath, System.String targetPath, NzbDrone.Common.Disk.TransferMode mode, System.Boolean overwrite, NzbDrone.Common.Disk.DiskTransferVerificationMode verificationMode) [0x003ce] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Common\Disk\DiskTransferService.cs:312 
  at NzbDrone.Common.Disk.DiskTransferService.TransferFile (System.String sourcePath, System.String targetPath, NzbDrone.Common.Disk.TransferMode mode, System.Boolean overwrite, System.Boolean verified) [0x0000e] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Common\Disk\DiskTransferService.cs:196 
  at NzbDrone.Core.MediaFiles.EpisodeFileMovingService.TransferFile (NzbDrone.Core.MediaFiles.EpisodeFile episodeFile, NzbDrone.Core.Tv.Series series, System.Collections.Generic.List`1[T] episodes, System.String destinationFilePath, NzbDrone.Common.Disk.TransferMode mode) [0x0012c] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Core\MediaFiles\EpisodeFileMovingService.cs:119 
  at NzbDrone.Core.MediaFiles.EpisodeFileMovingService.MoveEpisodeFile (NzbDrone.Core.MediaFiles.EpisodeFile episodeFile, NzbDrone.Core.Parser.Model.LocalEpisode localEpisode) [0x0005e] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Core\MediaFiles\EpisodeFileMovingService.cs:81 
  at NzbDrone.Core.MediaFiles.UpgradeMediaFileService.UpgradeEpisodeFile (NzbDrone.Core.MediaFiles.EpisodeFile episodeFile, NzbDrone.Core.Parser.Model.LocalEpisode localEpisode, System.Boolean copyOnly) [0x0017c] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Core\MediaFiles\UpgradeMediaFileService.cs:76 
  at NzbDrone.Core.MediaFiles.EpisodeImport.ImportApprovedEpisodes.Import (System.Collections.Generic.List`1[T] decisions, System.Boolean newDownload, NzbDrone.Core.Download.DownloadClientItem downloadClientItem, NzbDrone.Core.MediaFiles.EpisodeImport.ImportMode importMode) [0x00272] in C:\BuildAgent\work\5d7581516c0ee5b3\src\NzbDrone.Core\MediaFiles\EpisodeImport\ImportApprovedEpisodes.cs:107 

I had recently filled up my array so I added another 3TB Red into it.  It's got space for days, so the "disk" definitely isn't (literally) full.  At that, Sonarr can see (and was able to rename) an episode that I manually copied over to that new drive, so the drive is definitely part of the user share into which Sonarr imports files from SAB.  I am at a loss here.  I can't seem to find the "convince Sonarr that there's a couple terabytes waiting for it" button.  :(

Any ideas?

Link to comment

Hi to all, I'm not under Unraid, sorry for this post but I can't find a solution.

 

I'm trying to use sonar_cleanup.php script in order to clean up eventually left over files from a unrar script for transmission.

 

But the script use php and no php are installed into container.

 

Ho can I point to the external /usr/bin/php file from inside container in order to use the php installed on the machine? There's any chance to deal with that?

 

Many thanks. Jo

Link to comment
On 8/8/2018 at 3:14 PM, CHBMB said:

You sure the new disk is allowing the relevant share.  Look around Shares =>Whatever Share You're using

As best I can tell, yes, the user share in question can use the new disk.  Disks 1 through 8 have been a part of the share for a long time, Disk 9 is excluded, and then Disk 10 is the new drive.  In the settings for this user share, Included Disks is set to All and Excluded Disks is set to Disk 9.  If I click the compute link in the Size column on the Shares tab, it shows the space used by the one file I have manually copied over.

Link to comment
17 hours ago, MethodCall said:

As best I can tell, yes, the user share in question can use the new disk.  Disks 1 through 8 have been a part of the share for a long time, Disk 9 is excluded, and then Disk 10 is the new drive.  In the settings for this user share, Included Disks is set to All and Excluded Disks is set to Disk 9.  If I click the compute link in the Size column on the Shares tab, it shows the space used by the one file I have manually copied over.

Check Split Level and Most Free for the share. Turn on Help, then ask if you don't understand what those settings do.

Link to comment
On 8/6/2018 at 12:44 AM, seefilms said:

Import failed, path does not exist or is not accessible by Sonarr: /config/Downloads/complete/tv/Steven Universe S05E24 Legs From Here to Homeworld 1080p iT WEB-DL AAC2 0 H 264-iT00NZ-RakuvFIN-Obfuscated/

 

I swear I'm not insane. And yet, I'm pretty sure you are going to tell me to do something like "push 'enter'" and it will fix everything.

It won't be quite as simple as pushing enter, but it is a simple solution

 

https://lime-technology.com/forums/topic/57181-real-docker-faq/?page=2#comment-566086

 

Link to comment
3 hours ago, trurl said:

Check Split Level and Most Free for the share. Turn on Help, then ask if you don't understand what those settings do.

Split level for that share is set to top two (the 3rd choice in the dropdown) and it is set to Most Free.  (It was set to this prior to adding the new drive, as well.)

Link to comment
4 hours ago, MethodCall said:

Split level for that share is set to top two (the 3rd choice in the dropdown) and it is set to Most Free.  (It was set to this prior to adding the new drive, as well.)

 

Sorry, my bad. I meant "Minimum Free", not "Most Free".

 

Note that Split Level takes precedence over Allocation Method and Minimum Free when unRAID decides which disk to write.

Link to comment
7 hours ago, trurl said:

 

Sorry, my bad. I meant "Minimum Free", not "Most Free".

 

Note that Split Level takes precedence over Allocation Method and Minimum Free when unRAID decides which disk to write.

I wouldn't expect the split level to be an issue.  I've been rocking this split level through the addition of several drives and it has always split correctly.  As for minimum free, it is currently set to 0KB, which is what it has been set to for the life of my server.  At that, on the share's own compute stats, it says there's over 2TB free on the new drive.

Link to comment
8 hours ago, MethodCall said:

I wouldn't expect the split level to be an issue.  I've been rocking this split level through the addition of several drives and it has always split correctly.  As for minimum free, it is currently set to 0KB, which is what it has been set to for the life of my server.  At that, on the share's own compute stats, it says there's over 2TB free on the new drive.

 

You would have to provide more details before we could analyze the specific situation to determine whether those things I mentioned are not coming into play.

 

Split Level can force unRAID to put a file on a disk other than the new disk. And 0 Minimum Free is not recommended and is probably one of the main reasons people get unexpected out-of-space. unRAID has no way to know how large a file will become when it chooses a disk to write. If a disk has less than Minimum Free, it will choose a different disk. You should set Minimum Free to larger than the largest file you expect to write to the User Share. There is also a Minimum Free setting for cache in Global Share Settings.

 

But as I mentioned, Split Level takes precedence, so if Split Level says a file should go on a certain disk to keep it together with other files, then that is the disk it is going to choose.

  • Like 1
Link to comment
16 hours ago, trurl said:

But as I mentioned, Split Level takes precedence, so if Split Level says a file should go on a certain disk to keep it together with other files, then that is the disk it is going to choose.

 

Yup.  It was split level.  I *clearly* hadn't wrapped my head around the problem sufficiently.  I wasn't understanding that "Hey, Episode 22 can't copy to the new drive because Episodes 1-21 are already on one of the other drives in the array and the split level won't allow E22 to be copied to another damn drive."  Arrrgh.  Well spotted, trurl.  Well spotted, indeed.

 

Edit (for posterity): I verified this by manually copying Episodes 1-22 to the new drive (to join E22, which I had already manually copied).  Once I did that, Sonarr was able to import E23 without issue (via manual import, because the automatic import had already failed for that episode).

Edited by MethodCall
Link to comment
  • 4 weeks later...

Hi, approximately 36 hours ago my Sonarr stopped connecting to my NZB indexer.  I have no idea why.  I verified with my indexer that everything is hunky-dory, and there are no API hits on their end (as seen in my account).  I have the following error in the log:

 

18-9-15 21:34:48.8|Warn|Newznab|Unable to connect to indexer

[v2.0.0.5228] System.Net.WebException: Failed to read complete http response ---> System.Net.WebException: Value cannot be null.
Parameter name: src ---> System.ArgumentNullException: Value cannot be null.
Parameter name: src

...

18-9-15 21:34:48.8|Warn|NzbDroneErrorPipeline|Invalid request Validation failed: 
 -- Unable to connect to indexer, check the log for more details

 

If anyone has this issue, double check that your indexer didn't just all of a sudden up and change their URL structure :)

 

In my case it went from https://www.example.com and they dropped the www to become https://example.com

Edited by fordfox
It vurks.
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.