[Support] Linuxserver.io - Sonarr


Recommended Posts

Squid, I tried the steps you posted and everything was moved off the cache. Once I turned use cache drive back to no the next tv show I downloaded was put on the cache drive again.

 

CHBMB, I'm not sure what you mean by the docker mappings? When I initially setup the dockers I had to add /mnt/user to each of the dockers to get them to be able to share the Downloads, Movies, and TV Shows shares. How should I be setting up the mappings?

Link to comment

Squid, I tried the steps you posted and everything was moved off the cache. Once I turned use cache drive back to no the next tv show I downloaded was put on the cache drive again.

 

CHBMB, I'm not sure what you mean by the docker mappings? When I initially setup the dockers I had to add /mnt/user to each of the dockers to get them to be able to share the Downloads, Movies, and TV Shows shares. How should I be setting up the mappings?

 

/downloads = /mnt/cache/downloads/

/tv = /mnt/user/tv

/share2 = /mnt/user/share2

 

would be the convention, but you've got both that sort of mapping with /tv = /mnt/user/tv shows/ & /mnt/user/ = /mnt/user/

 

So in effect within Sonarr you could tell it that tv shows is in /tv OR /mnt/user/tv shows/  so you've effectively mapped the location twice.  Without knowing how you mapped things within the container, ie did you tell Sonarr your shows were in /mnt/user/tv shows/ or /tv/ as inconsistency between how you define paths inside the container are the cause of most problems.

 

 

Link to comment

Ok, that makes sense.

 

It looks like I had 2 things wrong.

 

One is that I was adding /mnt /mnt or /mnt/user /mnt/user to every docker.

 

The second is that I was mapping everything using /mnt/user instead of /mnt/cache (example configs were set to /mnt/user/appdata/Sonarr instead of /mnt/cache/appdata/Sonarr.

 

I've updated all my containers and will see if this worked.

Link to comment

Hey guys,

Had a wonderful time with Sonarr, a few months back it derped out potentially due to an unsafe shutdown but now it says

Couldn't import episode /downloads/complete/excellentcontent.mkv: Disk full. Path

System.IO.IOException: Disk full. Path 
  at System.IO.File.Move (System.String sourceFileName, System.String destFileName) <0x2ac0d7fca5c0 + 0x002f0> in <filename unknown>:0 
  at NzbDrone.Common.Disk.DiskProviderBase.MoveFile (System.String source, System.String destination, Boolean overwrite) [0x000e3] in M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Disk\DiskProviderBase.cs:209 
  at NzbDrone.Common.Disk.DiskTransferService.TryMoveFileTransactional (System.String sourcePath, System.String targetPath, Int64 originalSize, DiskTransferVerificationMode verificationMode) [0x0008f] in M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Disk\DiskTransferService.cs:479 
  at NzbDrone.Common.Disk.DiskTransferService.TransferFile (System.String sourcePath, System.String targetPath, TransferMode mode, Boolean overwrite, DiskTransferVerificationMode verificationMode) [0x003c2] in M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Disk\DiskTransferService.cs:301 
  at NzbDrone.Common.Disk.DiskTransferService.TransferFile (System.String sourcePath, System.String targetPath, TransferMode mode, Boolean overwrite, Boolean verified) [0x0000e] in M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Disk\DiskTransferService.cs:185 
  at NzbDrone.Core.MediaFiles.EpisodeFileMovingService.TransferFile (NzbDrone.Core.MediaFiles.EpisodeFile episodeFile, NzbDrone.Core.Tv.Series series, System.Collections.Generic.List`1 episodes, System.String destinationFilePath, TransferMode mode) [0x0012c] in M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Core\MediaFiles\EpisodeFileMovingService.cs:118 
  at NzbDrone.Core.MediaFiles.EpisodeFileMovingService.MoveEpisodeFile (NzbDrone.Core.MediaFiles.EpisodeFile episodeFile, NzbDrone.Core.Parser.Model.LocalEpisode localEpisode) [0x0005e] in M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Core\MediaFiles\EpisodeFileMovingService.cs:80 
  at NzbDrone.Core.MediaFiles.UpgradeMediaFileService.UpgradeEpisodeFile (NzbDrone.Core.MediaFiles.EpisodeFile episodeFile, NzbDrone.Core.Parser.Model.LocalEpisode localEpisode, Boolean copyOnly) [0x00113] in M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Core\MediaFiles\UpgradeMediaFileService.cs:64 
  at NzbDrone.Core.MediaFiles.EpisodeImport.ImportApprovedEpisodes.Import (System.Collections.Generic.List`1 decisions, Boolean newDownload, NzbDrone.Core.Download.DownloadClientItem downloadClientItem, ImportMode importMode) [0x00274] in M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Core\MediaFiles\EpisodeImport\ImportApprovedEpisodes.cs:107

 

Which would be a very handy error if my drives were full, none of my drives are close to full.

Transmission still downloads episodes fetched, its just sonarr doesn't copy them around any more.

Any ideas??

Link to comment

NZB get config

MainDir=/downloads

DestDir=${MainDir}/completed

InterDir=${MainDir}/intermediate

NzbDir=${MainDir}/nzb

QueueDir=/config/queue

TempDir=${MainDir}/tmp

WebDir=/opt/nzbget/webui

ScriptDir=/config/ppscripts

LockFile=${MainDir}/nzbget.lock

LogFile=/config/log/nzbget.log

ConfigTemplate=/opt/nzbget/webui/nzbget.conf.template

RequiredDir=



Server1.Active=yes

Server1.Name=

Server1.Level=0

Server1.Group=0

Server1.Host=my.newsserver.com

Server1.Port=119

Server1.Username=user

Server1.Password=pass

Server1.JoinGroup=no

Server1.Encryption=no

Server1.Cipher=

Server1.Connections=4

Server1.Retention=0




ControlIP=0.0.0.0

ControlPort=6789

ControlUsername=nzbget

ControlPassword=tegbzn6789

RestrictedUsername=

RestrictedPassword=

AddUsername=

AddPassword=

SecureControl=yes

SecurePort=6791

SecureCert=/config/ssl/nzbget.crt

SecureKey=/config/ssl/nzbget.key

AuthorizedIP=127.0.0.1

DaemonUsername=root

UMask=000




Category1.Name=Movies

Category1.DestDir=

Category1.Unpack=yes

Category1.PostScript=

Category1.Aliases=

Category2.Name=Series
Category3.Name=Music
Category4.Name=Software


AppendCategoryDir=yes

NzbDirInterval=5

NzbDirFileAge=60

DupeCheck=yes



SaveQueue=yes

FlushQueue=yes

ReloadQueue=yes

ContinuePartial=yes

PropagationDelay=0

Decode=yes

ArticleCache=0

DirectWrite=yes

WriteBuffer=0

CrcCheck=yes

Retries=3

RetryInterval=10

ArticleTimeout=60

UrlTimeout=60

TerminateTimeout=600

DownloadRate=0

AccurateRate=no

DiskSpace=250

DeleteCleanupDisk=yes

NzbCleanupDisk=yes

KeepHistory=30

FeedHistory=7

UrlConnections=4

UrlForce=yes



WriteLog=append

RotateLog=3

ErrorTarget=both

WarningTarget=both

InfoTarget=both

DetailTarget=log

DebugTarget=log

LogBufferSize=1000

NzbLog=yes

BrokenLog=yes

DumpCore=no

TimeCorrection=0




OutputMode=curses

CursesNzbName=yes

CursesGroup=no

CursesTime=no

UpdateInterval=200



ParCheck=auto

ParRename=yes

ParRepair=yes

ParScan=extended

ParQuick=yes

ParBuffer=16

ParThreads=0

ParIgnoreExt=.sfv, .nzb, .nfo

HealthCheck=delete

ParTimeLimit=0

ParPauseQueue=no

ParCleanupQueue=yes



Unpack=yes

UnpackPauseQueue=no

UnpackCleanupDisk=yes

UnrarCmd=${AppDir}/unrar

SevenZipCmd=${AppDir}/7za

ExtCleanupDisk=.par2, .sfv, _brokenlog.txt

UnpackPassFile=


PostScript=

ScanScript=

QueueScript=

FeedScript=

ScriptOrder=

ScriptPauseQueue=no

EventInterval=0

/downloads -> /mnt/user/downloads/

/config -> /mnt/user/appdata/NZBGet

 

root@localhost:# /usr/local/emhttp/plugins/dynamix.docker.manager/scripts/docker run -d --name="NZBGet" --net="bridge" -e TZ="Pacific/Auckland" -e HOST_OS="unRAID" -p 6789:6789/tcp -v "/mnt/user/downloads/":"/downloads":rw -v "/mnt/user/appdata/NZBGet":"/config":rw gfjardim/nzbget

 

sonarr config

<Config>
<LogLevel>Info</LogLevel>
<Port>8989</Port>
<UrlBase/>
<BindAddress>*</BindAddress>
<SslPort>9898</SslPort>
<EnableSsl>False</EnableSsl>
<ApiKey>~~~~~~~~~~~~~~~~~~~~~~~</ApiKey>
<AuthenticationMethod>None</AuthenticationMethod>
<Branch>master</Branch>
<LaunchBrowser>True</LaunchBrowser>
<SslCertHash/>
<UpdateMechanism>BuiltIn</UpdateMechanism>
</Config>

 

/tv -> /mnt/user/TV shows/

/downloads -> /mnt/user/downloads/

/config -> /mnt/user/appdata/Sonarr

/dev/rtc -> /dev/rtc

 

root@localhost:# /usr/local/emhttp/plugins/dynamix.docker.manager/scripts/docker run -d --name="Sonarr" --net="bridge" --privileged="true" -e TZ="Pacific/Auckland" -e HOST_OS="unRAID" -e "PUID"="99" -e "PGID"="100" -p 8989:8989/tcp -v "/dev/rtc":"/dev/rtc":ro -v "/mnt/user/TV shows/":"/tv":rw -v "/mnt/user/downloads/":"/downloads":rw -v "/mnt/user/appdata/Sonarr":"/config":rw linuxserver/sonarr

FS5EW34.png

 

hopefully thats what you wanted and not too much

Link to comment

Does Anyone Have a Decent Method for Subtitles?

 

I am deaf in one ear, and I tend to rely a bit on subtitles as sometimes its hard for me to understand the dialogs in shows. With Sickrage, subtitles were downloaded automatically, and they tended to be synchronized as I believe is searches for matches in the release name/group.

 

I know that Sonarr does not yet implement subs. And I keep reading about subliminal being an option. I use Kodi, and while its easy too get subtitles through the kodi interface, it tends to take a few attempts to find properly synchronized subs. Its not a horrible ordeal, but when I am watching with other, it can be frustrating for them as I search and try.

 

If you have a semi-reliable way of getting subs (that do a decent job with downloading properly synchronized files) please share here.

 

Many thanks,

 

H.

Link to comment

A little help please. It seems like a guy wrote a script for getting subtitles and people report success. It requires Subliminal to be installed in the docker. I installed it manually with 'apt-get install -y subliminal', and followed the instructions. But does not work for me.... I did 'chmod +x scripts.sh' on each of script files.

 

I suspect it has to do something with the fact that its operating in a Docker... Perhaps the paths are not transferring properly between script and Docker paths. I think adding the script to the Sonarr Docker would be awesome.

 

https://github.com/ebergama/sonarr-sub-downloader

 

Many thanks,

 

H.

 

Link to comment

i am trying to add a second indexer to Sonarr and I am following the instructions from the indexing site on how to add it to Sonarr, yet after adding the correct info, Sonarr times out saying it is unable to communicate with the indexing site? I have tried several times with the same result. The indexing site says its API is up and available, if I put the API url in a browser it comes back that its up but that my query is incorrect which is normal as its not meant to come from a browser. I can't imagine there is anything in Sonarr that would be preventing the communication but I thought I would put it out here for you guys to see. I checked my firewall and it's not blocking outbound communication to the url for the indexing site which is using HTTPS. Oh and I did stop and restart the Sonarr docker a couple times to see if that would help, it did not. I have another indexing site in Sonarr which works fine btw.

 

This is what I pulled out of the Sonarr logs:

 

[Warn] Newznab: Unable to connect to indexer

 

[v2.0.0.4427] System.Net.WebException: The request timed out

at System.Net.HttpWebRequest.EndGetResponse (IAsyncResult asyncResult) <0x41cb5740 + 0x00197> in <filename unknown>:0

at System.Net.HttpWebRequest.GetResponse () <0x41cb33b0 + 0x0005a> in <filename unknown>:0

 

one.Common.Http.Dispatchers.ManagedHttpDispatcher.GetResponse (NzbDrone.Common.Http.HttpRequest request, System.Net.CookieContainer cookies) [0x000fe] in M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Http\Dispatchers\ManagedHttpDispatcher.cs:61

 

 

Thoughts?

Link to comment

i am trying to add a second indexer to Sonarr and I am following the instructions from the indexing site on how to add it to Sonarr, yet after adding the correct info, Sonarr times out saying it is unable to communicate with the indexing site? I have tried several times with the same result. The indexing site says its API is up and available, if I put the API url in a browser it comes back that its up but that my query is incorrect which is normal as its not meant to come from a browser. I can't imagine there is anything in Sonarr that would be preventing the communication but I thought I would put it out here for you guys to see. I checked my firewall and it's not blocking outbound communication to the url for the indexing site which is using HTTPS. Oh and I did stop and restart the Sonarr docker a couple times to see if that would help, it did not. I have another indexing site in Sonarr which works fine btw.

 

This is what I pulled out of the Sonarr logs:

 

[Warn] Newznab: Unable to connect to indexer

 

[v2.0.0.4427] System.Net.WebException: The request timed out

at System.Net.HttpWebRequest.EndGetResponse (IAsyncResult asyncResult) <0x41cb5740 + 0x00197> in <filename unknown>:0

at System.Net.HttpWebRequest.GetResponse () <0x41cb33b0 + 0x0005a> in <filename unknown>:0

 

one.Common.Http.Dispatchers.ManagedHttpDispatcher.GetResponse (NzbDrone.Common.Http.HttpRequest request, System.Net.CookieContainer cookies) [0x000fe] in M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Http\Dispatchers\ManagedHttpDispatcher.cs:61

 

 

Thoughts?

 

Impossible to say really, might be worth posting on the Sonarr site tbh.

Link to comment

Post your docker run command for this and sab/nzbget and some more details, like how you have folders set up in sab/nzbget and sonarr.  Never known an error like this that wasn't down to config issue.

 

hey man, have you had a chance to look over the stuff I posted??

 

Yeah, nothing springs out.  How are your shares configured?  Only other thing I can think of, but even that shouldn't be a problem as your array is relatively empty.  Worth checking your docker.img (settings ==> docker ==> advanced view) and seeing how full that is.

Link to comment

Post your docker run command for this and sab/nzbget and some more details, like how you have folders set up in sab/nzbget and sonarr.  Never known an error like this that wasn't down to config issue.

 

hey man, have you had a chance to look over the stuff I posted??

 

Yeah, nothing springs out.  How are your shares configured?  Only other thing I can think of, but even that shouldn't be a problem as your array is relatively empty.  Worth checking your docker.img (settings ==> docker ==> advanced view) and seeing how full that is.

 

 

btrfs filesystem show:

 

    Label: none  uuid: 63962c44-e805-41b3-bdbe-1d30c99765b3

    Total devices 1 FS bytes used 7.14GiB

    devid    1 size 20.00GiB used 15.38GiB path /dev/loop0

btrfs scrub status:

 

    scrub status for 63962c44-e805-41b3-bdbe-1d30c99765b3

    scrub started at Sat Dec 31 23:18:17 2016 and finished after 00:01:10

    total bytes scrubbed: 7.40GiB with 0 errors

 

is this what you mean for the shares??

agud72o.png

 

 

Link to comment

I have no idea then.

 

Although this looks like some windows path?!

 

M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Disk\DiskProviderBase.cs

 

Sent from my LG-H815 using Tapatalk

 

hey thanks for trying, I thought Id use some of my windows logic and reset everything, deleted the docker image... not my best decision...

right now I have:

 

/tv/ is not writeable by user abc

abc is not a user when I tried to give ownership by the command line so I think i've made it worse...

Link to comment

I have no idea then.

 

Although this looks like some windows path?!

 

M:\BuildAgent\work\6c3239faf2b92630\src\NzbDrone.Common\Disk\DiskProviderBase.cs

 

Sent from my LG-H815 using Tapatalk

 

hey thanks for trying, I thought Id use some of my windows logic and reset everything, deleted the docker image... not my best decision...

right now I have:

 

/tv/ is not writeable by user abc

abc is not a user when I tried to give ownership by the command line so I think i've made it worse...

 

abc is just a pseudonym for nobody from the group users which is the default user for Unraid......

 

I can't think much more about it right now, I'm kinda drunk...  :P

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.