Sonarr -> Downloads -> re-encode : Optimum structuring to limit copies/fragmentation?


Nirin

Recommended Posts

Ok so I know the trash guides setup has folder structures which put TV/Movies/Downloads folders all on a single share, so that hardlinks and atomic moves work. This seems mostly to be a benefit to torrents (which I don't use much) but is also the simplest setup method so probably the one I'll go for (still considering it though, as having different shares for TV and Movies would allow for better split levels, but thats another topic). 

 

However I use fileflows to re-encode some files (depending on source codec, and existing files size etc). And I've realised that if sonar downloads and moves the TV show onto the array, and then fileflows (or tdarr or whatever, I use fileflows) re-encodes the files, it would create new files and delete the old ones. Which would move the nicely-contiguous files onto new parts of the disks, eventually scattering them all over the place (I'm talking months/years of this happening with various folders of stuff and updated files and such). 

 

I know one solution is to have the files download to one download folder, then be encoded and moved to -another- post-download folder by sonarr, which fileflows uses as a watch folder, and is then encoded and moved to the final destination. However this means sonarr loses the file (you can set it to unmonitor so it doesn't download again at least) but will mean sonarr will no longer have an up-to-date catalogue of all files.

 

Is there a better solution? I did think that most of the issue comes from the files being re-encoded after being put on the array. Doing the encode while still in cache would avoid this. However I don't think there's any way I can force this to happen? Sure most of the time the encoded might happen within a couple hours, but sometimes maybe it will take longer (if there's a queue). And also there might be existing files on the array that need to be re-encoded to make them more efficient (though the only solution there is probably unbalance, or manual moves, I guess).

 

Figured I'd ask the question and hope there's a simple solution!

 

Edit: I don't mean fragmentation in the old 'run defrag drives' sense, as I think unraid deals with that anyway... I just don't know the right term for files being scattered around randomly. 

Edited by Nirin
Link to comment
  • 1 month later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.