[Support] HaveAGitGat - Tdarr: Audio/Video Library Analytics & Transcode Automation


Recommended Posts

So I just download this and testing it out. I have 5 plugin transcode options and I notice when transcoding they are being done one by one, first the priority 1 transcodes, and copies over, then priority 2 and again copies over, then #3 and so on, So because I have 5 plugins it transcodes it 5 times?

 

If that is the way is to work, is there any way to combine them into on?

 

Maybe someone can help me out and create one that has them combined into the least amount as possible.

 

Here are the ones I have and the order they are in. 

  1. Tdarr_Plugin_MC93_Migz3CleanAudio Migz-Clean audio streams ----- This plugin keeps only specified language audio tracks & can tags those that have an unknown language.
  2. Tdarr_Plugin_MC93_Migz4CleanSubs Migz-Clean subtitle streams ----- This plugin keeps only specified language subtitle tracks & can tag those that have an unknown language.
  3. Tdarr_Plugin_MC93_Migz2CleanTitle Migz-Clean title metadata ----- This plugin removes title metadata from video/audio/subtitles, if it exists. Video checking is mandatory, audio and subtitles are optional.
  4. Tdarr_Plugin_MC93_Migz6OrderStreams Migz-Order Streams ----- Orders streams into Video first, then Audio (2ch, 6ch, 8ch) and finally Subtitles.
  5. Tdarr_Plugin_MC93_Migz1FFMPEG Migz-Transcode Using Nvidia GPU & FFMPEG ----- Files not in H265 will be transcoded into H265 using Nvidia GPU with ffmpeg, settings are dependant on file bitrate, working by the logic that H265 can support the same ammount of data at half the bitrate of H264. NVDEC & NVENC compatible GPU required.

 

 

Link to comment

Actually if someone who knows how to create a plugin can help me. this is what I am after:

 

  1. Reorder streams to, video, then audio, then subtitles
  2. ENG audio only (all tracks except commentary if possible)
  3. ENG subtitles only (with forced on foreign audio)
  4. Clear out all meta data of video title and such so that plex will read the file name and not bad meta data.
  5. if larger than 1080p convert to 1080p and in H265, if 1080p or less then convert to h265
Link to comment
11 hours ago, almulder said:

Actually if someone who knows how to create a plugin can help me. this is what I am after:

 

  1. Reorder streams to, video, then audio, then subtitles
  2. ENG audio only (all tracks except commentary if possible)
  3. ENG subtitles only (with forced on foreign audio)
  4. Clear out all meta data of video title and such so that plex will read the file name and not bad meta data.
  5. if larger than 1080p convert to 1080p and in H265, if 1080p or less then convert to h265

using the plugin creator you can do this you just need to look up the help section of the trans coder for the Arguments you need

Link to comment

Has anyone had an issue with Tdarr not displaying any information (queue, history, stats, etc.). I have been using the container for a while and have 8000-odd transcodes, and as far as I can tell the app is still working, transcoding working properly, but it just doesn't print anything. I just updated to 1.109 hoping that would fix the issue, but is still present.

Link to comment
5 hours ago, GigaGrim said:

Has anyone had an issue with Tdarr not displaying any information (queue, history, stats, etc.). I have been using the container for a while and have 8000-odd transcodes, and as far as I can tell the app is still working, transcoding working properly, but it just doesn't print anything. I just updated to 1.109 hoping that would fix the issue, but is still present.

if you have been doing backups the fix it simple.

click Dev and Clear DB (halfway down the page)

restart the docker

then restore the backup

Restart again

Should fix it

Link to comment
On 5/6/2020 at 1:19 AM, Rand said:

Moving the tdarrdb directory off the cache pool fixed it for me. If it sits directly on the array or on a unassigned drive then it's only writing about 4KB/sec. I have no idea why this fixed it.

Are you using btrfs on your unassigned drive?

 

I've reformatted my cache drive as xfs and it's now writing at 4 KB/s as you saw.

Link to comment
11 hours ago, nicksphone said:

using the plugin creator you can do this you just need to look up the help section of the trans coder for the Arguments you need

I have done it for using handbreak option, but how can I make it use my GPU, I don't see any options to enable that. I have also looked at the other plugins that I was using to look at combining them and they made me really lost.

 

Link to comment
I have done it for using handbreak option, but how can I make it use my GPU, I don't see any options to enable that. I have also looked at the other plugins that I was using to look at combining them and they made me really lost.
 
First important question: are you using the tdarr_aio container? It will only work with that one.

Gesendet von meinem MI 8 mit Tapatalk

Link to comment
11 hours ago, darkreeper said:

First important question: are you using the tdarr_aio container? It will only work with that one.

Gesendet von meinem MI 8 mit Tapatalk
 

Not sure. How do I check and set that? I tried looking at 2 other plugins that use gpu (That work) and see no mention of tdarr_aio.

 

Also do you know how to show forced subtitles, and add a filter to skip if the file is already H265 and at max width and height?

 

Here is what I have:

    var fs = require('fs');
    var path = require('path');
    if (fs.existsSync(path.join(process.cwd() , '/npm'))) {
    var rootModules = path.join(process.cwd() , '/npm/node_modules/')
    } else{
    var rootModules = ''
    }
   
    const importFresh = require(rootModules+'import-fresh');
    const library = importFresh('../methods/library.js')
      
    module.exports.details = function details() {

          return {
            id: "N7N25RIjg",
            Name: "Testing",
            Type: "Video",
            Operation: "Transcode",
            Description: "Testing",
            Version: "",
            Link: ""
          }
        }



    module.exports.plugin = function plugin(file) {
        
        
          //Must return this object at some point
        
          var response = {
        
             processFile : false,
             preset : '',
             container : '.mkv',
             handBrakeMode : false,
             FFmpegMode : true,
             reQueueAfter : true,
             infoLog : '',
        
          }

          response.infoLog += "" + library.filters.filterByResolution(file,"exclude","480p,576p,720p,1080p").note + library.filters.filterByCodec(file,"exclude","h265").note
        
          
           if((true &&library.filters.filterByResolution(file,"exclude","480p,576p,720p,1080p").outcome === true &&library.filters.filterByCodec(file,"exclude","h265").outcome === true) || file.forceProcessing === true){

             
              response.preset = '--encoder x265 --quality 24 --encoder-level "5.1" --encoder-preset 6 --maxWidth 1920 --maxHeight 1080 --aencoder copy --all-audio --audio-copy-mask aac,ac3,eac3,truehd,dts,dtshd,mp3,flac --audio-lang-list eng --all-subtitles --subtitle-lang-list eng'
              response.container = '.mkv'
              response.handBrakeMode = true
              response.FFmpegMode = false

              response.reQueueAfter = true;
              response.processFile = true
              response.infoLog +=  "File is being transcoded using custom arguments \n"
              return response
        
        
             }else{

              response.processFile = false;
              response.infoLog += "File is being transcoded using custom arguments \n"
              return response
        
        
             }
        }

      

 

Edited by almulder
Link to comment
2 hours ago, darkreeper said:

Tdarr_aio is not a plugin. It is the docker container itself.

Gesendet von meinem MI 8 mit Tapatalk
 

LOL Duh, I knew that, and yes I am using that docker, but it only transcodes via the gpu on 2 plugins, all others are done on cpu, and I cant figure out how to get it ot use the gpu.

Link to comment
LOL Duh, I knew that, and yes I am using that docker, but it only transcodes via the gpu on 2 plugins, all others are done on cpu, and I cant figure out how to get it ot use the gpu.

Have you checked your GPU supports the codecs you want to use?

And by the way: what GPU are you using?

 

Gesendet von meinem MI 8 mit Tapatalk

 

 

 

Link to comment
44 minutes ago, darkreeper said:

Have you checked your GPU supports the codecs you want to use?

And by the way: what GPU are you using?

 

Gesendet von meinem MI 8 mit Tapatalk

 

 

 

It uses the GPU when using a community plugin, but not if I create one. (GTX 1050Ti)

 

Also I have added a filter to exclude 480p,576p,720p,1080p, but it still grabs all my movies files. I don't understand why.

 

Here is my current custom local plugin:


    var fs = require('fs');
    var path = require('path');
    if (fs.existsSync(path.join(process.cwd() , '/npm'))) {
    var rootModules = path.join(process.cwd() , '/npm/node_modules/')
    } else{
    var rootModules = ''
    }
   
    const importFresh = require(rootModules+'import-fresh');
    const library = importFresh('../methods/library.js')
      
    module.exports.details = function details() {

          return {
            id: "lqc85voaT",
            Name: "Downsize >1080p",
            Type: "Video",
            Operation: "Transcode",
            Description: "",
            Version: "",
            Link: ""
          }
        }



    module.exports.plugin = function plugin(file) {
        
        
          //Must return this object at some point
        
          var response = {
        
             processFile : false,
             preset : '',
             container : '.mkv',
             handBrakeMode : false,
             FFmpegMode : true,
             reQueueAfter : true,
             infoLog : '',
        
          }

          response.infoLog += "" + library.filters.filterByResolution(file,"exclude","480p,576p,720p,1080p").note
        
          
          if((true &&library.filters.filterByResolution(file,"exclude","480p,576p,720p,1080p").outcome === true) || file.forceProcessing === true){

             
              response.preset = '--encoder x265 --quality 24 --encoder-level "5.1" --encoder-preset 6 --maxWidth 1920 --maxHeight 1080 --aencoder copy --all-audio --audio-copy-mask aac,ac3,eac3,truehd,dts,dtshd,mp3,flac --audio-lang-list eng --all-subtitles --subtitle-lang-list eng --subtitle-forced'
              response.container = '.mkv'
              response.handBrakeMode = true
              response.FFmpegMode = false

              response.reQueueAfter = true;
              response.processFile = true
              response.infoLog +=  "File is being transcoded using custom arguments \n"
              return response
        
        
             }else{

              response.processFile = false;
              response.infoLog += "File is being transcoded using custom arguments \n"
              return response
        
        
             }
        }

 

Edited by almulder
Link to comment
13 hours ago, almulder said:

It uses the GPU when using a community plugin, but not if I create one. (GTX 1050Ti)

 

Also I have added a filter to exclude 480p,576p,720p,1080p, but it still grabs all my movies files. I don't understand why.

 

Here is my current custom local plugin:



    var fs = require('fs');
    var path = require('path');
    if (fs.existsSync(path.join(process.cwd() , '/npm'))) {
    var rootModules = path.join(process.cwd() , '/npm/node_modules/')
    } else{
    var rootModules = ''
    }
   
    const importFresh = require(rootModules+'import-fresh');
    const library = importFresh('../methods/library.js')
      
    module.exports.details = function details() {

          return {
            id: "lqc85voaT",
            Name: "Downsize >1080p",
            Type: "Video",
            Operation: "Transcode",
            Description: "",
            Version: "",
            Link: ""
          }
        }



    module.exports.plugin = function plugin(file) {
        
        
          //Must return this object at some point
        
          var response = {
        
             processFile : false,
             preset : '',
             container : '.mkv',
             handBrakeMode : false,
             FFmpegMode : true,
             reQueueAfter : true,
             infoLog : '',
        
          }

          response.infoLog += "" + library.filters.filterByResolution(file,"exclude","480p,576p,720p,1080p").note
        
          
          if((true &&library.filters.filterByResolution(file,"exclude","480p,576p,720p,1080p").outcome === true) || file.forceProcessing === true){

             
              response.preset = '--encoder x265 --quality 24 --encoder-level "5.1" --encoder-preset 6 --maxWidth 1920 --maxHeight 1080 --aencoder copy --all-audio --audio-copy-mask aac,ac3,eac3,truehd,dts,dtshd,mp3,flac --audio-lang-list eng --all-subtitles --subtitle-lang-list eng --subtitle-forced'
              response.container = '.mkv'
              response.handBrakeMode = true
              response.FFmpegMode = false

              response.reQueueAfter = true;
              response.processFile = true
              response.infoLog +=  "File is being transcoded using custom arguments \n"
              return response
        
        
             }else{

              response.processFile = false;
              response.infoLog += "File is being transcoded using custom arguments \n"
              return response
        
        
             }
        }

 

have you added the gpu to the container?

Link to comment

Beta v1.1091 release [24th May 2020]:

 

All containers are now the same (tdarr, tdarr_aio, tdarr_aio:qsv) and are based on the tdarr_aio:qsv container which supports NVENC and QSV hardware transcoding.

 

tdarr_aio and tdarr_aio:qsv users, you can continue using those containers as normal and will receive updates. You don't need to do anything.

 

Users who were previously using the tdarr container will need to set up the container again and restore from a backup. There is now no need for a separate MongoDB container.

 

Please see the following for help:

http://tdarr.io/tools

https://github.com/HaveAGitGat/Tdarr/wiki/2---Installation

Link to comment

I used to love tdarr and even did some of the testing for it before it was released to CA on unRAID.

 

This past weekend, I updated my container and started to get Docker disk space warnings. My jaw dropped when I saw the size of Tdarr after the latest image pull:

 

image.png.f8df799afa9db5c1d0f2be5fb99cacbc.png

 

tdarr is now over >5.5 GB!

 

This is the alpine version, not even the Ubuntu OS base. Even at that, we're now looking at an image that's larger than a full Ubuntu GUI desktop OS install.

Edit: Actually, it looks like the image uses an Ubuntu base by default, regardless of what Community Apps says.

 

When did we lose our minds?

 

-TorqueWrench

Edited by T0rqueWr3nch
  • Like 1
Link to comment
19 hours ago, T0rqueWr3nch said:

I used to love tdarr and even did some of the testing for it before it was released to CA on unRAID.

 

This past weekend, I updated my container and started to get Docker disk space warnings. My jaw dropped when I saw the size of Tdarr after the latest image pull:

 

image.png.f8df799afa9db5c1d0f2be5fb99cacbc.png

 

tdarr is now over >5.5 GB!

 

This is the alpine version, not even the Ubuntu OS base. Even at that, we're now looking at an image that's larger than a full Ubuntu GUI desktop OS install.

Edit: Actually, it looks like the image uses an Ubuntu base by default, regardless of what Community Apps says.

 

When did we lose our minds?

 

-TorqueWrench

This is hard drive space its using just up it a few gigs to docker image size stop the errors mine has gotten to over 9 gigs before and come down. i have had simple dockers use 3 gigs for no reasion before. i think its more to do with how unraid handles docker than the docker its self.

Link to comment
On 5/26/2020 at 4:29 PM, T0rqueWr3nch said:

I used to love tdarr and even did some of the testing for it before it was released to CA on unRAID.

 

This past weekend, I updated my container and started to get Docker disk space warnings. My jaw dropped when I saw the size of Tdarr after the latest image pull:

 

image.png.f8df799afa9db5c1d0f2be5fb99cacbc.png

 

tdarr is now over >5.5 GB!

 

This is the alpine version, not even the Ubuntu OS base. Even at that, we're now looking at an image that's larger than a full Ubuntu GUI desktop OS install.

Edit: Actually, it looks like the image uses an Ubuntu base by default, regardless of what Community Apps says.

 

When did we lose our minds?

 

-TorqueWrench

I had the same docker warnings, this is by far the largest docker container I have. Any chance this could be checked?

Link to comment
1 hour ago, paul.barrett said:

I had the same docker warnings, this is by far the largest docker container I have. Any chance this could be checked?

the docker image only uses the amount of hd space is currently used like this for instance is using 5.2 gb as 315meg is writeable. you can set your docker image size to be 100 gigs and will only use the amount it needs at that time. mine is set to 80 and using 21 so i stop getting those messages for when a temp log file gets big. 

Link to comment
4 hours ago, nicksphone said:

the docker image only uses the amount of hd space is currently used like this for instance is using 5.2 gb as 315meg is writeable. you can set your docker image size to be 100 gigs and will only use the amount it needs at that time. mine is set to 80 and using 21 so i stop getting those messages for when a temp log file gets big. 

Hey Nick,

 

That's not what we're complaining about. We're complaining that the size of the container itself is so large. 1 GB containers are unheard of. Over 5 gigs? That's astronomical.

Link to comment
1 hour ago, T0rqueWr3nch said:

Hey Nick,

 

That's not what we're complaining about. We're complaining that the size of the container itself is so large. 1 GB containers are unheard of. Over 5 gigs? That's astronomical.

i have 5 that are over a gig. binhex-krusader a simple file manager is 2.47 GB in my docker img since this is still beta hes probably has debug tools still in the build and proabbly not tweaked the size more by doing no-installs on recommended but not required dependency's as this is not near a first version yet. so lets give him time to get all the bugs out and it almost finished beta before we give him agro on size a few gigs bulk imo is worth the saving 5tb of space and climbing on my server. 

Edited by nicksphone
typo
Link to comment
On 5/15/2020 at 7:58 PM, HaveAGitGat said:

Are you using handbrake to transcode? Handbrake only encodes using GPU, not decode, so perhaps that is why it is showing 40-50% as the hardware is not being maxed out.

Any ideas on it or is it just smi not reporting correctly for ffmpeg? either way just something i found nothing that has stopped me from saving a ton of space. thanks for all the hard work. saved me a few pennies on drives. 

Link to comment
  • 2 weeks later...

Tdarr seems to be constantly writing data to my cache but not actually writing anything.  If Tdarr is running I constantly see a 2MB/s write going to my cache.  It stops immediately after stopping the container.  My libraries are set to only operate for 4 hours a day and this write activity seems to be constant.  How can I get it to stop?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.