[Support] HaveAGitGat - Tdarr: Audio/Video Library Analytics & Transcode Automation


Recommended Posts

Quick question, is there a way to scan (no writing, just reading) video files and report the audio stream sizes by language?

 

I'm using unmanic to remove unwanted audio streams and the issue is that it copies everything to cache and then back to the array, regardless of whether any work needs to be done. So even for files that have no audio tracks to be removed. So if I let it rip through my library it will essentially copy-paste like 50TB of data which is obviously not desirable.

 

I haven't used tdar before so I'm just wondering if it's possible for it to scan the library (without copy+write) and report back to me the audio stream sizes so I can then tell it which files to process.

Link to comment
2 hours ago, g0nz0 said:

Yup, same here. Hopefully it will resolve itself soon.

 

Ah nice, just making sure. I was making some changes to my firewall when I noticed it was bombing, wasn't sure if I broke something on my end or what. I see the package was updated yesterday, so probably something from that. 

Link to comment
6 hours ago, Civic1201 said:

For the "issue" docker not available try this:

 

 

Here's a one-liner from the above thread that fixes the docker update issue. It does the same thing as manually editing the file. Just ssh to your box / use the Web terminal and paste the command.

 

sed -i "s#distribution.manifest.v2+json'#distribution.manifest.v2+json,application/vnd.oci.image.index.v1+json'#" /usr/local/emhttp/plugins/dynamix.docker.manager/include/DockerClient.php

 

This will not persist when you reboot Unraid. The real fix will be in the in the next Unraid release.  

Link to comment

I have a general question related to the "use cache settings". My libraries and temp for tdarr are currently set to use the cache pool.

So when I start transcoding tdarr will use the cache while transcoding and I will have to use the mover to transfer the transcoded files back to the share folder.

Should / can I disable cache settings for the libraries so files will be written directly to the share after transcoding?

Link to comment
  • 4 weeks later...

So I recently upgraded an early Ryzen build with a 12700k because I wanted to use QSV and eliminate the need for a graphics card. I re-setup HW Decoding in Jellyfin & Plex and it works beautifully (verified with INTEL_GPU_TOP via console)

 

However, TDARR for some reason will not transcode anything using QSV, i've tried two different plugins: Boosh-Transcode using QSV GPU & FFMPEG & FFMPEG VAAPI HEVC Transcode.

 

  • this is using the separate node docker
  • yes I have added --device=/dev/dri to the node (and also to the server docker just incase)
  • it worked fine with NVENC on the old ryzen build haven't tried it with the current build as i removed the graphics card
  • Jellyfin and Plex work fine with --device=/dev/dri and i validated it in the console using INTEL_GPU_TOP
  • On the Boosh transcoder it does "work" but it goes at like 10,000 frames a second and doesnt ever actually transcode anything.

i'm out of ideas on this since it works on Jellyfin & Plex but not TDARR. Help plz?

 

2023-03-22T11:17:52.152Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:[Step W03] [C1] Analysing file - running plugins
2023-03-22T11:17:52.152Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:transcode task, scanning for extra file details before transcode
2023-03-22T11:17:52.153Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Updating file properties using mkvpropedit
2023-03-22T11:17:52.153Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Updated file properties using mkvpropedit
2023-03-22T11:17:53.154Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Scan complete
2023-03-22T11:17:53.156Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Updating Node relay: Processing
2023-03-22T11:17:53.158Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:[1/2] Checking file frame count
2023-03-22T11:17:53.159Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Transcode task, determining transcode settings
2023-03-22T11:17:53.160Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:[2/2] Frame count 0
2023-03-22T11:17:53.161Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Plugin stack selected
2023-03-22T11:17:53.161Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:[1/5] Reading plugin
2023-03-22T11:17:53.162Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Plugin: Tdarr_Plugin_lmg1_Reorder_Streams
2023-03-22T11:17:53.162Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:[2/5] Plugin read
2023-03-22T11:17:53.162Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:[3/5] Installing dependencies
2023-03-22T11:17:53.163Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:[4/5] Running plugin
2023-03-22T11:17:53.163Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:[5/5] Running plugin finished
2023-03-22T11:17:53.164Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Plugin: Tdarr_Plugin_Mthr_VaapiHEVCTranscode\
2023-03-22T11:17:53.164Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Worker log:
2023-03-22T11:17:53.164Z Pre-processing - Tdarr_Plugin_lmg1_Reorder_Streams
2023-03-22T11:17:53.164Z File has video in first stream
2023-03-22T11:17:53.164Z File meets conditions!
2023-03-22T11:17:53.164Z ☒Plugin error! TypeError: _0x868168[_0x1edbf5(...)] is not a function
2023-03-22T11:17:53.164Z
2023-03-22T11:17:53.164Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:[2/5] Plugin read
2023-03-22T11:17:53.165Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:[1/5] Reading plugin
2023-03-22T11:17:53.165Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Worker config: {
2023-03-22T11:17:53.165Z "processFile": false,
2023-03-22T11:17:53.165Z "preset": "",
2023-03-22T11:17:53.165Z "container": "",
2023-03-22T11:17:53.165Z "handBrakeMode": false,
2023-03-22T11:17:53.165Z "FFmpegMode": true,
2023-03-22T11:17:53.165Z "reQueueAfter": false,
2023-03-22T11:17:53.165Z "infoLog": "File has video in first stream\n File meets conditions!\n",
2023-03-22T11:17:53.165Z "handbrakeMode": "",
2023-03-22T11:17:53.165Z "ffmpegMode": true,
2023-03-22T11:17:53.165Z "error": true
2023-03-22T11:17:53.165Z }
2023-03-22T11:17:53.165Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Error TypeError: _0x868168[_0x1edbf5(...)] is not a function
2023-03-22T11:17:53.166Z 2xKFJ7DUZ:Node[girthquake]:Worker[OmVX8dHc5]:Worker config [-error-]:

 

1

2023-03-22T11:14:36.182Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:[Step W05] [C1] Launching subworker

2

2023-03-22T11:14:36.182Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:Preparing to launch subworker

3

2023-03-22T11:14:36.182Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:Subworker launched

4

2023-03-22T11:14:36.183Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:[1/3] Sending command to subworker

5

2023-03-22T11:14:36.183Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:[2/3] HandBrakeCLI -i /mnt/media/Convert/Romeo + Juliet (1996) Bluray-1080p Proper.mkv -o /tmp/Romeo + Juliet (1996) Bluray-1080p Proper-TdarrCacheFile-pbaqtSdEm1.mp4 --scan

6

2023-03-22T11:14:36.183Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:[3/3] Command sent

7

2023-03-22T11:14:36.184Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:To see live CLI output, enable 'Log full FFmpeg/HandBrake output' in the staging section on the Tdarr tab before the job starts. Note this could increase the job report size substantially.

8

2023-03-22T11:14:36.184Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:Subworker:Online

9

2023-03-22T11:14:36.184Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:Subworker:Receiving transcode settings

10

2023-03-22T11:14:36.184Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:Subworker:Running CLI

11

2023-03-22T11:14:37.192Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:Subworker:a.Thread closed, code: 0

12

2023-03-22T11:14:37.192Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:Subworker exit approved, killing subworker

13

2023-03-22T11:14:37.193Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:Subworker killed

14

2023-03-22T11:14:37.193Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:b.Thread closed, code: 0

15

2023-03-22T11:14:37.194Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:CLI code: 0

16

2023-03-22T11:14:37.195Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:Last 200 lines of CLI log:

17

2023-03-22T11:14:37.195Z XY6NuvfEkK:Node[girthquake]:Worker[ZYarUfcd_]:[11:14:33] Compile-time hardening features are enabled

18

2023-03-22T11:14:37.195Z

19

2023-03-22T11:14:37.195Z [11:14:33] hb_display_init: attempting VA driver 'iHD'

20

2023-03-22T11:14:37.195Z

21

2023-03-22T11:14:37.195Z libva info: VA-API version 1.16.0

22

2023-03-22T11:14:37.195Z libva info: User environment variable requested driver 'iHD'

23

2023-03-22T11:14:37.195Z libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so

24

2023-03-22T11:14:37.195Z

25

2023-03-22T11:14:37.195Z libva info: Found init function __vaDriverInit_1_16

26

2023-03-22T11:14:37.195Z

27

2023-03-22T11:14:37.195Z libva info: va_openDriver() returns 0

28

2023-03-22T11:14:37.195Z

29

2023-03-22T11:14:37.195Z [11:14:33] hb_display_init: using VA driver 'iHD'

30

2023-03-22T11:14:37.195Z libva info: VA-API version 1.16.0

31

2023-03-22T11:14:37.195Z libva info: User environment variable requested driver 'iHD'

32

2023-03-22T11:14:37.195Z libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so

33

2023-03-22T11:14:37.195Z

34

2023-03-22T11:14:37.195Z libva info: Found init function __vaDriverInit_1_16

35

2023-03-22T11:14:37.195Z

36

2023-03-22T11:14:37.195Z libva info: va_openDriver() returns 0

37

2023-03-22T11:14:37.195Z

38

2023-03-22T11:14:37.195Z [11:14:33] qsv: is available on this system

39

2023-03-22T11:14:37.195Z

40

2023-03-22T11:14:37.195Z Cannot load libnvidia-encode.so.1

41

2023-03-22T11:14:37.195Z

42

2023-03-22T11:14:37.195Z [11:14:33] hb_init: starting libhb thread

43

2023-03-22T11:14:37.195Z

44

2023-03-22T11:14:37.195Z [11:14:33] thread 14ebe51ff700 started ("libhb")

45

2023-03-22T11:14:37.195Z HandBrake 1.6.1 (2023021100) - Linux x86_64 - https://handbrake.fr

46

2023-03-22T11:14:37.195Z 20 CPUs detected

47

2023-03-22T11:14:37.195Z

48

2023-03-22T11:14:37.195Z Opening /mnt/media/Convert/Romeo + Juliet (1996) Bluray-1080p Proper.mkv...

49

2023-03-22T11:14:37.195Z

50

2023-03-22T11:14:37.195Z [11:14:33] CPU: 12th Gen Intel(R) Core(TM) i7-12700K

51

2023-03-22T11:14:37.195Z [11:14:33] - Intel microarchitecture Alder Lake performance hybrid architecture

52

2023-03-22T11:14:37.195Z

53

2023-03-22T11:14:37.195Z [11:14:33] - logical processor count: 20

54

2023-03-22T11:14:37.195Z [11:14:33] Intel Quick Sync Video support: yes

55

2023-03-22T11:14:37.195Z [11:14:33] Intel Quick Sync Video integrated adapter with index 0

56

2023-03-22T11:14:37.195Z [11:14:33] Impl mfxhw64 library path: /usr/lib/x86_64-linux-gnu/libmfxhw64.so.1.35

57

2023-03-22T11:14:37.195Z [11:14:33] - Intel Media SDK hardware: API 1.35 (minimum: 1.3)

58

2023-03-22T11:14:37.195Z [11:14:33] - Decode support: h264 hevc (8bit: yes, 10bit: yes) av1 (8bit: yes, 10bit: yes)

59

2023-03-22T11:14:37.195Z [11:14:33] - H.264 encoder: yes

60

2023-03-22T11:14:37.195Z [11:14:33] - preferred implementation: hardware (any) via ANY

61

2023-03-22T11:14:37.195Z [11:14:33] - capabilities (hardware): lowpower breftype vsinfo chromalocinfo opt1 opt2+mbbrc+extbrc+trellis+repeatpps+ib_adapt+nmpslice

62

2023-03-22T11:14:37.195Z [11:14:33] - H.265 encoder: yes (8bit: yes, 10bit: yes)

63

2023-03-22T11:14:37.195Z [11:14:33] - preferred implementation: hardware (any) via ANY

64

2023-03-22T11:14:37.195Z [11:14:33] - capabilities (hardware): lowpower bpyramid vsinfo masteringinfo cllinfo opt1

65

2023-03-22T11:14:37.195Z [11:14:33] - AV1 encoder: no

66

2023-03-22T11:14:37.195Z [11:14:33] hb_scan: path=/mnt/media/Convert/Romeo + Juliet (1996) Bluray-1080p Proper.mkv, title_index=1

67

2023-03-22T11:14:37.195Z

68

2023-03-22T11:14:37.195Z udfread ERROR: ECMA 167 Volume Recognition failed

69

2023-03-22T11:14:37.195Z

70

2023-03-22T11:14:37.195Z disc.c:333: failed opening UDF image /mnt/media/Convert/Romeo + Juliet (1996) Bluray-1080p Proper.mkv

71

2023-03-22T11:14:37.195Z

72

2023-03-22T11:14:37.195Z disc.c:437: error opening file BDMV/index.bdmv

73

2023-03-22T11:14:37.195Z

74

2023-03-22T11:14:37.195Z disc.c:437: error opening file BDMV/BACKUP/index.bdmv

75

2023-03-22T11:14:37.195Z

76

2023-03-22T11:14:37.195Z [11:14:33] bd: not a bd - trying as a stream/file instead

77

2023-03-22T11:14:37.195Z

78

2023-03-22T11:14:37.195Z libdvdread: DVDOpenFileUDF:UDFFindFile /VIDEO_TS/VIDEO_TS.IFO failed

79

2023-03-22T11:14:37.195Z libdvdnav: vm: vm: failed to read VIDEO_TS.IFO

80

2023-03-22T11:14:37.195Z

81

2023-03-22T11:14:37.195Z [11:14:33] dvd: not a dvd - trying as a stream/file instead

82

2023-03-22T11:14:37.195Z

83

2023-03-22T11:14:37.195Z Input #0, matroska,webm, from '/mnt/media/Convert/Romeo + Juliet (1996) Bluray-1080p Proper.mkv':

84

2023-03-22T11:14:37.195Z Metadata:

85

2023-03-22T11:14:37.195Z title : Romeo.+.Juliet.1996.REPACK.1080p.BluRay.x264-OFT

86

2023-03-22T11:14:37.195Z creation_time : 2022-10-05T02:40:33.000000Z

87

2023-03-22T11:14:37.195Z ENCODER : Lavf58.76.100

88

2023-03-22T11:14:37.195Z Duration: 02:00:10.66, start: 0.000000, bitrate: 6195 kb/s

89

2023-03-22T11:14:37.195Z Chapters:

90

2023-03-22T11:14:37.195Z Chapter #0:0: start 0.000000,

91

2023-03-22T11:14:37.195Z end 577.368000

92

2023-03-22T11:14:37.195Z Metadata:

93

2023-03-22T11:14:37.195Z title : The Prologue

94

2023-03-22T11:14:37.195Z Chapter #0:1: start 577.368000, end 817.150000

95

2023-03-22T11:14:37.195Z Metadata:

96

2023-03-22T11:14:37.195Z title : Sycamore Grove

97

2023-03-22T11:14:37.195Z Chapter #0:2: start 817.150000, end 868.242000

98

2023-03-22T11:14:37.195Z Metadata:

99

2023-03-22T11:14:37.195Z title : Bachelor of the Year

100

2023-03-22T11:14:37.195Z Chapter #0:3: start 868.242000, end 958.458000

101

2023-03-22T11:14:37.195Z Metadata:

102

2023-03-22T11:14:37.195Z title : The Pool Hall

103

2023-03-22T11:14:37.195Z Chapter #0:4: start 958.458000, end 1145.770000

104

2023-03-22T11:14:37.195Z Metadata:

105

2023-03-22T11:14:37.195Z title : The Capulet Mansion

106

2023-03-22T11:14:37.195Z Chapter #0:5: start 1145.770000, end 1446.070000

107

2023-03-22T11:14:37.195Z Metadata:

108

2023-03-22T11:14:37.195Z title : Mecutio at Sycamore Grove

109

2023-03-22T11:14:37.195Z Chapter #0:6: start 1446.070000, end 1531.947000

110

2023-03-22T11:14:37.195Z Metadata:

111

2023-03-22T11:14:37.195Z title : The Party Parties

112

2023-03-22T11:14:37.195Z Chapter #0:7: start 1531.947000, end 1677.551000

113

2023-03-22T11:14:37.195Z Metadata:

114

2023-03-22T11:14:37.195Z title : Romeo Spies Juliet

115

2023-03-22T11:14:37.195Z Chapter #0:8: start 1677.551000, end 1726.141000

116

2023-03-22T11:14:37.195Z Metadata:

117

2023-03-22T11:14:37.195Z title : Fulgencio Warn Tybalt

118

2023-03-22T11:14:37.195Z Chapter #0:9: start 1726.141000, end 1809.099000

119

2023-03-22T11:14:37.195Z Metadata:

120

2023-03-22T11:14:37.195Z title : Juliet, the Flirt

121

2023-03-22T11:14:37.195Z Chapter #0:10: start 1809.099000, end 1939.855000

122

2023-03-22T11:14:37.195Z Metadata:

123

2023-03-22T11:14:37.195Z title : First Kisses

124

2023-03-22T11:14:37.195Z Chapter #0:11: start 1939.855000, end 2001.082000

125

2023-03-22T11:14:37.195Z Metadata:

126

2023-03-22T11:14:37.195Z title : The Revelation

127

2023-03-22T11:14:37.195Z Chapter #0:12: start 2001.082000, end 2106.521000

128

2023-03-22T11:14:37.195Z Metadata:

129

2023-03-22T11:14:37.195Z title : The Party's Over

130

2023-03-22T11:14:37.195Z Chapter #0:13: start 2106.521000, end 2708.789000

131

2023-03-22T11:14:37.195Z Metadata:

132

2023-03-22T11:14:37.195Z title : Romeo Sneaks Back

133

2023-03-22T11:14:37.195Z Chapter #0:14: start 2708.789000, end 2986.734000

134

2023-03-22T11:14:37.195Z Metadata:

135

2023-03-22T11:14:37.195Z title : Arranging the Marriage

136

2023-03-22T11:14:37.195Z Chapter #0:15: start 2986.734000, end 3095.759000

137

2023-03-22T11:14:37.195Z Metadata:

138

2023-03-22T11:14:37.195Z title : Boys on the Beach

139

2023-03-22T11:14:37.195Z Chapter #0:16: start 3095.759000, end 3228.934000

140

2023-03-22T11:14:37.195Z Metadata:

141

2023-03-22T11:14:37.195Z title : Romeo Gets the Word From the Nurse

142

2023-03-22T11:14:37.195Z Chapter #0:17: start 3228.934000, end 3358.021000

143

2023-03-22T11:14:37.195Z Metadata:

144

2023-03-22T11:14:37.195Z title : The Nurse Teases Juliet

145

2023-03-22T11:14:37.195Z Chapter #0:18: start 3358.021000, end 3452.032000

146

2023-03-22T11:14:37.195Z Metadata:

147

2023-03-22T11:14:37.195Z title : The Wedding

148

2023-03-22T11:14:37.195Z Chapter #0:19: start 3452.032000, end 4071.401000

149

2023-03-22T11:14:37.195Z Metadata:

150

2023-03-22T11:14:37.195Z title : Death on a Summer's Day

151

2023-03-22T11:14:37.195Z Chapter #0:20: start 4071.401000, end 4526.772000

152

2023-03-22T11:14:37.195Z Metadata:

153

2023-03-22T11:14:37.195Z title : Retribution at Twilight

154

2023-03-22T11:14:37.195Z Chapter #0:21: start 4526.772000, end 4937.724000

155

2023-03-22T11:14:37.195Z Metadata:

156

2023-03-22T11:14:37.195Z title : A Wedding Night

157

2023-03-22T11:14:37.195Z Chapter #0:22: start 4937.724000, end 5369.364000

158

2023-03-22T11:14:37.195Z Metadata:

159

2023-03-22T11:14:37.195Z title : Juliet Learns Her Options

160

2023-03-22T11:14:37.195Z

161

2023-03-22T11:14:37.195Z Chapter #0:23: start 5369.364000, end 5395.974000

162

2023-03-22T11:14:37.195Z Metadata:

163

2023-03-22T11:14:37.195Z title : Romeo Misses a Message in Mantua

164

2023-03-22T11:14:37.195Z Chapter #0:24: start 5395.974000, end 5553.631000

165

2023-03-22T11:14:37.195Z Metadata:

166

2023-03-22T11:14:37.195Z title : Juliet Takes Her Medicine

167

2023-03-22T11:14:37.195Z Chapter #0:25: start 5553.631000, end 6051.170000

168

2023-03-22T11:14:37.195Z Metadata:

169

2023-03-22T11:14:37.195Z title : Romeo Learns of His Beloved' Fate

170

2023-03-22T11:14:37.195Z Chapter #0:26: start 6051.170000, end 6691.101000

171

2023-03-22T11:14:37.195Z Metadata:

172

2023-03-22T11:14:37.195Z title : Together at Last

173

2023-03-22T11:14:37.195Z Chapter #0:27: start 6691.101000, end 6797.624000

174

2023-03-22T11:14:37.195Z Metadata:

175

2023-03-22T11:14:37.195Z title : Epilogue

176

2023-03-22T11:14:37.195Z Chapter #0:28: start 6797.624000, end 7210.579000

177

2023-03-22T11:14:37.195Z Metadata:

178

2023-03-22T11:14:37.195Z title : End Titles

179

2023-03-22T11:14:37.195Z

180

2023-03-22T11:14:37.195Z Stream #0:0: Video: h264 (High), yuv420p(tv, bt709, progressive), 1920x1080 [SAR 1:1 DAR 16:9], 23.98 fps, 23.98 tbr, 1k tbn (default)

181

2023-03-22T11:14:37.195Z Metadata:

182

2023-03-22T11:14:37.195Z BPS-eng : 5745334

183

2023-03-22T11:14:37.195Z DURATION-eng : 02:00:10.619708333

184

2023-03-22T11:14:37.195Z NUMBER_OF_FRAMES-eng: 172882

185

2023-03-22T11:14:37.195Z NUMBER_OF_BYTES-eng: 5178427078

186

2023-03-22T11:14:37.195Z _STATISTICS_WRITING_APP-eng: mkvpropedit v45.0.0 ('Heaven in Pennies') 64-bit

187

2023-03-22T11:14:37.195Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-22 16:14:03

188

2023-03-22T11:14:37.195Z _STATISTICS_TAGS-eng:

189

2023-03-22T11:14:37.195Z BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES

190

2023-03-22T11:14:37.195Z Stream #0:1(eng): Audio: ac3, 48000 Hz, 5.1(side), fltp, 448 kb/s (default)

191

2023-03-22T11:14:37.195Z Metadata:

192

2023-03-22T11:14:37.195Z title : Surround AC3 5.1

193

2023-03-22T11:14:37.195Z BPS-eng : 448000

194

2023-03-22T11:14:37.195Z DURATION-eng : 02:00:10.656000000

195

2023-03-22T11:14:37.195Z

196

2023-03-22T11:14:37.195Z NUMBER_OF_FRAMES-eng: 225333

197

2023-03-22T11:14:37.195Z NUMBER_OF_BYTES-eng: 403796736

198

2023-03-22T11:14:37.195Z _STATISTICS_WRITING_APP-eng: mkvpropedit v45.0.0 ('Heaven in Pennies') 64-bit

199

2023-03-22T11:14:37.195Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-22 16:14:03

200

2023-03-22T11:14:37.195Z _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES

201

2023-03-22T11:14:37.195Z Stream #0:2(eng)

202

2023-03-22T11:14:37.195Z : Subtitle: subrip

203

2023-03-22T11:14:37.195Z Metadata:

204

2023-03-22T11:14:37.195Z title : English SDH

205

2023-03-22T11:14:37.195Z BPS-eng : 50

206

2023-03-22T11:14:37.195Z DURATION-eng : 01:59:36.419000000

207

2023-03-22T11:14:37.195Z NUMBER_OF_FRAMES-eng: 1335

208

2023-03-22T11:14:37.195Z NUMBER_OF_BYTES-eng: 45501

209

2023-03-22T11:14:37.195Z _STATISTICS_WRITING_APP-eng: mkvpropedit v45.0.0 ('Heaven in Pennies') 64-bit

210

2023-03-22T11:14:37.195Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-22 16:14:03

211

2023-03-22T11:14:37.195Z _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES

212

2023-03-22T11:14:37.195Z

213

2023-03-22T11:14:37.195Z [11:14:33] scan: decoding previews for title 1

214

2023-03-22T11:14:37.195Z

215

2023-03-22T11:14:37.195Z [11:14:33] scan: audio 0x1: ac3, rate=48000Hz, bitrate=448000 English (AC3) (5.1 ch) (448 kbps)

216

2023-03-22T11:14:37.195Z

217

2023-03-22T11:14:37.195Z Scanning title 1 of 1, preview 10, 100.00 %

218

2023-03-22T11:14:37.195Z [11:14:34] scan: 10 previews, 1920x1080, 23.976 fps, autocrop = 140/140/0/0, aspect 16:9, PAR 1:1, color profile: 1-1-1, chroma location: left

219

2023-03-22T11:14:37.195Z [11:14:34] scan: supported video decoders: avcodec qsv

220

2023-03-22T11:14:37.195Z

221

2023-03-22T11:14:37.195Z [11:14:34] libhb: scan thread found 1 valid title(s)

222

2023-03-22T11:14:37.195Z

223

2023-03-22T11:14:37.195Z + title 1:

224

2023-03-22T11:14:37.195Z + stream: /mnt/media/Convert/Romeo + Juliet (1996) Bluray-1080p Proper.mkv

225

2023-03-22T11:14:37.195Z + duration: 02:00:10

226

2023-03-22T11:14:37.195Z + size: 1920x1080, pixel aspect: 1/1, display aspect: 1.78, 23.976 fps

227

2023-03-22T11:14:37.195Z + autocrop: 140/140/0/0

228

2023-03-22T11:14:37.195Z + chapters:

229

2023-03-22T11:14:37.195Z + 1: duration 00:09:37

230

2023-03-22T11:14:37.195Z + 2: duration 00:04:00

231

2023-03-22T11:14:37.195Z + 3: duration 00:00:51

232

2023-03-22T11:14:37.195Z + 4: duration 00:01:30

233

2023-03-22T11:14:37.195Z + 5: duration 00:03:07

234

2023-03-22T11:14:37.195Z + 6: duration 00:05:00

235

2023-03-22T11:14:37.195Z + 7: duration 00:01:26

236

2023-03-22T11:14:37.195Z + 8: duration 00:02:26

237

2023-03-22T11:14:37.195Z + 9: duration 00:00:49

238

2023-03-22T11:14:37.195Z + 10: duration 00:01:23

239

2023-03-22T11:14:37.195Z + 11: duration 00:02:11

240

2023-03-22T11:14:37.195Z + 12: duration 00:01:01

241

2023-03-22T11:14:37.195Z + 13: duration 00:01:45

242

2023-03-22T11:14:37.195Z + 14: duration 00:10:02

243

2023-03-22T11:14:37.195Z + 15: duration 00:04:38

244

2023-03-22T11:14:37.195Z + 16: duration 00:01:49

245

2023-03-22T11:14:37.195Z + 17: duration 00:02:13

246

2023-03-22T11:14:37.195Z + 18: duration 00:02:09

247

2023-03-22T11:14:37.195Z + 19: duration 00:01:34

248

2023-03-22T11:14:37.195Z + 20: duration 00:10:19

249

2023-03-22T11:14:37.195Z + 21: duration 00:07:35

250

2023-03-22T11:14:37.195Z + 22: duration 00:06:51

251

2023-03-22T11:14:37.195Z + 23: duration 00:07:12

252

2023-03-22T11:14:37.195Z + 24: duration 00:00:27

253

2023-03-22T11:14:37.195Z + 25: duration 00:02:38

254

2023-03-22T11:14:37.195Z

255

2023-03-22T11:14:37.195Z + 26: duration 00:08:18

256

2023-03-22T11:14:37.195Z + 27: duration 00:10:40

257

2023-03-22T11:14:37.195Z + 28: duration 00:01:47

258

2023-03-22T11:14:37.195Z + 29: duration 00:06:53

259

2023-03-22T11:14:37.195Z + audio tracks:

260

2023-03-22T11:14:37.195Z + 1, English (AC3) (5.1 ch) (448 kbps) (iso639-2: eng), 48000Hz, 448000bps

261

2023-03-22T11:14:37.195Z + subtitle tracks:

262

2023-03-22T11:14:37.195Z + 1, English [UTF-8]

263

2023-03-22T11:14:37.195Z

264

2023-03-22T11:14:37.195Z HandBrake has exited.

265

2023-03-22T11:14:37.195Z

266

2023-03-22T11:14:37.195Z libdvdread: Encrypted DVD support unavailable.

267

2023-03-22T11:14:37.195Z libdvdread: Can't open file VIDEO_TS.IFO.

268

2023-03-22T11:14:37.195Z

269

2023-03-22T11:14:37.195Z

270

2023-03-22T11:14:37.195Z

 

Link to comment
  • 1 month later...

Anyone know why my Tdarr and Tdarr_node containers would be multiple gigs each?

 

Name                                              Container                Writable                 Log
tdarr                                             5.58 GB                  1.50 GB                  11.7 MB
tdarr_node                                        3.73 GB                  531 MB                   11.5 MB

 

This is from docker -> "Container Size"

I got a warning after installing and using Tdarr for a bit that my docker vdisk was almost full

 

*edit* I verified that my transcode cache for all of my libraries is properly mapped to a `transcode_cache` share on my cache drive, not in the image

Edited by veri745
additional info
Link to comment

I have a simple Tdarr use case. Cycle through my Movies and TV shows and convert everything possible to HVEC x265 to save me some disk space. I'm running Unraid Server in Docker on my Unraid Machine. I have three nodes I can use:

 

1. The Quadro p1000 in my unraid sever using NVENC.
2. A remote windows micro PC with a 9th generation Intel iGPU using QSV.
3. A remote M1 Macbook Air using Videotoolbox.

 

All these nodes are configured in the options page with the GPU type and for the GPU to take CPU tasks.

I first set up to only use the Windows and Mac machines using the BOOSH QSV/Videotoolbox plugin and it worked perfectly if a little slowly churning through my movies.

 

I then tried to add the Quadro p1000 node and I added the (standard?) Migz NVEC GPU plugin I hit issues. Nothing would transcode on the Mac, when I looked in the logs every transcode was failing when it came to the NVEC plugin and was passing the transcode job to the p1000 node. The QSV node was working fine, picking up transcodes and completing them successfully, and not displaying this behaviour.

 

I then tried putting the BOOSH plugin before the NVEC plugin in the stack so I ended up with files being transcoded twice! First successfully by the Mac then buy the p1000 once it hit the Migz plugin. Again the QSV node was working fine I just couldn't get this configuration to work. Has anyone else? Was I configuring it wrong in some way?

 

I'm asking this as I ended up paying for a month of Pro and currently have the p1000 churning through my Movies library and the PC/Mac working through the TV shows library. It working fine and I should have all the movies done by the end of the week (hopefully) but then I would like to have all three nodes working on the Tv Shows.

 

Any thoughts? Am I missing something here, I would have thought that the plugin logic would skip the NVEC plugin when running on the Mac and complete the stack. Is it possible to add an IF statement to the stack in some way? So I can skip the NVEC plugin?

Link to comment
On 4/27/2023 at 9:36 AM, veri745 said:

Anyone know why my Tdarr and Tdarr_node containers would be multiple gigs each?

 

Name                                              Container                Writable                 Log
tdarr                                             5.58 GB                  1.50 GB                  11.7 MB
tdarr_node                                        3.73 GB                  531 MB                   11.5 MB

 

This is from docker -> "Container Size"

I got a warning after installing and using Tdarr for a bit that my docker vdisk was almost full

 

*edit* I verified that my transcode cache for all of my libraries is properly mapped to a `transcode_cache` share on my cache drive, not in the image

 

Hi, did you ever find a solution to tdarr_node doing all its work at path inside container: /app/tdarr_node?  It's completely ignoring the mapping to I set to "/temp" on a spare SSD.  So, if I let it run for a little while it completely fills the Unraid Docker image.  Thanks in advance.

Edited by guy.davis
Link to comment
6 hours ago, guy.davis said:

 

Hi, did you ever find a solution to tdarr_node doing all its work at path inside container: /app/tdarr_node?  It's completely ignoring the mapping to I set to "/temp" on a spare SSD.  So, if I let it run for a little while it completely fills the Unraid Docker image.  Thanks in advance.

Never had any issue.  I have a dedicated share that I mapped to /temp in the docker config and it works fine.

 

All the space in the image size for me appears to be actual assets, in

/usr/lib

as well as

/app/Tdarr_Server

and

/app/Tdarr_Node

.

 

I guess it's just big

Edited by veri745
more info
Link to comment
  • 2 weeks later...
On 7/18/2022 at 11:09 PM, FlyAg said:

Hello! I'm looking for some explanations on why my 1050Ti Node is always faster at transcoding than my 1660Ti node, which is running on the same host as the server. Has anyone experienced this before? The 1050Ti is on a windows node and the 1660Ti is on Unraid, with Tdarr server running on Unraid docker. If there are some optimal configs someone can share, I'd love that. I went through several guides on setting everything up. I have read that maybe enabling Bframes can be faster on the 1660Ti but that is not supported on the 1050Ti I hear so I can't check that box as the 1050Ti tasks will fail. Thanks in advance!

Tdarr1.png

Tdarr2.png

Tdarr3.png

Tdarr4.png

Tdarr5.png

I would guess that this is because the 1050ti is in PCI bus 1, device 0 and is probably using 8 of the 16 PCI lanes, whereas your 1060 ti is in PCI bus 1, device 1 and is only using 4 of the 16 PCI lanes.  You will want to check the documentation for your mobo.  Most AMD boards provide 16 lanes for one GPU, and if you add a second, split them 8/4.

Link to comment
7 minutes ago, CarpNCod said:

I would guess that this is because the 1050ti is in PCI bus 1, device 0 and is probably using 8 of the 16 PCI lanes, whereas your 1060 ti is in PCI bus 1, device 1 and is only using 4 of the 16 PCI lanes.  You will want to check the documentation for your mobo.  Most AMD boards provide 16 lanes for one GPU, and if you add a second, split them 8/4.

Scratch that.  I just re-read the initial post and saw these GPUs are on separate nodes.  The 1660ti does indicate it is only getting 4 of 16 lanes though, which why I assumed (incorrectly) they were on the same node.

Link to comment
  • 1 month later...

I'm having an issue where all transcodes are failing with an error "specified rc mode is deprecated". I saw a recent issue on GitHub which appeared to be related and suggested reverting to 2.00.20.1 but that didn't help for me.

 

TDARR transcode log-

1
2023-07-20T19:36:41.570Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:[Step W05] [C1] Launching subworker
2
2023-07-20T19:36:41.571Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:Preparing to launch subworker
3
2023-07-20T19:36:41.574Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:Subworker launched
4
2023-07-20T19:36:41.576Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:[1/3] Sending command to subworker
5
2023-07-20T19:36:41.577Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:[2/3] tdarr-ffmpeg -c:v h264_cuvid -i /mnt/media/XBMC/Torrents/All Quiet on the Western Front (2022)/All Quiet on the Western Front (2022).mkv -map 0 -c:v hevc_nvenc -rc:v vbr_hq -cq:v 19 -b:v 5862k -minrate 4103k -maxrate 7620k -bufsize 11724k -spatial_aq:v 1 -rc-lookahead:v 32 -c:a copy -c:s copy -max_muxing_queue_size 9999 -map -0:d -pix_fmt p010le -bf 5 /temp/All Quiet on the Western Front (2022)-TdarrCacheFile-4ZI7rByUE.mkv
6
2023-07-20T19:36:41.579Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:[3/3] Command sent
7
2023-07-20T19:36:41.581Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:To see live CLI output, enable 'Log full FFmpeg/HandBrake output' in the staging section on the Tdarr tab before the job starts. Note this could increase the job report size substantially.
8
2023-07-20T19:36:41.583Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:Subworker:Online
9
2023-07-20T19:36:41.585Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:Subworker:Receiving transcode settings
10
2023-07-20T19:36:41.586Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:Subworker:Running CLI
11
2023-07-20T19:36:42.595Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:Subworker:a.Thread closed, code: 1
12
2023-07-20T19:36:42.596Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:Subworker exit approved, killing subworker
13
2023-07-20T19:36:42.597Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:Subworker killed
14
2023-07-20T19:36:42.598Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:b.Thread closed, code: 1
15
2023-07-20T19:36:42.599Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:CLI code: 1
16
2023-07-20T19:36:42.600Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:Last 200 lines of CLI log:
17
2023-07-20T19:36:42.601Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:ffmpeg version 5.1.2-Jellyfin Copyright (c) 2000-2022 the FFmpeg developers
18
2023-07-20T19:36:42.601Z built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
19
2023-07-20T19:36:42.601Z configuration: --prefix=/usr/lib/jellyfin-ffmpeg --target-os=linux --extra-libs=-lfftw3f --extra-version=Jellyfin --disable-doc --disable-ffplay --disable-ptx-compression --disable-shared --disable-libxcb --disable-sdl2 --disable-xlib --enable-lto --enable-gpl --enable-version3 --enable-static --enable-gmp --enable-gnutls --enable-chromaprint --enable-libdrm --enable-libass --enable-libfreetype --enable-libfribidi --enable-libfontconfig --enable-libbluray --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libdav1d --enable-libwebp --enable-libvpx --enable-libx264 --enable-libx265 --enable-libzvbi --enable-libzimg --enable-libfdk-aac --arch=amd64 --enable-libsvtav1 --enable-libshaderc --enable-libplacebo --enable-vulkan --enable-opencl --enable-vaapi --enable-amf --enable-libmfx --enable-ffnvcodec --enable-cuda --enable-cuda-llvm --enable-cuvid --enable-nvdec --enable-nvenc
20
2023-07-20T19:36:42.601Z libavutil 57. 28.100 / 57. 28.100
21
2023-07-20T19:36:42.601Z libavcodec 59. 37.100 / 59. 37.100
22
2023-07-20T19:36:42.601Z libavformat 59. 27.100 / 59. 27.100
23
2023-07-20T19:36:42.601Z libavdevice 59. 7.100 / 59. 7.100
24
2023-07-20T19:36:42.601Z libavfilter 8. 44.100 / 8. 44.100
25
2023-07-20T19:36:42.601Z libswscale 6. 7.100 / 6. 7.100
26
2023-07-20T19:36:42.601Z libswresample 4. 7.100 / 4. 7.100
27
2023-07-20T19:36:42.601Z libpostproc 56. 6.100 / 56. 6.100
28
2023-07-20T19:36:42.601Z
29
2023-07-20T19:36:42.601Z Input #0, matroska,webm, from '/mnt/media/XBMC/Torrents/All Quiet on the Western Front (2022)/All Quiet on the Western Front (2022).mkv':
30
2023-07-20T19:36:42.601Z Metadata:
31
2023-07-20T19:36:42.601Z encoder : libebml v1.3.6 + libmatroska v1.4.9
32
2023-07-20T19:36:42.601Z creation_time : 2023-03-21T10:13:59.000000Z
33
2023-07-20T19:36:42.601Z Duration: 02:26:57.82, start: 0.000000, bitrate: 12294 kb/s
34
2023-07-20T19:36:42.601Z Chapters:
35
2023-07-20T19:36:42.601Z Chapter #0:0: start 0.000000,
36
2023-07-20T19:36:42.601Z end 705.208333
37
2023-07-20T19:36:42.601Z Metadata:
38
2023-07-20T19:36:42.601Z title : Chapter 01
39
2023-07-20T19:36:42.601Z Chapter #0:1: start 705.208333, end 1444.166667
40
2023-07-20T19:36:42.601Z Metadata:
41
2023-07-20T19:36:42.601Z title : Chapter 02
42
2023-07-20T19:36:42.601Z Chapter #0:2: start 1444.166667, end 2188.583333
43
2023-07-20T19:36:42.601Z Metadata:
44
2023-07-20T19:36:42.601Z title : Chapter 03
45
2023-07-20T19:36:42.601Z Chapter #0:3: start 2188.583333, end 2989.833333
46
2023-07-20T19:36:42.601Z Metadata:
47
2023-07-20T19:36:42.601Z title : Chapter 04
48
2023-07-20T19:36:42.601Z Chapter #0:4: start 2989.833333, end 3641.541667
49
2023-07-20T19:36:42.601Z Metadata:
50
2023-07-20T19:36:42.601Z title : Chapter 05
51
2023-07-20T19:36:42.601Z Chapter #0:5: start 3641.541667, end 4030.166667
52
2023-07-20T19:36:42.601Z Metadata:
53
2023-07-20T19:36:42.601Z title : Chapter 06
54
2023-07-20T19:36:42.601Z Chapter #0:6: start 4030.166667, end 4883.541667
55
2023-07-20T19:36:42.601Z Metadata:
56
2023-07-20T19:36:42.601Z title : Chapter 07
57
2023-07-20T19:36:42.601Z Chapter #0:7: start 4883.541667, end 5769.166667
58
2023-07-20T19:36:42.601Z Metadata:
59
2023-07-20T19:36:42.601Z title : Chapter 08
60
2023-07-20T19:36:42.601Z Chapter #0:8: start 5769.166667, end 6444.375000
61
2023-07-20T19:36:42.601Z Metadata:
62
2023-07-20T19:36:42.601Z title : Chapter 09
63
2023-07-20T19:36:42.601Z Chapter #0:9: start 6444.375000, end 7231.916667
64
2023-07-20T19:36:42.601Z Metadata:
65
2023-07-20T19:36:42.601Z title : Chapter 10
66
2023-07-20T19:36:42.601Z Chapter #0:10: start 7231.916667, end 8055.833333
67
2023-07-20T19:36:42.601Z Metadata:
68
2023-07-20T19:36:42.601Z title : Chapter 11
69
2023-07-20T19:36:42.601Z Chapter #0:11: start 8055.833333, end 8817.792000
70
2023-07-20T19:36:42.601Z Metadata:
71
2023-07-20T19:36:42.601Z title : Chapter 12
72
2023-07-20T19:36:42.601Z Stream #0:0: Video: h264 (High), yuv420p(tv, bt709, progressive), 1920x804, SAR 1:1 DAR 160:67, 23.81 fps, 23.81 tbr, 1k tbn (default)
73
2023-07-20T19:36:42.601Z Metadata:
74
2023-07-20T19:36:42.601Z BPS-eng : 10562858
75
2023-07-20T19:36:42.601Z DURATION-eng : 02:26:57.792000000
76
2023-07-20T19:36:42.601Z NUMBER_OF_FRAMES-eng: 211627
77
2023-07-20T19:36:42.601Z NUMBER_OF_BYTES-eng: 11642636284
78
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_APP-eng: mkvmerge v25.0.0 ('Prog Noir') 64-bit
79
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-21 10:13:59
80
2023-07-20T19:36:42.601Z _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
81
2023-07-20T19:36:42.601Z Stream #0:1(ita): Audio: ac3, 48000 Hz, 5.1(side), fltp, 448 kb/s
82
2023-07-20T19:36:42.601Z (default) (forced)
83
2023-07-20T19:36:42.601Z Metadata:
84
2023-07-20T19:36:42.601Z title : AC3 iTA 5.1
85
2023-07-20T19:36:42.601Z BPS-eng : 448000
86
2023-07-20T19:36:42.601Z DURATION-eng : 02:26:57.792000000
87
2023-07-20T19:36:42.601Z NUMBER_OF_FRAMES-eng: 275556
88
2023-07-20T19:36:42.601Z NUMBER_OF_BYTES-eng: 493796352
89
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_APP-eng: mkvmerge v25.0.0 ('Prog Noir') 64-bit
90
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-21 10:13:59
91
2023-07-20T19:36:42.601Z _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
92
2023-07-20T19:36:42.601Z Stream #0:2(ger): Audio: ac3, 48000 Hz, 5.1(side), fltp, 640 kb/s
93
2023-07-20T19:36:42.601Z Metadata:
94
2023-07-20T19:36:42.601Z title : AC3 GER
95
2023-07-20T19:36:42.601Z BPS-eng : 640000
96
2023-07-20T19:36:42.601Z DURATION-eng : 02:26:57.792000000
97
2023-07-20T19:36:42.601Z NUMBER_OF_FRAMES-eng: 275556
98
2023-07-20T19:36:42.601Z NUMBER_OF_BYTES-eng: 705423360
99
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_APP-eng: mkvmerge v25.0.0 ('Prog Noir') 64-bit
100
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-21 10:13:59
101
2023-07-20T19:36:42.601Z _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
102
2023-07-20T19:36:42.601Z Stream #0:3(eng): Audio: ac3, 48000 Hz, 5.1(side), fltp, 640 kb/s
103
2023-07-20T19:36:42.601Z Metadata:
104
2023-07-20T19:36:42.601Z title : AC3 ENG
105
2023-07-20T19:36:42.601Z BPS-eng : 640000
106
2023-07-20T19:36:42.601Z
107
2023-07-20T19:36:42.601Z DURATION-eng : 02:26:57.824000000
108
2023-07-20T19:36:42.601Z NUMBER_OF_FRAMES-eng: 275557
109
2023-07-20T19:36:42.601Z NUMBER_OF_BYTES-eng: 705425920
110
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_APP-eng: mkvmerge v25.0.0 ('Prog Noir') 64-bit
111
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-21 10:13:59
112
2023-07-20T19:36:42.601Z _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
113
2023-07-20T19:36:42.601Z Stream #0:4(ita): Subtitle: subrip (forced)
114
2023-07-20T19:36:42.601Z Metadata:
115
2023-07-20T19:36:42.601Z title : FORCED iTA
116
2023-07-20T19:36:42.601Z BPS-eng : 1
117
2023-07-20T19:36:42.601Z DURATION-eng : 02:21:53.125000000
118
2023-07-20T19:36:42.601Z NUMBER_OF_FRAMES-eng: 78
119
2023-07-20T19:36:42.601Z NUMBER_OF_BYTES-eng: 1910
120
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_APP-eng: mkvmerge v25.0.0 ('Prog Noir') 64-bit
121
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-21 10:13:59
122
2023-07-20T19:36:42.601Z _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
123
2023-07-20T19:36:42.601Z Stream #0:5(ita): Subtitle: subrip
124
2023-07-20T19:36:42.601Z
125
2023-07-20T19:36:42.601Z Metadata:
126
2023-07-20T19:36:42.601Z title : REGULAR iTA
127
2023-07-20T19:36:42.601Z BPS-eng : 23
128
2023-07-20T19:36:42.601Z DURATION-eng : 02:23:53.417000000
129
2023-07-20T19:36:42.601Z NUMBER_OF_FRAMES-eng: 1033
130
2023-07-20T19:36:42.601Z NUMBER_OF_BYTES-eng: 25898
131
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_APP-eng: mkvmerge v25.0.0 ('Prog Noir') 64-bit
132
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-21 10:13:59
133
2023-07-20T19:36:42.601Z _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
134
2023-07-20T19:36:42.601Z Stream #0:6(eng): Subtitle: subrip (forced)
135
2023-07-20T19:36:42.601Z Metadata:
136
2023-07-20T19:36:42.601Z title : FORCED ENG
137
2023-07-20T19:36:42.601Z BPS-eng : 2
138
2023-07-20T19:36:42.601Z DURATION-eng : 02:15:21.042000000
139
2023-07-20T19:36:42.601Z NUMBER_OF_FRAMES-eng: 92
140
2023-07-20T19:36:42.601Z NUMBER_OF_BYTES-eng: 2254
141
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_APP-eng: mkvmerge v25.0.0 ('Prog Noir') 64-bit
142
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-21 10:13:59
143
2023-07-20T19:36:42.601Z
144
2023-07-20T19:36:42.601Z _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
145
2023-07-20T19:36:42.601Z Stream #0:7(eng): Subtitle: subrip
146
2023-07-20T19:36:42.601Z Metadata:
147
2023-07-20T19:36:42.601Z title : NON UDENTI ENG
148
2023-07-20T19:36:42.601Z BPS-eng : 38
149
2023-07-20T19:36:42.601Z DURATION-eng : 02:21:28.708000000
150
2023-07-20T19:36:42.601Z NUMBER_OF_FRAMES-eng: 1461
151
2023-07-20T19:36:42.601Z NUMBER_OF_BYTES-eng: 40787
152
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_APP-eng: mkvmerge v25.0.0 ('Prog Noir') 64-bit
153
2023-07-20T19:36:42.601Z _STATISTICS_WRITING_DATE_UTC-eng: 2023-03-21 10:13:59
154
2023-07-20T19:36:42.601Z _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
155
2023-07-20T19:36:42.601Z
156
2023-07-20T19:36:42.601Z Stream mapping:
157
2023-07-20T19:36:42.601Z Stream #0:0 -> #0:0 (h264 (h264_cuvid) -> hevc (hevc_nvenc))
158
2023-07-20T19:36:42.601Z Stream #0:1 -> #0:1 (copy)
159
2023-07-20T19:36:42.601Z Stream #0:2 -> #0:2 (copy)
160
2023-07-20T19:36:42.601Z Stream #0:3 -> #0:3 (copy)
161
2023-07-20T19:36:42.601Z Stream #0:4 -> #0:4 (copy)
162
2023-07-20T19:36:42.601Z Stream #0:5 -> #0:5 (copy)
163
2023-07-20T19:36:42.601Z Stream #0:6 -> #0:6 (copy)
164
2023-07-20T19:36:42.601Z Stream #0:7 -> #0:7 (copy)
165
2023-07-20T19:36:42.601Z Press [q] to stop, [?] for help
166
2023-07-20T19:36:42.601Z
167
2023-07-20T19:36:42.601Z [hevc_nvenc @ 0x55c5cee72140] Specified rc mode is deprecated.
168
2023-07-20T19:36:42.601Z [hevc_nvenc @ 0x55c5cee72140] Use -rc constqp/cbr/vbr, -tune and -multipass instead.
169
2023-07-20T19:36:42.601Z
170
2023-07-20T19:36:42.601Z [hevc_nvenc @ 0x55c5cee72140] InitializeEncoder failed: invalid param (8): Preset P1 to P7 not supported with older 2 Pass RC Modes(CBR_HQ, VBR_HQ) and cbr lowdelayEnable NV_ENC_RC_PARAMS::multiPass flag for two pass encoding and set
171
2023-07-20T19:36:42.601Z
172
2023-07-20T19:36:42.601Z Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
173
2023-07-20T19:36:42.601Z
174
2023-07-20T19:36:42.601Z Conversion failed!
175
2023-07-20T19:36:42.601Z
176
2023-07-20T19:36:42.601Z
177
2023-07-20T19:36:42.601Z k5BIAOw9jC:Node[node_2]:Worker[elated-elk]:[-error-]

 

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.