Tdarr Errors on Larger Files

Recommended Posts

I'm running Tdarr in an Unraid Docker, after following the recent Spaceinvaderone guide.

I have a large library and am experiencing problems transcoding large files (File sizes ranging from 4gb to 50gb). I keep receiving errors such as:

[hevc_nvenc @ 0x55b3f455ef00] OpenEncodeSessionEx failed: out of memory (10)

[hevc_nvenc @ 0x55b3f455ef00] No NVENC capable devices found

Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height

Conversion failed!


I also receive an error stating that my /temp directory is out of space when trying to write the file. My /temp share is on a 1TB SSD and I have 32GB of memory, according to the Unraid dashboard I'm not seeing RAM spikes above 70% at most. Tdarr is successfully transcoding other files under 4GB, but I can't find the bottleneck. I tested while running only one worker as well as up to 5 simultaneously, and fairly consistently the smaller files are working, while the larger ones seem to be getting up to 70-80% completion before going into the Error/Cancelled category. There are some files that fail instantaneously.  Any help would be appreciated.

Screen Shot 2021-10-16 at 12.12.42 PM.png

Link to comment

It sounds like your Tdarr docker is writing to the Unraid cache/RAM instead of your SSD.  I would check that the “Transcode Cache” path in the Tdarr server and node docker settings are the same, and that they are pointing at a share that is set to use your cache (share set to ‘Yes’ or ‘Prefer’).  


The ‘/temp’ setting for transcode cache in the Tdarr GUI will point to this path in the docker settings, so double check that it’s set to the SSD (e.g. :


) and not


on the server.

Link to comment

You were correct that my Transcode Cache path was set incorrectly, so thank you for that.  After the change, I re-queued the failed transcodes and unfortunately many of the same files are failing, with the same error.  I've attached a screenshot of the plugins I'm using, and I will note that the transcodes are going much faster than they were previously, which is of course great.  Many of the failed transcodes are Remux Bluray 1080p files, is it possible one of these plugins is having difficulty parsing some of the data?  I attached a second screenshot with a sample of one of the information tabs for a failed transcode.

Screen Shot 2021-10-16 at 12.33.15 PM.png


Screen Shot 2021-10-16 at 12.52.39 PM.png

Edited by jackfalveyiv
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.