Maticks Posted July 1, 2023 Share Posted July 1, 2023 I have tried 6.11.5 and 6.12.2 both having the same issue Nvidia Drivers are filling the Plex Server Docker log.json file till it crashes the system in 28+ Hours. Tried setting parameters in Plex Docker didn't get rid of the below message, but i have put together a workaround below for the time being till i guess an update comes out? If someone knows how to fix this properly then even better {"level":"info","msg":"Running with config:\n{\n \"AcceptEnvvarUnprivileged\": true,\n \"NVIDIAContainerCLIConfig\": {\n \"Root\": \"\"\n },\n \"NVIDIACTKConfig\": {\n \"Path\": \"nvidia-ctk\"\n },\n \"NVIDIAContainerRuntimeConfig\": {\n \"DebugFilePath\": \"/dev/null\",\n \"LogLevel\": \"info\",\n \"Runtimes\": [\n \"docker-runc\",\n \"runc\"\n ],\n \"Mode\": \"auto\",\n \"Modes\": {\n \"CSV\": {\n \"MountSpecPath\": \"/etc/nvidia-container-runtime/host-files-for-container.d\"\n },\n \"CDI\": {\n \"SpecDirs\": null,\n \"DefaultKind\": \"nvidia.com/gpu\",\n \"AnnotationPrefixes\": [\n \"cdi.k8s.io/\"\n ]\n }\n }\n },\n \"NVIDIAContainerRuntimeHookConfig\": {\n \"SkipModeDetection\": false\n }\n}","time":"2023-07-01T08:49:53+10:00"} {"level":"info","msg":"Using low-level runtime /usr/bin/runc","time":"2023-07-01T08:49:53+10:00"} {"level":"info","msg":"Running with config:\n{\n \"AcceptEnvvarUnprivileged\": true,\n \"NVIDIAContainerCLIConfig\": {\n \"Root\": \"\"\n },\n \"NVIDIACTKConfig\": {\n \"Path\": \"nvidia-ctk\"\n },\n \"NVIDIAContainerRuntimeConfig\": {\n \"DebugFilePath\": \"/dev/null\",\n \"LogLevel\": \"info\",\n \"Runtimes\": [\n \"docker-runc\",\n \"runc\"\n ],\n \"Mode\": \"auto\",\n \"Modes\": {\n \"CSV\": {\n \"MountSpecPath\": \"/etc/nvidia-container-runtime/host-files-for-container.d\"\n },\n \"CDI\": {\n \"SpecDirs\": null,\n \"DefaultKind\": \"nvidia.com/gpu\",\n \"AnnotationPrefixes\": [\n \"cdi.k8s.io/\"\n ]\n }\n }\n },\n \"NVIDIAContainerRuntimeHookConfig\": {\n \"SkipModeDetection\": false\n }\n}","time":"2023-07-01T08:49:59+10:00"} {"level":"info","msg":"Using low-level runtime /usr/bin/runc","time":"2023-07-01T08:49:59+10:00"} {"level":"info","msg":"Running with config:\n{\n \"AcceptEnvvarUnprivileged\": true,\n \"NVIDIAContainerCLIConfig\": {\n \"Root\": \"\"\n },\n \"NVIDIACTKConfig\": {\n \"Path\": \"nvidia-ctk\"\n },\n \"NVIDIAContainerRuntimeConfig\": {\n \"DebugFilePath\": \"/dev/null\",\n \"LogLevel\": \"info\",\n \"Runtimes\": [\n \"docker-runc\",\n \"runc\"\n ],\n \"Mode\": \"auto\",\n \"Modes\": {\n \"CSV\": {\n \"MountSpecPath\": \"/etc/nvidia-container-runtime/host-files-for-container.d\"\n },\n \"CDI\": {\n \"SpecDirs\": null,\n \"DefaultKind\": \"nvidia.com/gpu\",\n \"AnnotationPrefixes\": [\n \"cdi.k8s.io/\"\n ]\n }\n }\n },\n \"NVIDIAContainerRuntimeHookConfig\": {\n \"SkipModeDetection\": false\n }\n}","time":"2023-07-01T08:50:04+10:00"} {"level":"info","msg":"Using low-level runtime /usr/bin/runc","time":"2023-07-01T08:50:04+10:00"} User Script running daily is keeping it under control for me. #!/bin/bash set -e # Enable error checking echo "" echo "<font color='red'><b>Before:</b></font>" echo "=====================================================================================================================================================================================" du -ah /run/docker/containerd/ | grep -v "/$" | sort -rh | head -60 | grep log.json echo "=====================================================================================================================================================================================" echo "Cleaning Logs:" logs=$(find /run/docker/containerd/ -name 'log.json') for log in $logs; do if [[ -f "$log" ]]; then # Check if the file exists echo "Cleaning $log" cat /dev/null > "$log" echo "Cleanup complete for $log" else echo "File not found: $log" fi done sleep 6 echo "...<font color='blue'>cleaning complete!</font>" echo "" echo "<font color='green'><b>After:</b></font>" echo "=====================================================================================================================================================================================" du -ah /run/docker/containerd/ | grep -v "/$" | sort -rh | head -60 | grep log.json echo "" The script running. Script location: /tmp/user.scripts/tmpScripts/Clean Docker Logs/script Note that closing this window will abort the execution of this script Before: ===================================================================================================================================================================================== 6.9M /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/84a71d0217765ed3e4358ec4a4dfdeee3fa7a34e41a00186ebab7b84d94cc3bf/log.json 8.0K /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/1158fb61ff223a7465fd7eca2d128f7cebb25a922f9e048dab8120e9788c4341/log.json 4.0K /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/68ee5760f343a65f11b43434acf4ca724b9fb0115a4448ee1f213612b802a396/log.json ===================================================================================================================================================================================== Cleaning Logs: Cleaning /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/84a71d0217765ed3e4358ec4a4dfdeee3fa7a34e41a00186ebab7b84d94cc3bf/log.json Cleanup complete for /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/84a71d0217765ed3e4358ec4a4dfdeee3fa7a34e41a00186ebab7b84d94cc3bf/log.json Cleaning /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/f439d9288ed4eeeff81e3fefea2e53c319db8e88587f477cbb7f00084ce53f78/log.json Cleanup complete for /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/f439d9288ed4eeeff81e3fefea2e53c319db8e88587f477cbb7f00084ce53f78/log.json Cleaning /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/e0723af74f141f744aa6002a2573efefe31a825727140675571000173b866174/log.json Cleanup complete for /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/e0723af74f141f744aa6002a2573efefe31a825727140675571000173b866174/log.json Cleaning /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/68ee5760f343a65f11b43434acf4ca724b9fb0115a4448ee1f213612b802a396/log.json Cleanup complete for /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/68ee5760f343a65f11b43434acf4ca724b9fb0115a4448ee1f213612b802a396/log.json Cleaning /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/4fda88ee8beeb6ace74584bbf196b5ac3e18c6b491a1fd06c2f23eecf1fc8f39/log.json Cleanup complete for /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/4fda88ee8beeb6ace74584bbf196b5ac3e18c6b491a1fd06c2f23eecf1fc8f39/log.json Cleaning /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/23093eff7956319820e36e53b29f1d053859320636e12be6f763ab5ace1ce50b/log.json Cleanup complete for /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/23093eff7956319820e36e53b29f1d053859320636e12be6f763ab5ace1ce50b/log.json Cleaning /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/1158fb61ff223a7465fd7eca2d128f7cebb25a922f9e048dab8120e9788c4341/log.json Cleanup complete for /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/1158fb61ff223a7465fd7eca2d128f7cebb25a922f9e048dab8120e9788c4341/log.json Cleaning /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/9235ee51028a3baa5407882cc7eee50f0f2dad73bca02dc0b053dc490d50549d/log.json Cleanup complete for /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/9235ee51028a3baa5407882cc7eee50f0f2dad73bca02dc0b053dc490d50549d/log.json ...cleaning complete! After: ===================================================================================================================================================================================== 4.0K /run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/84a71d0217765ed3e4358ec4a4dfdeee3fa7a34e41a00186ebab7b84d94cc3bf/log.json Quote Link to comment
Mikeo Posted July 12, 2023 Share Posted July 12, 2023 I'm sure you've probably already fixed this, but in case not, it looks like ich777 released an update that changed the log level to warning. I believe this has fixed the issue for me. 1 Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.