ultimz
-
Posts
68 -
Joined
-
Last visited
Content Type
Profiles
Forums
Downloads
Store
Gallery
Bug Reports
Documentation
Landing
Posts posted by ultimz
-
-
Just moved to this docker from the unsupported one. Thanks it seems to have worked without any issues.
I did change port 1900 to 1901 to match my old docker settings. Also had to backup from v7.3.83 and do the restore on v7.5.187 which seemed to work fine as well.
- 1
-
Sorry my bad... I was trying to access the gui through NPM and that was stopped when I stopped the array... I had to use the IP.
-
Hi,
I was about to do the upgrade to 6.12.6 but when trying to stop the array (running 6.12.4) it got stuck on "Stopping services"... now I can't get into the GUI but I have managed to get into the shell and download diagnostics which are attached.
Is anyone able to help me figure out why it's stuck? I was going to reboot the server using powerdown -r but can wait.
-
Thanks again @yogy - seems to have worked... I'll test and post back if I pick up any issues. I have just set WEBSOCKET_ENABLED to false and removed the port mapping for 3012
-
Perfect thanks @yogy that worked
-
On 7/17/2023 at 10:31 AM, yogy said:
Here is a very quick guide how to use Argon2 hash for Vaultwarden. You can use different ways to enable access to admin page here but like I said, this is a very quick solution:
- Go to https://argon2.online/
- Enter a passphrase in Plain Text Input, click once on the Salt cogwheel and leave everything as default and click GENERATE HASH
- Go to Vaultwarden Admin's Page >> General Settings and replace your current admin token in plain text with the generated hash value ($argon2i$v=19$m=16,t=2,p=1$YnJvYm1vSD...........)
- Save and restart the vaultwarden container
- To login to admin's page you must use your plain text value, not the hash
I hope you will find this very quick tutorial useful.
@yogy does the docker variable (ADMIN_TOKEN) also need to be updated?
-
Hi,
Running the latest version of Vaultwarden and I enabled the push notifications as well. I am also using Nginx Proxy Manager to expose it (with just https and nothing on 3012)
I can see from the logs that I am getting these errors:
[2023-11-20 12:42:12.277][rocket::server][ERROR] Upgraded websocket I/O handler failed: WebSocket protocol error: Sending after closing is not allowed
Is there something I need to configure on NPM or Vaultwarden?
-
Hi all,
Hoping someone can help me with a weird issue.
I use Ombi with Jellyfin users that access with their Jellyfin accounts. Recently I have had an issue with Jellyfin users who can't authenticate:
fail: Ombi.Api.Api[1000]
StatusCode: Unauthorized, Reason: Unauthorized, RequestUri: https://jf.(domain)/users/authenticatebyname
fail: Microsoft.AspNetCore.Identity.UserManager[0]
Jellyfin Login Failed
Newtonsoft.Json.JsonReaderException: Unexpected character encountered while parsing value: E. Path '', line 0, position 0.
at Newtonsoft.Json.JsonTextReader.ParseValue()
at Newtonsoft.Json.JsonReader.ReadForType(JsonContract contract, Boolean hasConverter)
at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)
at Newtonsoft.Json.JsonSerializer.DeserializeInternal(JsonReader reader, Type objectType)
at Newtonsoft.Json.JsonSerializer.Deserialize(JsonReader reader, Type objectType)
at Newtonsoft.Json.JsonConvert.DeserializeObject(String value, Type type, JsonSerializerSettings settings)
at Newtonsoft.Json.JsonConvert.DeserializeObject[T](String value, JsonSerializerSettings settings)
at Ombi.Api.Api.Request[T](Request request, CancellationToken cancellationToken)
at Ombi.Api.Jellyfin.JellyfinApi.LogIn(String username, String password, String apiKey, String baseUri) in /home/runner/work/Ombi/Ombi/src/Ombi.Api.Jellyfin/JellyfinApi.cs:line 74
at Ombi.Core.Authentication.OmbiUserManager.CheckJellyfinPasswordAsync(OmbiUser user, String password) in /home/runner/work/Ombi/Ombi/src/Ombi.Core/Authentication/OmbiUserManager.cs:line 232
warn: Ombi.Controllers.V1.TokenController[0]Local users work fine. Any ideas what the issue could be? The username and password is definitely correct.
-
@KluthR I just wanted to say thanks for your hard work on supporting the old plugin for so long and this awesome new one. Appreciate it!
- 1
-
SOLVED - seems tdarr_node is linked to the same logs folder.
Hi,
I'm picking up an error when backing up tdarr - anyone have any idea of what could be causing the file to get modified while the docker is stopped? here are the logs
[03.10.2023 02:40:49][ℹ️][tdarr] Stopping tdarr... done! (took 4 seconds)
[03.10.2023 02:40:53][ℹ️][tdarr] Should NOT backup external volumes, sanitizing them...
[03.10.2023 02:40:53][ℹ️][tdarr] Calculated volumes to back up: /mnt/user/appdata/tdarr/logs, /mnt/user/appdata/tdarr/server, /mnt/user/appdata/tdarr/configs
[03.10.2023 02:40:53][ℹ️][tdarr] Backing up tdarr...
[03.10.2023 02:41:22][ℹ️][tdarr] Backup created without issues
[03.10.2023 02:41:22][ℹ️][tdarr] Verifying backup...
[03.10.2023 02:41:58][❌][tdarr] tar verification failed! Tar said: tar: Removing leading `/' from member names; mnt/user/appdata/tdarr/logs/Tdarr_Node_Log.txt: Mod time differs; mnt/user/appdata/tdarr/logs/Tdarr_Node_Log.txt: Size differs
[03.10.2023 02:42:15][ℹ️][tdarr] Starting tdarr... (try #1) done! -
Thanks @SimonF that worked!
-
Hi @SimonF
Thanks for assisting with this.
I also passed through a USB controller to the VM... would the upgrade to 6.12.4 impacted the bind?
I see this under system devices and it's not ticked to bind selected at boot for my VM to use it (but my GPU entries are ticked):
[1022:145f] 0b:00.3 USB controller: Advanced Micro Devices, Inc. [AMD] Zeppelin USB 3.0 xHCI Compliant Host Controller
Bus 005 Device 001 Port 5-0 ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 006 Device 001 Port 6-0 ID 1d6b:0003 Linux Foundation 3.0 root hub
Would ticking the entry and binding selected at boot fix my issue and get me back to where I was?
-
Hi,
I am getting the following error when I reboot my server:
Event: VM Autostart disabled
Subject: vfio-pci-errors
Description: VM Autostart disabled due to vfio-bind error
Importance: alert
Please review /var/log/vfio-pci-errorsThe only log entry in that file is
Error: Device 0000:0a:00.3 does not exist, unable to bind device
Both the VMs that I have start up fine manually (1 has a GPU passed through to it)... could it be an issue with the GPU?
[10de:1b82] 03:00.0 VGA compatible controller: NVIDIA Corporation GP104 [GeForce GTX 1070 Ti] (rev a1)
[10de:10f0] 03:00.1 Audio device: NVIDIA Corporation GP104 High Definition Audio Controller (rev a1)
I have attached the diagnostic file. Thanks in advance for the assistance!
-
Upgraded from 6.11.5 to 6.12.4 - was pretty seamless. Just had to reinstall 2 plugins (gpustat and NUT) and also change docker to use ipvlan as it picked up a few traces.
Thanks to all the devs for their hard work. Appreciated!
-
Sure thanks - I don't mind sharing it then (took a quick look at the file). I've filtered out just for that time period.
-
Thanks - found the logs. Is it safe for me to post them here? Not sure if it includes any sensitive information...
-
Ok thanks guys and yes the syslog server was enabled before the reboot (a while back actually)... how can I share those logs?
-
Hi,
This morning I noticed that I couldn't connect to my Sonarr docker (nothing out of the ordinary in its logs) so I tried to stop it but the "server error" warning came up and it just continued running. I tried docker kill command from console and it still wouldn't stop. I then killed the Sonarr process by finding that specific PID and then it stopped but wouldn't start up.
I also noticed that the mover was running for a long time and didn't ever complete so I stopped the mover process from console and then tried to restart. That stopped and then I tried to stop the array to restart the server but couldn't. It was just stuck at "Sync filesystems".
The only thing that was accessing /mnt/user was some syslog processes so after a while of nothing happening I killed those. No luck stopping the array. powerdown -r also didn't help as I then lost connection to IP of unraid server but the server was still on. Ended up doing a hard shutdown.
Not sure why this happened but it's doing a parity check now... but everything seems to be back up.
Can anyone tell me why this happened? Logs attached.
-
I made the jump from 7.1.68 to 7.3.83. Only issue I had was an error when starting the container... it complained about invalid memory heap size. I had to set the MEM_LIMIT and MEM_STARTUP back to default and it worked again.
-
@KluthR yes I can start the container manually without any issues. Auto update is not enabled.
-
Hi @KluthR I have installed the plugin again (over the old one) but I am still getting errors when trying to start a certain docker container.
linuxserver/unifi-controller
Quote[03.03.2023 02:00:01] Backup of appData starting. This may take awhile
[03.03.2023 02:00:01] Stopping AndroidDebugBridge... done! (took 10 seconds)
[03.03.2023 02:00:11] Stopping Apache-PHP... done! (took 1 seconds)
[03.03.2023 02:00:12] Stopping bazarr... done! (took 5 seconds)
[03.03.2023 02:00:17] Stopping binhex-delugevpn... done! (took 3 seconds)
[03.03.2023 02:00:20] Stopping binhex-lidarr... done! (took 2 seconds)
[03.03.2023 02:00:22] Stopping binhex-prowlarr... done! (took 1 seconds)
[03.03.2023 02:00:23] Stopping binhex-radarr... done! (took 2 seconds)
[03.03.2023 02:00:25] Stopping binhex-sabnzbdvpn... done! (took 2 seconds)
[03.03.2023 02:00:27] Stopping binhex-sonarr... done! (took 0 seconds)
[03.03.2023 02:00:27] Stopping bitwardenrs... done! (took 1 seconds)
[03.03.2023 02:00:28] Stopping ddns... done! (took 0 seconds)
[03.03.2023 02:00:28] Stopping deepstack... done! (took 10 seconds)
[03.03.2023 02:00:38] Stopping deepstack-ui... done! (took 2 seconds)
[03.03.2023 02:00:40] Stopping duplicati... done! (took 5 seconds)
[03.03.2023 02:00:45] Stopping Grafana... done! (took 0 seconds)
[03.03.2023 02:00:45] Stopping grocy... done! (took 5 seconds)
[03.03.2023 02:00:50] Stopping hassConfigurator... done! (took 10 seconds)
[03.03.2023 02:01:00] Stopping Home-Assistant-Core... done! (took 6 seconds)
[03.03.2023 02:01:06] Stopping Influxdb... done! (took 1 seconds)
[03.03.2023 02:01:07] Stopping Jellyfin... done! (took 0 seconds)
[03.03.2023 02:01:07] Stopping mariadb... done! (took 10 seconds)
[03.03.2023 02:01:17] Stopping MQTT... done! (took 1 seconds)
[03.03.2023 02:01:18] Stopping NginxProxyManager... done! (took 4 seconds)
[03.03.2023 02:01:22] Stopping NodeRed... done! (took 0 seconds)
[03.03.2023 02:01:22] Stopping ombi... done! (took 4 seconds)
[03.03.2023 02:01:26] Stopping OpenEats... done! (took 11 seconds)
[03.03.2023 02:01:37] Stopping phpmyadmin... done! (took 1 seconds)
[03.03.2023 02:01:38] Stopping pihole-template... done! (took 4 seconds)
[03.03.2023 02:01:42] Stopping tdarr... done! (took 5 seconds)
[03.03.2023 02:01:47] Stopping tdarr_node... done! (took 4 seconds)
[03.03.2023 02:01:51] Stopping unifi-controller... done! (took 7 seconds)
[03.03.2023 02:01:58] Stopping webtrees... done! (took 11 seconds)
[03.03.2023 02:02:09] Stopping zabbix-agent2... done! (took 0 seconds)
[03.03.2023 02:02:09] Stopping Zabbix-Server... done! (took 0 seconds)
[03.03.2023 02:02:09] Stopping Zabbix-Webinterface... done! (took 10 seconds)
[03.03.2023 02:02:19] Backing up USB Flash drive config folder to /mnt/user/Backup/Unraid/USB/
2023/03/03 02:02:19 [7083] building file list
2023/03/03 02:02:20 [7083] *deleting config/super.dat.CA_BACKUP
2023/03/03 02:02:20 [7083] .d...p..... ./
2023/03/03 02:02:20 [7083] .d..tp..... EFI/
2023/03/03 02:02:20 [7083] .d..tp..... EFI/boot/
2023/03/03 02:02:20 [7083] .d..tp..... System Volume Information/
2023/03/03 02:02:20 [7083] .d..tp..... config/
2023/03/03 02:02:20 [7083] >f+++++++++ config/super.dat
2023/03/03 02:02:20 [7083] .d...p..... config/modprobe.d/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins-error/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins-removed/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/
2023/03/03 02:02:20 [7083] >f.stp..... config/plugins/ca.update.applications.plg
2023/03/03 02:02:20 [7083] >f.stp..... config/plugins/community.applications.plg
2023/03/03 02:02:20 [7083] >f.stp..... config/plugins/dynamix.system.info.plg
2023/03/03 02:02:20 [7083] >f.stp..... config/plugins/dynamix.system.temp.plg
2023/03/03 02:02:20 [7083] >f.stp..... config/plugins/fix.common.problems.plg
2023/03/03 02:02:20 [7083] >f.stp..... config/plugins/nut.plg
2023/03/03 02:02:20 [7083] >f.stp..... config/plugins/rclone.plg
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/NerdPack/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/NerdPack/packages/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/NerdPack/packages/6.10/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/NerdPack/packages/6.6/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/NerdPack/packages/6.8/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/NerdPack/packages/6.9/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/ca.backup2/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/ca.update.applications/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/ca.update.applications/scripts/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/ca.update.applications/scripts/starting/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/ca.update.applications/scripts/stopping/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/community.applications/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/community.applications/private/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/community.applications/private/DockerHub/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/controlr/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/dockerMan/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/dockerMan/images/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/dockerMan/templates-user/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/dockerMan/templates/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/dockerMan/templates/limetech/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/dynamix.apcupsd/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/dynamix.my.servers/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/dynamix.system.info/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/dynamix.system.temp/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/dynamix.wireguard/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/dynamix/
2023/03/03 02:02:20 [7083] >f..tp..... config/plugins/dynamix/monitor.ini
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/dynamix/notifications/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/dynamix/notifications/agents/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/dynamix/users/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/fix.common.problems/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/fix.common.problems/scripts/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/gpustat/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/nut/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/nvidia-driver/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/nvidia-driver/packages/
2023/03/03 02:02:20 [7083] .d...p..... config/plugins/nvidia-driver/packages/5.19.17/
2023/03/03 02:02:20 [7083] .d..tp..... config/plugins/rclone/
2023/03/03 02:02:21 [7083] .d...p..... config/plugins/rclone/install/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/rclone/install/scripts/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/rclone/install/scripts/rclone_custom_plugin/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/rclone/install/scripts/rclone_mount_plugin/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/rclone/install/scripts/rclone_unmount_plugin/
2023/03/03 02:02:21 [7083] .d...p..... config/plugins/rclone/logs/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/rclone/scripts/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/rclone/scripts/rclone_custom_plugin/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/rclone/scripts/rclone_mount_plugin/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/rclone/scripts/rclone_unmount_plugin/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/speedtest/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/tips.and.tweaks/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/unassigned.devices-plus/
2023/03/03 02:02:21 [7083] .d...p..... config/plugins/unassigned.devices-plus/packages/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/unassigned.devices/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/unassigned.devices/packages/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/user.scripts/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/user.scripts/scripts/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/user.scripts/scripts/Backup_VMs/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/user.scripts/scripts/NVidia Sleep/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/user.scripts/scripts/Unlock NVidia/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/user.scripts/scripts/delete.ds_store/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/user.scripts/scripts/delete_dangling_images/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/user.scripts/scripts/nested off/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/user.scripts/scripts/nested on/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/user.scripts/scripts/viewDockerLogSize/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/vmbackup/
2023/03/03 02:02:21 [7083] .d..tp..... config/plugins/vmbackup/packages/
2023/03/03 02:02:21 [7083] .d...p..... config/pools/
2023/03/03 02:02:21 [7083] .d..tp..... config/shares/
2023/03/03 02:02:21 [7083] .d..tp..... config/ssh/
2023/03/03 02:02:21 [7083] .d...p..... config/ssh/root/
2023/03/03 02:02:21 [7083] .d..tp..... config/ssl/
2023/03/03 02:02:21 [7083] .d..tp..... config/ssl/certs/
2023/03/03 02:02:21 [7083] .d..tp..... config/wireguard/
2023/03/03 02:02:21 [7083] .d..tp..... config/wireguard/peers/
2023/03/03 02:02:21 [7083] .d..tp..... logs/
2023/03/03 02:02:21 [7083] .d..tp..... previous/
2023/03/03 02:02:21 [7083] .d..tp..... syslinux/
2023/03/03 02:02:21 [7083] sent 174,447 bytes received 1,824 bytes 70,508.40 bytes/sec
2023/03/03 02:02:21 [7083] total size is 1,086,435,004 speedup is 6,163.44
[03.03.2023 02:02:21] Backing up libvirt.img to /mnt/user/Backup/Unraid/Libvirt/
[03.03.2023 02:02:21] Using Command: /usr/bin/rsync -avXHq --delete --log-file="/var/lib/docker/unraid/ca.backup2.datastore/appdata_backup.log" "/mnt/user/system/libvirt/libvirt.img" "/mnt/user/Backup/Unraid/Libvirt/" > /dev/null 2>&1
2023/03/03 02:02:21 [7157] building file list
2023/03/03 02:02:21 [7157] sent 75 bytes received 19 bytes 188.00 bytes/sec
2023/03/03 02:02:21 [7157] total size is 1,073,741,824 speedup is 11,422,785.36
[03.03.2023 02:02:21] Backing Up appData from /mnt/user/appdata/ to /mnt/user/Backup/Unraid/Appdata/[email protected]
[03.03.2023 02:02:21] Separate archives disabled! Saving into one file.
[03.03.2023 02:02:21] Backing Up
[03.03.2023 02:20:07] Verifying Backup
[03.03.2023 02:27:50] done
[03.03.2023 02:27:50] Starting AndroidDebugBridge... (try #1) done!
[03.03.2023 02:27:53] Starting Apache-PHP... (try #1) done!
[03.03.2023 02:27:55] Starting Grafana... (try #1) done!
[03.03.2023 02:27:58] Starting Home-Assistant-Core... (try #1) done!
[03.03.2023 02:28:00] Starting Influxdb... (try #1) done!
[03.03.2023 02:28:02] Starting Jellyfin... (try #1) done!
[03.03.2023 02:28:05] Starting MQTT... (try #1) done!
[03.03.2023 02:28:08] Starting NginxProxyManager... (try #1) done!
[03.03.2023 02:28:10] Starting NodeRed... (try #1) done!
[03.03.2023 02:28:12] Starting OpenEats... (try #1) done!
[03.03.2023 02:28:15] Starting Zabbix-Server... (try #1) done!
[03.03.2023 02:28:18] Starting Zabbix-Webinterface... (try #1) done!
[03.03.2023 02:28:20] Starting bazarr... (try #1) done!
[03.03.2023 02:28:23] Starting binhex-prowlarr... (try #1) done!
[03.03.2023 02:28:25] Starting bitwardenrs... (try #1) done!
[03.03.2023 02:28:28] Starting deepstack... (try #1) done!
[03.03.2023 02:28:34] Starting deepstack-ui... (try #1) done!
[03.03.2023 02:28:38] Starting grocy... (try #1) done!
[03.03.2023 02:28:41] Starting hassConfigurator... (try #1) done!
[03.03.2023 02:28:46] Starting phpmyadmin... (try #1) done!
[03.03.2023 02:28:51] Starting pihole-template... (try #1) done!
[03.03.2023 02:28:54] Starting tdarr... (try #1) done!
[03.03.2023 02:28:58] Starting tdarr_node... (try #1) done!
[03.03.2023 02:29:01] Starting webtrees... (try #1) done!
[03.03.2023 02:29:03] Starting zabbix-agent2... (try #1) done!
[03.03.2023 02:29:06] Starting binhex-delugevpn... (try #1) done!
[03.03.2023 02:29:08] Starting binhex-lidarr... (try #1) done!
[03.03.2023 02:29:11] Starting binhex-radarr... (try #1) done!
[03.03.2023 02:29:13] Starting binhex-sabnzbdvpn... (try #1) done!
[03.03.2023 02:29:16] Starting binhex-sonarr... (try #1) done!
[03.03.2023 02:29:19] Starting mariadb... (try #1) done!
[03.03.2023 02:29:21] Starting ombi... (try #1) done!
[03.03.2023 02:29:24] Starting ddns... (try #1) done!
[03.03.2023 02:29:26] Starting duplicati... (try #1) done!
[03.03.2023 02:29:29] Starting unifi-controller... (try #1) Error while starting container! - Code: Server error
[03.03.2023 02:29:34] Starting unifi-controller... (try #2) Error while starting container! - Code: Server error
[03.03.2023 02:29:39] Starting unifi-controller... (try #3) Error while starting container! - Code: Server error
[03.03.2023 02:29:39] Container did not started after multiple tries, skipping.
[03.03.2023 02:29:47] A error occurred somewhere. Not deleting old backup sets of appdata
[03.03.2023 02:29:47] Backup / Restore CompletedAny idea what I could try to solve this? For now I have set the backup to not stop this container and have also excluded its data folder.
-
Thanks to @m33ts4k0z for finding a solution for this issue and to @Squid for the plugin! My docker containers are no longer showing as "not available".
-
I successfully upgraded from 6.10 to 6.11.5 and everything went smoothly.
Thanks to all the Unraid devs - you guys rock!
-
19 hours ago, KluthR said:
Yes!
remove the current one and install this plg: https://raw.githubusercontent.com/Squidly271/ca.backup2/master/plugins/ca.backup2.plg
(Dont bother the deprecated version warning)
If you feel ready, you can always switch forward again.
Thanks I have installed the old version again. Will keep an eye out here to see when issues have been resolved and install again (but I will be able to test new version if required - just let me know!)
Unraid OS version 6.12.9 available
in Announcements
Posted
Update went through smoothly. Thanks to the Unraid team