Lordbye

Members
  • Posts

    57
  • Joined

  • Last visited

Everything posted by Lordbye

  1. there is some command line that i can use to fix the situation while waiting for a new key
  2. ouch..again? i have to change usb key every 6/8 months ok i'll try to buy another usb key and reload thanks
  3. i attach also diagnostic lordbyenas-diagnostics-20240412-1043.zip
  4. every 2 days i have this problem - docker service failed to start...and i have to restore a backup of usb key... how can i solve? thanks here last rows from sys log Apr 12 09:42:02 LordbyeNas emhttpd: shcmd (996586): /usr/local/sbin/mount_image '/mnt/user/system/docker/docker.img' /var/lib/docker 50 Apr 12 09:42:02 LordbyeNas kernel: loop4: detected capacity change from 0 to 104857600 Apr 12 09:42:02 LordbyeNas root: mount: /var/lib/docker: mount(2) system call failed: File exists. Apr 12 09:42:02 LordbyeNas root: dmesg(1) may have more information after failed mount system call. Apr 12 09:42:02 LordbyeNas kernel: BTRFS warning: duplicate device /dev/loop4 devid 1 generation 262288 scanned by mount (20622) Apr 12 09:42:02 LordbyeNas root: mount error Apr 12 09:42:02 LordbyeNas emhttpd: shcmd (996586): exit status: 1 Apr 12 09:42:03 LordbyeNas avahi-daemon[20484]: Service "LordbyeNas" (/services/ssh.service) successfully established. Apr 12 09:42:03 LordbyeNas avahi-daemon[20484]: Service "LordbyeNas" (/services/smb.service) successfully established. Apr 12 09:42:03 LordbyeNas avahi-daemon[20484]: Service "LordbyeNas" (/services/sftp-ssh.service) successfully established. Apr 12 09:42:24 LordbyeNas nmbd[20328]: [2024/04/12 09:42:24.433402, 0] ../../source3/nmbd/nmbd_become_lmb.c:398(become_local_master_stage2) Apr 12 09:42:24 LordbyeNas nmbd[20328]: ***** Apr 12 09:42:24 LordbyeNas nmbd[20328]: Apr 12 09:42:24 LordbyeNas nmbd[20328]: Samba name server LORDBYENAS is now a local master browser for workgroup WORKGROUP on subnet 192.168.1.10 Apr 12 09:42:24 LordbyeNas nmbd[20328]: Apr 12 09:42:24 LordbyeNas nmbd[20328]: ***** Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:43:17 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:47:26 LordbyeNas emhttpd: spinning down /dev/sdc Apr 12 09:48:50 LordbyeNas emhttpd: cmd: /usr/local/emhttp/plugins/user.scripts/startScript.sh /tmp/user.scripts/tmpScripts/Riavvio docker/script Apr 12 09:48:51 LordbyeNas kernel: docker0: port 2(veth0e80741) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: docker0: port 9(vethe5a4021) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: docker0: port 8(vethccab7aa) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: docker0: port 7(veth4a016e4) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: docker0: port 6(veth507bce7) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: docker0: port 5(vethebe6137) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: docker0: port 4(vethf5ac7b1) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: docker0: port 3(veth6e4d6ea) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: docker0: port 1(vethebcaa70) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: device veth0e80741 left promiscuous mode Apr 12 09:48:51 LordbyeNas kernel: docker0: port 2(veth0e80741) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: device vethe5a4021 left promiscuous mode Apr 12 09:48:51 LordbyeNas kernel: docker0: port 9(vethe5a4021) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: device vethccab7aa left promiscuous mode Apr 12 09:48:51 LordbyeNas kernel: docker0: port 8(vethccab7aa) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: device veth4a016e4 left promiscuous mode Apr 12 09:48:51 LordbyeNas kernel: docker0: port 7(veth4a016e4) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: device veth507bce7 left promiscuous mode Apr 12 09:48:51 LordbyeNas kernel: docker0: port 6(veth507bce7) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: device vethebe6137 left promiscuous mode Apr 12 09:48:51 LordbyeNas kernel: docker0: port 5(vethebe6137) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: device vethf5ac7b1 left promiscuous mode Apr 12 09:48:51 LordbyeNas kernel: docker0: port 4(vethf5ac7b1) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: device veth6e4d6ea left promiscuous mode Apr 12 09:48:51 LordbyeNas kernel: docker0: port 3(veth6e4d6ea) entered disabled state Apr 12 09:48:51 LordbyeNas kernel: device vethebcaa70 left promiscuous mode Apr 12 09:48:51 LordbyeNas kernel: docker0: port 1(vethebcaa70) entered disabled state Apr 12 09:49:07 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:49:07 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:49:07 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:49:07 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:49:07 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:49:07 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:49:07 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:49:07 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:49:20 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:49:20 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:49:20 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:49:20 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:49:21 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:49:21 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:49:21 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:49:21 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:49:31 LordbyeNas flash_backup: adding task: /usr/local/emhttp/plugins/dynamix.my.servers/scripts/UpdateFlashBackup update Apr 12 09:52:12 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:52:12 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:52:12 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:52:12 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:52:12 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:52:12 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:52:12 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:52:12 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5 Apr 12 09:52:12 LordbyeNas kernel: SQUASHFS error: xz decompression failed, data probably corrupt Apr 12 09:52:12 LordbyeNas kernel: SQUASHFS error: Failed to read block 0x28885fc: -5
  5. HI, Strange things have been happening to me for a few days now, after being stopped for 15 days for moving house. The last thing that happened is this, this morning I found some deleted dockers (and I don't know how to restore them) and If I go into the app and reinstall from the previous application they give me error. How can repair that issue? thanks docker run -d --name='homeassistant' --net='host' --privileged=true -e TZ="Europe/Berlin" -e HOST_OS="Unraid" -e HOST_HOSTNAME="LordbyeNas" -e HOST_CONTAINERNAME="homeassistant" -e 'TCP_PORT_8123'='8123' -e 'PUID'='99' -e 'PGID'='100' -e 'UMASK'='022' -l net.unraid.docker.managed=dockerman -l net.unraid.docker.webui='http://[IP]:[PORT:8123]' -l net.unraid.docker.icon='https://raw.githubusercontent.com/linuxserver/docker-templates/master/linuxserver.io/img/homeassistant-logo.png' -v '/mnt/user/appdata/homeassistant':'/config':'rw' --device='/path/to/device' 'lscr.io/linuxserver/homeassistant' unexpected fault address 0x505b00 fatal error: fault [signal SIGBUS: bus error code=0x2 addr=0x505b00 pc=0x505b00] goroutine 1 [running]: runtime.throw({0x201c49?, 0x4eaa53?}) /usr/local/go/src/runtime/panic.go:1047 +0x5f fp=0xc00059e048 sp=0xc00059e018 pc=0x411d7f runtime.sigpanic() /usr/local/go/src/runtime/signal_unix.go:838 +0x125 fp=0xc00059e0a8 sp=0xc00059e048 pc=0x4283c5 encoding/json.boolEncoder(0x10b2880, {0xc00021a8d0, 0xc0000e88e0, 0xc00059e130}, {0x58, 0xf5}) /usr/local/go/src/encoding/json/encode.go:535 fp=0xc00059e0b0 sp=0xc00059e0a8 pc=0x505b00 encoding/json.structEncoder.encode({{{0xc0000d8600?, 0xc00021a730?, 0xc000122020?}, 0xc000455500?}}, 0xc00013a100, {0xffcd60?, 0xc00045c078?, 0xc00059e208?}, {0x0, 0x1}) /usr/local/go/src/encoding/json/encode.go:759 +0x1f4 fp=0xc00059e160 sp=0xc00059e0b0 pc=0x506d54 encoding/json.structEncoder.encode-fm(0xffcd60?, {0xffcd60?, 0xc00045c078?, 0xc00013a100?}, {0x80?, 0x0?}) <autogenerated>:1 +0x69 fp=0xc00059e1b8 sp=0xc00059e160 pc=0x5127e9 encoding/json.(*encodeState).reflectValue(0xc00013a100?, {0xffcd60?, 0xc00045c078?, 0x3e7c47?}, {0x78?, 0x0?}) /usr/local/go/src/encoding/json/encode.go:358 +0x78 fp=0xc00059e218 sp=0xc00059e1b8 pc=0x5047b8 encoding/json.(*encodeState).marshal(0xc00059e2e8?, {0xffcd60?, 0xc00045c078?}, {0x6a?, 0x78?}) /usr/local/go/src/encoding/json/encode.go:330 +0xfa fp=0xc00059e290 sp=0xc00059e218 pc=0x50433a encoding/json.(*Encoder).Encode(0xc00059e360, {0xffcd60, 0xc00045c078}) /usr/local/go/src/encoding/json/stream.go:209 +0xf3 fp=0xc00059e340 sp=0xc00059e290 pc=0x510493 github.com/docker/cli/vendor/github.com/docker/docker/client.encodeData({0xffcd60, 0xc00045c078}) /go/src/github.com/docker/cli/vendor/github.com/docker/docker/client/request.go:262 +0x9b fp=0xc00059e3c0 sp=0xc00059e340 pc=0x7dbe1b github.com/docker/cli/vendor/github.com/docker/docker/client.encodeBody({0xffcd60?, 0xc00045c078?}, 0x0) /go/src/github.com/docker/cli/vendor/github.com/docker/docker/client/request.go:77 +0x35 fp=0xc00059e408 sp=0xc00059e3c0 pc=0x7da3b5 github.com/docker/cli/vendor/github.com/docker/docker/client.(*Client).post(0xffcd60?, {0x10da7b8, 0xc0001160f0}, {0x21114f, 0x12}, 0x450d3e?, {0xffcd60?, 0xc00045c078?}, 0x450a25?) /go/src/github.com/docker/cli/vendor/github.com/docker/docker/client/request.go:41 +0x85 fp=0xc00059e4d0 sp=0xc00059e408 pc=0x7da005 github.com/docker/cli/vendor/github.com/docker/docker/client.(*Client).ContainerCreate(0xc0000dd400, {0x10da7b8, 0xc0001160f0}, 0xc0000e88c0, 0xc000480d80, 0xc000122008, 0x0, {0x7ffde054cd17, 0xd}) /go/src/github.com/docker/cli/vendor/github.com/docker/docker/client/container_create.go:63 +0x665 fp=0xc00059e818 sp=0xc00059e4d0 pc=0x7bf425 github.com/docker/cli/cli/command/container.createContainer({0x10da7b8, 0xc0001160f0}, {0x10e3b38?, 0xc0004200f0?}, 0x0?, 0xc000192540) /go/src/github.com/docker/cli/cli/command/container/create.go:255 +0x7ea fp=0xc00059f780 sp=0xc00059e818 pc=0xae42ca github.com/docker/cli/cli/command/container.runContainer({0x10e3b38, 0xc0004200f0}, 0xc000192540, 0xc0000f2b00, 0xc00045c060) /go/src/github.com/docker/cli/cli/command/container/run.go:147 +0x2a6 fp=0xc00059f9e8 sp=0xc00059f780 pc=0xaf97e6 github.com/docker/cli/cli/command/container.runRun({0x10e3b38, 0xc0004200f0}, 0x0?, 0xc000192540, 0xc0000f2b00) /go/src/github.com/docker/cli/cli/command/container/run.go:118 +0x76c fp=0xc00059fbe8 sp=0xc00059f9e8 pc=0xaf942c github.com/docker/cli/cli/command/container.NewRunCommand.func1(0xc000004600?, {0xc0000ee5a0?, 0x1?, 0x1e?}) /go/src/github.com/docker/cli/cli/command/container/run.go:46 +0xef fp=0xc00059fc40 sp=0xc00059fbe8 pc=0xaf8c6f github.com/docker/cli/vendor/github.com/spf13/cobra.(*Command).execute(0xc000004600, {0xc00019b210, 0x1e, 0x1e}) /go/src/github.com/docker/cli/vendor/github.com/spf13/cobra/command.go:940 +0x862 fp=0xc00059fd78 sp=0xc00059fc40 pc=0x9dbe42 github.com/docker/cli/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc000004300) /go/src/github.com/docker/cli/vendor/github.com/spf13/cobra/command.go:1068 +0x3bd fp=0xc00059fe30 sp=0xc00059fd78 pc=0x9dc6bd github.com/docker/cli/vendor/github.com/spf13/cobra.(*Command).Execute(...) /go/src/github.com/docker/cli/vendor/github.com/spf13/cobra/command.go:992 main.runDocker(0x0?) /go/src/github.com/docker/cli/cmd/docker/docker.go:263 +0x4b7 fp=0xc00059ff08 sp=0xc00059fe30 pc=0xe731b7 main.main() /go/src/github.com/docker/cli/cmd/docker/docker.go:274 +0x97 fp=0xc00059ff80 sp=0xc00059ff08 pc=0xe732d7 runtime.main() /usr/local/go/src/runtime/proc.go:250 +0x212 fp=0xc00059ffe0 sp=0xc00059ff80 pc=0x4146b2 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00059ffe8 sp=0xc00059ffe0 pc=0x445cc1 goroutine 2 [force gc (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00009afb0 sp=0xc00009af90 pc=0x414af6 runtime.goparkunlock(...) /usr/local/go/src/runtime/proc.go:387 runtime.forcegchelper() /usr/local/go/src/runtime/proc.go:305 +0xb0 fp=0xc00009afe0 sp=0xc00009afb0 pc=0x414930 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00009afe8 sp=0xc00009afe0 pc=0x445cc1 created by runtime.init.6 /usr/local/go/src/runtime/proc.go:293 +0x25 goroutine 18 [GC sweep wait]: runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000096780 sp=0xc000096760 pc=0x414af6 runtime.goparkunlock(...) /usr/local/go/src/runtime/proc.go:387 runtime.bgsweep(0x0?) /usr/local/go/src/runtime/mgcsweep.go:319 +0xde fp=0xc0000967c8 sp=0xc000096780 pc=0x3ffafe runtime.gcenable.func1() /usr/local/go/src/runtime/mgc.go:178 +0x26 fp=0xc0000967e0 sp=0xc0000967c8 pc=0x3f4d66 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0000967e8 sp=0xc0000967e0 pc=0x445cc1 created by runtime.gcenable /usr/local/go/src/runtime/mgc.go:178 +0x6b goroutine 19 [GC scavenge wait]: runtime.gopark(0xc00010e000?, 0x3bea70?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000096f70 sp=0xc000096f50 pc=0x414af6 runtime.goparkunlock(...) /usr/local/go/src/runtime/proc.go:387 runtime.(*scavengerState).park(0x189ece0) /usr/local/go/src/runtime/mgcscavenge.go:400 +0x53 fp=0xc000096fa0 sp=0xc000096f70 pc=0x3fd9d3 runtime.bgscavenge(0x0?) /usr/local/go/src/runtime/mgcscavenge.go:633 +0x65 fp=0xc000096fc8 sp=0xc000096fa0 pc=0x3fdfc5 runtime.gcenable.func2() /usr/local/go/src/runtime/mgc.go:179 +0x26 fp=0xc000096fe0 sp=0xc000096fc8 pc=0x3f4d06 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc000096fe8 sp=0xc000096fe0 pc=0x445cc1 created by runtime.gcenable /usr/local/go/src/runtime/mgc.go:179 +0xaa goroutine 34 [finalizer wait]: runtime.gopark(0x1a0?, 0x189f680?, 0xa0?, 0x41?, 0xc00009a770?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00009a628 sp=0xc00009a608 pc=0x414af6 runtime.runfinq() /usr/local/go/src/runtime/mfinal.go:193 +0x107 fp=0xc00009a7e0 sp=0xc00009a628 pc=0x3f3d87 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00009a7e8 sp=0xc00009a7e0 pc=0x445cc1 created by runtime.createfing /usr/local/go/src/runtime/mfinal.go:163 +0x45 goroutine 20 [GC worker (idle)]: runtime.gopark(0x8ef915?, 0x3df9dd?, 0xc0?, 0x23?, 0xc00043a7a8?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00043a750 sp=0xc00043a730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00043a7e0 sp=0xc00043a750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00043a7e8 sp=0xc00043a7e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 37 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00043af50 sp=0xc00043af30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00043afe0 sp=0xc00043af50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00043afe8 sp=0xc00043afe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 21 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000097750 sp=0xc000097730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc0000977e0 sp=0xc000097750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0000977e8 sp=0xc0000977e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 38 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00043b750 sp=0xc00043b730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00043b7e0 sp=0xc00043b750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00043b7e8 sp=0xc00043b7e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 22 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000097f50 sp=0xc000097f30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc000097fe0 sp=0xc000097f50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc000097fe8 sp=0xc000097fe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 39 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00043bf50 sp=0xc00043bf30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00043bfe0 sp=0xc00043bf50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00043bfe8 sp=0xc00043bfe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 23 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000098750 sp=0xc000098730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc0000987e0 sp=0xc000098750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0000987e8 sp=0xc0000987e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 40 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00043c750 sp=0xc00043c730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00043c7e0 sp=0xc00043c750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00043c7e8 sp=0xc00043c7e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 24 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000098f50 sp=0xc000098f30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc000098fe0 sp=0xc000098f50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc000098fe8 sp=0xc000098fe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 41 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00043cf50 sp=0xc00043cf30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00043cfe0 sp=0xc00043cf50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00043cfe8 sp=0xc00043cfe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 25 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000099750 sp=0xc000099730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc0000997e0 sp=0xc000099750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0000997e8 sp=0xc0000997e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 42 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00043d750 sp=0xc00043d730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00043d7e0 sp=0xc00043d750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00043d7e8 sp=0xc00043d7e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 43 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00043df50 sp=0xc00043df30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00043dfe0 sp=0xc00043df50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00043dfe8 sp=0xc00043dfe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 26 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000099f50 sp=0xc000099f30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc000099fe0 sp=0xc000099f50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc000099fe8 sp=0xc000099fe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 44 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000436750 sp=0xc000436730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc0004367e0 sp=0xc000436750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0004367e8 sp=0xc0004367e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 27 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc0004ea750 sp=0xc0004ea730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc0004ea7e0 sp=0xc0004ea750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0004ea7e8 sp=0xc0004ea7e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 45 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000436f50 sp=0xc000436f30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc000436fe0 sp=0xc000436f50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc000436fe8 sp=0xc000436fe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 28 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc0004eaf50 sp=0xc0004eaf30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc0004eafe0 sp=0xc0004eaf50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0004eafe8 sp=0xc0004eafe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 3 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00009b750 sp=0xc00009b730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00009b7e0 sp=0xc00009b750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00009b7e8 sp=0xc00009b7e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 29 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc0004eb750 sp=0xc0004eb730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc0004eb7e0 sp=0xc0004eb750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0004eb7e8 sp=0xc0004eb7e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 4 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00009bf50 sp=0xc00009bf30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00009bfe0 sp=0xc00009bf50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00009bfe8 sp=0xc00009bfe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 5 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00009c750 sp=0xc00009c730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00009c7e0 sp=0xc00009c750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00009c7e8 sp=0xc00009c7e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 46 [GC worker (idle)]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000437750 sp=0xc000437730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc0004377e0 sp=0xc000437750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0004377e8 sp=0xc0004377e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 6 [GC worker (idle)]: runtime.gopark(0x8cf247231ff6?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00009cf50 sp=0xc00009cf30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00009cfe0 sp=0xc00009cf50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00009cfe8 sp=0xc00009cfe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 30 [GC worker (idle)]: runtime.gopark(0x8cf247223dec?, 0x1?, 0xfc?, 0xd8?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc0004ebf50 sp=0xc0004ebf30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc0004ebfe0 sp=0xc0004ebf50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0004ebfe8 sp=0xc0004ebfe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 47 [GC worker (idle)]: runtime.gopark(0x18d1180?, 0x1?, 0xd8?, 0xad?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000437f50 sp=0xc000437f30 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc000437fe0 sp=0xc000437f50 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc000437fe8 sp=0xc000437fe0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 31 [GC worker (idle)]: runtime.gopark(0x8cf247225956?, 0x3?, 0x69?, 0x3?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc0004ec750 sp=0xc0004ec730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc0004ec7e0 sp=0xc0004ec750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0004ec7e8 sp=0xc0004ec7e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 7 [GC worker (idle)]: runtime.gopark(0x8cf247222740?, 0x1?, 0x90?, 0x13?, 0x0?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00009d750 sp=0xc00009d730 pc=0x414af6 runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1275 +0xf1 fp=0xc00009d7e0 sp=0xc00009d750 pc=0x3f6ad1 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00009d7e8 sp=0xc00009d7e0 pc=0x445cc1 created by runtime.gcBgMarkStartWorkers /usr/local/go/src/runtime/mgc.go:1199 +0x25 goroutine 9 [IO wait]: runtime.gopark(0x90?, 0xb?, 0x0?, 0x0?, 0x8?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000451ab0 sp=0xc000451a90 pc=0x414af6 runtime.netpollblock(0x45af65?, 0x3dd4cf?, 0x0?) /usr/local/go/src/runtime/netpoll.go:527 +0xf7 fp=0xc000451ae8 sp=0xc000451ab0 pc=0x40d257 internal/poll.runtime_pollWait(0x14c05bbb4358, 0x72) /usr/local/go/src/runtime/netpoll.go:306 +0x89 fp=0xc000451b08 sp=0xc000451ae8 pc=0x4400e9 internal/poll.(*pollDesc).wait(0xc0000dd480?, 0xc000239000?, 0x0) /usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x32 fp=0xc000451b30 sp=0xc000451b08 pc=0x481912 internal/poll.(*pollDesc).waitRead(...) /usr/local/go/src/internal/poll/fd_poll_runtime.go:89 internal/poll.(*FD).Read(0xc0000dd480, {0xc000239000, 0x1000, 0x1000}) /usr/local/go/src/internal/poll/fd_unix.go:167 +0x299 fp=0xc000451bc8 sp=0xc000451b30 pc=0x482cf9 net.(*netFD).Read(0xc0000dd480, {0xc000239000?, 0x2?, 0xc000422500?}) /usr/local/go/src/net/fd_posix.go:55 +0x29 fp=0xc000451c10 sp=0xc000451bc8 pc=0x5f30c9 net.(*conn).Read(0xc000198b30, {0xc000239000?, 0x0?, 0x0?}) /usr/local/go/src/net/net.go:183 +0x45 fp=0xc000451c58 sp=0xc000451c10 pc=0x6049e5 net.(*UnixConn).Read(0x0?, {0xc000239000?, 0xc0001bd9b0?, 0x17?}) <autogenerated>:1 +0x29 fp=0xc000451c88 sp=0xc000451c58 pc=0x616e29 net/http.(*persistConn).Read(0xc0001fd7a0, {0xc000239000?, 0x425580?, 0xc000451ec8?}) /usr/local/go/src/net/http/transport.go:1943 +0x4e fp=0xc000451ce8 sp=0xc000451c88 pc=0x6f322e bufio.(*Reader).fill(0xc000115980) /usr/local/go/src/bufio/bufio.go:106 +0xff fp=0xc000451d20 sp=0xc000451ce8 pc=0x512e5f bufio.(*Reader).Peek(0xc000115980, 0x1) /usr/local/go/src/bufio/bufio.go:144 +0x5d fp=0xc000451d40 sp=0xc000451d20 pc=0x512fbd net/http.(*persistConn).readLoop(0xc0001fd7a0) /usr/local/go/src/net/http/transport.go:2107 +0x1ac fp=0xc000451fc8 sp=0xc000451d40 pc=0x6f404c net/http.(*Transport).dialConn.func5() /usr/local/go/src/net/http/transport.go:1765 +0x26 fp=0xc000451fe0 sp=0xc000451fc8 pc=0x6f27e6 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc000451fe8 sp=0xc000451fe0 pc=0x445cc1 created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1765 +0x16ea goroutine 10 [select]: runtime.gopark(0xc00044df90?, 0x2?, 0xf8?, 0xdd?, 0xc00044df34?) /usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00044ddb0 sp=0xc00044dd90 pc=0x414af6 runtime.selectgo(0xc00044df90, 0xc00044df30, 0xc000112280?, 0x0, 0x0?, 0x1) /usr/local/go/src/runtime/select.go:327 +0x7be fp=0xc00044def0 sp=0xc00044ddb0 pc=0x4247de net/http.(*persistConn).writeLoop(0xc0001fd7a0) /usr/local/go/src/net/http/transport.go:2410 +0xf2 fp=0xc00044dfc8 sp=0xc00044def0 pc=0x6f5d12 net/http.(*Transport).dialConn.func6() /usr/local/go/src/net/http/transport.go:1766 +0x26 fp=0xc00044dfe0 sp=0xc00044dfc8 pc=0x6f2786 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00044dfe8 sp=0xc00044dfe0 pc=0x445cc1 created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1766 +0x173d Comando Fallito.
  6. anyone knows if is possible to transfer my palworld progress from a server to my dedicated server?
  7. thanks. it is ok!! now there is a problem with game update, that doesn't recognize server, but there is only to wait update
  8. hi, i need an help to setting up palworld docker server i can user my docker (from outside home) only with wireguard, but is possibile to setting up palworld server to be accessible without any vpn, so i can access it from everywhere (also my friends)? thanks
  9. i have solved with a downgrade to 6.12.5 and new update to 6.12.6 now
  10. hi, starting from today i have a problem. Untill yesterday i can see mobo and cpu temp and can pwm my fan This morning nothing is working, or better, i can see coretemp CPU TEMP, but no pwm control and mobo temp Untill yesterday my sensor was NCT6775 today it recognize it as lm75.... (here my log of july : Jul 29 16:23:04 LordbyeNas autofan: autofan process ID 15733 started, To terminate it, type: autofan -q -c /sys/devices/platform/nct6775.2592/hwmon/hwmon1/pwm3 -f -q) how can i try to solve? or where can i find any error? thanks sensors-detect version 3.6.0 # System: JINGSHA AD12-B # Kernel: 6.1.64-Unraid x86_64 # Processor: Intel(R) Xeon(R) CPU E5-2697 v3 @ 2.60GHz (6/63/2) This program will help you determine which kernel modules you need to load to use lm_sensors most effectively. It is generally safe and recommended to accept the default answers to all questions, unless you know what you're doing. Some south bridges, CPUs or memory controllers contain embedded sensors. Do you want to scan for them? This is totally safe. (YES/no): Silicon Integrated Systems SIS5595... No VIA VT82C686 Integrated Sensors... No VIA VT8231 Integrated Sensors... No AMD K8 thermal sensors... No AMD Family 10h thermal sensors... No AMD Family 11h thermal sensors... No AMD Family 12h and 14h thermal sensors... No AMD Family 15h thermal sensors... No AMD Family 16h thermal sensors... No AMD Family 17h thermal sensors... No AMD Family 15h power sensors... No AMD Family 16h power sensors... No Hygon Family 18h thermal sensors... No Intel digital thermal sensor... Success! (driver `coretemp') Intel AMB FB-DIMM thermal sensor... No Intel 5500/5520/X58 thermal sensor... No VIA C7 thermal sensor... No VIA Nano thermal sensor... No Some Super I/O chips contain embedded sensors. We have to write to standard I/O ports to probe them. This is usually safe. Do you want to scan for Super I/O sensors? (YES/no): Probing for Super-I/O at 0x2e/0x2f Trying family `National Semiconductor/ITE'... Yes Found unknown chip with ID 0xfdfd Probing for Super-I/O at 0x4e/0x4f Trying family `National Semiconductor/ITE'... Yes Found unknown chip with ID 0xfdfd Some systems (mainly servers) implement IPMI, a set of common interfaces through which system health data may be retrieved, amongst other things. We first try to get the information from SMBIOS. If we don't find it there, we have to read from arbitrary I/O ports to probe for such interfaces. This is normally safe. Do you want to scan for IPMI interfaces? (YES/no): Probing for `IPMI BMC KCS' at 0xca0... Success! (confidence 4, driver `to-be-written') Probing for `IPMI BMC SMIC' at 0xca8... Success! (confidence 4, driver `to-be-written') Some hardware monitoring chips are accessible through the ISA I/O ports. We have to write to arbitrary I/O ports to probe them. This is usually safe though. Yes, you do have ISA I/O ports even if you do not have any ISA slots! Do you want to scan the ISA I/O ports? (YES/no): Probing for `National Semiconductor LM78' at 0x290... No Probing for `National Semiconductor LM79' at 0x290... No Probing for `Winbond W83781D' at 0x290... No Probing for `Winbond W83782D' at 0x290... No Lastly, we can probe the I2C/SMBus adapters for connected hardware monitoring devices. This is the most risky part, and while it works reasonably well on most systems, it has been reported to cause trouble on some systems. Do you want to probe the I2C/SMBus adapters now? (YES/no): Using driver `i2c-i801' for device 0000:00:1f.3: Wellsburg (PCH) Module i2c-dev loaded successfully. Next adapter: SMBus I801 adapter at 0580 (i2c-0) Do you want to scan it? (YES/no/selectively): Client found at address 0x4f Probing for `National Semiconductor LM75'... No Probing for `National Semiconductor LM75A'... No Probing for `Dallas Semiconductor DS75'... Success! (confidence 3, driver `lm75') Probing for `Maxim MAX6642'... No Probing for `Texas Instruments TMP421'... No Probing for `Texas Instruments TMP422'... No Probing for `Texas Instruments TMP435'... No Probing for `Texas Instruments TMP441'... No Probing for `Maxim MAX6633/MAX6634/MAX6635'... No Probing for `NXP/Philips SA56004'... No Next adapter: NVIDIA i2c adapter 3 at 3:00.0 (i2c-1) Do you want to scan it? (yes/NO/selectively): Next adapter: NVIDIA i2c adapter 4 at 3:00.0 (i2c-2) Do you want to scan it? (yes/NO/selectively): Next adapter: NVIDIA i2c adapter 6 at 3:00.0 (i2c-3) Do you want to scan it? (yes/NO/selectively): Now follows a summary of the probes I have just done. Just press ENTER to continue: Driver `coretemp': * Chip `Intel digital thermal sensor' (confidence: 9) Driver `to-be-written': * ISA bus, address 0xca0 Chip `IPMI BMC KCS' (confidence: 4) * ISA bus, address 0xca8 Chip `IPMI BMC SMIC' (confidence: 4) Driver `lm75': * Bus `SMBus I801 adapter at 0580' Busdriver `i2c_i801', I2C address 0x4f Chip `Dallas Semiconductor DS75' (confidence: 3) Note: there is no driver for IPMI BMC KCS yet. Check https://hwmon.wiki.kernel.org/device_support_status for updates. Do you want to generate /etc/sysconfig/lm_sensors? (yes/NO): n To load everything that is needed, add this to one of the system initialization scripts (e.g. /etc/rc.d/rc.local): #----cut here---- # Chip drivers modprobe coretemp modprobe lm75 /usr/bin/sensors -s #----cut here---- You really should try these commands right now to make sure everything is working properly. Monitoring programs won't work until the needed modules are loaded. Unloading i2c-dev... OK
  11. Hi, i've problems with my docker. Almost every time i shutdown server (with shutdown command in home), i have to reconfigure my plex library, and i lost access to homeassistant home assistant also...has a strange problem too. When i try to change password with console command it says user doesn't exist..., but users exist
  12. hi, just set it up to use Gmail as a SMTP server and enter your credentials for your Gmail account but this will no longer work, as you now have to use App Passwords From Google Create & use app passwords Important: To create an app password, you need 2-Step Verification on your Google Account. If you use 2-Step-Verification and get a "password incorrect" error when you sign in, you can try to use an app password. Go to your Google Account. Select Security. Under "Signing in to Google," select 2-Step Verification. At the bottom of the page, select App passwords. Enter a name that helps you remember where you’ll use the app password. Select Generate. To enter the app password, follow the instructions on your screen. The app password is the 16-character code that generates on your device. Select Done. If you’ve set up 2-Step Verification but can’t find the option to add an app password, it might be because: Your Google Account has 2-Step Verification set up only for security keys. You’re logged into a work, school, or another organization account. Your Google Account has Advanced Protection. Tip: Usually, you’ll need to enter an app password once per app or device.
  13. i rebooted and now du and df has same total...i'll wait
  14. i try root@LordbyeNas:~# du -sh /var/log 772K /var/log root@LordbyeNas:~# df -h /var/log Filesystem Size Used Avail Use% Mounted on tmpfs 384M 73M 312M 19% /var/log i have not any symlinks (or better, not manually made by me....)
  15. here diagnotic lordbyenas-diagnostics-20230512-1523.zip
  16. i'll try to trim these files tnx root@LordbyeNas:/var/log# ls -lh total 244K -rw------- 1 root root 768 May 8 09:34 btmp -rw-r--r-- 1 root root 0 Apr 28 2021 cron -rw-r--r-- 1 root root 0 Apr 28 2021 debug -rw-r--r-- 1 root root 71K May 6 00:14 dmesg -rw-r--r-- 1 root root 17K May 12 07:01 docker.log -rw-r--r-- 1 root root 6.9K May 6 00:14 faillog -rw-r--r-- 1 root root 56 May 12 15:20 gitcount -rw-r--r-- 1 root root 14K May 12 15:20 gitflash -rw-r--r-- 1 root root 63K May 6 00:14 lastlog drwxr-xr-x 4 root root 140 May 11 03:00 libvirt/ -rw-r--r-- 1 root root 784 May 6 16:32 maillog -rw-r--r-- 1 root root 0 May 6 00:14 mcelog -rw-r--r-- 1 root root 0 Apr 28 2021 messages drwxr-xr-x 2 root root 40 Aug 10 2022 nfsd/ drwxr-x--- 2 nobody root 60 May 6 00:15 nginx/ lrwxrwxrwx 1 root root 24 Nov 20 22:24 packages -> ../lib/pkgtools/packages/ drwxr-xr-x 5 root root 100 May 6 00:14 pkgtools/ drwxr-xr-x 2 root root 980 May 11 16:45 plugins/ drwxr-xr-x 2 root root 40 May 12 11:03 pwfail/ lrwxrwxrwx 1 root root 25 Nov 20 22:26 removed_packages -> pkgtools/removed_packages/ lrwxrwxrwx 1 root root 24 Nov 20 22:26 removed_scripts -> pkgtools/removed_scripts/ lrwxrwxrwx 1 root root 34 May 6 00:14 removed_uninstall_scripts -> pkgtools/removed_uninstall_scripts/ drwxr-xr-x 3 root root 340 May 12 15:17 samba/ -rw-r--r-- 1 root root 33 Dec 31 2019 scan lrwxrwxrwx 1 root root 23 Nov 20 22:24 scripts -> ../lib/pkgtools/scripts/ -rw-r--r-- 1 root root 0 Apr 28 2021 secure lrwxrwxrwx 1 root root 21 Nov 20 22:24 setup -> ../lib/pkgtools/setup/ -rw-r--r-- 1 root root 0 Apr 28 2021 spooler drwxr-xr-x 3 root root 60 Sep 27 2022 swtpm/ -rw-r--r-- 1 root root 100K May 6 01:00 syslog drwxr-xr-x 2 root root 80 May 10 21:00 unraid-api/ -rw-r--r-- 1 root root 0 May 6 00:14 vfio-pci -rw-r--r-- 1 root root 587 May 6 00:14 wg-quick.log -rw-rw-r-- 1 root utmp 6.8K May 6 00:15 wtmp
  17. hi, i have a problem with these commands..and /var/log folder if i use DF -h i receive this output root@LordbyeNas:~# df -h Filesystem Size Used Avail Use% Mounted on rootfs 16G 2.4G 14G 15% / tmpfs 32M 1.1M 31M 4% /run /dev/sda1 15G 990M 14G 7% /boot overlay 16G 2.4G 14G 15% /lib/firmware overlay 16G 2.4G 14G 15% /lib/modules devtmpfs 8.0M 0 8.0M 0% /dev tmpfs 16G 0 16G 0% /dev/shm cgroup_root 8.0M 0 8.0M 0% /sys/fs/cgroup tmpfs 384M 71M 314M 19% /var/log tmpfs 1.0M 0 1.0M 0% /mnt/disks tmpfs 1.0M 0 1.0M 0% /mnt/remotes tmpfs 1.0M 0 1.0M 0% /mnt/addons tmpfs 1.0M 0 1.0M 0% /mnt/rootshare ----- Instead with Du root@LordbyeNas:~# du -h -c /var/log/ 0 /var/log/pwfail 128K /var/log/unraid-api 0 /var/log/swtpm/libvirt/qemu 0 /var/log/swtpm/libvirt 0 /var/log/swtpm 0 /var/log/samba/cores/rpcd_winreg 0 /var/log/samba/cores/rpcd_classic 0 /var/log/samba/cores/rpcd_lsad 0 /var/log/samba/cores/samba-dcerpcd 0 /var/log/samba/cores/winbindd 0 /var/log/samba/cores/nmbd 0 /var/log/samba/cores/smbd 0 /var/log/samba/cores 236K /var/log/samba 0 /var/log/plugins 0 /var/log/pkgtools/removed_uninstall_scripts 4.0K /var/log/pkgtools/removed_scripts 12K /var/log/pkgtools/removed_packages 16K /var/log/pkgtools 4.0K /var/log/nginx 0 /var/log/nfsd 28K /var/log/libvirt/qemu 0 /var/log/libvirt/ch 128K /var/log/libvirt 752K /var/log/ 752K total Where can i find the difference? thanks
  18. Sorry, i didn't put my script size=100k function trimLog { file=$1 temp="$file.$(date +%s%N).tmp" time=$(date --rfc-3339='seconds') before=$(du -sh "$file" | cut -f1) echo -n "$time: $file: $before=>" tail --bytes=$size "$file" > "$temp" chown $(stat -c '%U' "$file"):$(stat -c '%G' "$file") "$temp" chmod $(stat -c "%a" "$file") "$temp" mv "$temp" "$file" after=$(du -sh "$file" | cut -f1) echo "$after" } find /var/log -maxdepth 5 -type f -size +$size 2>/dev/null | sort -f |\ while read file; do trimLog "$file"; done find "/boot/logs" "/var/log" "/var/lib/docker/containers" -name "*.log" "*log" -size +$size 2>/dev/null | sort -f |\ while read file; do trimLog "$file"; done
  19. Hi , i'm using yoru script, and for a little time it is ok, now i have a problem log is growing anyway, but i don't know..where...or which file because: root@MyNas:/var/log# df /var/log/ Filesystem 1K-blocks Used Available Use% Mounted on tmpfs 262144 18836 243308 8% /var/log ----------------------------------- root@MyNas:/var/log# du -mh -c 0 ./pwfail 152K ./unraid-api 0 ./swtpm/libvirt/qemu 0 ./swtpm/libvirt 0 ./swtpm 0 ./samba/cores/rpcd_lsad 0 ./samba/cores/samba-dcerpcd 0 ./samba/cores/winbindd 0 ./samba/cores/nmbd 0 ./samba/cores/smbd 0 ./samba/cores 212K ./samba 0 ./plugins 0 ./pkgtools/removed_uninstall_scripts 4.0K ./pkgtools/removed_scripts 12K ./pkgtools/removed_packages 16K ./pkgtools 4.0K ./nginx 0 ./nfsd 40K ./libvirt/qemu 0 ./libvirt/ch 76K ./libvirt 796K . 796K total --------------------- root@LordbyeNas:/var/log# ls -lha total 336K drwxr-xr-x 11 root root 740 May 2 16:21 ./ drwxr-xr-x 15 root root 360 Dec 31 2019 ../ -rw------- 1 root root 0 Nov 20 22:25 btmp -rw-r--r-- 1 root root 0 Apr 28 2021 cron -rw-r--r-- 1 root root 0 Apr 28 2021 debug -rw-r--r-- 1 root root 71K May 1 19:42 dmesg -rw-r--r-- 1 root root 20K May 3 07:02 docker.log -rw-r--r-- 1 root root 6.9K May 1 19:43 faillog -rw-r--r-- 1 root root 67 May 3 08:24 gitcount -rw-r--r-- 1 root root 12K May 3 08:24 gitflash -rw-r--r-- 1 root root 63K May 1 19:43 lastlog drwxr-xr-x 4 root root 140 May 1 19:44 libvirt/ -rw-r--r-- 1 root root 358 May 2 00:07 maillog -rw-r--r-- 1 root root 0 May 1 19:42 mcelog -rw-r--r-- 1 root root 0 Apr 28 2021 messages drwxr-xr-x 2 root root 40 Aug 10 2022 nfsd/ drwxr-x--- 2 nobody root 60 May 1 19:43 nginx/ lrwxrwxrwx 1 root root 24 Nov 20 22:24 packages -> ../lib/pkgtools/packages/ drwxr-xr-x 5 root root 100 May 1 19:43 pkgtools/ drwxr-xr-x 2 root root 980 May 3 08:00 plugins/ drwxr-xr-x 2 root root 40 May 3 16:24 pwfail/ lrwxrwxrwx 1 root root 25 Nov 20 22:26 removed_packages -> pkgtools/removed_packages/ lrwxrwxrwx 1 root root 24 Nov 20 22:26 removed_scripts -> pkgtools/removed_scripts/ lrwxrwxrwx 1 root root 34 May 1 19:43 removed_uninstall_scripts -> pkgtools/removed_uninstall_scripts/ drwxr-xr-x 3 root root 340 May 3 16:24 samba/ -rw-r--r-- 1 root root 33 Dec 31 2019 scan lrwxrwxrwx 1 root root 23 Nov 20 22:24 scripts -> ../lib/pkgtools/scripts/ -rw-r--r-- 1 root root 0 Apr 28 2021 secure lrwxrwxrwx 1 root root 21 Nov 20 22:24 setup -> ../lib/pkgtools/setup/ -rw-r--r-- 1 root root 0 Apr 28 2021 spooler drwxr-xr-x 3 root root 60 Sep 27 2022 swtpm/ -rw-r--r-- 1 root root 100K May 2 16:21 syslog -rw-r--r-- 1 root root 100K May 2 16:21 syslog.1 drwxr-xr-x 2 root root 80 May 2 16:21 unraid-api/ -rw-r--r-- 1 root root 0 May 1 19:42 vfio-pci -rw-r--r-- 1 root root 587 May 1 19:42 wg-quick.log -rw-rw-r-- 1 root utmp 6.8K May 1 19:43 wtmp i don't undestand where is the difference from 796k to 18836 K...
  20. hi, yes, i removed it and installed it again and then generate two new NETDATA_CLAIM_TOKEN and NETDATA_CLAIM_ROOMS keys now it is working
  21. al momento ho "tamponato" con questo script trovato nel forum che riduce ad 1 mb i file syslog #!/bin/bash size=1M function trimLog { file=$1 temp="$file.$(date +%s%N).tmp" time=$(date --rfc-3339='seconds') before=$(du -sh "$file" | cut -f1) echo -n "$time: $file: $before=>" tail --bytes=$size "$file" > "$temp" chown $(stat -c '%U' "$file"):$(stat -c '%G' "$file") "$temp" chmod $(stat -c "%a" "$file") "$temp" mv "$temp" "$file" after=$(du -sh "$file" | cut -f1) echo "$after" } find /var/log -maxdepth 1 -type f -size +$size 2>/dev/null | sort -f |\ while read file; do trimLog "$file"; done find "/boot/logs" "/var/lib/docker/containers" -name "*.log" -size +$size 2>/dev/null | sort -f |\ while read file; do trimLog "$file"; done
  22. ciao, nel mio log di sistema ho un flood (ogni secondo) di un errore: emhttpd: errore: share_luks_status, 6173: operazione non supportata (95): getxattr: /mnt/user/GoogleDrive mi è stato detto che devo spostare la cartella su disk e non in user ma al momento per motivi di tempo e funzionalità non riesco a farlo detto che lo stesso unassigned device non mi vede il disco "disk" Potrei provare con un script ulteriore, ma la idea di riconfigurare plex mi viene male Il problema è che si riempe una partizione (quella di log) e devo riavviare. volevo sapere si può cancellare il log/la partizione senza riavviare? o si può evitare di fare il logging di quell'errore? Grazie
  23. is possibile to stop logging this specific error?