Ive recently deployed the docker for for this and it has worked with scanning of maybe 40% of the files in the share, but it has reached a point where the docker keeps crashing.
On a restart, it will run and perform another scan rechecking all previous scanned folders (although not adding as it sees the folders have been indexed within the DB) but when it starts looking to add in new paths/files into the index, after some new files being indexed and added, it crashes shortly there after. (error below)
2023/10/31 07:08:22 Scanning media: /photos/media-various/Video Rip/rhrn sewquence/IMG_0315.JPG
2023/10/31 07:08:22 Job finished
2023/10/31 07:08:22 Queue waiting for lock
2023/10/31 07:08:22 Queue running: in_progress: 39, max_tasks: 40, queue_len: 2651
2023/10/31 07:08:22 Queue starting job
2023/10/31 07:08:22 Queue waiting
2023/10/31 07:08:22 Starting job
2023/10/31 07:08:22 Job finished
2023/10/31 07:08:22 Queue waiting for lock
2023/10/31 07:08:22 Queue running: in_progress: 39, max_tasks: 40, queue_len: 2650
2023/10/31 07:08:22 Queue starting job
2023/10/31 07:08:22 Queue waiting
2023/10/31 07:08:22 Starting job
2023/10/31 07:08:22 Scanning media: /photos/2022/Photos/holiday/IMG_3222.JPG
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x1c8 pc=0xb63950]
goroutine 11530 [running]:
github.com/photoview/photoview/api/scanner/scanner_tasks/processing_tasks.ProcessVideoTask.ProcessMedia({{}}, {{0xf68cf8?, 0xc001843b00?}}, 0xc00188e8d0, {0xc000124a68, 0x12})
/app/scanner/scanner_tasks/processing_tasks/process_video_task.go:115 +0x16d0
github.com/photoview/photoview/api/scanner/scanner_tasks.scannerTasks.ProcessMedia({{}}, {{0xf68cf8?, 0xc001843b00?}}, 0xe3f640?, {0xc000124a68, 0x12})
/app/scanner/scanner_tasks/scanner_tasks.go:128 +0x15d
github.com/photoview/photoview/api/scanner.processMedia({{0xf68cf8?, 0xc001843b00?}}, 0xc00188e8d0)
/app/scanner/scanner_album.go:204 +0x9b
github.com/photoview/photoview/api/scanner.ScanAlbum.func1({{0xf68cf8?, 0xc001843b00?}})
/app/scanner/scanner_album.go:115 +0x55
github.com/photoview/photoview/api/scanner/scanner_task.TaskContext.DatabaseTransaction.func1(0xc000444f60?)
/app/scanner/scanner_task/scanner_task.go:73 +0x3a
gorm.io/gorm.(*DB).Transaction(0xc000444f60, 0xc000118760, {0x0, 0x0, 0x0})
/go/pkg/mod/gorm.io/
[email protected]/finisher_api.go:606 +0x29e
github.com/photoview/photoview/api/scanner/scanner_task.TaskContext.DatabaseTransaction({{0xf68cf8?, 0xc000ea3710?}}, 0xc00188e900, {0x0, 0x0, 0x0})
/app/scanner/scanner_task/scanner_task.go:72 +0xed
github.com/photoview/photoview/api/scanner.ScanAlbum({{0xf68cf8?, 0xc000444ff0?}})
/app/scanner/scanner_album.go:114 +0x477
github.com/photoview/photoview/api/scanner/scanner_queue.(*ScannerJob).Run(...)
/app/scanner/scanner_queue/queue.go:35
github.com/photoview/photoview/api/scanner/scanner_queue.(*ScannerQueue).processQueue.func1()
/app/scanner/scanner_queue/queue.go:144 +0x72
created by github.com/photoview/photoview/api/scanner/scanner_queue.(*ScannerQueue).processQueue
/app/scanner/scanner_queue/queue.go:142 +0x16a
Any pointers as to why this would be or how to resolve/troubleshoot further.
I've tried this with a range of scanners from 3,2 to 10 to now 40. Regardless of number used, it is at a point where it will continually fail.