Roxelchen

Members
  • Posts

    6
  • Joined

  • Last visited

Everything posted by Roxelchen

  1. Is anyone else having problems with "Add root share" ? It used to work for me in the past but now the share wont show up on my systems. I have deleted and created the root share and it is currently mounted but i cant see it.
  2. Just updated to 20.0.10 Now i am getting There are some errors regarding your setup. PHP configuration option output_buffering must be disabled Any clue how to do that?
  3. today at 13:11 time="2021-03-29T13:11:44+02:00" level=warning msg="DEPRECATED: SMTP Notifier `disable_verify_cert` option has been replaced by `notifier.smtp.tls.skip_verify` (will be removed in 4.28.0)" today at 13:11 time="2021-03-29T13:11:44+02:00" level=info msg="Logging severity set to debug" today at 13:11 time="2021-03-29T13:11:44+02:00" level=debug msg="Storage schema is being checked to verify it is up to date" today at 13:11 time="2021-03-29T13:11:44+02:00" level=debug msg="Storage schema is up to date" today at 13:11 panic: Unable to parse database: yaml: line 4: did not find expected key today at 13:11 today at 13:11 goroutine 1 [running]: today at 13:11 github.com/authelia/authelia/internal/authentication.NewFileUserProvider(0xc000334408, 0xd) today at 13:11 github.com/authelia/authelia/internal/authentication/file_user_provider.go:54 +0x22c today at 13:11 main.startServer() today at 13:11 github.com/authelia/authelia/cmd/authelia/main.go:91 +0xa88 today at 13:11 main.main.func1(0xc000384280, 0xc00032c9e0, 0x0, 0x2) today at 13:11 github.com/authelia/authelia/cmd/authelia/main.go:137 +0x25 today at 13:11 github.com/spf13/cobra.(*Command).execute(0xc000384280, 0xc0000a2160, 0x2, 0x2, 0xc000384280, 0xc0000a2160) today at 13:11 github.com/spf13/[email protected]/command.go:856 +0x2c2 today at 13:11 github.com/spf13/cobra.(*Command).ExecuteC(0xc000384280, 0xc00019ff58, 0x4, 0x4) today at 13:11 github.com/spf13/[email protected]/command.go:960 +0x375 today at 13:11 github.com/spf13/cobra.(*Command).Execute(...) today at 13:11 github.com/spf13/[email protected]/command.go:897 today at 13:11 main.main() today at 13:11 github.com/authelia/authelia/cmd/authelia/main.go:154 +0x185 today at 13:11 Container stopped Any clue what might be wrong in my configuration? Authelia Docker is not starting
  4. Having the same issue right now I went into /usr/share/tessdata and downloaded deu.traineddata via wget https://github.com/tesseract-ocr/tessdata/blob/master/deu.traineddata chmod +x deu.traineddata See bash-5.0# cd /usr/share/tessdata/ bash-5.0# ls -l total 35504 drwxr-xr-x 1 root root 360 Mar 1 19:05 configs -rwxr-xr-x 1 root root 64820 Mar 7 18:14 deu.traineddata -rwxr-xr-x 1 root root 23466654 Jul 9 2019 eng.traineddata -rwxr-xr-x 1 root root 2251950 Jul 9 2019 equ.traineddata -rwxr-xr-x 1 root root 10562874 Jul 9 2019 osd.traineddata -rw-r--r-- 1 root root 572 Jul 9 2019 pdf.ttf drwxr-xr-x 1 root root 88 Mar 1 19:05 tessconfigs Still getting: pyocr.error.TesseractError: (1, b'Error opening data file /usr/share/tessdata/deu.traineddata\nPlease make sure the TESSDATA_PREFIX environment variable is set to your "tessdata" directory.\nFailed loading language \'deu\'\nTesseract couldn\'t load any languages!\nCould not initialize tesseract.\n') and the paperless_consumer docker crashed Everything worked for some weeks and now this is happening Any idea why?
  5. I am running paperless since a few days and i am absolutely in love with it. Problem i ran into yesterday is bad performance when a PDF file is more than one page. I uploaded a 2Mb 8 pages file (not that much actually...) and it took the OCR process over 30 minutes while using 100% cpu on all 4 Xeon 1225-v3 cores. Maybe that has something to do with this issue https://github.com/the-paperless-project/paperless/issues/438 ? Any one has any idea how to optimize that process? paperless-consumer docker log: Consuming /consume/03.2020.pdf ** Processing: /tmp/paperless/paperless-up38twsl/convert.png 500x700 pixels, 3x16 bits/pixel, RGB Input IDAT size = 575331 bytes Input file size = 575592 bytes Trying: zc = 9 zm = 9 zs = 0 f = 0 IDAT size = 545251 zc = 9 zm = 8 zs = 0 f = 0 IDAT size = 545208 Selecting parameters: zc = 9 zm = 9 zs = 0 f = 1 IDAT size = 494809 Output file: /tmp/paperless/paperless-up38twsl/optipng.png Output IDAT size = 494809 bytes (80522 bytes decrease) Output file size = 494866 bytes (80726 bytes = 14.02% decrease) Processing sheet #1: /tmp/paperless/paperless-up38twsl/convert-0002.pnm -> /tmp/paperless/paperless-up38twsl/convert-0002.unpaper.pnm Processing sheet #1: /tmp/paperless/paperless-up38twsl/convert-0000.pnm -> /tmp/paperless/paperless-up38twsl/convert-0000.unpaper.pnm Processing sheet #1: /tmp/paperless/paperless-up38twsl/convert-0001.pnm -> /tmp/paperless/paperless-up38twsl/convert-0001.unpaper.pnm Processing sheet #1: /tmp/paperless/paperless-up38twsl/convert-0003.pnm -> /tmp/paperless/paperless-up38twsl/convert-0003.unpaper.pnm [pgm_pipe @ 0x55698b596f80] [pgm_pipe @ 0x56315b5eaf80] [pgm_pipe @ 0x55b79cc53f80] Stream #0: not enough frames to estimate rate; consider increasing probesize Stream #0: not enough frames to estimate rate; consider increasing probesize [pgm_pipe @ 0x55d75f3f5f80] Stream #0: not enough frames to estimate rate; consider increasing probesize Stream #0: not enough frames to estimate rate; consider increasing probesize [image2 @ 0x55b79cc55600] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead. [image2 @ 0x55b79cc55600] Encoder did not produce proper pts, making some up. Processing sheet #1: /tmp/paperless/paperless-up38twsl/convert-0004.pnm -> /tmp/paperless/paperless-up38twsl/convert-0004.unpaper.pnm [pgm_pipe @ 0x55a4ad8d8f80] Stream #0: not enough frames to estimate rate; consider increasing probesize [image2 @ 0x55698b598600] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead. [image2 @ 0x55698b598600] Encoder did not produce proper pts, making some up. out of deviation range - NO ROTATING [image2 @ 0x55d75f3f7600] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead. Processing sheet #1: /tmp/paperless/paperless-up38twsl/convert-0005.pnm -> /tmp/paperless/paperless-up38twsl/convert-0005.unpaper.pnm [image2 @ 0x55d75f3f7600] Encoder did not produce proper pts, making some up. [pgm_pipe @ 0x564bda956f80] Stream #0: not enough frames to estimate rate; consider increasing probesize Processing sheet #1: /tmp/paperless/paperless-up38twsl/convert-0006.pnm -> /tmp/paperless/paperless-up38twsl/convert-0006.unpaper.pnm [pgm_pipe @ 0x5610d26a6f80] Stream #0: not enough frames to estimate rate; consider increasing probesize [image2 @ 0x56315b5ec600] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead. [image2 @ 0x56315b5ec600] Encoder did not produce proper pts, making some up. Processing sheet #1: /tmp/paperless/paperless-up38twsl/convert-0007.pnm -> /tmp/paperless/paperless-up38twsl/convert-0007.unpaper.pnm [pgm_pipe @ 0x56090cae1f80] Stream #0: not enough frames to estimate rate; consider increasing probesize [image2 @ 0x55a4ad8da600] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead. [image2 @ 0x55a4ad8da600] Encoder did not produce proper pts, making some up. [image2 @ 0x564bda958600] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead. [image2 @ 0x564bda958600] Encoder did not produce proper pts, making some up. [image2 @ 0x5610d26a8600] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead. [image2 @ 0x5610d26a8600] Encoder did not produce proper pts, making some up. [image2 @ 0x56090cae3600] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead. [image2 @ 0x56090cae3600] Encoder did not produce proper pts, making some up. OCRing the document Parsing for deu Parsing for deu Parsing for deu Detected document date 2014-01-20T00:00:00+01:00 based on string 20.01.2014 d Document 20140120000000: 03.2020 consumption finished