[Support] Linuxserver.io - Nextcloud


Recommended Posts

On 9/20/2023 at 4:19 PM, iXNyNe said:

You likely also need to update your nginx files. Specifically:

/config/nginx/site-confs/default.conf

This line: https://github.com/linuxserver/docker-nextcloud/blob/3accfcac32414f739ef312281d256a3fc9a47c6f/root/defaults/nginx/site-confs/default.conf.sample#L29C31-L29C31 would need to be adjusted to match your max upload setting in php. you may also need to update the timeout line right below it. After making the changes, restart the container.

I'm having the same issue changed settings in various places including swag but still have below

 

PHP

Version: 8.2.10

Memory limit: 512 MB

Max execution time: 3600

Upload max size: 512 MB

 

Edited by turnipisum
Link to comment
57 minutes ago, turnipisum said:

I'm having the same issue changed settings in various places including swag but still have below

 

PHP

Version: 8.2.10

Memory limit: 512 MB

Max execution time: 3600

Upload max size: 512 MB

Scrap that, i've fixed it!

 

Added below in the php.local.ini and now it's working.

 

memory_limit = 8G
post_max_size = 16G
upload_max_filesize = 16G
max_execution_time = 3600
max_input_time = 600

 

PHP

Version: 8.2.10

Memory limit: 8 GB

Max execution time: 3600

Upload max size: 16 GB

Edited by turnipisum
Link to comment
On 10/10/2023 at 8:19 PM, alturismo said:

then may just check your files and logs to see what is triggering it ...

 

you should know now thats its "normal" ... may some NC plugin is doing something, may ... access logs ....

 

i can just recommend to move the share NOT to the array ...

I understand the recommendation but with limited space on the share this is not an option.

The logs mention nothing that points to a reason.

 

Further, there is no read/write activity. In unRAID, on the driver overview page, I can see that not a single byte is read or written to/from the drive where the nextcloud share is placed. But still it keeps the drives spun up. Both, the share drive in the array and the parity drive.

Link to comment

I'm having issues with my Nextcloud installation. I'm running the LinuxServer's Nextcloud container. When I tried to log in to the webui, I received the following error message:
 

"Error Your data directory is readable by other users. Please change the permissions to 0770 so that the directory cannot be listed by other users."
 

I then ran chmod -R 0770 /mnt/user/Nextcloud.
After doing this, when I try to access it, I get the following error message:

"Error Your data directory is invalid. Ensure there is a file called ".ocdata" in the root of the data directory. Your data directory is not writable. Permissions can usually be fixed by giving the web server write access to the root directory. See https://docs.nextcloud.com/server/27/go.php?to=admin-dir_permissions." I have a .ocdata file under /mnt/user/Nextcloud.
 

The permissions on /mnt/user/Nextcloud are as follows:
drwxrwx--- 1 root root 113 Sep 26 07:55 NextCloud/

I can't seem to get this to work correctly. Does anyone have any tips? Do I need to set another owner/group?

Edited by TrondHjertager
Link to comment
8 minutes ago, TrondHjertager said:

I'm having issues with my Nextcloud installation. I'm running the LinuxServer's Nextcloud container. When I tried to log in to the webui, I received the following error message:
 

"Error Your data directory is readable by other users. Please change the permissions to 0770 so that the directory cannot be listed by other users."
 

I then ran chmod -R 0770 /mnt/user/Nextcloud.
After doing this, when I try to access it, I get the following error message:

"Error Your data directory is invalid. Ensure there is a file called ".ocdata" in the root of the data directory. Your data directory is not writable. Permissions can usually be fixed by giving the web server write access to the root directory. See https://docs.nextcloud.com/server/27/go.php?to=admin-dir_permissions." I have a .ocdata file under /mnt/user/Nextcloud.
 

The permissions on /mnt/user/Nextcloud are as follows:
drwxrwx--- 1 root root 113 Sep 26 07:55 NextCloud/

I can't seem to get this to work correctly. Does anyone have any tips? Do I need to set another owner/group?

Looks like I had to use nobody:nobody isted of root:root. Seems to have done the trick for me.

Link to comment
On 10/11/2023 at 2:26 AM, turnipisum said:

Scrap that, i've fixed it!

 

Added below in the php.local.ini and now it's working.

 

memory_limit = 8G
post_max_size = 16G
upload_max_filesize = 16G
max_execution_time = 3600
max_input_time = 600

 

PHP

Version: 8.2.10

Memory limit: 8 GB

Max execution time: 3600

Upload max size: 16 GB

Is it working using webdav? I have no problem uploading files larger than 10GB from the web or even from my iphone. Using webdav no matter what i tried i can't upload files larger than 500MB.

Link to comment

How can I fix this error permanently:

 

Quote

Your data directory is readable by other users.

Please change the permissions to 0770 so that the directory cannot be listed by other users.

 

I can run a script that fixes the permissions but if I make a change to a share or something in unraid it seems to reset it and i have to run the same script again.

 

Quote

chown 99:100 /mnt/user/NextCloud_Data
chmod  0770 /mnt/user/NextCloud_Data
 

 

Is there something I have in my unraid setup that needs to change so this stays or in the docker?  Confused why it keeps reverting back.

Link to comment

Hi, I'm trying to install Collabora Online - Built-in CODE Server so that I can use Nextcloud Office. In the description it says

 

"The download is rather big so it is possible you experience a time-out when using the web interface. You can use the OCC command line tool to install the built-in server:

```sudo -u wwwrun php -d memory_limit=512M ./occ app:install richdocumentscode```

Where wwwrun is the user of your web server."

 

Sure enough, if I try to install from the web UI, it fails to do so. Actually it will say that it's installed, but then I can't select it with Nextcloud Office settings and the next time I restart the container in the logs it will say Removing richdocumentscode. But I can't figure out where to run that command. If I go to the container in Unraid, select console and run 

 

"sudo -u [My User in Nextcloud] -d memory_limit=512M ./occ app:install richdocumentscode" I get the error that the sudo user is unknown. Obviously this is operator error on my part, but I don't know how to install the CODE server from the terminal, or I should say I don't understand what I need to change from the command in order to install it in the terminal on UnRAID. Sorry for being a noob, but I would greatly appreciate it if someone could explain how to do this. Thanks. 

Link to comment
6 hours ago, RebelLion1519 said:

but I would greatly appreciate it if someone could explain how to do this. Thanks. 

most answer are found by using the search ;) either in this topic (upper right) or may just google it ...

 

here a answer as note as there where many many questions about buildin Office ... we take a small google answer.

 

https://discourse.linuxserver.io/t/nextcloud-collabora-code/2573

Link to comment
On 10/20/2023 at 2:10 PM, Kilrah said:

Are you using ZFS on the drives shares related to NC are stored?

 

 

If this is aimed at my question.  I do use ZFS but the NextCloud data share is not stored on a ZFS pool or drive using ZFS.  Its stored on the array, limited to one XFS disk.

Link to comment

Hello,

I'm having trouble with using onlyoffice in nextcloud on my phone and away from home. It works fine on my desktop connected to the same lan as my server. For example when I try to open a spreadsheet in the nextcloud app I get net::err_blocked_by_response. I'm running nextcloud and doncumentserver containers with the SWAG container as the reverse proxy. I'm guessing this is an error in the proxy configs because I recently updated the proxy-conf files that the SWAG logs were saying were out of date and then this problem started. I'm hoping someone can help me out as I really have no idea what anything in the proxy conf files mean. I'll include the three conf files the I recently changed and if anyone can help me understand what, if anything, is wrong that would be much appreciated.

 

Nextcloud proxy-conf:

server {
    listen 443 ssl http2;
    listen [::]:443 ssl http2;

    server_name nextcloud.*;

    include /config/nginx/ssl.conf;

    client_max_body_size 0;

    location / {
        include /config/nginx/proxy.conf;
        include /config/nginx/resolver.conf;
        set $upstream_app nextcloud;
        set $upstream_port 443;
        set $upstream_proto https;
        proxy_pass $upstream_proto://$upstream_app:$upstream_port;

        # Hide proxy response headers from Nextcloud that conflict with ssl.conf
        # Uncomment the Optional additional headers in SWAG's ssl.conf to pass Nextcloud's security scan
        proxy_hide_header Referrer-Policy;
        proxy_hide_header X-Content-Type-Options;
        proxy_hide_header X-Frame-Options;
        proxy_hide_header X-XSS-Protection;

        # Disable proxy buffering
        proxy_buffering off;
    }
}

 

Documentserver proxy-conf:

server {
    listen 443 ssl http2;
    listen [::]:443 ssl http2;

    server_name documentserver.*;

    include /config/nginx/ssl.conf;

    client_max_body_size 0;

    # enable for ldap auth (requires ldap-location.conf in the location block)
    #include /config/nginx/ldap-server.conf;

    # enable for Authelia (requires authelia-location.conf in the location block)
    #include /config/nginx/authelia-server.conf;

    # enable for Authentik (requires authentik-location.conf in the location block)
    #include /config/nginx/authentik-server.conf;

    location / {
        # enable the next two lines for http auth
        #auth_basic "Restricted";
        #auth_basic_user_file /config/nginx/.htpasswd;

        # enable for ldap auth (requires ldap-server.conf in the server block)
        #include /config/nginx/ldap-location.conf;

        # enable for Authelia (requires authelia-server.conf in the server block)
        #include /config/nginx/authelia-location.conf;

        # enable for Authentik (requires authentik-server.conf in the server block)
        #include /config/nginx/authentik-location.conf;

        include /config/nginx/proxy.conf;
        include /config/nginx/resolver.conf;
        set $upstream_app documentserver;
        set $upstream_port 80;
        set $upstream_proto http;
        proxy_pass $upstream_proto://$upstream_app:$upstream_port;

    }
}

 

ssl conf:

## Version 2023/08/13 - Changelog: https://github.com/linuxserver/docker-baseimage-alpine-nginx/commits/master/root/defaults/nginx/ssl.conf.sample

### Mozilla Recommendations
# generated 2023-06-25, Mozilla Guideline v5.7, nginx 1.24.0, OpenSSL 3.1.1, intermediate configuration
# https://ssl-config.mozilla.org/#server=nginx&version=1.24.0&config=intermediate&openssl=3.1.1&guideline=5.7

ssl_certificate /config/keys/cert.crt;
ssl_certificate_key /config/keys/cert.key;
ssl_session_timeout 1d;
ssl_session_cache shared:MozSSL:10m; # about 40000 sessions
ssl_session_tickets off;

# curl *redacted seems like something I shouldn't give out?* > /path/to/dhparam
ssl_dhparam /config/nginx/dhparams.pem;

# intermediate configuration
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers *redacted seems like something I shouldn't give out?*
ssl_prefer_server_ciphers off;

# HSTS (ngx_http_headers_module is required) (63072000 seconds)
add_header Strict-Transport-Security "max-age=63072000" always;

# OCSP stapling
ssl_stapling on;
ssl_stapling_verify on;

# verify chain of trust of OCSP response using Root CA and Intermediate certs
ssl_trusted_certificate /config/keys/cert.crt;

# Optional additional headers
#add_header Cache-Control "no-transform" always;
#add_header Content-Security-Policy "upgrade-insecure-requests; frame-ancestors 'self'" always;
#add_header Permissions-Policy "interest-cohort=()" always;
add_header Referrer-Policy "same-origin" always;
add_header X-Content-Type-Options "nosniff" always;
add_header X-Frame-Options "SAMEORIGIN" always;
#add_header X-UA-Compatible "IE=Edge" always;
add_header X-XSS-Protection "1; mode=block" always;

 

Edited by Viper-694
Link to comment
On 4/7/2023 at 2:25 AM, Avsynthe said:

Hey all,

 

So I'm trying to set up Recognize. I'd like to run it without Tensorflow WASM mode as I understand Video tagging is not available with that setting.

 

Under Tensorflow WASM mode, it reads:

 

Could not check whether your machine supports native TensorFlow operation.

 

And with it disabled, under Node.js it reads:

 

Could not load libtensorflow in Node.js. You can try to manually install libtensorflow or run in WASM mode.

 

Under that is the note:

 

If the shipped Node.js binary doesn't work on your system for some reason you can set the path to a custom node.js binary. Currently supported is Node v14.17 and newer v14 releases.

 

I'd also like to run Tensorflow GPU mode, so I'd like to know if GPU passthrough works in the traditional sense with the --runtime=nvidia extra parameters and the NVIDIA_VISIBLE_DEVICES variable set.

 

Thanks in advance!

 

I would like to follow up with this question as I couldn't find any information here on how to pass through a NVIDIA GPU for this container. It would help with Memories and Recognize. Any response is appreciated.

Link to comment

Has anyone managed to get this working as a subfolder instead of the default? I've gone through the nextcloud documentation but it doesn't seem to work. If I do the changes in the config.php to add /nextcloud so http://192.168.x.x:8443/nextcloud instead of just http://192.168.x.x:8443/ it doesn't work so means I can't get it work via reverse proxy either.

Link to comment

Recently my linuxserver.io docker container for nextcloud updated to v27.1.3.2 and broke the web interface, so when you went to the login page you got a 'Internal Server Error' message.

 

After checking nextcloud.log found the issue to be on line 295 of

 

/config/www/nextcloud/apps/user_status/lib/Service/StatusService.php

 

The line was set to if ($userStatus->getId() !== n}ll) {

 

the n}ll should read null, once I changed it and saved the file, the webpage access started working again.

 

Thought I'd post this, just in case someone else comes across the same issue. 

Link to comment
On 10/30/2023 at 1:11 AM, mesencephalon said:

 

I would like to follow up with this question as I couldn't find any information here on how to pass through a NVIDIA GPU for this container. It would help with Memories and Recognize. Any response is appreciated.

 

From what I was told on the discord, this will never be possible on this image as there is neither Nvidia nor AMD GPU drivers in existence for muslc on Alpine. 

 

I'm just about to take a dive and see if I can successfully migrate everything to the official container without having to re-setup anything as I've done a lot of work. Any help on this would be awesome.

Link to comment
  • 2 weeks later...

one issue I had after downloading this container is I had a lot of errors in the nextcloud log file about richdocumentsscode

I had to run occ app:install richdocumentscode in the nexctcloud terminal to fix it.

 

######################

Now my main issue which has been really annoying me. I am unable to upload larger then 1GB files via chrome web browser. If I upload larger then 1GB I just get "unknown error" in nextcloud.

 

The only thing I have found is this
https://docs.nextcloud.com/server/latest/admin_manual/configuration_files/big_file_upload_configuration.html

I've tried all of that and still no luck. I created a share that all 3 have access to
nextcloud_tmp/ I can see the files in all 3 consoles.

/mnt/user/appdata/nextcloud/php/php-local.ini


memory_limit=2048M
upload_max_filesize=70G
post_max_size=70G
max_input_time=6400
max_execution_time=6400
output_buffering=0
upload_tmp_dir =/nextcloud_tmp
file_uploads =On



/mnt/user/appdata/nextcloud/www/nextcloud/config/config.php

  'tempdirectory' => '/nextcloud_tmp',

 

 /mnt/user/appdata/swag/nginx/proxy-confs/nextcloud.subdomain.conf

add_header X-Accel-Buffering no;
proxy_buffering off;
client_max_body_size 0;
client_body_in_file_only on;
client_body_in_single_buffer on;
client_body_temp_path /nextcloud_tmp;
fastcgi_request_buffering off;
fastcgi_max_temp_file_size 0;


################# EDIT #######################

I think I finally figured it out. But maybe someone can explain it to me because I do not understand it. I have been trying to modify  /mnt/user/appdata/swag/nginx/proxy-confs/nextcloud.subdomain.conf to make this work because swag is the proxy server.

 

however after I modified

/mnt/user/appdata/nextcloud/nginx/site-confs/default.conf

    # display real ip in nginx logs when connected through reverse proxy via docker network
    set_real_ip_from 172.18.0.0/12;
    real_ip_header X-Forwarded-For;

    # https://docs.nextcloud.com/server/latest/admin_manual/installation/nginx.html#nextcloud-in-the-webroot-of-nginx

    # set max upload size and increase upload timeout:
    client_max_body_size 0;
    client_body_temp_path /nextcloud_tmp;
    client_body_timeout 6400s;
    fastcgi_buffers 64 4K;
    proxy_buffering off;
    fastcgi_request_buffering off;
    fastcgi_max_temp_file_size 0;
    add_header X-Accel-Buffering no;


It started working as expected. How does modifying the nginx file within the nextcloud container solve this? I assumed this was just an example file? Does this build have its own nginx server?

Edited by xokia
  • Thanks 1
Link to comment

Error after upgrade to: 27.1.3.2

 

OK new install about a week old. Everything was working. Hit upgrade button under docker. Then got Error message of internal server errors. Tried a few of the fixes found via google, but unable to fix it.

 

deleted the containers for MariaDB and nextcloud. Re-installed and still same error.

 

deleted the containers again, and this time   Deleted both the MariaDB and nextcloud. share/folders.

 

so I reinstalled everything. It worked. Created a couple of users. 5 minutes later now a new error:

Doctrine exception: failed to connect to the database. (See attached picture)


stopped both containers. Restarted. Then it worked again. A minute later same error.

 

BTW: I don’t need to save any data. I just want a fresh install that actually works .

 

 

 

Any thoughts ?

 

 

 

465B2F2E-B699-425A-B6DE-7434B01BD643.png

Edited by beachbum
Link to comment

Ya know how: unRaid is NOT Backup!?, well I've had corruption before, and loads of lost&found files. Total bummer. Anyhow, I'd like to back my stuff up on an external USB drive (Unassigned Devices). I use Luckybackup, which is an rsync GUI apparently. I got it setup, and it's mostly working. But I get loads of errors when running the backup on the nextcloud share. Anyone have luck with some sort of additional backup docker, offsite, USB etc. ?

thanks

 

https://unraid.net/blog/unraid-server-backups-with-luckybackup

 

Screenshot 2023-11-13 185440.png

Link to comment
50 minutes ago, rutherford said:

But I get loads of errors when running the backup on the nextcloud share. Anyone have luck with some sort of additional backup docker, offsite, USB etc. ?

while this Question is pretty sure wrong here (not related to lsio NC Docker) ... may a hint ...

 

turn Nextcloud off while running your backup, that should solve it ... and is also recommended ;)

Link to comment
45 minutes ago, alturismo said:

while this Question is pretty sure wrong here (not related to lsio NC Docker) ... may a hint ...

 

turn Nextcloud off while running your backup, that should solve it ... and is also recommended ;)

Yeah, you're probably right. How about any OTHER additional backup solution? A nextcloud mirror maybe.?

Link to comment
6 minutes ago, rutherford said:

Yeah, you're probably right. How about any OTHER additional backup solution? A nextcloud mirror maybe.?

just try & error whatever you like, but in the end ... almost any software backups ... where "live data" is running, you should consider to turn off the app while the backup is running.

 

in terms of NC its a 5 min break (worst case) after the initial backup as rsync is incremental ...

 

and even NC says ... (there are search funtions out there ... ;)) turn off ...

 

https://docs.nextcloud.com/server/latest/admin_manual/maintenance/backup.html

 

Link to comment
5 hours ago, rutherford said:

But I get loads of errors when running the backup on the nextcloud share

Screenshot doesn't show the actual errors if any that would be above in the output, this looks like it's just attributes that didn't transfer, may be normal if your backup drive's filesystem isn't the same as the drive the original files are on.

Edited by Kilrah
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.