[Support] Linuxserver.io - Nextcloud


Recommended Posts

After the update to the latest 19.0.5, it broke my collabora server. I keep getting "Could not establish connection to the Collabora Online server". I can go to the actual domain and se the ok page. Does anyone know where i can look to see where it broke? I checked my dns records and its updated with the newest ip. Also, i am using nginx proxy manager and it was working before. Thank you guys.

Link to comment

I’m bringing up an old topic regarding Full Text Search in Nextcloud 20. I installed it along with the supporting apps including ElasticSearch. I installed the docker for ElasticSearch on my unraid server as well as the custom script as outlined in the docker install page. I also setup a reverse proxy for the ElasticSearch docker so it can be access in Nextcloud. Used the default user/pass of elastic/changeme. Still no go. Any ideas??

Link to comment

I have Nextcloud 3.03 running on Win10.  I'm getting the 413 request entity too large error.  I have the max client body size set to 100M on the various *.conf files.  I'm running Unraid 6.9 beta 35.  Those changes did nothing.  I've uninstalled and reinstalled the client on Windows.  Has anyone had this problem and fixed it on version 3?  

Link to comment
On 11/4/2020 at 10:20 PM, skois said:

For anyone having the same problem anyway, the fix for me was this
Just replace the PHP_VER with whatever you have on Settings -> System (Bottom of the page)
 


PHP_VER="7.3.24" && \
BUILD_PACKAGES="wget build-base php7-dev" && \

apk add --no-cache --virtual .php-build-dependencies $BUILD_PACKAGES && \
apk add --no-cache --repository https://dl-3.alpinelinux.org/alpine/edge/testing/ gnu-libiconv-dev && \
(mv /usr/bin/gnu-iconv /usr/bin/iconv; mv /usr/include/gnu-libiconv/*.h /usr/include; rm -rf /usr/include/gnu-libiconv) && \
mkdir -p /opt && \
cd /opt && \
wget https://secure.php.net/distributions/php-$PHP_VER.tar.gz && \
tar xzf php-$PHP_VER.tar.gz && \
cd php-$PHP_VER/ext/iconv && \
phpize && \
./configure --with-iconv=/usr && \
make && \
make install && \
mkdir -p /etc/php7/conf.d && \
#next command not needed in LSIO Docker
#echo "extension=iconv.so" >> /etc/php7/conf.d/iconv.ini && \

apk del .php-build-dependencies && \
rm -rf /opt/*

 

 

Can you elaborate? I'm quite .... bad at this. Where should I run these commands?

Link to comment

Hello everybody!

 

I've been dealing with a peculiar problem since I started using Linuxserver.io's Nextcloud Docker image.

My Temp is set to '/tmp/nextcloud/', but every time the server is restarted, the directory is of course recreated, but the problem is that I'm not able to login to Nextcloud until the directory's permissions are manually changed (I change them to 777 in this case), after that, it works as expected, and the sess_ files etc. all appear as expected.

 

It isn't a big problem, but it can be a little puzzling if I don't remember to do it if the server needs to be restarted for maintenance at any point.


Does anybody have any ideas regarding this?

Link to comment

I'm trying to tune preview generator, and i found this https://ownyourbits.com/2019/06/29/understanding-and-improving-nextcloud-previews/
Which suggets running 

occ config:app:set previewgenerator squareSizes --value="32 256"
occ config:app:set previewgenerator widthSizes  --value="256 384"
occ config:app:set previewgenerator heightSizes --value="256"

while the heightSizes works fine. the other 2 gives me

root@24b78faef1b1:/# occ config:app:set previewgenerator widthSizes  --value="256 384"

                                                                  
  Too many arguments, expected arguments "command" "app" "name".  
                                                                  

config:app:set [--output [OUTPUT]] [--value VALUE] [--update-only] [--] <app> <name>

When i try with 1 value only it works. 
The same syntax have on the github page of previewgenerator. Which it doen't work also.

Any ideas?

Link to comment
On 11/22/2020 at 8:00 PM, Falcowe said:

Hello all,

I have been trying to have other people upload files to a folder I have created on Nextcloud and created a share link to allow for uploading to the folder. However it was reported to me that they were having trouble and I was able to "recreate" the issue on my side. 

The issue: when uploading files to the shared folder all files no matter how I upload them (via drag and drop or the dialogue to upload files) immediately shows the error "An unknown error has occurred." On occasion I'll get an error along the lines of - Server connection lost - but that one has been far more inconsistent and I think it is related to whatever the problem is preventing any files from being uploaded. 

Now with all of that being said I can upload files if I am logged in as a user and that is the workaround I have created for the other person in this case but I would like to be able to simply share a link in the future and have other people be able to upload files to me. The other part in this case has been able to download files from the link without issue however uploading seems to be a problem. Which is strange. Why would downloading work but not uploading, any bight ideas here? 

 

Thanks for your help! 

So I wanted to update my issue if anyone is following along. I was able to get a bit of help from the Nextcloud support community and have discovered that it is actually a file size limitation somewhere. I have tried increasing my PHP file and NGIX max file seizes with no luck yet. But I suspect that I didn't increase the rights files as the file locations are different in the docker version than in the linux distros people are posting about on the Nextcloud forums.

 

So I can upload files, I just can't upload file that are "large" (i.e. in excess of at least 50MB in size and I suspect smaller but I don't have an efficient way to test). The files that have been successfully uploaded are in the 10s of KBs range. So.... does anyone have a suggestion where I might look for what/why this is being limited? And why its only happening on upload/file drops where it's an 'unknown'/'guest' user? If I have someone log in there isn't an issue uploading big files which makes me think it nextcloud causing the issue and not something outside of Nextcloud (i.e. php setup or NGIX limitations).

Link to comment
4 hours ago, Falcowe said:

So I wanted to update my issue if anyone is following along. I was able to get a bit of help from the Nextcloud support community and have discovered that it is actually a file size limitation somewhere. I have tried increasing my PHP file and NGIX max file seizes with no luck yet. But I suspect that I didn't increase the rights files as the file locations are different in the docker version than in the linux distros people are posting about on the Nextcloud forums.

 

So I can upload files, I just can't upload file that are "large" (i.e. in excess of at least 50MB in size and I suspect smaller but I don't have an efficient way to test). The files that have been successfully uploaded are in the 10s of KBs range. So.... does anyone have a suggestion where I might look for what/why this is being limited? And why its only happening on upload/file drops where it's an 'unknown'/'guest' user? If I have someone log in there isn't an issue uploading big files which makes me think it nextcloud causing the issue and not something outside of Nextcloud (i.e. php setup or NGIX limitations).

Please link the post on the NC community, ill try to find the said files!

Link to comment
11 hours ago, skois said:

Please link the post on the NC community, ill try to find the said files!

Ok if the file size is the problem only, then let's start with simple "fixes" 
go to path/to/appdata/nextcloud/php 
edit your php-local.ini 

The part you actually need is the first and specifically 

post_max_size.
The rest is just optimizations i did. If you want you are welcome to include them.

; Edit this file to override php.ini directives and restart the container

date.timezone = Europe/Athens
upload_max_filesize=16G
memory_limit=8G
max_execution_time=7200
max_input_time=7200
post_max_size=16G
max_file_uploads = 200
default_socket_timeout = 7200


; Enable PHP OPcache https://docs.nextcloud.com/server/latest/admin_manual/installation/server_tuning.html#enable-php-opcache
opcache.enable=1
opcache.interned_strings_buffer=8
opcache.max_accelerated_files=10000
opcache.memory_consumption=128
opcache.save_comments=1
opcache.revalidate_freq=1

 

Link to comment

I just noticed on my server that the letsencrypt is being deprecated and SWAG is the way to go. I know that this maybe a noob question, but is there an easy to follow set of instructions for the migration, hopefully without breaking anything? I hate to mess with something that is working well.

 

Thanks,

Lev

Link to comment
5 hours ago, levster said:

I just noticed on my server that the letsencrypt is being deprecated and SWAG is the way to go. I know that this maybe a noob question, but is there an easy to follow set of instructions for the migration, hopefully without breaking anything? I hate to mess with something that is working well.

 

Thanks,

Lev

 

Link to comment
On 11/20/2020 at 9:40 PM, Mystic said:

@ skois - I seen you post.

 

I attempted to upload large files using the client and received the 504 error.  I am going to try to move thing over is small chunks since nextcloud doesn't like large chunks :)

 

I found a thread about using terminal to move data.  First I have to connect my Qnap to my Unraid server and then identify the location of the data and move it.  I am sure they make it sound easier then it is but it maybe my only choice.  I have this thread now sending me emails so I can keep up with your progress.

 

And of course once I figure out using terminal to transfer data I will let you know how it goes..

Today i did some further investigation.
When i uploaded a file and gave me the 504 error i open dev tools on chrome and noticed the url path. included the "dav"
So i mapped in Windows a network driver with webdav. Tried to upload 1gb file, it copied the file almost all. and it stuck at 99% (it was then uploading it to server) So after waiting for about 1 minute i got an error saying i should check my connection and try again. I though it was a random error because 1 min to upload 1gb with 20mbps is too fast. Tried again. about a minute in again. error. Tried 4gb, same time. With some math 20mbps in one minute can upload a little more than 100MB.
Then it hit me. 
Cloudflare proxy free tier have 100mb limitation on uploads (that also is a hint that webdav does not do any chunking.) 
So i set my nextcloud CNAME on cloudflare to DNS only instead of proxy. 
BOOM files start to upload correctly. Problem solved right? Nop.
There is a new limit now. I can upload with mapped webdav drive until 2gb. (2gb +1 byte failes instantly). (This needs further investigation)
So now was the time to test file uploads from WebGUI. 2GBs (exactly) uploaded succesfully!
2.1GB also succesfully! (Didn't expect that) Now trying 4GB file.

*EDIT1*

4GB returned the 504Error. I'll start looking on the reverse proxy nginx timeout configs..
*EDIT2*

Changed some timeouts to 15min on NPM and now i got "Error when assembling chunks, status code 524" instead of 504. 
ill try again with some huge time outs like 1day and see what happens.

*EDIT3*

I use NginxProxyManager, after adding my proxy host i go to /mnt/user/appdata/NginxProxyManager/nginx/proxy_host/numberoftheNChost.conf 
i copied the whole "location / " block and then i edited again the proxy host though webui (npm webui), Advanced tab.
Pasted the location /  block and added the following lines.

proxy_connect_timeout 1d;
proxy_send_timeout 1d;
proxy_read_timeout 1d;
send_timeout 1d;

Anywhere in the block doesn't matter.
After that a 5gb file upload is completed successfully. 
When i had it at 15min it didn't work, probably because the whole upload took almost an hour. 
I don't upload usually that large files through web gui but its nice to know that if i need to it will work ok.

Also there is open issue on nginxproxymanager github that someone asks to add the feature to edit the timeout from withing the gui. So we might see it there soon.

I think for now this is where my quest ends :)

*EDIT4* 

The above config helps also on the updater! No longer times out! (Just updated to NC 21 Beta1 on my test server. NC21 feels a bit faster!


 

BUT even if it succeeds it does not make ANY sense.

Cloudflare shouldn't block this upload though webgui because of the chunking. If i'm not mistaken default chunking size is 10mb.
Actually the upload was never blocked just failed assembling (but not!) If you wait a minute and refresh the page. The file is uploaded correctly and playable.
This might be a timeout setting.

 

I'll edit the last part later when file upload completes and if i have more findings

Edited by skois
  • Like 1
Link to comment
4 minutes ago, Wong said:

Updating to 2.0.3 wasn't really smooth though. There were errors throughout the updater process but I manage to install it. However, how do I solve the warning given by Nextcloud? Hopefully I am not the one seeing this only.

image.thumb.png.aae6430b0d9e5ac8c88c10b896777629.png

The command is in the first line of the warning...
occ db:add-missing-primary-keys

 

Also this have nothing to do with the updater errors you got on the updater.
On this version NC made some changes in the database, and need to add some primary keys :)

Edited by skois
Link to comment
3 minutes ago, skois said:

The command is in the first line of the warning...
occ db:add-missing-primary-keys

 

Also this have nothing to do with the updater errors you got on the updater.
On this version NC made some changes in the database, and need to add some primary keys :)

Thanks for the quick reply, you guys are awesome. I am still quite bad at terminal command. I just copy and pasted the command line just now, it stated command not found. Could you list down all the command line I should execute? Thanks.

Link to comment
16 minutes ago, Wong said:

Thanks for the quick reply, you guys are awesome. I am still quite bad at terminal command. I just copy and pasted the command line just now, it stated command not found. Could you list down all the command line I should execute? Thanks.

You probably tried to execute the command on the unraid's terminal.
This command is to be runned bu the nextcloud container. So, 
click on the NC docker and click console. Run the mentioned command there

  • Thanks 1
Link to comment
6 minutes ago, skois said:

You probably tried to execute the command on the unraid's terminal.
This command is to be runned bu the nextcloud container. So, 
click on the NC docker and click console. Run the mentioned command there

Your feedback solved my problem. Yeah I was running on the Unraid terminal instead of the docker console. Thanks for the helps!!! 

  • Like 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.