[Support] Linuxserver.io - Nextcloud


Recommended Posts

3 hours ago, NVrEnough said:

First, thank you for your fix above - it worked perfect...  then came the next container update....  same error but your fix returns an error where there was none last time.  Any idea how to fix this error?  Here is the entire interaction with your fix:

 


root@3e69a0cc162f:/# PHP_VER="7.3.25" && \
> BUILD_PACKAGES="wget build-base php7-dev" && \
> 
> apk add --no-cache --virtual .php-build-dependencies $BUILD_PACKAGES && \
> apk add --no-cache --repository https://dl-3.alpinelinux.org/alpine/edge/testing/ gnu-libiconv-dev && \
> (mv /usr/bin/gnu-iconv /usr/bin/iconv; mv /usr/include/gnu-libiconv/*.h /usr/include; rm -rf /usr/include/gnu-libiconv) && \
> mkdir -p /opt && \
> cd /opt && \
> wget https://secure.php.net/distributions/php-$PHP_VER.tar.gz && \
> tar xzf php-$PHP_VER.tar.gz && \
> cd php-$PHP_VER/ext/iconv && \
> phpize && \
> ./configure --with-iconv=/usr && \
> make && \
> make install && \
> mkdir -p /etc/php7/conf.d && \
> #next command not needed in LSIO Docker
> #echo "extension=iconv.so" >> /etc/php7/conf.d/iconv.ini && \
> 
> apk del .php-build-dependencies && \
> rm -rf /opt/*
fetch http://dl-cdn.alpinelinux.org/alpine/v3.12/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.12/community/x86_64/APKINDEX.tar.gz
(1/1) Upgrading .php-build-dependencies (20201205.073029 -> 20201205.074158)
OK: 558 MiB in 240 packages
fetch https://dl-3.alpinelinux.org/alpine/edge/testing/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.12/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.12/community/x86_64/APKINDEX.tar.gz
OK: 558 MiB in 240 packages
mv: cannot stat '/usr/bin/gnu-iconv': No such file or directory
mv: cannot stat '/usr/include/gnu-libiconv/*.h': No such file or directory
--2020-12-05 01:41:59--  https://secure.php.net/distributions/php-7.3.25.tar.gz
Resolving secure.php.net (secure.php.net)... 185.85.0.29
Connecting to secure.php.net (secure.php.net)|185.85.0.29|:443... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: https://www.php.net/distributions/php-7.3.25.tar.gz [following]
--2020-12-05 01:42:00--  https://www.php.net/distributions/php-7.3.25.tar.gz
Resolving www.php.net (www.php.net)... 185.85.0.29
Connecting to www.php.net (www.php.net)|185.85.0.29|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 19675266 (19M) [application/octet-stream]
Saving to: 'php-7.3.25.tar.gz.1'

php-7.3.25.tar.gz.1               100%[=============================================================>]  18.76M  9.79MB/s    in 1.9s    

2020-12-05 01:42:03 (9.79 MB/s) - 'php-7.3.25.tar.gz.1' saved [19675266/19675266]

Configuring for:
PHP Api Version:         20180731
Zend Module Api No:      20180731
Zend Extension Api No:   320180731
checking for grep that handles long lines and -e... /bin/grep
checking for egrep... /bin/grep -E
checking for a sed that does not truncate output... /bin/sed
checking for cc... cc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables... 
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether cc accepts -g... yes
checking for cc option to accept ISO C89... none needed
checking how to run the C preprocessor... cc -E
checking for icc... no
checking for suncc... no
checking whether cc understands -c and -o together... yes
checking for system library directory... lib
checking if compiler supports -R... no
checking if compiler supports -Wl,-rpath,... yes
checking build system type... x86_64-pc-linux-musl
checking host system type... x86_64-pc-linux-musl
checking target system type... x86_64-pc-linux-musl
checking for PHP prefix... /usr
checking for PHP includes... -I/usr/include/php7 -I/usr/include/php7/main -I/usr/include/php7/TSRM -I/usr/include/php7/Zend -I/usr/include/php7/ext -I/usr/include/php7/ext/date/lib
checking for PHP extension directory... /usr/lib/php7/modules
checking for PHP installed headers prefix... /usr/include/php7
checking if debug is enabled... no
checking if zts is enabled... no
checking for re2c... no
configure: WARNING: You will need re2c 0.13.4 or later if you want to regenerate PHP parsers.
checking for gawk... no
checking for nawk... no
checking for awk... awk
checking if awk is broken... no
checking for iconv support... yes, shared
checking for libiconv in -liconv... yes
checking if iconv is glibc's... no
checking if using GNU libiconv... no
checking if iconv is Konstantin Chuguev's... no
checking if using IBM iconv... no
checking if iconv supports errno... yes
checking if iconv supports //IGNORE... no
checking if your cpp allows macro usage in include lines... yes
checking for ld used by cc... /usr/x86_64-alpine-linux-musl/bin/ld
checking if the linker (/usr/x86_64-alpine-linux-musl/bin/ld) is GNU ld... yes
checking for /usr/x86_64-alpine-linux-musl/bin/ld option to reload object files... -r
checking for BSD-compatible nm... /usr/bin/nm -B
checking whether ln -s works... yes
checking how to recognize dependent libraries... pass_all
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking dlfcn.h usability... yes
checking dlfcn.h presence... yes
checking for dlfcn.h... yes
checking the maximum length of command line arguments... 98304
checking command to parse /usr/bin/nm -B output from cc object... ok
checking for objdir... .libs
checking for ar... ar
checking for ranlib... ranlib
checking for strip... strip
checking if cc supports -fno-rtti -fno-exceptions... no
checking for cc option to produce PIC... -fPIC
checking if cc PIC flag -fPIC works... yes
checking if cc static flag -static works... yes
checking if cc supports -c -o file.o... yes
checking whether the cc linker (/usr/x86_64-alpine-linux-musl/bin/ld -m elf_x86_64) supports shared libraries... yes
checking whether -lc should be explicitly linked in... no
checking dynamic linker characteristics... GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking whether stripping libraries is possible... yes
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... yes
checking whether to build static libraries... no

creating libtool
appending configuration tag "CXX" to libtool
configure: creating ./config.status
config.status: creating config.h
config.status: config.h is unchanged
/bin/sh /opt/php-7.3.25/ext/iconv/libtool --mode=compile cc -I"/usr/include" -DZEND_ENABLE_STATIC_TSRMLS_CACHE=1 -I. -I/opt/php-7.3.25/ext/iconv -DPHP_ATOM_INC -I/opt/php-7.3.25/ext/iconv/include -I/opt/php-7.3.25/ext/iconv/main -I/opt/php-7.3.25/ext/iconv -I/usr/include/php7 -I/usr/include/php7/main -I/usr/include/php7/TSRM -I/usr/include/php7/Zend -I/usr/include/php7/ext -I/usr/include/php7/ext/date/lib  -DHAVE_CONFIG_H  -I/usr/include -g -O2   -c /opt/php-7.3.25/ext/iconv/iconv.c -o iconv.lo 
mkdir .libs
 cc -I/usr/include -DZEND_ENABLE_STATIC_TSRMLS_CACHE=1 -I. -I/opt/php-7.3.25/ext/iconv -DPHP_ATOM_INC -I/opt/php-7.3.25/ext/iconv/include -I/opt/php-7.3.25/ext/iconv/main -I/opt/php-7.3.25/ext/iconv -I/usr/include/php7 -I/usr/include/php7/main -I/usr/include/php7/TSRM -I/usr/include/php7/Zend -I/usr/include/php7/ext -I/usr/include/php7/ext/date/lib -DHAVE_CONFIG_H -I/usr/include -g -O2 -c /opt/php-7.3.25/ext/iconv/iconv.c  -fPIC -DPIC -o .libs/iconv.o
/opt/php-7.3.25/ext/iconv/iconv.c: In function 'zm_startup_miconv':
/opt/php-7.3.25/ext/iconv/iconv.c:287:4: error: '_libiconv_version' undeclared (first use in this function)
  287 |    _libiconv_version >> 8, _libiconv_version & 0xff);
      |    ^~~~~~~~~~~~~~~~~
/opt/php-7.3.25/ext/iconv/iconv.c:287:4: note: each undeclared identifier is reported only once for each function it appears in
/opt/php-7.3.25/ext/iconv/iconv.c: In function '_php_iconv_appendl':
/opt/php-7.3.25/ext/iconv/iconv.c:184:15: warning: implicit declaration of function 'libiconv'; did you mean 'iconv'? [-Wimplicit-function-declaration]
  184 | #define iconv libiconv
      |               ^~~~~~~~
/opt/php-7.3.25/ext/iconv/iconv.c:468:8: note: in expansion of macro 'iconv'
  468 |    if (iconv(cd, (char **)&in_p, &in_left, (char **) &out_p, &out_left) == (size_t)-1) {
      |        ^~~~~
make: *** [Makefile:194: iconv.lo] Error 1
root@3e69a0cc162f:/opt/php-7.3.25/ext/iconv# 

any feedback/help is greatly appreciated.

Can you try force update the docker again and try using mail without my fix (might the new update fixed it and fix is not needed?) 
If not working try then to apply the fix again and see if the error comes up again. I just checked for updates but it says up to date. And i dont have this error..

Link to comment
On 12/2/2020 at 11:16 AM, azzilla said:

I have also followed Spaceinvader One's guide and having sort of the same issue and i can't understand why. Copy within the docker console works as expected, 5gb file copies in 8-10 seconds.

 

I am getting slow uploads only (10MB/s), downloads are fine (90-120MB/s). There are som topics concering this issue over at Nextclouds forum but I am not getting anywhere with the solutions suggested over there.

 

https://help.nextcloud.com/t/slow-upload-speed-i-need-advice/60909/14

 

I can't remember this being an issue looking back on using Nextcloud for almost 2 years. I had some trouble uploading large files and had to make some edits to the .conf file for swag to fix that.

Other from that, nothing has changed.

 

So, i did some testing, and looks like the nextcloud instance is "locked" at 20Mbps max. 
I tried to upload from desktop client a new file (2gb) and the max speed i could see on task manager was 20Mbps. At the same time i tried to upload from the Web, another 2gb file (different), speed was bouncing from 0-20mbps on both streams, but always the total was 20mbps.
Then i tried to upload from a different device, (maybe was limit per ip?) When the upload started, the speed is again split. but the total was never more than 20mbps.

All this tests done in the same lan through the reverse proxy. 

I dont have a solution yet. but this might help some people look for a solution!
 

So, the main problem here is the reverse proxy.
When i tried uploading the same from web by using the container ip instead of the domain name. It is uploading at 100-200Mbps (which is again slow considering the 1Gbps ethernet link) but more reasonable.

This could be fixed for internal transfers with a NAT rule maybe?

Edited by skois
Link to comment
12 hours ago, skois said:

Can you try force update the docker again and try using mail without my fix (might the new update fixed it and fix is not needed?) 
If not working try then to apply the fix again and see if the error comes up again. I just checked for updates but it says up to date. And i dont have this error..

Not sure what happened but it works now.  I left it broken last night and after reading your reply was getting ready to try again but....  no need.  I ran your script twice last night with the same results, but I didn't force update the docker.  I appreciate your quick response, your knowledge is appreciated.  Thanks again.

Edited by NVrEnough
grammar
  • Like 1
Link to comment

I don't get this. I tried plenty of settings and tweaks without trying to nag in this forum but now i am at a loss for ideas.

The performance of uploading files through the webgui is soooooo slow. Around 30MB/s which i deem slow on a gigabit LAN and WAN.

I totally understand that NC isn't perfect ootb in the sense that i needs some tweaking to get the best performance, but i tried a enough of the solutions without significant results that i am close to surrender my attempts.

 

Share is cache enabled and according to "Diskspeed" i have no issues with read/write. Similar share settings for my "Downloads" share and speeds in nzbget is around 100MB/s.

 

Has anyone managed to make any changes or tweaks that performs better than 30MB/s or have any suggestions as to what i should focus on?

It's not as if i can't live with 30MB/s via the gui, but if higher speeds are possible i would really like to know.

 

What i've tried.

Using Postgresql instead of MariaDB. (Marginally improved CPU load when heavy use)

Enable OPcache (No measurable difference)

Tuned php-fpm (No measurable difference)

 

What i haven't tried

Redis backend

 

NC Documentation links

Link 1

Link 2

 

Linuxbots

Link 1

 

Link to comment
On 11/19/2020 at 11:29 PM, Mystic said:

@skois - I configured the client and it corrected the uploaded files in seconds.  I am going to attempt to upload as much as I can prior to maxing out my local drive.  I guess I can development my tree top level folders and sub folders one by one....  A long process but maybe more reliable than the WEB.

 

I will let you know how things go,  Keep me posted with your progress....

 

No worries on time...  a day a week, I am just happy someone more clever then me is on the case :)

Ok, i made some edits that improved a little the "Error when assembling chunks".
Edited the file in appdata/nextcloud/php/www2.conf adding the following UNDER the [www]

; Tune PHP-FPM https://docs.nextcloud.com/server/latest/admin_manual/installation/server_tuning.html#tune-php-fpm

pm = dynamic
pm.max_children = 150
pm.start_servers = 15
pm.min_spare_servers = 10
pm.max_spare_servers = 30

Also on appdata/nextcloud/php/php-local.ini i have the following 

date.timezone = Europe/Athens
upload_max_filesize=16G
memory_limit=8G
max_execution_time=7200
max_input_time=7200
post_max_size=16G
max_file_uploads = 200
default_socket_timeout = 7200


; Enable PHP OPcache https://docs.nextcloud.com/server/latest/admin_manual/installation/server_tuning.html#enable-php-opcache
opcache.enable=1
opcache.interned_strings_buffer=8
opcache.max_accelerated_files=10000
opcache.memory_consumption=128
opcache.save_comments=1
opcache.revalidate_freq=1


Note that some of the above options are redundant. But i decided to group the together so i know what's going on.

Tried to upload a 1,5GB file and completed fine (before i was getting the error). Trying now a 4GB file. Will edit soon to let you know how it goes.

 

Aaaaand i got the error again :D
 

Edited by skois
Link to comment

I need some help with networking, i have NC behind reverser proxy (NginxProxyManager). When i access the cloud.mydomain.com from the local network and transfer files, the speed is awful because the file has to go public and come back here.

How i can setup my pfsense to route the cloud.mydomain.com to my local NC address. And i do that. Should i also change the config to NOT overwrite host to cloud.mydomain.com?

 

Link to comment

Hey guys!

 

Little question: Im using the Password App in NC20 and it tells me that it need some higher php version above 7.4. How do I check for the php version and upgrade?

 

Edit:

 

With the todays update to the LSIO Docker Image, the problem for not getting the Code Server of Collabora-Online installed is suddenly gone. Now, it says, it cannot connect:

 

"Es konnte keine Verbindung zum Collabora Online-Server hergestellt werden. Dies könnte auf eine fehlende Konfiguration Deines Web-Servers zurückzuführen sein. Für weitere Informationen besuche bitte:Collabora Online mit einem Klick mit Nginx verbinden"

 

Whicht means in english:

 

It was not possible to establish a connection to the Collabora Online-Server. This could be due to a misconfiguration of your web-server. For further information visit: Connect Collabora Online with just one click to Nginx"

 

And additionally, there is some odd behaviour with my Collabora installation. I cannot install the integrated Code Server. When I type 

 

sudo -u abc php -d memory_limit=512M occ app:install richdocumentscode

 

nothings gonna happen. The installation in the apps section isnt working either...

 

 

Thanks in advance friends!

Edited by hundsboog
Update
Link to comment
17 hours ago, skois said:

Ok, i made some edits that improved a little the "Error when assembling chunks".
Edited the file in appdata/nextcloud/php/www2.conf adding the following UNDER the [www]


; Tune PHP-FPM https://docs.nextcloud.com/server/latest/admin_manual/installation/server_tuning.html#tune-php-fpm

pm = dynamic
pm.max_children = 150
pm.start_servers = 15
pm.min_spare_servers = 10
pm.max_spare_servers = 30

Also on appdata/nextcloud/php/php-local.ini i have the following 


date.timezone = Europe/Athens
upload_max_filesize=16G
memory_limit=8G
max_execution_time=7200
max_input_time=7200
post_max_size=16G
max_file_uploads = 200
default_socket_timeout = 7200


; Enable PHP OPcache https://docs.nextcloud.com/server/latest/admin_manual/installation/server_tuning.html#enable-php-opcache
opcache.enable=1
opcache.interned_strings_buffer=8
opcache.max_accelerated_files=10000
opcache.memory_consumption=128
opcache.save_comments=1
opcache.revalidate_freq=1


Note that some of the above options are redundant. But i decided to group the together so i know what's going on.

Tried to upload a 1,5GB file and completed fine (before i was getting the error). Trying now a 4GB file. Will edit soon to let you know how it goes.

 

Aaaaand i got the error again :D
 

Have you tried to disable chunking completely? That made a huge change for me.

Link to comment

After the update to the latest 19.0.5, it broke my collabora server. I keep getting "Could not establish connection to the Collabora Online server". I can go to the actual domain and se the ok page. Does anyone know where i can look to see where it broke? I checked my dns records and its updated with the newest ip. Also, i am using nginx proxy manager and it was working before. Thank you guys.

Link to comment

I’m bringing up an old topic regarding Full Text Search in Nextcloud 20. I installed it along with the supporting apps including ElasticSearch. I installed the docker for ElasticSearch on my unraid server as well as the custom script as outlined in the docker install page. I also setup a reverse proxy for the ElasticSearch docker so it can be access in Nextcloud. Used the default user/pass of elastic/changeme. Still no go. Any ideas??

Link to comment

I have Nextcloud 3.03 running on Win10.  I'm getting the 413 request entity too large error.  I have the max client body size set to 100M on the various *.conf files.  I'm running Unraid 6.9 beta 35.  Those changes did nothing.  I've uninstalled and reinstalled the client on Windows.  Has anyone had this problem and fixed it on version 3?  

Link to comment
On 11/4/2020 at 10:20 PM, skois said:

For anyone having the same problem anyway, the fix for me was this
Just replace the PHP_VER with whatever you have on Settings -> System (Bottom of the page)
 


PHP_VER="7.3.24" && \
BUILD_PACKAGES="wget build-base php7-dev" && \

apk add --no-cache --virtual .php-build-dependencies $BUILD_PACKAGES && \
apk add --no-cache --repository https://dl-3.alpinelinux.org/alpine/edge/testing/ gnu-libiconv-dev && \
(mv /usr/bin/gnu-iconv /usr/bin/iconv; mv /usr/include/gnu-libiconv/*.h /usr/include; rm -rf /usr/include/gnu-libiconv) && \
mkdir -p /opt && \
cd /opt && \
wget https://secure.php.net/distributions/php-$PHP_VER.tar.gz && \
tar xzf php-$PHP_VER.tar.gz && \
cd php-$PHP_VER/ext/iconv && \
phpize && \
./configure --with-iconv=/usr && \
make && \
make install && \
mkdir -p /etc/php7/conf.d && \
#next command not needed in LSIO Docker
#echo "extension=iconv.so" >> /etc/php7/conf.d/iconv.ini && \

apk del .php-build-dependencies && \
rm -rf /opt/*

 

 

Can you elaborate? I'm quite .... bad at this. Where should I run these commands?

Link to comment

Hello everybody!

 

I've been dealing with a peculiar problem since I started using Linuxserver.io's Nextcloud Docker image.

My Temp is set to '/tmp/nextcloud/', but every time the server is restarted, the directory is of course recreated, but the problem is that I'm not able to login to Nextcloud until the directory's permissions are manually changed (I change them to 777 in this case), after that, it works as expected, and the sess_ files etc. all appear as expected.

 

It isn't a big problem, but it can be a little puzzling if I don't remember to do it if the server needs to be restarted for maintenance at any point.


Does anybody have any ideas regarding this?

Link to comment

I'm trying to tune preview generator, and i found this https://ownyourbits.com/2019/06/29/understanding-and-improving-nextcloud-previews/
Which suggets running 

occ config:app:set previewgenerator squareSizes --value="32 256"
occ config:app:set previewgenerator widthSizes  --value="256 384"
occ config:app:set previewgenerator heightSizes --value="256"

while the heightSizes works fine. the other 2 gives me

root@24b78faef1b1:/# occ config:app:set previewgenerator widthSizes  --value="256 384"

                                                                  
  Too many arguments, expected arguments "command" "app" "name".  
                                                                  

config:app:set [--output [OUTPUT]] [--value VALUE] [--update-only] [--] <app> <name>

When i try with 1 value only it works. 
The same syntax have on the github page of previewgenerator. Which it doen't work also.

Any ideas?

Link to comment
On 11/22/2020 at 8:00 PM, Falcowe said:

Hello all,

I have been trying to have other people upload files to a folder I have created on Nextcloud and created a share link to allow for uploading to the folder. However it was reported to me that they were having trouble and I was able to "recreate" the issue on my side. 

The issue: when uploading files to the shared folder all files no matter how I upload them (via drag and drop or the dialogue to upload files) immediately shows the error "An unknown error has occurred." On occasion I'll get an error along the lines of - Server connection lost - but that one has been far more inconsistent and I think it is related to whatever the problem is preventing any files from being uploaded. 

Now with all of that being said I can upload files if I am logged in as a user and that is the workaround I have created for the other person in this case but I would like to be able to simply share a link in the future and have other people be able to upload files to me. The other part in this case has been able to download files from the link without issue however uploading seems to be a problem. Which is strange. Why would downloading work but not uploading, any bight ideas here? 

 

Thanks for your help! 

So I wanted to update my issue if anyone is following along. I was able to get a bit of help from the Nextcloud support community and have discovered that it is actually a file size limitation somewhere. I have tried increasing my PHP file and NGIX max file seizes with no luck yet. But I suspect that I didn't increase the rights files as the file locations are different in the docker version than in the linux distros people are posting about on the Nextcloud forums.

 

So I can upload files, I just can't upload file that are "large" (i.e. in excess of at least 50MB in size and I suspect smaller but I don't have an efficient way to test). The files that have been successfully uploaded are in the 10s of KBs range. So.... does anyone have a suggestion where I might look for what/why this is being limited? And why its only happening on upload/file drops where it's an 'unknown'/'guest' user? If I have someone log in there isn't an issue uploading big files which makes me think it nextcloud causing the issue and not something outside of Nextcloud (i.e. php setup or NGIX limitations).

Link to comment
4 hours ago, Falcowe said:

So I wanted to update my issue if anyone is following along. I was able to get a bit of help from the Nextcloud support community and have discovered that it is actually a file size limitation somewhere. I have tried increasing my PHP file and NGIX max file seizes with no luck yet. But I suspect that I didn't increase the rights files as the file locations are different in the docker version than in the linux distros people are posting about on the Nextcloud forums.

 

So I can upload files, I just can't upload file that are "large" (i.e. in excess of at least 50MB in size and I suspect smaller but I don't have an efficient way to test). The files that have been successfully uploaded are in the 10s of KBs range. So.... does anyone have a suggestion where I might look for what/why this is being limited? And why its only happening on upload/file drops where it's an 'unknown'/'guest' user? If I have someone log in there isn't an issue uploading big files which makes me think it nextcloud causing the issue and not something outside of Nextcloud (i.e. php setup or NGIX limitations).

Please link the post on the NC community, ill try to find the said files!

Link to comment
11 hours ago, skois said:

Please link the post on the NC community, ill try to find the said files!

Ok if the file size is the problem only, then let's start with simple "fixes" 
go to path/to/appdata/nextcloud/php 
edit your php-local.ini 

The part you actually need is the first and specifically 

post_max_size.
The rest is just optimizations i did. If you want you are welcome to include them.

; Edit this file to override php.ini directives and restart the container

date.timezone = Europe/Athens
upload_max_filesize=16G
memory_limit=8G
max_execution_time=7200
max_input_time=7200
post_max_size=16G
max_file_uploads = 200
default_socket_timeout = 7200


; Enable PHP OPcache https://docs.nextcloud.com/server/latest/admin_manual/installation/server_tuning.html#enable-php-opcache
opcache.enable=1
opcache.interned_strings_buffer=8
opcache.max_accelerated_files=10000
opcache.memory_consumption=128
opcache.save_comments=1
opcache.revalidate_freq=1

 

Link to comment

I just noticed on my server that the letsencrypt is being deprecated and SWAG is the way to go. I know that this maybe a noob question, but is there an easy to follow set of instructions for the migration, hopefully without breaking anything? I hate to mess with something that is working well.

 

Thanks,

Lev

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.