Mihle

Members
  • Posts

    152
  • Joined

  • Last visited

Everything posted by Mihle

  1. Found a script someone posted here I am not quite sure how to use yet. But, do I need to do anything special for Nextcloud that is partly on cache?
  2. So I want to have something that can backup both just shares, and data that Nextcloud has (MariaDB), both stored on array and in cache, whenever I press a button OR the drive get connected to the NAS. (Offline copy I would update every few months, store offsite maybe), I tried searching the forum and google already, but didnt really find questions I was looking for, if anyone have links, feel free to share. How would I go about doing that? rsync? Something else? If script, is there some script that someone else have made that I can use and modify or something? I know nothing about rsync or scripts. Just a normal share is probably the easiest, but Nextcloud that is its own thing and a database, I am guessing you cant just treat the same? Would Nextcloud have to be not running while it is happening, if so, how? Less important right now to answer: How would I Restore the Nextcloud part if I need to? For reference, I can not just copy a hole drive, the drive I want to backup to is smaller than my other disk, but I dont feel I need to have an offline copy of everything.
  3. I dont really have many power outages, for a two month period it was much more than usual at about once every two weeks. Usually its like once every year at maximum, usually less. Never had more than once on one day.
  4. Nice! The 1000VA one would probably be overkill then, my NAS average load unless I do something that loads it right there is less than 80W
  5. Would Cyberpower Value Pro VP1000ELCD be a good UPS from my nas that dont draw that much (less than 200W)? Specs says battery is replaceable by qualified personnel or something, so I am guessing I would manage to do that.
  6. I dont know how I didnt realize that is how you did it.... Thank you!
  7. Tried, still compresses it to about 1,5mb per image in the PDF. (they are high resolution)
  8. So I have gotten my hand on a 3TB surveillance drive (free) and it seem to be in OK shape. So I got the idea of backing up the most important info on it (photos documents and so on) (not close to 3TB, more like 500gb) in encrypted form and then storing it in another location and then every X months take it home to update it and then back to storage. As its another location I would prefer the data to be encrypted. Is there any way I can set up so I plug it in, then press a button in unRAID, then the backup/backup update automatically happens (selectable shares) and then I can take it out and put it back? If so, how? I have found scripts that do something similar, but none of them encrypt the data, its just a straight copy. I want it to be encrypted in some form.
  9. Seems like this compresses my PDFs when it run more than I want to (I am perfectionist) The documentation on OCRmyPDF itself says there are settings for it, "Optimization", that says the command --optimize N (N is number depending on what you want) But I havent figured out how to do it in the Docker, so do anyone know how?
  10. Any scripts that backups selected shares (including for example nextcloud) to an SATA unassigned device with the backup being encrypted? (want to backup to a drive 2-4 times a year and store it outside my own home)
  11. So I have gotten my hand on a 3TB surveillance drive (free) and it seem to be in OK shape. So I got the idea of backing up the most important info on it (photos documents and so on) (not close to 3TB, more like 500gb) in encrypted form and then storing it in another location and then every X months take it home to update it and then back to storage. As its another location I would prefer the data to be encrypted. Is there any way I can set up so I plug it in, then press a button in unRAID, then the backup/backup update automatically happens (selectable shares) and then I can take it out and put it back? If so, how?
  12. I am getting the same error with youtube.dl but I know 100% sure I have never removed the sample on it, I dont even know what it is. I only use Swag with Nextcloud. Tho I see that that config was last updated summer 2020...
  13. Thanks, seem to have worked, now I am getting this error instead tho: And the speed issues that started after updating to Nextcloud 21 is still there, aka slow switching between menus of nextcloud itself, like dashboard, files, or going to settings, is so much slower than before I updated to 21. (from 20) I think it might be related to NC 21 pinging the CPU much harder for some reason? EDIT: after some googeling and changing min max spare and start servers, the numbers are only getting larger?? And its only me that access the server and this is only happening when I access next cloud via web, does not happen if I do it via app. EDIT: I think I found part of the slowness issue.... For some reason I tried to test in private browsing, there it was fast as it was in NC 20. So I thought, thats weird, so I deleted cookies related to nextcloud and my site, and changed nothing. Then I tried disabeling one and one browser plugin, and it got fixed. You know what plugin caused it? Dark Reader. A plugin that is only supposed to make sites darker. Why would it cause that, both the slowness and excess PHP children on my NC server? (yes, the errors is gone) EDIT: the PHP error still happens but site is much faster.
  14. My php error file reports this error: WARNING: [pool www] server reached pm.max_children setting (5), consider rasing it Can that be the cause of the slowness I experience? If so, how do I change it?
  15. Its a gone a while and its not become any faster in switching between the tabs like dashboard and files and stuff. Its slower than Nextcloud 20 for me, and pings the CPU harder.
  16. Thank You! that is what I did wrong. Seems like the name of the file dont matter for Nextcloud, I now moved the old renamed file one step up in the file structure and it worked Just curious, what do the "Strict-Transport-Security“-HTTP-Header" do? Dont seem like that one even was in my old file.
  17. Ah, ok, I will do that then. The problem was that I did not find it in that file... I dont know why. After i used the command, a line got added to the end of that config tho, or I was blind, its on the last line now tho. Its faster to just use that command anyway, if you do it straight away and dont do like me and first open the cofig, then dont find it, then find the command on internet and then do it. OR you could just probably add the line yourself completely.
  18. The Phone region thing, if you dont find it in config as I did, open the nextcloud console from docker menu and use this command: occ config:system:set default_phone_region --value="NO" Replace NO with the code for the country you are in.
  19. So I updated to 21.0.1, and it become slower when first opening the file tab or the other main tabs like dashboard, or when opening the settings page where you check for updates. When files tab first is open, its as normal tho. When switching tabs, I think it also uses more CPU than it did before. Anyone know the reason?
  20. Probably Nextcloud fault, maybe it dont like putting files to its database without being through its GUI or its own code. Work around of having a own share just to store the backup and adding it as external storage worked, it then shows up in Nextcloud and syncs to other devices just fine. Not ideal but it works. Thanks.
  21. Yes. I did not know that Ah, ok, only adding the quotes seem to have fixed it! Well, It does not show up in Nextcloud as I ideally want, but that might be Nextcloud. At least it doesnt make the folder it is moved to uneditable for Nextcloud as the appdata backup do.
  22. Tested again, but I removed the / at the end of the directory, it says this: Then remove some other things then continues like this, probably until the NAS runs out of memory and crashes? (I run it manually) Is there something I have done that causes it?
  23. So I did run this and this happened: I think it somehow deleted more things than it should have done EDIT: Reboot fixed it, so I think it caused a memory leak or what its called that made unRAID itself not able to run from memory anymore as it usually do. Had to turn it off by the power button. (I do have a manual flash backup already btw) This is what the script looks like for me right now: #!/bin/bash #### SECTION 1 ####------------------------------------------------------------------------------------------------------ #dir = WHATEVER FOLDER PATH YOU WANT TO SAVE TO dir=/mnt/user/NCloud/Mihle/files/Filer/Backup/NAS Flash Backup/ echo 'Executing native unraid backup script' /usr/local/emhttp/webGui/scripts/flash_backup #### SECTION 2 ####------------------------------------------------------------------------------------------------------ echo 'Remove symlink from emhttp' find /usr/local/emhttp/ -maxdepth 1 -name '*flash-backup-*.zip' -delete sleep 5 #### SECTION 3 ####------------------------------------------------------------------------------------------------------ echo 'Move Flash Zip Backup from Root to Backup Destination' mv /*-flash-backup-*.zip "$dir" sleep 5 #### SECTION 4 ####------------------------------------------------------------------------------------------------------ echo 'Deleting Old Backups' #ENTER NUMERIC VALUE OF DAYS AFTER "-MTIME +" find "$dir"* -mtime +90 -exec rm -rfv {} \; echo 'All Done' #### SECTION 5 ####------------------------------------------------------------------------------------------------------ #UNCOMMENT THE NEXT LINE TO ENABLE GUI NOTIFICATION UPON COMPLETION /usr/local/emhttp/webGui/scripts/notify -e "Unraid Server Notice" -s "Flash Zip Backup" -d "A copy of the NAS unraid flash disk has been backed up" -i "normal" exit When I run it, it did say something about it could not find directory " " over and over and over I think? dont quite remember Did I write something in to the script wrongly?
  24. Ah, searched and found this one: EDIT: That way to do flash backup works better than what this plugin does for how I want to use it.