CrimsonTyphoon Posted August 22, 2017 Share Posted August 22, 2017 (edited) 5 hours ago, Phastor said: Where's a good place to map /tmp? Does it get large? Would I be able to map it to something like /user/appdata/duplicati/tmp without an issue? I have mine mapped to /tmp So the host and the container are both /tmp. If my understanding is correct, /tmp is in your RAM, so make sure you have enough Edited August 22, 2017 by CrimsonTyphoon Add quote Quote Link to comment
CrimsonTyphoon Posted August 23, 2017 Share Posted August 23, 2017 (Yes, I quoted myself) 49 minutes ago, CrimsonTyphoon said: I have mine mapped to /tmp So the host and the container are both /tmp. If my understanding is correct, /tmp is in your RAM, so make sure you have enough On 2nd thought, don't map /tmp to /tmp until you hear from someone else. I am not an expert. I mapped my /tmp to a cache only share. Can someone else chime in if it's okay to map /tmp to /tmp? Quote Link to comment
scytherbladez Posted August 23, 2017 Share Posted August 23, 2017 I came here looking for alternatives to CrapPlan bailing on home consumers. SpaceInvader One video on Duplicati is worth watching. In it he discovered that the default /tmp folder could potentially fill up your "docker" image file (and details how to remap it). Since SpaceInvader One video LS have since added the variable to make this mapping easier, you just tell it where you want tmp files stored. For majority of users the cache should suffice, depending on the upload volume size. Quote Link to comment
lovingHDTV Posted August 23, 2017 Share Posted August 23, 2017 I decided to give this a try, now that I need to drop Crashplan before Oct 31. Went with Backblaze B2 as I can backup up 4 devices for cheaper than Crashplan Business at $10 per device. I followed SpaceInvader One video and got it all setup. did a couple backup/restore tests to ensure it would work. I have paused/restarted the backup without issue. It will take a while to complete as I only have 10mb/s upload. I killed the docker and upon restart the backup is not automatically resumed. It did report that it will start at the next scheduled time. I have the default of once per day. It did not clean up the unsent dup-* files found on the cache drive where I mapped /tmp. That is a bit disconcerting. Hopefully they will get cleaned up at some point. I do see it making new files after I started it again. david Quote Link to comment
Phastor Posted August 23, 2017 Share Posted August 23, 2017 (edited) I have a top level share called "backups" that I keep copies of my game server saves in and the Veeam archives for my workstations. That share is allocated to cache only, so based on what I heard here, I decided to put a tmp folder under that and map /tmp to that. Next question: I did a test run and the first thing I noticed were the tons upon tons of 50 MB zip files going onto my external backup drive. I did some reasearch and learned about remote volumes. From what I understand, the default size of these volumes were set to 50 MB with it in mind that the smaller sizes would be easier to move up and down to cloud services. Given that I eventually fill my 8 TB and am backing it all up, that puts it in the ballpark of around 160,000 of these little guys sitting on that drive. I don't intend to push my backups to the cloud, and instead plan stick with a couple USB drives that I rotate so that I can have one off-site. With this in mind, should I set the remote volume size to something larger? The majority of what I'm backing up is audio and video if that's a factor. And the same as a couple other people have stated here, I'm here now because of Crashplan deciding to crap on their home users. Edited August 23, 2017 by Phastor Quote Link to comment
lovingHDTV Posted August 23, 2017 Share Posted August 23, 2017 I am backup up to the cloud and changed mine to 100MB without issue. It all has to do with how long it takes to manage a single file if it is huge. There were issues if the block size is > 4TB, but I think those have been fixed. Quote Link to comment
Phastor Posted August 23, 2017 Share Posted August 23, 2017 (edited) I kept the volume size 50MB and did another test. Backup seems to be going really slow as it did about 30GB in two hours. I set it to 1GB just for testing and got the same results. I monitored the shares as the backup was happening and it seems that the slowness is occurring during the generation of the zip files. It takes about 90 seconds to generate one of the 1GB volumes. Transfer to the USB drive seems fine as it only takes about 10 seconds to copy the file to the drive after it's generated. I disabled compression thinking that might speed things up. Most of my files are audio and video, so I won't benefit much from compression. I'm not really noticing any difference even after doing that. Is there any way I can speed that up? Edited August 23, 2017 by Phastor Quote Link to comment
lovingHDTV Posted August 24, 2017 Share Posted August 24, 2017 Anyone have tips on getting this to send email notifications? thanks david Quote Link to comment
isvein Posted August 25, 2017 Share Posted August 25, 2017 What would be best practise to use Duplicati to backup from an remote computer? Quote Link to comment
JonathanM Posted August 25, 2017 Share Posted August 25, 2017 2 hours ago, isvein said: What would be best practise to use Duplicati to backup from an remote computer? Site to site VPN 1 Quote Link to comment
isvein Posted August 25, 2017 Share Posted August 25, 2017 3 hours ago, jonathanm said: Site to site VPN And seperate SSH users for the clients? Cant have all use root for Duplicate connection. Quote Link to comment
Gog Posted August 25, 2017 Share Posted August 25, 2017 5 hours ago, isvein said: And seperate SSH users for the clients? Cant have all use root for Duplicate connection. I also decided to try this to replace crashplan. For my "crashplan friends", I just setup a FTP server on unraid. The data is encrypted by the duplicati client so privacy is ok if not hardcore. I created a bunch of FTP users and told them to use different root directory if they backup different PCs. 1 Quote Link to comment
isvein Posted August 26, 2017 Share Posted August 26, 2017 8 hours ago, Gog said: I also decided to try this to replace crashplan. For my "crashplan friends", I just setup a FTP server on unraid. The data is encrypted by the duplicati client so privacy is ok if not hardcore. I created a bunch of FTP users and told them to use different root directory if they backup different PCs. That works I found out that every file and every folder on an unraid server (I guess I should have checked this before) has all access (777). So if an FTP user conencts to the server with FTP, they have access to all files even if they should not have. (looks like its only the SMB/AFP permitions that blocks access) But I also have a seperate unraid server that is used only for backup, and since everything here is encrypted its ok for friends and family to have access. Quote Link to comment
scytherbladez Posted August 26, 2017 Share Posted August 26, 2017 30 minutes ago, isvein said: I found out that every file and every folder on an unraid server (I guess I should have checked this before) has all access (777). So if an FTP user conencts to the server with FTP, they have access to all files even if they should not have. I've never used unraids FTP server, but that is good (or should I say bad) to know. Quote Link to comment
isvein Posted August 26, 2017 Share Posted August 26, 2017 (edited) 19 hours ago, jonathanm said: Site to site VPN Looks like site to site vpn and connect to the backup share from the client over SMB/AFP is the most secure option Edited August 26, 2017 by isvein Quote Link to comment
allanp81 Posted August 26, 2017 Share Posted August 26, 2017 This docker seems to create a public share named "HttpServer" whenever it starts that the fix common errors plugin then flags up as a problem. Any one know this share is created and what its purpose is? Quote Link to comment
CHBMB Posted August 26, 2017 Share Posted August 26, 2017 3 minutes ago, allanp81 said: This docker seems to create a public share named "HttpServer" whenever it starts that the fix common errors plugin then flags up as a problem. Any one know this share is created and what its purpose is? Not seen that myself, perhaps if you attached your docker run command, some logs and some config information we can see what's happening Quote Link to comment
allanp81 Posted August 26, 2017 Share Posted August 26, 2017 This is the output from updating the docker: root@localhost:# /usr/local/emhttp/plugins/dynamix.docker.manager/scripts/docker run -d --name="duplicati" --net="bridge" -e TZ="Europe/London" -e HOST_OS="unRAID" -e "PUID"="99" -e "PGID"="100" -p 8200:8200/tcp -v "/mnt/cache/":"/tmp":rw -v "/mnt/disks/BUFFALO_External_HDD/Backup/duplicati/":"/backups":rw,slave -v "/mnt/user/P/":"/source":ro,slave -v "/mnt/cache/appdata/duplicati":"/config":rw linuxserver/duplicati a1b4a215c1f22c23e12a7d5675ab3d7f83eb0a9590e97008c63faeb2bbc3840b I'm not sure if this is what you mean by run command. If I bring up the log from the row under docker it shows me: Quote [s6-init] making user provided files available at /var/run/s6/etc...exited 0.[s6-init] ensuring user provided files have correct perms...exited 0.[fix-attrs.d] applying ownership & permissions fixes...[fix-attrs.d] done.[cont-init.d] executing container initialization scripts...[cont-init.d] 10-adduser: executing...-------------------------------------_ _ _| |___| (_) ___| / __| | |/ _ \| \__ \ | | (_) ||_|___/ |_|\___/|_|Brought to you by linuxserver.ioWe gratefully accept donations at:https://www.linuxserver.io/donations/-------------------------------------GID/UID-------------------------------------User uid: 99User gid: 100-------------------------------------[cont-init.d] 10-adduser: exited 0.[cont-init.d] 30-config: executing...[cont-init.d] 30-config: exited 0.[cont-init.d] done.[services.d] starting services[services.d] done.Server has started and is listening on 0.0.0.0, port 8200 I have about 10 other dockers and don't seem to have this issue with any others. Quote Link to comment
CHBMB Posted August 26, 2017 Share Posted August 26, 2017 It's because you're mounting /mnt/cache/ to /tmp Quote Link to comment
allanp81 Posted August 26, 2017 Share Posted August 26, 2017 Why would that create a share named httpserver if you don't mind me asking? Quote Link to comment
allanp81 Posted August 26, 2017 Share Posted August 26, 2017 Ah, I see why, because that directory would otherwise be created under /tmp? Quote Link to comment
CHBMB Posted August 26, 2017 Share Posted August 26, 2017 YepSent from my LG-H815 using Tapatalk Quote Link to comment
Gog Posted August 26, 2017 Share Posted August 26, 2017 6 hours ago, isvein said: That works I found out that every file and every folder on an unraid server (I guess I should have checked this before) has all access (777). So if an FTP user conencts to the server with FTP, they have access to all files even if they should not have. (looks like its only the SMB/AFP permitions that blocks access) But I also have a seperate unraid server that is used only for backup, and since everything here is encrypted its ok for friends and family to have access. Yeah, I figured with proftpd restricting access to the backup directory it wasn't too bad but it still eaves me a bit twitchy. I'm trying to switch to sFTP now but I'm fighting with the public key authentication method. I'll post info if I figure it out. Quote Link to comment
lovingHDTV Posted August 26, 2017 Share Posted August 26, 2017 On 8/24/2017 at 4:55 PM, lovingHDTV said: Anyone have tips on getting this to send email notifications? thanks david I asked and got an answer from the Duplicati forums and verified that email works just fine in the docker: https://forum.duplicati.com/t/how-to-set-up-email-notification/233/12 hopefully this helps someone, david Quote Link to comment
isvein Posted August 27, 2017 Share Posted August 27, 2017 (edited) Maybe not scope of this thread, but does anyone know how I can change the TMP folder Duplicati on windows uses? It uses the default tmp folder and it fills up my SSD Edit: I changed the system location of the TMP, no need for windows to use the SSD as tmp storage. edit2: How small/big should the Upload volume size be? Edited August 28, 2017 by isvein no need to create a new post Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.