Djoss Posted February 10, 2020 Author Share Posted February 10, 2020 On 2/4/2020 at 3:35 PM, ezzys said: Can I use this to backup to a local Nas too? Yes, if your NAS is accessible via a local directory path. Quote Link to comment
Larz Posted March 29, 2020 Share Posted March 29, 2020 I have installed the CloudBerry Backup Docker container and it is working well. Thanks very much for setting this up for us. I have configured the system to store to IDrive Cloud using CloudBerry-specific configuration details provided by IDrive. Good job IDrive. I first configured CloudBerry to use the S3 compatible API, but I would not get a list of my buckets in the Storage Add Account Bucket drop-down list. I tried several times with the S3 keys and endpoint specified by IDrive. Switching to the Open Stack API worked fine. Have others been successful using the S3 compatible storage API? Have others worked with IDrive Cloud? Thanks in advance for any feedback. P.S. IDrive Cloud offers 2TB of cloud storage for $69.50US discounted to $6.95 for the first year. Ther offer a 5TB account for under $100. Quote Link to comment
Djoss Posted April 1, 2020 Author Share Posted April 1, 2020 On 3/29/2020 at 10:56 AM, Larz said: I have installed the CloudBerry Backup Docker container and it is working well. Thanks very much for setting this up for us. I have configured the system to store to IDrive Cloud using CloudBerry-specific configuration details provided by IDrive. Good job IDrive. I first configured CloudBerry to use the S3 compatible API, but I would not get a list of my buckets in the Storage Add Account Bucket drop-down list. I tried several times with the S3 keys and endpoint specified by IDrive. Switching to the Open Stack API worked fine. Have others been successful using the S3 compatible storage API? Have others worked with IDrive Cloud? Thanks in advance for any feedback. P.S. IDrive Cloud offers 2TB of cloud storage for $69.50US discounted to $6.95 for the first year. Ther offer a 5TB account for under $100. I never tried with iDrive cloud, but if they are offering an S3 compatible API and it doesn't work, maybe you could have better luck if you check with them ? Quote Link to comment
quinnjudge Posted May 4, 2020 Share Posted May 4, 2020 Hello, hoping to get a little direction... I've been using this container for a few months to back up to BackBlaze B2, and it has been terrific! Recently a couple of files have been giving me trouble, and I'm not sure where to start as far as troubleshooting. When I try and back up files generated from CA Appdata Backup / Restore v2, get the following message in CBB: SSL_write() returned SYSCALL, errno = 32 and the backup job fails. When I remove these files from the backup job it runs successfully, but adding the files back in to the backup job will cause the error again. Any idea where the problem may lie, or who to start the right conversation with? Thanks in advance! Quote Link to comment
Djoss Posted May 4, 2020 Author Share Posted May 4, 2020 3 hours ago, quinnjudge said: Hello, hoping to get a little direction... I've been using this container for a few months to back up to BackBlaze B2, and it has been terrific! Recently a couple of files have been giving me trouble, and I'm not sure where to start as far as troubleshooting. When I try and back up files generated from CA Appdata Backup / Restore v2, get the following message in CBB: SSL_write() returned SYSCALL, errno = 32 and the backup job fails. When I remove these files from the backup job it runs successfully, but adding the files back in to the backup job will cause the error again. Any idea where the problem may lie, or who to start the right conversation with? Thanks in advance! Did you set upload bandwidth limits ? Quote Link to comment
quinnjudge Posted May 5, 2020 Share Posted May 5, 2020 17 hours ago, Djoss said: Did you set upload bandwidth limits ? Yes; my thought was to ensure a backup does not interfere with web conferencing software (I'm still working full-time from my home office)...I have the limit for cloud storage set to approx. 80% of available upload bandwidth. Quote Link to comment
Djoss Posted May 6, 2020 Author Share Posted May 6, 2020 On 5/5/2020 at 1:42 PM, quinnjudge said: Yes; my thought was to ensure a backup does not interfere with web conferencing software (I'm still working full-time from my home office)...I have the limit for cloud storage set to approx. 80% of available upload bandwidth. It seems that setting bandwidth limit can cause SSL timeouts. Did you try to remove it to see if it fixes the issue? Quote Link to comment
quinnjudge Posted May 7, 2020 Share Posted May 7, 2020 1 hour ago, Djoss said: It seems that setting bandwidth limit can cause SSL timeouts. Did you try to remove it to see if it fixes the issue? I removed the bandwidth limit in CBB, and the backup job was successful; no errors on the files like before - thanks!!! So, is this an issue with CBB or with BackBlaze B2? Quote Link to comment
Djoss Posted May 8, 2020 Author Share Posted May 8, 2020 On 5/6/2020 at 9:17 PM, quinnjudge said: I removed the bandwidth limit in CBB, and the backup job was successful; no errors on the files like before - thanks!!! So, is this an issue with CBB or with BackBlaze B2? It seems to be an issue on CBB side... you can always complains to their support team Quote Link to comment
CorneliousJD Posted May 13, 2020 Share Posted May 13, 2020 I'm going to try using this to get away from CrashPlan (I have INotify limits and high resource usage with CP Pro daily, and my pricing finally went up to $10/mo a while back) That being said, I'm trying to use it with Google Archival Storage, I think I'd like to also supplement with a local backup disk some day as well so I do not have to rely on Google Cloud for restores unless a disaster strikes. My understanding with Google Archival storage is pricing is only $0.0012/GB/mo, very cheap. But any file that touches the service you automatically pay for storage for a full year on. 365 days. I have my retention policy setup like this, would this be accurate to get the best "bang for my buck" with their storage then since I'm paying for 365 days of storage per file anyways? I also have daily block level backups with monly forced full backups. PS - the /flash mapping to /boot no longer works anymore it seems, the flash folder is empty inside the container. PSS - If Google Archival storage is a waste of time and i should just go for B2 that's fine, I'm just trying to keep costs as low as possible. With my ~800GB of data to backup, this is just less than $1/mo where B2 is $4/mo. Thanks in advance! Quote Link to comment
CorneliousJD Posted May 15, 2020 Share Posted May 15, 2020 On 5/13/2020 at 3:34 PM, CorneliousJD said: I'm going to try using this to get away from CrashPlan (I have INotify limits and high resource usage with CP Pro daily, and my pricing finally went up to $10/mo a while back) That being said, I'm trying to use it with Google Archival Storage, I think I'd like to also supplement with a local backup disk some day as well so I do not have to rely on Google Cloud for restores unless a disaster strikes. My understanding with Google Archival storage is pricing is only $0.0012/GB/mo, very cheap. But any file that touches the service you automatically pay for storage for a full year on. 365 days. I have my retention policy setup like this, would this be accurate to get the best "bang for my buck" with their storage then since I'm paying for 365 days of storage per file anyways? I also have daily block level backups with monly forced full backups. PS - the /flash mapping to /boot no longer works anymore it seems, the flash folder is empty inside the container. PSS - If Google Archival storage is a waste of time and i should just go for B2 that's fine, I'm just trying to keep costs as low as possible. With my ~800GB of data to backup, this is just less than $1/mo where B2 is $4/mo. Thanks in advance! Well this ended up being $20 for 2 days worth of backups... Looks like I'll migrate over to B2. What is everyone's retention policies for B2 and delete policies? My initial thought would be to keep 30 days from modification date, always keep last version, and 3 versions, and delete 30 days after as well so we aren't dealing with too much data retention. Thanks all. Quote Link to comment
CorneliousJD Posted May 27, 2020 Share Posted May 27, 2020 Alright, I'm still going through some hoops getting this setup properly to give me a minimal B2 usage and also be efficient enough to give me what I need. I have it setup like this currently, and I think it's right, the goal here is to Backup CA Appdata backup files (generated weekly) and keep them for 30 days. Backup direct appdata/nextcloud/some other files as well. Only keep X days of versions. Keep latest 3 versions at minimum (even if older than X days) If anything is explictly deleted or unchecked from backup jobs, delete them from B2 after 30 days. Does this look correct to you guys? If there's a better retention that you think I should be using please let me know. I'm currently debating keeping 7 days, 14 days, or 30 days of versions. Quote Link to comment
CorneliousJD Posted May 27, 2020 Share Posted May 27, 2020 I keep running into an issue while trying to test this container too... I just setup a fresh install of it and a fresh B2 bucket, but now I'm seeing that purges from B2 are failing as "File not present". I saw this last week too and CloudBerry support isn't being much help here honestly, I'm not sure what's going on. I just want to backup my entire /appdata/ folder at midnight every night. CrashPlan is currently doing this and is not complaining (I'm also not ever purging anything either), but I'm really trying to get away from CrashPlan and thought CloudBerry would be leaps and bounds better, but I'm not sure what's going on now, this hasn't even been up and running for a few hours yet and purges are failing? Quote Link to comment
Djoss Posted May 28, 2020 Author Share Posted May 28, 2020 19 hours ago, CorneliousJD said: I keep running into an issue while trying to test this container too... I just setup a fresh install of it and a fresh B2 bucket, but now I'm seeing that purges from B2 are failing as "File not present". I saw this last week too and CloudBerry support isn't being much help here honestly, I'm not sure what's going on. I just want to backup my entire /appdata/ folder at midnight every night. CrashPlan is currently doing this and is not complaining (I'm also not ever purging anything either), but I'm really trying to get away from CrashPlan and thought CloudBerry would be leaps and bounds better, but I'm not sure what's going on now, this hasn't even been up and running for a few hours yet and purges are failing? I reported this issue a few years ago... so looks like no fix has been done yet. I suspect that this happens to files that were present during the scan, but got deleted from the host before being uploaded to the cloud. Quote Link to comment
Smackover Posted July 11, 2020 Share Posted July 11, 2020 With Cloudberry Labs now MSP360, is there any concern that this software will go away? I'm finally getting around to setting up cloud backup for my server, and am debating going this way or just using rclone. Endpoint will be Backblaze B2 regardless. Quote Link to comment
Djoss Posted July 20, 2020 Author Share Posted July 20, 2020 On 7/11/2020 at 4:59 PM, Smackover said: With Cloudberry Labs now MSP360, is there any concern that this software will go away? I'm finally getting around to setting up cloud backup for my server, and am debating going this way or just using rclone. Endpoint will be Backblaze B2 regardless. I'm personally not too concerned. The company has been renamed, but they still sell and develop the software. Quote Link to comment
jasonculp Posted July 31, 2020 Share Posted July 31, 2020 (edited) I am sure I am missing something very simple. I just can not get this to work. I downloaded the CloudBerryBackup from the apps section. I used the defaults in the settings. It shows to be running but says Server Disconnected (code: 1006) when I go the [ip_adderss:7802] Thank you! Edit: This is solved. I tried deleting my cache in Chrome, but that didn't work. After quite a bit of trial and error, I deleted everything in Chrome, now it works. Edited July 31, 2020 by jasonculp Problem Solved Quote Link to comment
DivideBy0 Posted August 10, 2020 Share Posted August 10, 2020 I have configured both CloudBerry and Duplicati for backups to the same server via sftp. Of course I run and tested them one at the time and here is my puzzle to solve: With CloudBerry my upload speed stops at 4-6 MB/s where with Duplicati it goes all the way up to 16-18 MB/s. They both use the same server and same sftp protocol. I have 1G ISP but my dilemma is why CloudBerry can't match at least the Duplicati upload speed? Am I doing something wrong? CloudBerry speed bellow. Any custom settings I should change? Quote Link to comment
Djoss Posted August 10, 2020 Author Share Posted August 10, 2020 14 minutes ago, johnwhicker said: I have configured both CloudBerry and Duplicati for backups to the same server via sftp. Of course I run and tested them one at the time and here is my puzzle to solve: With CloudBerry my upload speed stops at 4-6 MB/s where with Duplicati it goes all the way up to 16-18 MB/s. They both use the same server and same sftp protocol. I have 1G ISP but my dilemma is why CloudBerry can't match at least the Duplicati upload speed? Am I doing something wrong? CloudBerry speed bellow. Any custom settings I should change? I assume you did not enable compression/encryption ? It seems that there are a few things you can do to improve upload speed. See : https://kb.msp360.com/standalone-backup/general/how-to-increase-upload-speed Quote Link to comment
DivideBy0 Posted August 10, 2020 Share Posted August 10, 2020 47 minutes ago, Djoss said: I assume you did not enable compression/encryption ? It seems that there are a few things you can do to improve upload speed. See : https://kb.msp360.com/standalone-backup/general/how-to-increase-upload-speed Thanks, No encryption is not enabled at all. Actually with Duplicati encryption is enabled and I still get 16MB/S I did try all these various settings and still no improvement. There is gotta be something I am missing Quote Link to comment
DivideBy0 Posted August 10, 2020 Share Posted August 10, 2020 Yeah is clearly something with CloudBerry. I get 18 on Duplicati with encryption on. Quote Link to comment
DivideBy0 Posted August 10, 2020 Share Posted August 10, 2020 And with Duplicity I got almost 40-60 so man what a difference between the various clients ? Quote Link to comment
uhf Posted October 9, 2020 Share Posted October 9, 2020 Can CBB be updated from within the GUI, or do I need to wait for the docker to be updated? And do I need to worry about paying for annual maintenance? Quote Link to comment
Djoss Posted October 14, 2020 Author Share Posted October 14, 2020 On 10/8/2020 at 10:00 PM, uhf said: Can CBB be updated from within the GUI, or do I need to wait for the docker to be updated? And do I need to worry about paying for annual maintenance? Yes, you need to wait for the Docker image to be updated. For the annual maintenance, it's up to you. You won't get support if you don't have one. But that has not been a problem for me. They also have forums where you can seek help. Quote Link to comment
domingothegamer Posted November 16, 2020 Share Posted November 16, 2020 (edited) Hello! I was wondering if anyone else was having issues with the current version of the docker. I'm having time zone issues using MinIO which in my case is hosted on Truenas core s3 service feature. It sees the SSL and accepts connection, but gives a time zone issue. Searched earlier in this topic and seems to be something that has to be fixed on the build's end. Thanks for your time! Edited November 16, 2020 by domingothegamer wording Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.