Djoss Posted August 23, 2017 Author Share Posted August 23, 2017 It is not monitoring each file for changes like CrashPlan. However, you can configure the backup to run every minute if you want. 1 Quote Link to comment
Djoss Posted August 23, 2017 Author Share Posted August 23, 2017 10 hours ago, Djoss said: And if it's possible for you, can you try to remove the container and its folder under appdata, then re-install using the Community Apps plugin, keeping all default settings? @daniel329, I've just tested with a new and fresh install using the Free version. I can restart the container without loosing anything. Any development on your side? Quote Link to comment
Evgeny Posted August 29, 2017 Share Posted August 29, 2017 Awesome toy Djoss! There is no limitation from the storage support stand point as well as life cycle (details are here: https://www.cloudberrylab.com/backup.aspx), right at the bottom. Quote Link to comment
Djoss Posted August 29, 2017 Author Share Posted August 29, 2017 Make sure to look at the Linux version, which doesn't have the same features. There are multiple editions you can get. Look at the bottom of the page: https://www.cloudberrylab.com/backup/linux.aspx Quote Link to comment
toy4x4 Posted August 30, 2017 Share Posted August 30, 2017 Thanks for the docker! Had picked up a free Linux Pro license in April. Working great! Quote Link to comment
ksignorini Posted August 30, 2017 Share Posted August 30, 2017 @Djoss I see a ton of "/" locations in the file selection window. None of them have subdirectories. Is this normal? Quote Link to comment
Djoss Posted August 30, 2017 Author Share Posted August 30, 2017 I guess not, do you have a screenshot? Quote Link to comment
ksignorini Posted August 31, 2017 Share Posted August 31, 2017 9 hours ago, Djoss said: I guess not, do you have a screenshot? One with all / closed. One with top / open. One with top / open and bottom /'s all clicked on once (on the exposure triangles) to show that they're empty. Quote Link to comment
Djoss Posted August 31, 2017 Author Share Posted August 31, 2017 Finally I also have the same thing... I will try to see if I can remove the last three "/". Quote Link to comment
ksignorini Posted August 31, 2017 Share Posted August 31, 2017 21 minutes ago, Djoss said: Finally I also have the same thing... I will try to see if I can remove the last three "/". It gets weirder. If I map in unRAID's /mnt rather than /mnt/user for /storage, or as you can see in the screenshots below: if I map in unRAID's /mnt into a different container directory, all the drives show up at the highest level of the list and not in the correct spot in the container hierarchy. It's weird, I tell ya. I haven't seen that happen in any other dockers. Quote Link to comment
Djoss Posted September 1, 2017 Author Share Posted September 1, 2017 The problem seems to be with the way CloudBerry Backup handles mounts inside the container. I've opened a case with their support team. During this time, you can just ignore extra "/" locations. Quote Link to comment
toy4x4 Posted September 1, 2017 Share Posted September 1, 2017 Anyone else having a issue with backups to B2 stalling after about 10%? I have about 16gb to upload down from the original 19gb. I can upload some and then it stalls and I can restart, but I see some of the larger video files I have re-upload many times. Quote Link to comment
toy4x4 Posted September 1, 2017 Share Posted September 1, 2017 I did bump up the advanced network settings and it seems to be better. I think the main thing I changed that helped was the memory used setting under Application -> advanced settings. I changed it to 4096 and it seems to have replaced it with 0. Seems to be working for now. Will watch. Quote Link to comment
Djoss Posted September 1, 2017 Author Share Posted September 1, 2017 Good to know, let us know. Did you have any errors in log files (/config/logs) ? Quote Link to comment
1812 Posted September 3, 2017 Share Posted September 3, 2017 I'm seeing no storage limit on the free version using local file system for backup. Or at least, it hasn't stopped at 200GB as advertised (I'm at 2TB and going to a local server...) Can anyone else confirm? Quote Link to comment
Phastor Posted September 6, 2017 Share Posted September 6, 2017 Starting testing this out. Seems like my choice lies between this and Duplicati. Completed my first test backup and it seems to have skipped over a lot of files. Many things are missing in the backup target. I'm running the backup again and it seems to be picking up what it missed, but now I'm wondering if it will still miss things on this pass. Has anyone else seen this? I'm hoping this is something that can be fixed, because I'm already impressed at the speed that it runs at going to my USB3 drive compared to Duplicati. Quote Link to comment
scytherbladez Posted September 6, 2017 Share Posted September 6, 2017 44 minutes ago, Phastor said: Starting testing this out. Seems like my choice lies between this and Duplicati. Same. However I'm thinking that neither of these are an adequate solution to replace what CrashPlan can actually do (Reliable, unlimited storage, unlimited versioning). In fact it's made me realise how good CrashPlan actually is. Quote Link to comment
Djoss Posted September 6, 2017 Author Share Posted September 6, 2017 5 hours ago, Phastor said: Completed my first test backup and it seems to have skipped over a lot of files. Many things are missing in the backup target. I'm running the backup again and it seems to be picking up what it missed, but now I'm wondering if it will still miss things on this pass. Has anyone else seen this? I assume these files have not been added while the backup was running? What is the size of files it skipped? Big ones? Did you look at the logs (/config/logs) to see if there is any errors? Quote Link to comment
Phastor Posted September 6, 2017 Share Posted September 6, 2017 (edited) 1 hour ago, Djoss said: I assume these files have not been added while the backup was running? What is the size of files it skipped? Big ones? Did you look at the logs (/config/logs) to see if there is any errors? Well, it just got weirder. During the second backup, I watched the target drive as the files were added. It actually does appear to backup everything selected, but then deletes them from the target after the backup completes. It only left the files that were there after the first attempt. The files in this test backup range between a few KB (documents), to a couple GB (short video). The test backup as a whole is about 100GB. Edited September 6, 2017 by Phastor Quote Link to comment
Djoss Posted September 6, 2017 Author Share Posted September 6, 2017 Did you try to restore them to see if they are really missing? Quote Link to comment
Phastor Posted September 6, 2017 Share Posted September 6, 2017 (edited) While going in to restore within Cloudberry, the only files available to retore are those that I see when manually looking through the target drive. Just the ones that it didn't delete after completing the backup. Edited September 6, 2017 by Phastor Quote Link to comment
Djoss Posted September 6, 2017 Author Share Posted September 6, 2017 And what the history (accessible from the left pane) is saying about these files? Quote Link to comment
Phastor Posted September 6, 2017 Share Posted September 6, 2017 Filtered by Purge and found that they are indeed being purged after the backup. I have my retention set to keep backups going back to a month. Shouldn't it only be doing this to file versions that are older than a month? Quote Link to comment
1812 Posted September 6, 2017 Share Posted September 6, 2017 unless you have it set to delete after local deletion. Quote Link to comment
Djoss Posted September 6, 2017 Author Share Posted September 6, 2017 Also, is the option "Always keep the last version" selected? What is the time/date of the files not being in backup? Is is possible that the time of the first version is the same as the one of the files? Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.