1812 Posted September 6, 2017 Share Posted September 6, 2017 Speaking of time, how do you change it in the program? Mine is wrong. Quote Link to comment
Phastor Posted September 6, 2017 Share Posted September 6, 2017 (edited) 40 minutes ago, Djoss said: Also, is the option "Always keep the last version" selected? What is the time/date of the files not being in backup? Is is possible that the time of the first version is the same as the one of the files? I just did another backup attempt with "Always keep the last version" selected. That seemed to fix it. It retained everything this time. I figured it would have have hung onto everything without that selected as long as the backups were not older than a month. Does it determine this by the files last modify date rather than the time of the backup that it was actually taken? And I have the same issue 1812's comment above. The time is off. Edited September 6, 2017 by Phastor Quote Link to comment
Djoss Posted September 6, 2017 Author Share Posted September 6, 2017 52 minutes ago, Phastor said: Does it determine this by the files last modify date rather than the time of the backup that it was actually taken? I don't know, but no matter which time it is taken, I think the Always keep the last version" should be selected if you don't want your files to be purged eventually. 57 minutes ago, Phastor said: And I have the same issue 1812's comment above. The time is off. Fix is coming! Quote Link to comment
Phastor Posted September 6, 2017 Share Posted September 6, 2017 (edited) Eew. Just changing the filename (or that of its parent folder) or moving a file triggers a full duplicate instance of that file in the backup. Since most of my data is decently large video files, that's a deal breaker for me. I understand that's a limitation of the software itself and not the docker container. Thanks anyway for your help on this! Edited September 6, 2017 by Phastor Quote Link to comment
Djoss Posted September 6, 2017 Author Share Posted September 6, 2017 8 minutes ago, Phastor said: Since most of my data is video, that's a deal breaker for me. Because of the space taken by the backup? Files deleted locally can also be removed from the backup. Quote Link to comment
Djoss Posted September 6, 2017 Author Share Posted September 6, 2017 By the way, new container image with timezone fix is available. Quote Link to comment
Phastor Posted September 6, 2017 Share Posted September 6, 2017 (edited) Thanks for the quick timzone fix! Partially for the space, partially for the backup duration, and mostly because my backup drive can't hold more than one copy of my video. I suppose I could create a separate backup plan for video that immediately removes files that were locally deleted. However, if I were to move a video, the software will still want to create a backup of that video in it's new location before deleting the old, right? Just renaming the top level directory of my video files would trigger a full copy of every video, unless I'm misunderstanding this. At over a TB of video, that would be a long backup for a small change. Edited September 6, 2017 by Phastor Quote Link to comment
Djoss Posted September 6, 2017 Author Share Posted September 6, 2017 Yeah, renaming stuff create duplicates in the backup. However, this behavior would be the same for any backup applications not supporting deduplication... Quote Link to comment
Phastor Posted September 6, 2017 Share Posted September 6, 2017 (edited) Yeah, I never looked into whether Cloudberry supported it or not until now. I guess it just doesn't fit my use case. It's shame since I really like it! I guess I'm stuck with the slowness of Duplicati. Edited September 6, 2017 by Phastor Quote Link to comment
scytherbladez Posted September 7, 2017 Share Posted September 7, 2017 9 hours ago, Djoss said: Yeah, renaming stuff create duplicates in the backup. Goodbye CloudBerry. Quote Link to comment
1812 Posted September 7, 2017 Share Posted September 7, 2017 Something neat I just figured out: If you set multiple backup plans for different folders (current, cold storage, work, etc...) and are backing up local unencrypted, you can run them all at the same time. If you're using encryption or compression, it makes the next plan wait for the first to finish. I thought it was fast before.... Quote Link to comment
bertrandr Posted September 16, 2017 Share Posted September 16, 2017 Anyone else seeing very high CPU utilization during backup? I have a decent quad core E3-1220 v3 @ 3.10GHz in my server and this CB docker is pushing all 4 cores to 75-100% while running backups. I have tried adjusting the Threads, Chunk size and Memory settings but they don't seem to make much of a difference. My current "test" backup is mostly jpg files, encryption is on, compression off -backing up to Backblaze B2 buckets. Bandwidth throttle set to 1024 KByte/s on a 15 mbps upload intent connection. Cheers, BR Quote Link to comment
Djoss Posted September 16, 2017 Author Share Posted September 16, 2017 Do you see the same CPU usage with encryption disabled? Quote Link to comment
1812 Posted September 17, 2017 Share Posted September 17, 2017 I spread my usage over 15 cores... so... no 10-30% usage on encryption. you can set priority in the docker settings and thread count in the GUI if it's using too much. Quote Link to comment
isvein Posted September 17, 2017 Share Posted September 17, 2017 hello Each time I try to use local or mounted storage from another local server as storage for a backup I get "cant create dir in specificed path" even if the share is set to r/w. Anyone know why? Quote Link to comment
1812 Posted September 17, 2017 Share Posted September 17, 2017 1 hour ago, isvein said: hello Each time I try to use local or mounted storage from another local server as storage for a backup I get "cant create dir in specificed path" even if the share is set to r/w. Anyone know why? use r/w slave and double check your password/login when mounting the remote location in unassigned devices. Quote Link to comment
isvein Posted September 17, 2017 Share Posted September 17, 2017 (edited) I have tried that now and still same I tried to have it backup to an Minio server on the other unraid server. It can take backup fine, but restore wont work. Looks like it have only read permitions and not write. edit: after reading all of the posts here I got it to be able to write to the orginal share, but no success with the SMB share I mounted yet. Lets try NFS Edited September 17, 2017 by isvein Quote Link to comment
Djoss Posted September 18, 2017 Author Share Posted September 18, 2017 Are you able to write to your share from unraid itself? Double check the ownership and permissions of your share... 1 Quote Link to comment
bertrandr Posted September 18, 2017 Share Posted September 18, 2017 On 9/16/2017 at 4:50 PM, Djoss said: Do you see the same CPU usage with encryption disabled? Yes I do, turning encryption on/off made no difference. I also experimented with enabling / disabling the bandwidth throttling feature but it made no difference either. -Restarted the docker and backup job after each change. Without throttling, encryption or compression -I'm not sure why the app is so CPU hungry if all it is doing is copying jpeg files into Backblaze. I have opened a ticket directly with Cloudberry and apparently it is a "know issue"? and a "fix is on the way" ? I'm open to suggestions and willing to try other ideas! I know I could pin the Docker to a single CPU thread, but that seems more like a band-aid, I would rather help find a fix. Cheers, BR Quote Link to comment
bertrandr Posted September 18, 2017 Share Posted September 18, 2017 2 minutes ago, bertrandr said: Yes I do, turning encryption on/off made no difference. I also experimented with enabling / disabling the bandwidth throttling feature but it made no difference either. -Restarted the docker and backup job after each change. Without throttling, encryption or compression -I'm not sure why the app is so CPU hungry if all it is doing is copying jpeg files into Backblaze. I have opened a ticket directly with Cloudberry and apparently it is a "know issue"? and a "fix is on the way" ? I'm open to suggestions and willing to try other ideas! I know I could pin the Docker to a single CPU thread, but that seems more like a band-aid, I would rather help find a fix. Cheers, BR Not sure if it helps you, but I just heard back from CloudBerry support - apparently the <just posted today> Linux version 2.1.0.81 has a fix? Quote Link to comment
Djoss Posted September 18, 2017 Author Share Posted September 18, 2017 Ok, I will update the container image with the latest version! Quote Link to comment
isvein Posted September 19, 2017 Share Posted September 19, 2017 (edited) On 18/09/2017 at 11:47 AM, Djoss said: Are you able to write to your share from unraid itself? Double check the ownership and permissions of your share... I got it to work both ways. But I dont see the update for it yet. Edited September 19, 2017 by isvein Quote Link to comment
Djoss Posted September 20, 2017 Author Share Posted September 20, 2017 New container image now available. It implements the latest CloudBerry Backup version 2.1.0.81. 3 Quote Link to comment
bertrandr Posted September 20, 2017 Share Posted September 20, 2017 Forced an update to pull down the latest image without any issues. Still using the CloudBerry trial key and Backblaze B2 storage over my 15 mbps (upload) internet connection (Shaw 150 unlimited) Test backup is a 33GB folder on my Unraid with a couple thousand JPEGs Update looks VERY promising!!!! CPU utilization is negligible (5-6% overall) even with encryption enabled -as it should be. Upload speed is maxing out my internet link at a steady 1950+ KB/s (1.95 MB/s) Running for 20 min with little fluctuation in CPU or upload speed Next test will be to see if it stalls after an hour like the previous version. I'll report back when the test finishes... Thanks for the quick turnaround Djoss! BR 1 Quote Link to comment
isvein Posted September 20, 2017 Share Posted September 20, 2017 (edited) Im looking at B2 for cold storage too, not as crazy pricing as Amazon S3/glacier. The update crashed the docker, so had to reinstall the cloudberry docker, but it works fine now. edit: the backup stops after some time, is that just a bug for me or does it happens for others too Edited September 21, 2017 by isvein Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.