Jump to content
Djoss

[Support] Djoss - CloudBerry Backup

164 posts in this topic Last Reply

Recommended Posts

Speaking of time, how do you change it in the program? Mine is wrong.

Share this post


Link to post
40 minutes ago, Djoss said:

Also, is the option "Always keep the last version" selected?

What is the time/date of the files not being in backup?  Is is possible that the time of the first version is the same as the one of the files?

 

I just did another backup attempt with "Always keep the last version" selected. That seemed to fix it. It retained everything this time. I figured it would have have hung onto everything without that selected as long as the backups were not older than a month. Does it determine this by the files last modify date rather than the time of the backup that it was actually taken?

 

And I have the same issue 1812's comment above. The time is off.

Edited by Phastor

Share this post


Link to post
52 minutes ago, Phastor said:

Does it determine this by the files last modify date rather than the time of the backup that it was actually taken?

I don't know, but no matter which time it is taken, I think the Always keep the last version" should be selected if you don't want your files to be purged eventually.

 

57 minutes ago, Phastor said:

And I have the same issue 1812's comment above. The time is off.

Fix is coming!

Share this post


Link to post

Eew. Just changing the filename (or that of its parent folder) or moving a file triggers a full duplicate instance of that file in the backup. Since most of my data is decently large video files, that's a deal breaker for me.

 

I understand that's a limitation of the software itself and not the docker container. Thanks anyway for your help on this!

Edited by Phastor

Share this post


Link to post
8 minutes ago, Phastor said:

Since most of my data is video, that's a deal breaker for me.

Because of the space taken by the backup?  Files deleted locally can also be removed from the backup.

Share this post


Link to post

By the way, new container image with timezone fix is available.

Share this post


Link to post

Thanks for the quick timzone fix!

 

Partially for the space, partially for the backup duration, and mostly because my backup drive can't hold more than one copy of my video. I suppose I could create a separate backup plan for video that immediately removes files that were locally deleted. However, if I were to move a video, the software will still want to create a backup of that video in it's new location before deleting the old, right? Just renaming the top level directory of my video files would trigger a full copy of every video, unless I'm misunderstanding this. At over a TB of video, that would be a long backup for a small change.

Edited by Phastor

Share this post


Link to post

Yeah, renaming stuff create duplicates in the backup.  However, this behavior would be the same for any backup applications not supporting deduplication...

Share this post


Link to post

Yeah, I never looked into whether Cloudberry supported it or not until now. I guess it just doesn't fit my use case. It's shame since I really like it!

 

I guess I'm stuck with the slowness of Duplicati.

Edited by Phastor

Share this post


Link to post

Something neat I just figured out:

 

If you set multiple backup plans for different folders (current, cold storage, work, etc...) and are backing up local unencrypted, you can run them all at the same time. If you're using encryption or compression, it makes the next plan wait for the first to finish. I thought it was fast before.... 

Share this post


Link to post

Anyone else seeing very high CPU utilization during backup? 

 

I have a decent quad core E3-1220 v3 @ 3.10GHz in my server and this CB docker is pushing all 4 cores to 75-100% while running backups. I have tried adjusting the Threads, Chunk size and Memory settings but they don't seem to make much of a difference.  

 

My current "test" backup is mostly jpg files, encryption is on, compression off -backing up to Backblaze B2 buckets. Bandwidth throttle set to 1024 KByte/s on a 15 mbps  upload intent connection.

 

Cheers,

 

BR

 

Share this post


Link to post

Do you see the same CPU usage with encryption disabled?

Share this post


Link to post

I spread my usage over 15 cores... so... no :) 

 

10-30% usage on encryption.

 

you can set priority in the docker settings and thread count in the GUI if it's using too much.

Share this post


Link to post

hello :)

Each time I try to use local or mounted storage from another local server as storage for a backup I get "cant create dir in specificed path" even if the share is set to r/w.

Anyone know why?

Share this post


Link to post
1 hour ago, isvein said:

hello :)

Each time I try to use local or mounted storage from another local server as storage for a backup I get "cant create dir in specificed path" even if the share is set to r/w.

Anyone know why?

 

use r/w slave and double check your password/login when mounting the remote location in unassigned devices.

Share this post


Link to post

I have tried that now and still same :(

I tried to have it backup to an Minio server on the other unraid server.

It can take backup fine, but restore wont work.

Looks like it have only read permitions and not write.

 

edit: after reading all of the posts here I got it to be able to write to the orginal share, but no success with the SMB share I mounted yet.

Lets try NFS

Edited by isvein

Share this post


Link to post

Are you able to write to your share from unraid itself?  Double check the ownership and permissions of your share...

  • Like 1

Share this post


Link to post
On 9/16/2017 at 4:50 PM, Djoss said:

Do you see the same CPU usage with encryption disabled?

 

Yes I do, turning encryption on/off made no difference. I also experimented with enabling / disabling the bandwidth throttling feature but it made no difference either. -Restarted the docker and backup job after each change. Without throttling, encryption or compression -I'm not sure why the app is so CPU hungry if all it is doing is copying jpeg files into Backblaze. 

 

I have opened a ticket directly with Cloudberry and apparently it is a "know issue"? and a "fix is on the way" ?

 

I'm open to suggestions and willing to try other ideas! 

I know I could pin the Docker to a single CPU thread, but that seems more like a band-aid, I would rather help find a fix.  

 

Cheers,

 

BR

 

Share this post


Link to post
2 minutes ago, bertrandr said:

 

Yes I do, turning encryption on/off made no difference. I also experimented with enabling / disabling the bandwidth throttling feature but it made no difference either. -Restarted the docker and backup job after each change. Without throttling, encryption or compression -I'm not sure why the app is so CPU hungry if all it is doing is copying jpeg files into Backblaze. 

 

I have opened a ticket directly with Cloudberry and apparently it is a "know issue"? and a "fix is on the way" ?

 

I'm open to suggestions and willing to try other ideas! 

I know I could pin the Docker to a single CPU thread, but that seems more like a band-aid, I would rather help find a fix.  

 

Cheers,

 

BR

 

Not sure if it helps you, but I just heard back from CloudBerry support - apparently the <just posted today> Linux version  2.1.0.81 has a fix? 

Share this post


Link to post

Ok, I will update the container image with the latest version!

Share this post


Link to post
On 18/09/2017 at 11:47 AM, Djoss said:

Are you able to write to your share from unraid itself?  Double check the ownership and permissions of your share...

I got it to work :) both ways.

 

But I dont see the update for it yet.

Edited by isvein

Share this post


Link to post

New container image now available.  It implements the latest CloudBerry Backup version 2.1.0.81.

  • Like 3

Share this post


Link to post

Forced an update to pull down the latest image without any issues. 

Still using the CloudBerry trial key and Backblaze B2 storage over my 15 mbps (upload) internet connection (Shaw 150 unlimited) 

Test backup is a 33GB folder on my Unraid with a couple thousand JPEGs 

 

Update looks VERY promising!!!! 

  • CPU utilization is negligible (5-6% overall) even with encryption enabled -as it should be. 
  • Upload speed is maxing out my internet link at a steady 1950+ KB/s (1.95 MB/s)
  • Running for 20 min with little fluctuation in CPU or upload speed
  • Next test will be to see if it stalls after an hour like the previous version. 
  • I'll report back when the  test finishes... 

8) Thanks for the quick turnaround Djoss! 

 

BR

  • Like 1

Share this post


Link to post

Im looking at B2 for cold storage too, not as crazy pricing as Amazon S3/glacier.

The update crashed the docker, so had to reinstall the cloudberry docker, but it works fine now.

 

 

edit: the backup stops after some time, is that just a bug for me or does it happens for others too

Edited by isvein

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now