Jump to content
Djoss

[Support] Djoss - CloudBerry Backup

201 posts in this topic Last Reply

Recommended Posts

20 hours ago, Djoss said:

You can just install with the defaults settings.  In most cases, there is no reason to change them.

I thank you for taking the time to reach out on this.

I feel more confused than ever today but in a good way.

Here's what I've been experiencing.

 

When you go to install the docker for Cloudberry there's three buttons on the bottom.

Apply, Done, and Save.

If I changed nothing and clicked on Done everything goes away and nothing installs.

If I changed nothing and clicked on Save I get a screen of the xml file but nothing else happens.

The odd thing is that for the last several days when I changed nothing and clicked on Apply that same xml file displayed and nothing happens.

After you posted I decided to give things another try so just to be completely fresh I powered down and then booted and tried again and this time selecting apply started the install process.

I haven't set up the backup yet but at least my docker is in now so I'm feeling relieved.

I don't know if I have or had an issue with my server but if the choice is my machine or your docker, then is more likely it's my machine or we would have heard of others having trouble too.

Anyway, thanks again and have a great weekend.

Share this post


Link to post
On 1/17/2019 at 8:01 PM, Djoss said:

Thanks for reporting, I'm working on this!

Just curious if this update is coming soon for the Docker container? I see you added the 2.7.0.28 deb file to the github project, but the Dockerfile still references 2.6.0.31.

 

Share this post


Link to post
10 minutes ago, sjoerger said:

Just curious if this update is coming soon for the Docker container? I see you added the 2.7.0.28 deb file to the github project, but the Dockerfile still references 2.6.0.31.

 

During testing of this version, I had an issue where the UI would constantly disconnects from the engine.  I reported the issue to CloudBerry and they told me (on January 25th) that it's a known issue and that they are working on a fix.

 

So I'm waiting for this new version before pushing a new Docker image...

Share this post


Link to post
18 minutes ago, Djoss said:

During testing of this version, I had an issue where the UI would constantly disconnects from the engine.  I reported the issue to CloudBerry and they told me (on January 25th) that it's a known issue and that they are working on a fix.

 

So I'm waiting for this new version before pushing a new Docker image...

Excellent, thanks for the quick update!

 

Share this post


Link to post

Hi,

 

Backup fra unRaid cache drive to a Nas in my cabled network gave about 35MB/s transfer speed. Is that the transfer speed to expect? When doing backup from/to the same shares with Acronis True Image run on a VM is was about 100MB/s. 

 

// Frode

Share this post


Link to post
3 hours ago, frodr said:

Hi,

 

Backup fra unRaid cache drive to a Nas in my cabled network gave about 35MB/s transfer speed. Is that the transfer speed to expect? When doing backup from/to the same shares with Acronis True Image run on a VM is was about 100MB/s. 

 

// Frode

I never tried to backup to a device on my local network, but I guess that the speed can be impacted by multiple features: compression, encryption, etc.

Share this post


Link to post
50 minutes ago, Djoss said:

I never tried to backup to a device on my local network, but I guess that the speed can be impacted by multiple features: compression, encryption, etc.

this is true. With no compression/encryption, just straight file backup local, the max I see is about 85-95MBps on large files (like a movie.)

 

One way to help is increase thread count and chunk size. I run 20MB

Share this post


Link to post
3 hours ago, 1812 said:

this is true. With no compression/encryption, just straight file backup local, the max I see is about 85-95MBps on large files (like a movie.)

 

One way to help is increase thread count and chunk size. I run 20MB

80-85 is fine, but 35GB/s when another is running at 3x-

Share this post


Link to post

Hi DJoss, 

 

Great docker.  I'm running into a little difficulty.  I can do a backup to a nas on my network (via smb and unassigned devices), but restore fails when trying to restore to the source unraid folder.  The error message is useless, something like... "error on process some file".

 

The restored files are fine since I can restore to the root folder of the target nas.   Not sure what the issue is. 

Share this post


Link to post
16 minutes ago, eds said:

Hi DJoss, 

 

Great docker.  I'm running into a little difficulty.  I can do a backup to a nas on my network (via smb and unassigned devices), but restore fails when trying to restore to the source unraid folder.  The error message is useless, something like... "error on process some file".

 

The restored files are fine since I can restore to the root folder of the target nas.   Not sure what the issue is. 

There source files from unRAID (data under /storage in the container) are read only by default.  This is to make sure the container can't do anything to your data.  If you want to restore, you can change the permission of the /storage folder by editing the container's configuration, switching to the Advanced View (at the top right), editing the "Storage" setting and changing the Access Mode to Read/Write.

Share this post


Link to post
1 hour ago, Djoss said:

There source files from unRAID (data under /storage in the container) are read only by default.  This is to make sure the container can't do anything to your data.  If you want to restore, you can change the permission of the /storage folder by editing the container's configuration, switching to the Advanced View (at the top right), editing the "Storage" setting and changing the Access Mode to Read/Write.

Of course.  Why didn't I think of that?

 

Thanks!

Share this post


Link to post

I am starting to get a very consistent error with one of my backup jobs to Backblaze where the CloudBerry process is crashing with the error

"Plan process stopped with system error. Send logs to support"

 

In the logs is
 

2019-07-18 09:32:53,762451 [INFO ]: [ CBB ] [ 7 ] End Listing : Status:  QProcess::ExitStatus(CrashExit) code 11

2019-07-18 09:32:53,784091 [INFO ]: [ CBB ] [ 7 ] Update destination statistic, id: {fe9d330d-0d5f-4971-bdc4-31fcf9a5da66}

 

CloudBerry support says  "Usually that error occurs when open GUI or use web UI during the backup run. That is why as workaround I can only suggest not to open the GUI when the backup plan is running and let us know if that helps. "

 

Is there any way to close or prevent the GUI from running to work around this?

Share this post


Link to post
On 7/23/2019 at 11:32 AM, yippy3000 said:

I am starting to get a very consistent error with one of my backup jobs to Backblaze where the CloudBerry process is crashing with the error

"Plan process stopped with system error. Send logs to support"

 

In the logs is
 

2019-07-18 09:32:53,762451 [INFO ]: [ CBB ] [ 7 ] End Listing : Status:  QProcess::ExitStatus(CrashExit) code 11

2019-07-18 09:32:53,784091 [INFO ]: [ CBB ] [ 7 ] Update destination statistic, id: {fe9d330d-0d5f-4971-bdc4-31fcf9a5da66}

 

CloudBerry support says  "Usually that error occurs when open GUI or use web UI during the backup run. That is why as workaround I can only suggest not to open the GUI when the backup plan is running and let us know if that helps. "

 

Is there any way to close or prevent the GUI from running to work around this?

 

This is not something that can be automatically, but you can try the following workaround:

  • Login to the container: docker exec -ti CloudBerryBackup sh
  • Edit /startapp.sh
  • Add the following line (without the double quotes) before the command that starts the UI:  "tail -f /dev/null"
  • Save the file and exit.
  • Reboot the container.

 

 

Share this post


Link to post

Now that I've finally got a decent Internet connection its been working great. However the largest 2 files (~40 GB each) are just not backing up to AWS. It sits there forever with no progress at 0.00 B/s / 0%.

 

Everything else has backed up fine, these last 2 files just wont go. Any ideas?

 

Thanks!

Share this post


Link to post

Just wanted to say thanks for this docker.

 

I've been having issues with backups not working, after speaking with cloudberry they are saying these issues were fixed in 2.9.3.  I have tried to update the software to no avail, do we know when the switch from 2.8 to 2.9 will be done?

 

During the Local Daemon log investigation we have encountered the following records:

 

2019-08-13 15:11:54,893687 [INFO ]: [ CBB ] [ 7 ] QUuid({3181eadc-fde2-485b-993a-a587893c8213}) LAWSNAS : Listing Process:

2019-08-13 15:11:54,894260 [INFO ]: [ CBB ] [ 7 ] 

2019-08-13 15:11:54,894363 [INFO ]: [ CBB ] [ 7 ] End Listing : Status:  QProcess::ExitStatus(CrashExit) code 11

 

Thank you.

Edited by LawsReporting

Share this post


Link to post
On 8/15/2019 at 4:04 PM, LawsReporting said:

Just wanted to say thanks for this docker.

 

I've been having issues with backups not working, after speaking with cloudberry they are saying these issues were fixed in 2.9.3.  I have tried to update the software to no avail, do we know when the switch from 2.8 to 2.9 will be done?

 

During the Local Daemon log investigation we have encountered the following records:

 

2019-08-13 15:11:54,893687 [INFO ]: [ CBB ] [ 7 ] QUuid({3181eadc-fde2-485b-993a-a587893c8213}) LAWSNAS : Listing Process:

2019-08-13 15:11:54,894260 [INFO ]: [ CBB ] [ 7 ] 

2019-08-13 15:11:54,894363 [INFO ]: [ CBB ] [ 7 ] End Listing : Status:  QProcess::ExitStatus(CrashExit) code 11

 

Thank you.

You probably already saw it, but the docker image has been updated a few days ago.

Share this post


Link to post
5 hours ago, Djoss said:

You probably already saw it, but the docker image has been updated a few days ago.

I didn't see this until this morning, but after checking today the update did fix the issues I was having.

 

Thank you.

Share this post


Link to post

I am looking for online backup and am currently testing a personal license for CloudBerry Backup using this docker with Backblaze B2.  So far it has worked great.

 

I have a couple questions on settings I do not fully understand.

 

1.  What is a good retention policy if I just want to keep a backup online and am not too concerned with versions?  I am running the default options but wondered if there was a better option.

 

2.  I am confused on the scheduling on the Backup Plan.  It initially asks to set a schedule and I selected "Daily" at "1:00 AM".  After I hit continue it then asks to schedule Full Backup (I have previously selected block level backup) and I am confused on whether I need both of these or if they make a difference.

 

3.  I have encryption enabled but when I log into B2 online I am able to see all of the files.  However, after downloading some I am unable to open any of them.  Are the file names not encrypted?  And if I can't open the downloaded file does that mean the file is encrypted (and therefore the encryption is working)?

 

4.  I was going to try Duplicati but this seems to be working great and is well worth the $30 once the trial is up.  Any info worth knowing about either using a paid plan or any reasons on giving Duplicati a try?

Edited by ur6969

Share this post


Link to post
23 hours ago, ur6969 said:

1.  What is a good retention policy if I just want to keep a backup online and am not too concerned with versions?  I am running the default options but wondered if there was a better option.

In this case I guess you can keep 1 version.  You can then configure how quickly you want the old versions/deleted files to be permanently removed from the backup.

23 hours ago, ur6969 said:

2.  I am confused on the scheduling on the Backup Plan.  It initially asks to set a schedule and I selected "Daily" at "1:00 AM".  After I hit continue it then asks to schedule Full Backup (I have previously selected block level backup) and I am confused on whether I need both of these or if they make a difference.

You can get more details here: https://www.cloudberrylab.com/resources/blog/scheduling-full-backup/

On 9/24/2019 at 9:12 PM, ur6969 said:

3.  I have encryption enabled but when I log into B2 online I am able to see all of the files.  However, after downloading some I am unable to open any of them.  Are the file names not encrypted?  And if I can't open the downloaded file does that mean the file is encrypted (and therefore the encryption is working)?

Correct.  The filenames themselves are not encrypted, but the content is.

On 9/24/2019 at 9:12 PM, ur6969 said:

4.  I was going to try Duplicati but this seems to be working great and is well worth the $30 once the trial is up.  Any info worth knowing about either using a paid plan or any reasons on giving Duplicati a try?

I don't have personally try DuplicatiI, so I can't comment on this...

Share this post


Link to post

Hello..  Coming over from crashplan to test out..

This is a gui question..  Since I'll have to pay per GB...   Is there a way to see how many GB there are selected in the 

backup before it actually starts backing up?  I'm trying to gauge how much it will cost me..

 

I did a test backup and it only told me the size after I started the backup...

 

Thanks,

 

Jim

Share this post


Link to post
On 10/23/2019 at 11:26 AM, jbuszkie said:

Hello..  Coming over from crashplan to test out..

This is a gui question..  Since I'll have to pay per GB...   Is there a way to see how many GB there are selected in the 

backup before it actually starts backing up?  I'm trying to gauge how much it will cost me..

 

I did a test backup and it only told me the size after I started the backup...

 

Thanks,

 

Jim

I'm not aware of a such functionality.

But you can do it yourself with to command "du -sh <path to directory>".

Share this post


Link to post
2 hours ago, Djoss said:

I'm not aware of a such functionality.

But you can do it yourself with to command "du -sh <path to directory>".

Yeah...  The cloudberry guys said that it wasn't available too..

What I need is a way to total random directories.  I can do it the hard way...  But I was hoping for something easier...

Share this post


Link to post

No longer supported?

 

Cloudberry now shows EOL on their website and I can no longer activate the application.

I can't get restores to work either even tho it can see my S3 buckets,etc... it may be i am just doing this wrong but i think they may have crippled the software?

 

 

Share this post


Link to post
23 hours ago, PigNib said:

No longer supported?

 

Cloudberry now shows EOL on their website and I can no longer activate the application.

I can't get restores to work either even tho it can see my S3 buckets,etc... it may be i am just doing this wrong but i think they may have crippled the software?

 

 

Where do you see the EOL ?

 

As for the restore, are you trying to restore files to their original location?  If yes, you first need to go in container's settings and change the permission for the Storage folder to read/write.  By default, it's read only.

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.