bertrandr Posted September 21, 2017 Share Posted September 21, 2017 On 9/20/2017 at 2:52 PM, isvein said: Im looking at B2 for cold storage too, not as crazy pricing as Amazon S3/glacier. The update crashed the docker, so had to reinstall the cloudberry docker, but it works fine now. edit: the backup stops after some time, is that just a bug for me or does it happens for others too Yea my backups to B2 seem to randomly stop as well. I was hoping this recent update would fix it but my backup stalled again this morning. It seems to happen with very large files over 1GB (home video archives). I have since been tweaking the options under the advanced settings. I read somewhere that Backblaze B2 stores in 100mb chunks or blocks and the recommendation was to try and match the incoming "chunk size" from Coudberry to that of Backblaze... But you also need to make sure that CloudBerry has enough RAM assigned to support the number of worker threads and chunk size... The formula to calculate the minimum RAM allocation is <threadcount*2*chunksize> Personally I am using 3 threads, 100mb chunk and 700mb RAM allocated. Read this: https://www.cloudberrylab.com/blog/slow-backup-how-to-speed-up/ So far so good, I have been running a very large backup (87 GB) with many large files for over 3 hours without stalling. I'll report back tomorrow... Cheers, BR Quote Link to comment
isvein Posted September 22, 2017 Share Posted September 22, 2017 I did not even know this settings was there changed it up to 100mb chunks and 2GB of ram, lets see how that goes. Quote Link to comment
bertrandr Posted September 22, 2017 Share Posted September 22, 2017 (edited) Bad news - changing the advanced settings makes little / no difference. Backups still "stop" randomly. I've opened another ticket with CloudBerry and they quickly resonded: "Hello Bertrandr,Thank you for reporting. That is a known issue. We are working on the fix and will let you know once it's ready.We apologize for inconvenience caused." I'm still on the trial version (which I like), but until this issue is fixed I'm not buying... BR Edited September 22, 2017 by bertrandr 1 Quote Link to comment
isvein Posted September 22, 2017 Share Posted September 22, 2017 1 hour ago, bertrandr said: Bad news - changing the advanced settings makes little / no difference. Backups still "stop" randomly. I've opened another ticket with CloudBerry and they quickly resonded: "Hello Bertrandr,Thank you for reporting. That is a known issue. We are working on the fix and will let you know once it's ready.We apologize for inconvenience caused." I'm still on the trial version (which I like), but until this issue is fixed I'm not buying... BR thanks, then it is at least not a setting we have wrong. Im on trial too. Looks like this is a problem on the Linux version, the windows version does not have this issue. Cant speak for the mac version. Quote Link to comment
1812 Posted September 22, 2017 Share Posted September 22, 2017 I don't think this is a "real" issue. I changed my port mappings for one of my 2 dockers of cloud berry to I could run them at the same time, and about a week or so later, fix common problems complained about this: Docker Application Backup_Remote, Container Port 43210 not found or changed on installed application When changing ports on a docker container, you should only ever modify the HOST port, as the application in question will expect the container port to remain the same as what the template author dictated. Fix this here: Application Support Thread Docker Application Backup_Remote, Container Port 43211 not found or changed on installed application When changing ports on a docker container, you should only ever modify the HOST port, as the application in question will expect the container port to remain the same as what the template author dictated. Fix this here: Application Support Thread but the ports I changed were for webport and vnc backup_remote 7803 7903 backup_local 7802 7902 so....???? Quote Link to comment
bertrandr Posted September 23, 2017 Share Posted September 23, 2017 SMTP settings? Maybe I'm just blind, but where do we put the SMTP server for email notifications?.... BR Quote Link to comment
Djoss Posted September 23, 2017 Author Share Posted September 23, 2017 21 hours ago, 1812 said: I don't think this is a "real" issue. I changed my port mappings for one of my 2 dockers of cloud berry to I could run them at the same time, and about a week or so later, fix common problems complained about this: Docker Application Backup_Remote, Container Port 43210 not found or changed on installed application When changing ports on a docker container, you should only ever modify the HOST port, as the application in question will expect the container port to remain the same as what the template author dictated. Fix this here: Application Support Thread Docker Application Backup_Remote, Container Port 43211 not found or changed on installed application When changing ports on a docker container, you should only ever modify the HOST port, as the application in question will expect the container port to remain the same as what the template author dictated. Fix this here: Application Support Thread but the ports I changed were for webport and vnc backup_remote 7803 7903 backup_local 7802 7902 so....???? What are you port mappings, as displayed under the Docker page? Quote Link to comment
Djoss Posted September 23, 2017 Author Share Posted September 23, 2017 2 hours ago, bertrandr said: SMTP settings? Maybe I'm just blind, but where do we put the SMTP server for email notifications?.... BR I think email notifications are send by/from CloudBerry... Quote Link to comment
1812 Posted September 23, 2017 Share Posted September 23, 2017 53 minutes ago, Djoss said: What are you port mappings, as displayed under the Docker page? top 2 are both cloudberry instances. bottom 2 are 2 Krusader instances. Quote Link to comment
Djoss Posted September 24, 2017 Author Share Posted September 24, 2017 2 hours ago, 1812 said: top 2 are both cloudberry instances. bottom 2 are 2 Krusader instances. Did you removed mappings to ports 43210 and 43211? Quote Link to comment
1812 Posted September 24, 2017 Share Posted September 24, 2017 23 hours ago, Djoss said: Did you removed mappings to ports 43210 and 43211? nope, never had anything assigned there, which is why i'm confused. Quote Link to comment
Djoss Posted September 25, 2017 Author Share Posted September 25, 2017 They are new ports added since the latest version. Normally, the template should automatically update itself and add them... Quote Link to comment
Djoss Posted October 4, 2017 Author Share Posted October 4, 2017 CloudBerry is currently offering 15% discount! http://go.cloudberrylab.com/crashplan-home/?source=Internal+newsletter Quote Link to comment
bertrandr Posted October 10, 2017 Share Posted October 10, 2017 I noticed there is a new version - any specific updates? FYI - I have started to use the Web GUI -much easier than the VNC session. BR Quote Link to comment
Djoss Posted October 10, 2017 Author Share Posted October 10, 2017 It depends on which version you had... but using Community Application plugin, you can look at the whole change log : Under the "Apps" tab, display your installed apps and click on the icon of CloudBerry Backup. You can also use the search box instead. Quote Link to comment
ritalin Posted November 22, 2017 Share Posted November 22, 2017 (edited) Hey Djoss. Im running into an issue with this docker and have a question about its usage. First the problem. If I attempt to stop a backup that is in progress from inside of the Cloudberry UI, the docker hangs. All attempts to stop or restart the docker from the UnRaid UI fail. Restarting the UnRaid server during this hang, ends up hanging the UnRaid UI. Not really sure what is going on with that. Oddly enough if I delete a running backup instance inside of the Cloudberry UI it does not cause a hang. As to the usage issues I am having, I just wanted to ask if its possible to simply backup files alone. I am coming from a Rclone backup instance that went crazy and started creating duplicates all over the place. I was hoping to setup something similar to Rclone Sync in Cloudberry. I have the docker set to have access to /mnt/. I need to do this method because I am backing up to Google Drive, which is mounted to my system under /mnt/disks/Google. So the first problem is that when the backup runs, it is putting everything on the Google drive under the following directory My Drive/ServerMedia/CBB_Servername/Storage/user/media/... My Rclone backup was saving everything to My Drive/ServerMedia/ Is there no way to remove the CBB_Servername subfolder? Is there no way to avoid it backing up the entire path from /storage to /media? This is not a big issue, I can make it work by simply moving all of the existing files to this new sub directory. It just seems unnecessary to have the entire path created. The bigger issue is that for every file it backs up, it creates two additional sub-folders. One is a folder with the filename, and the second under it is a date. Lets say the the files Im backing up are located in .../ABC/test.srt When it backs up the actual is then located in .../ABC/test.srt/11222017/test.srt Is there no way around this? Can I not simply backup the files in their directory as they are? Edited November 22, 2017 by ritalin Quote Link to comment
Djoss Posted November 23, 2017 Author Share Posted November 23, 2017 5 hours ago, ritalin said: If I attempt to stop a backup that is in progress from inside of the Cloudberry UI, the docker hangs. How do you know that it hangs? The GUI becomes unresponsive? 5 hours ago, ritalin said: All attempts to stop or restart the docker from the UnRaid UI fail. How it's failing exactly? Are you getting an error? The container is not stopped? 5 hours ago, ritalin said: As to the usage issues I am having, I just wanted to ask if its possible to simply backup files alone. I am coming from a Rclone backup instance that went crazy and started creating duplicates all over the place. I was hoping to setup something similar to Rclone Sync in Cloudberry. I have the docker set to have access to /mnt/. I need to do this method because I am backing up to Google Drive, which is mounted to my system under /mnt/disks/Google. So the first problem is that when the backup runs, it is putting everything on the Google drive under the following directory My Drive/ServerMedia/CBB_Servername/Storage/user/media/... My Rclone backup was saving everything to My Drive/ServerMedia/ Is there no way to remove the CBB_Servername subfolder? Is there no way to avoid it backing up the entire path from /storage to /media? This is not a big issue, I can make it work by simply moving all of the existing files to this new sub directory. It just seems unnecessary to have the entire path created. The bigger issue is that for every file it backs up, it creates two additional sub-folders. One is a folder with the filename, and the second under it is a date. Lets say the the files Im backing up are located in .../ABC/test.srt When it backs up the actual is then located in .../ABC/test.srt/11222017/test.srt Is there no way around this? Can I not simply backup the files in their directory as they are? Since it's a backup software and not a synchronisation/cloning tool, I'm afraid that it's not possible to only copy files without any metadata... But to be sure, I encourage you to contact CloudBerry Lab's support and ask them. They are usually responsive. Quote Link to comment
ritalin Posted November 23, 2017 Share Posted November 23, 2017 3 hours ago, Djoss said: How do you know that it hangs? The GUI becomes unresponsive? How it's failing exactly? Are you getting an error? The container is not stopped? Since it's a backup software and not a synchronisation/cloning tool, I'm afraid that it's not possible to only copy files without any metadata... But to be sure, I encourage you to contact CloudBerry Lab's support and ask them. They are usually responsive. Yes the GUI becomes completely unresponsive. No error listed in the UI. After the GUI becomes unresponsive, if I try to restart or stop the container, it just prompts me that it failed to do so. No Error other than that. Ill grab the logs next time I attempt to stop it and post those for you. I might try the backup on the Windows client as Ive read there are some additional options that are not available for the linux client yet. I really wanted to avoid pulling data from one system to another on the network, only to push it out to the internet from there. But if it works then maybe I can look into just running a light Windows VM until such time as the linux client reaches parity with the windows client. Quote Link to comment
ffhelllskjdje Posted December 13, 2017 Share Posted December 13, 2017 (edited) I can't seem to backup a certain share, it always fails. Anything I can check to make sure it gets backed up? this is back up to my local NAS. Edited December 13, 2017 by ffhelllskjdje Quote Link to comment
Djoss Posted December 13, 2017 Author Share Posted December 13, 2017 What is the error you get? You can see failure under "History" on the left. Quote Link to comment
disruptorx Posted January 11, 2018 Share Posted January 11, 2018 On 02/09/2017 at 12:13 AM, Djoss said: The problem seems to be with the way ButtBerry Backup handles mounts inside the container. I've opened a case with their support team. During this time, you can just ignore extra "/" locations. @Djoss were you able to find the issue with it ? I still cant see the exact shares within Cloudberry Quote Link to comment
Djoss Posted January 11, 2018 Author Share Posted January 11, 2018 51 minutes ago, disruptorx said: @Djoss were you able to find the issue with it ? I still cant see the exact shares within Cloudberry The CloudBerry support team told me that they passed the information to the development team. But since it's a cosmetic/minor issue, I guess this gets low priority and I don't when they will fix this. Quote Link to comment
disruptorx Posted January 12, 2018 Share Posted January 12, 2018 21 hours ago, Djoss said: The CloudBerry support team told me that they passed the information to the development team. But since it's a cosmetic/minor issue, I guess this gets low priority and I don't when they will fix this. Thanks for getting back on this. How is this a minor issue when a backup program can't see the host's shares / source ? I don't know how others did it but I can't get anything to sync with Amazon S3 since it doesn't show /mnt/user/*anyofmyshares* even Duplicati does the same for me Quote Link to comment
Djoss Posted January 12, 2018 Author Share Posted January 12, 2018 3 minutes ago, disruptorx said: I don't know how others did it but I can't get anything to sync with Amazon S3 since it doesn't show /mnt/user/*anyofmyshares* With default settings, your files can be found under /storage inside the container, not under /mnt/user. Quote Link to comment
disruptorx Posted January 12, 2018 Share Posted January 12, 2018 dammit..my bad..you're a genius! Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.