[CONTAINER] CrashPlan & CrashPlan-Desktop


Recommended Posts

Throw up a small VM! I made one on Sunday and it has been working like charm. No need for manual updates or anything. I just see the same thing over and over in this thread. Pretty soon unRaid will make their own backup cloud service connected to their OS. Would be nice, right?

 

Link to comment

I set up crashplan yesterday to make backups of my mac's user folder on my unraid server. I configured it to only run once a day intending not to spin up the array drives permanently. Unfortunately everytime I wake up my mac from sleep, crashplan is spinning up my backup and parity disk and is editing some files in the backup folder. How can I prevent it from doing this? I don't want the drives to spin up everytime a client connects to the server but only when an actual backup is made. Is anyone else experiencing this behavior?

 

EDIT:

For anyone else who experience this issue: I've got a workaround for this.

I set my backup share to cache only and started a new backup so crashplan created its backup folder on the cache disk. Then I set the share to not cache and  copied over the one big file that obviously is the actual backup to a corresponding folder on my array. All the small files that are accessed and being written to during the connection stay on the cache disk and my array does not spin up unless an actual backup is made.

 

This is a great tip!  It reminds me of the techniques Jon used here:

  https://lime-technology.com/forum/index.php?topic=40777.msg385753

 

I was previously backing my desktop computer up to a "CrashplanTower" share that was set to use disk1 only.  This meant the Crashplan docker kept disk1 and the parity drive spun up the entire time my desktop computer was on.

 

So I changed the CrashplanTower share to be "cache only", and then moved all the standalone files and the most recent subfolder from disk1 to the cache drive. I left all of the older folders on disk1.

 

Now my array stays spun down even when my desktop computer is running a backup!

 

There are two downsides I can think of:

1) If I ever lose my cache drive, I'll probably lose these backups too.  But since my desktop is also backing up to the Crashplan cloud, that isn't really a big deal.

2) It looks like Crashplan will create a new directory on the cache drive every few weeks, so I'll need to remember to periodically move older directories to the array to keep my cache drive from filling up.

 

Link to comment

 

I decided to ditch this great docker (and the second MATE) and just install a small VM with ubuntu and install Crashplan. I mounted all my user shares within the VM and now don't have to edit my Windows client files everytime there is an update or a need for the docker to get updated. This will happen by itself on the VM now. No more waiting for the manual updates from the programmer. I just read in another thread he's working on other projects and is a busy guy. I would think this is on the bottom of his list. The VM is working great so far and is currently backing up as I write this.

 

In what way the auto update would help? Code42 has done some undocumented modifications on how headless server works. In this case, wasn't just about updating CP, but release a 4.3 version of both CP and CP-Desktop.

 

Other example is NZBget; the author made some serious changes on how the program is installed. It took some time to figure out and modify the code accordingly.

 

With the exception of a few guys that have goodwill and submit some changes to the GitHub repo or send debug data, most of users just take things, never giving anything in return. Sacretagent sent me some PM, a guy named Datapotomus submitted a PR (it wasn't just about version update, so I didn't merged it, but thanks!).... Thanks a lot guys, this kind of thing helps a lot!

 

For you, opentoe, good riddance with your VM solution; I'm sure it works just fine. Thanks a lot for your input.

 

Link to comment

Okay I am a bit of a dumbass here. Solved the issue I was having which should have been obvious to me.

 

Simple copy the .ui_info from your crashplan docker to the empty "ID" directory of the crashplan-desktop docker.

 

Kryspy

 

Duh!!  I had thought of that as well, but, for some reason, I could not see the ID folder in the CrashPlan-Desktop docker and I was trying to figure out what the equivalent is in the CrashPlan MATE docker of copying .ui_info to C:\ProgramData\CrashPlan in Windows.  I could see the ID folder in MC in a Putty session and got it copied over fine and the MATE CrashPlan docker now works!!!  :D

 

After latest CrashPlan-Desktop update I now see the ID folder through other methods as well, so, it must have been some permissions issue. I still cannot copy to it via Windows, but, MC did the trick.

 

Thanks for getting me pointed in the right direction.  It has to be a pain for gfjardim to keep up with Code 42's undocumented tricks.  They don't officially support headless installations, but, at least they ought to have the decency of not breaking them (or documenting the changes) for those who are trying to support them.

Link to comment

Got this to work.

 

On the crashplan desktop you also need to have your container volume the same as is in the screenshot:

 

Container volume:                     Host path:

/home/ubuntu/disks     /mnt/disks/

 

 

 

 

Extra Parameters

--volumes-from CrashPlan

 

I don't think there is a one-size-fits-all configuration for CrashPlan server or desktop.  I guess that is what makes troubleshooting the problem a bit different for everyone.  Some combination of the solutions mentioned is this thread will likely help most, but, there does not seem to be one config to rule them all. 

 

My config is working fine without the disks mapping but needed the .ui_info fix:

CP_Desktop_Config.png.4b1d8f4ceecec9b88b61f6627758283b.png

Link to comment

 

I decided to ditch this great docker (and the second MATE) and just install a small VM with ubuntu and install Crashplan. I mounted all my user shares within the VM and now don't have to edit my Windows client files everytime there is an update or a need for the docker to get updated. This will happen by itself on the VM now. No more waiting for the manual updates from the programmer. I just read in another thread he's working on other projects and is a busy guy. I would think this is on the bottom of his list. The VM is working great so far and is currently backing up as I write this.

 

In what way the auto update would help? Code42 has done some undocumented modifications on how headless server works. In this case, wasn't just about updating CP, but release a 4.3 version of both CP and CP-Desktop.

 

Other example is NZBget; the author made some serious changes on how the program is installed. It took some time to figure out and modify the code accordingly.

 

With the exception of a few guys that have goodwill and submit some changes to the GitHub repo or send debug data, most of users just take things, never giving anything in return. Sacretagent sent me some PM, a guy named Datapotomus submitted a PR (it wasn't just about version update, so I didn't merged it, but thanks!).... Thanks a lot guys, this kind of thing helps a lot!

 

For you, opentoe, good riddance with your VM solution; I'm sure it works just fine. Thanks a lot for your input.

 

No need to get offended. I'm going to use what works best for me and this (IN MY OPINION) is a good alternative. If you want any more input just me know. Happy to share.

 

 

Link to comment

 

I don't think there is a one-size-fits-all configuration for CrashPlan server or desktop.  I guess that is what makes troubleshooting the problem a bit different for everyone.  Some combination of the solutions mentioned is this thread will likely help most, but, there does not seem to be one config to rule them all. 

 

My config is working fine without the disks mapping but needed the .ui_info fix:

 

Remove the volume mapping for /config -> /mnt/cache/appdata/crashplan-desktop.  that conflicts with  --volumes-from CrashPlan

Link to comment

I wonder if everyone who is still having problems is making the mistake of mapping a /config folder for Crashplan Desktop.  That Docker should have no volume mappings of its own.  Everything it needs it gets from the "--volumes-from CrashPlan" param. 

Link to comment

I wonder if everyone who is still having problems is making the mistake of mapping a /config folder for Crashplan Desktop.  That Docker should have no volume mappings of its own.  Everything it needs it gets from the "--volumes-from CrashPlan" param.

 

Just habit as all my dockers have /config mappings; however, CrashPlan-Desktop was working fine for me even with that mapping after Kryspy pointed out how to make the .ui_info copy on CrashPlan-Desktop.  It seems to work for me fine with or without the /config mapping which I was unaware was not needed.  I definitely did not need the /home/ubuntu/disks mapping either as a backup is running right now with seemingly no problems.

Link to comment

Question about connecting the client... I've read back quite a few pages and also followed the reddit instructions. Nothing seems to want to work. Apologies if the information is there and I've missed it...

 

My windows client is 4.3.0, I am not using an SSH tunnel (just trying to directly connect a client).

 

 

I have the CrashPlan docker installed and working.

I have found the .ui_info file (contents == 4243,unRAID)

I have copied this to C:\Program Files\CrashPlan

I have edited  to C:\Program Files\CrashPlan\conf\ui.properties to

  serviceHost=192.168.1.201

  servicePort=4243

The docker instance config my.service.xml:

  <serviceUIConfig>

    <serviceHost>0.0.0.0</serviceHost>

    <servicePort>4243</servicePort>

 

 

When I launch the GUI, it just sits at the splash screen and never gets past that. If I take out the serviceHost / Port config then it goes back to what it should be and connects locally. I can see the backup set and the four other computers backing up to my account (via restore to tab).

 

I've also installed CrashPlan-Desktop. I can RDP to the MATE session and it behaves in the same way, the splash screen is just sitting there and never connects (got bored after 10 mins). I've copied .ui_info into /usr/local/crashplan (and /usr/local/crashplan/conf).

 

I'm missing something, but cannot see what it is. Any clues?

 

Thanks in advance!

 

 

 

 

Link to comment

I've also installed CrashPlan-Desktop. I can RDP to the MATE session and it behaves in the same way, the splash screen is just sitting there and never connects (got bored after 10 mins). I've copied .ui_info into /usr/local/crashplan (and /usr/local/crashplan/conf).

 

Not sure about your Windows client but for CrashPlan desktop you mentioned having to copy files... that means you have likely mapped a /config folder for the CrashPlan desktop docker.  You should delete all volume mappings for this docker as described a few posts back.  The "--volumes-from CrashPlan" param takes care of everything for you.

Link to comment

Question about connecting the client... I've read back quite a few pages and also followed the reddit instructions. Nothing seems to want to work. Apologies if the information is there and I've missed it...

 

My windows client is 4.3.0, I am not using an SSH tunnel (just trying to directly connect a client).

 

 

I have the CrashPlan docker installed and working.

I have found the .ui_info file (contents == 4243,unRAID)

I have copied this to C:\Program Files\CrashPlan

I have edited  to C:\Program Files\CrashPlan\conf\ui.properties to

  serviceHost=192.168.1.201

  servicePort=4243

The docker instance config my.service.xml:

  <serviceUIConfig>

    <serviceHost>0.0.0.0</serviceHost>

    <servicePort>4243</servicePort>

 

 

When I launch the GUI, it just sits at the splash screen and never gets past that. If I take out the serviceHost / Port config then it goes back to what it should be and connects locally. I can see the backup set and the four other computers backing up to my account (via restore to tab).

 

I've also installed CrashPlan-Desktop. I can RDP to the MATE session and it behaves in the same way, the splash screen is just sitting there and never connects (got bored after 10 mins). I've copied .ui_info into /usr/local/crashplan (and /usr/local/crashplan/conf).

 

I'm missing something, but cannot see what it is. Any clues?

 

Thanks in advance!

 

I don't know if my experience will be much help to you but, in summary, here is what I had to do to get both the Windows 4.3 client and the MATE desktop connected to the CP 4.3 engine on my unRAID server:

 

Windows Client

- I have never been able to get it to connect without the Putty/SSH tunnel.  If I uncomment the serviceHost=[unRAID IP] line in ui.properties and set servicePort to 4243, the client just hangs.

- Because of SSH tunnel, the servicePort=4200 in my ui.properties

- The .ui_info file must be copied to C:\ProgramData\CrashPlan not C:\Program Files\CrashPlan as you said you had done.  I initially made the same mistake.

 

MATE Desktop

- copied .ui_info from CrashPlan docker "id" folder in my appdata share to CrashPlan-Desktop docker "id" folder (it will be empty)

- the .ui_info with the "4243, unRAID" content would not allow the client to connect.  I had to copy over the a renamed .ui_info file I had renamed from a prior docker instance which contained the proper key hash and looks something like "4243,c03bafd8-3f83-40ae-8856-836d6b9d5edf"

- This file copy may not be necessary unless you have a /config volume mapping, supposedly the --volumes-from CrashPlan setting in Extra Parameters takes care of this for you.

 

 

In both cases, client connects to engine with the following my.service.xml settings (serviceHost does not need to be 127.0.0.1 for my Windows client to connect in my case):

  <serviceUIConfig>

    <serviceHost>0.0.0.0</serviceHost>

    <servicePort>4243</servicePort>

 

I have recently added files to shares set to backup to CrashPlan Central and monitored backup progress through both Windows client and MATE desktop.  So, until Code 42 messes with something again, it appears to be working although, as I said, only through Putty/SSH on Windows in my case.

Link to comment

Thank you both so much.

 

I don't know if my experience will be much help to you but, in summary, here is what I had to do to get both the Windows 4.3 client and the MATE desktop connected to the CP 4.3 engine on my unRAID server:

 

Windows Client

- I have never been able to get it to connect without the Putty/SSH tunnel.  If I uncomment the serviceHost=[unRAID IP] line in ui.properties and set servicePort to 4243, the client just hangs.

- Because of SSH tunnel, the servicePort=4200 in my ui.properties

- The .ui_info file must be copied to C:\ProgramData\CrashPlan not C:\Program Files\CrashPlan as you said you had done.  I initially made the same mistake.

 

Thanks. Found the original .ui_info file in there (why it was missing was confusing).

I've retained the original and stashed this in the windows file: 4243,unRAID

All works beautifully - thank you! No need for SSH.

 

I found the log files on the server and they contained the following, indicating the key mismatch:

[07.17.15 16:46:16.980 INFO    MQ-UI-1              com.backup42.service.ui.UIController    ] Received status request message with invalid token.
[07.17.15 16:46:16.981 INFO    Factory$Notifier-UI0 com.backup42.service.ui.UIController    ] UISession Ended after less than a minute - 698537468441025962

 

I guess a question pops up here: is there a way to reset the key in the docker to be the same as the Windows installation?? Would be nice to share a key across instances in this case!

 

 

MATE Desktop

- copied .ui_info from CrashPlan docker "id" folder in my appdata share to CrashPlan-Desktop docker "id" folder (it will be empty)

- the .ui_info with the "4243, unRAID" content would not allow the client to connect.  I had to copy over the a renamed .ui_info file I had renamed from a prior docker instance which contained the proper key hash and looks something like "4243,c03bafd8-3f83-40ae-8856-836d6b9d5edf"

- This file copy may not be necessary unless you have a /config volume mapping, supposedly the --volumes-from CrashPlan setting in Extra Parameters takes care of this for you.

 

 

In both cases, client connects to engine with the following my.service.xml settings (serviceHost does not need to be 127.0.0.1 for my Windows client to connect in my case):

  <serviceUIConfig>

    <serviceHost>0.0.0.0</serviceHost>

    <servicePort>4243</servicePort>

 

I have recently added files to shares set to backup to CrashPlan Central and monitored backup progress through both Windows client and MATE desktop.  So, until Code 42 messes with something again, it appears to be working although, as I said, only through Putty/SSH on Windows in my case.

Where do I find the proper key for the Docker instance? The data in my windows client is now 4243,unRAID which I assume is the key that the instance is expecting.

 

I have no folders mapped in the MATE desktop - I had found that post and double checked. All that is there is the default RDP port.

 

Link to comment

I have no folders mapped in the MATE desktop - I had found that post and double checked. All that is there is the default RDP port.

Sorry if you've already confirmed this, but do you have "--volumes-from CrashPlan" in the Extra Parameters area of the MATE desktop?  If so, I'm afraid I'm out of ideas.

Link to comment

No!!

 

I'd missed that sorry (didn't have advanced view on).

 

And suddenly it all springs to life.

 

Thank you so much.

 

I've since created batch files on my Windows desktop to toggle between configurations, but having the app served over RDP is much cleaner I think!

 

Few hundred Gb (more) backing up now. I'll stop short of the full array that would take years.

Link to comment

I fixed my problem, I had to edit the my.service.xml on my unraid, to have the <serviceHost> field to read 0.0.0.0. Hope this helps someone else.

 

Thanks tron. This fixed the issue for me when the GUI said that it was, "unable to connect to backup engine".

Link to comment

I don't have any problems connecting to the backup engine or anything after reading through the past few pages of posts... however I can no longer connect to any Friends outside of my home network and they can no longer back up to me. Did some networking changes happen with the 4.3.0 update? Anyone know how to fix?

 

EDIT: Weird, I had to forward port 4242 now to get it to work.

Link to comment

I just upgraded to unRAID 6.0.1 this evening. The main application I need to get running again is CrashPlan. I'm new to Docker here as well. I followed the directions from the first post on this thread:

 

docker run -d -h laffy --name=crashplan -v /mnt/user/crashplan:/config -v /mnt/user:/data -v /etc/localtime:/etc/localtime:ro -p 4242:4242 -p 4243:4243 gfjardim/crashplan

 

Watched it install the components successfully, but then got the following error:

 

root@laffy:~# docker logs crashplan

*** Running /etc/my_init.d/config.sh...

mv: cannot move '/etc/localtime.dpkg-new' to '/etc/localtime': Device or resource busy

*** Running /etc/rc.local...

*** Booting runit daemon...

*** Runit started as PID 57

 

So, I guess it's running, but not sure. My Mac client is unable to connect to it. When I fire it up after modifying ~/Library/Application Support/Crashplan/ui.properties (as I've done many times in the past for years), the client keeps going back to my local mac client. I have no idea why that's happening. Perhaps it defaults to that if it can't connect to the specified IP?

 

I tried to review the setting in the container, but when I go to the Docker tab in unRAID interface, I don't see any screens like the screenshots I'm seeing here. So, I have no idea what is wrong here and there is very little information to go on in troubleshooting this issue.

 

UPDATE:

 

I removed the entire Docker configuration and started over again. This time I used from the Community plugin and got that much working. At least that much is installed properly now and the interface is working. I'm still unable to connect my Mac client though and that's baffling.

 

Probably need to modify the starting point of this thread that those instructions will not work for a new 6.0.1 installation.

Link to comment

noob questions I'm afraid, but I'm going round in circles.

 

I'm trying to set up a facility to backup various directories in my array to an external HDD. I've been recommended crashplan.

 

Installed the community crashplan docker and crashplan desktop docker and configured as per page one of this thread.

 

crashplan log:

 

*** Running /etc/my_init.d/config.sh...

Current default time zone: 'Europe/London'
Local time is now: Mon Jul 20 17:28:07 BST 2015.
Universal Time is now: Mon Jul 20 16:28:07 UTC 2015.

*** Running /etc/rc.local...
*** Booting runit daemon...
*** Runit started as PID 64

 

Desktop log:

 

*** Running /etc/my_init.d/config.sh...

Current default time zone: 'Europe/London'
Local time is now: Mon Jul 20 17:40:06 BST 2015.
Universal Time is now: Mon Jul 20 16:40:06 UTC 2015.

cp: cannot stat ‘/root/wallpapers/*’: No such file or directory

 

Err, what do I do now? I don't seem to have any GUI. I was expecting a web-based GUI but reading between the lines I need to remote desktop to port 4243 on my server. Tried VLC on my Mac, but no go. I don't have a Windows box. Am I on the right track?

 

Documentation for crash plan is unfortunately all over the place and quite obscure unless you know exactly what to ask!

 

Help appreciated :)

 

TO UPDATE:

Installed Microsoft Remote Desktop on my Mac and seem to be able to log in. Now to give it a go.

 

Leaving this post here because it just might help other noobs!

Link to comment

OK, I got things going now. It's still in the middle of synchronizing block information and will take it awhile before I start seeing some real results. Here are some notes on my confusion and how I got things running.

 

o Got Docker initialized fine, but it helps to read some of the unRAID information on how it works here.

o Make sure you put your Docker container in the right place.

o Do not use the instructions at the beginning of this thread—they're outdated as far as I can tell.

o You will need the Community Applications plugin to get access to Crashplan containers (server and desktop client).

o Verify you have good mappings to your devices (I had to /boot /boot to backup my flash drive).

o Volume mappings will be different compared to your old CrashPlan configuration.

o Add the new path(s) to your backup sets and let CrashPlan sort out the new mappings without having to back all of it up over again (don't delete any old ones even if they say they are "missing" yet).

o Once you have a complete backup you can remove the "missing" entries.

o Mac: To access the client you need CoRD and connect to port 3389 (don't use  Remote Desktop Connection on Mac).

 

Hope that helps some other new people.

Link to comment

so by trawling through 20+ pages, i came across a reference to ".ui_info". Long story short, there's an additional step missing from the first post, which is:

 

  • on your local client, update the .ui_info file

[*]Locate the file. On my Windows installation it's at c:\ProgramData\CrashPlan\.ui_info, on OSx it's at /Library/Application Support/CrashPlan/.ui_info

[*]Replace the GUID (long string of letters and numbers) with "unRAID", so that the entire line looks like this:

4243,unRaid

 

and that's it. Long story short, apparently Code42 introduced some security feature where the client and server need to share some unique identifier to prevent people controlling others' servers.

Link to comment

I am still having problems connecting to the Crashplan Docker. Everything seems to be backing up fine, but I can not connect from either my Windows PC (using SSH tunnel) or via the Mate Desktop. All the .ui_info  files are the same. I've reloaded the Mate desktop and played with the ServerHost and ServicePort settings on both clients with no effect. Mate desktop just times out and the Windows client ALWAYS connects locally.

 

Any thoughts?  It would be terrific if Limetech created/supported an official CP Docker (or something like it). It could be a pain to have to keep up with all the modifications CP seems to have been making recently.  Thanks

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.