Jump to content
Squid

[Plugin] CA Appdata Backup / Restore

376 posts in this topic Last Reply

Recommended Posts

Interesting.  That popup is a built in part of dynamix.  What browser are you using?

Share this post


Link to post

Interesting.  That popup is a built in part of dynamix.  What browser are you using?

Google Chrome

Version 54.0.2840.99 m (64-bit)

Share this post


Link to post

 

Actually, it looks like on 6.2 the buttons are reversed compared to 6.3

Share this post


Link to post

I'm slightly confused as to how I'm supposed to point the backup directory to an unassigned device.

 

I've attached two images to this post, which represent how I have the plugin configured right now. Capture1.png shows how I've entered the address for the unassigned device drive. Capture2.png shows the message that shows up when I choose to backup, and I find the error message somewhat confusing. Why is the destination drive written as this:

 

/mnt/disks/Temporary/Backups/ (/mnt/disks/Temporary)

 

Maybe I'm being dense, but I'm confused as to what the parentheses mean. This is a remote server so I don't want to mess around with it and be unable to reconnect if that configuration message is not expected.

 

Thanks in advance for any help!

Capture1.PNG.b0558cee9a24716159c2539baee72185.PNG

Capture2.PNG.e53b280eeffe116fdbf3117f2d54f63b.PNG

Share this post


Link to post

I'm slightly confused as to how I'm supposed to point the backup directory to an unassigned device.

 

I've attached two images to this post, which represent how I have the plugin configured right now. Capture1.png shows how I've entered the address for the unassigned device drive. Capture2.png shows the message that shows up when I choose to backup, and I find the error message somewhat confusing. Why is the destination drive written as this:

 

/mnt/disks/Temporary/Backups/ (/mnt/disks/Temporary)

 

Maybe I'm being dense, but I'm confused as to what the parentheses mean. This is a remote server so I don't want to mess around with it and be unable to reconnect if that configuration message is not expected.

 

Thanks in advance for any help!

Sorry for the delay (busy with the season).  On the share section, you've typed in the whole path (/mnt/disks/...) when you should only be typing in the folder that the backups are going to since you already have the "disk" set.

 

The confusing two step process is simply because of 6.1.x and the fact that it doesn't support hardlinks on user shares.  In a couple of months I will be getting rid of the two step and making it a single option (as more people get rid of 6.1.x)

Share this post


Link to post

Ah thank you so much Squid! I know its a busy time of year. I just set up the application as you've described and its working perfectly. It feels nice to not have to worry about losing all of my VM configurations and docker data. Thanks again!

Share this post


Link to post

Can someone help me change the Rsync settings so that the backup doesn't tie up my server and dockers for hours? I just want it to backup files that have changed since last backup. When it runs, its completely overwriting everything and I get texts about my Plex server being down lol

 

Please send help

 

Is there anyway that the rsync stuff can run similar to how Unbalance does where only files that have changed are copied.

Share this post


Link to post

If you have dated backups set up and delete after so many days then instead of deleting the dated backup it will copy the changes files into the old folder.  Net result is that the backup will be minutes.  Or barring that you also have the option to keep plex up and running

 

Sent from my LG-D852 using Tapatalk

 

 

Share this post


Link to post

If you have dated backups set up and delete after so many days then instead of deleting the dated backup it will copy the changes files into the old folder.  Net result is that the backup will be minutes.  Or barring that you also have the option to keep plex up and running

 

Sent from my LG-D852 using Tapatalk

 

Okay, I have turned on the option for dated backups and to delete them after 3 days. I also deleted the old backups. From my understanding, the initial backup will take a while because I have 1.5TB of data to backup, but if I understand what you're saying correctly....after this initial backup, each dated backup will take less time? Please correct me if I am wrong.

Share this post


Link to post

If you have dated backups set up and delete after so many days then instead of deleting the dated backup it will copy the changes files into the old folder.  Net result is that the backup will be minutes.  Or barring that you also have the option to keep plex up and running

 

Sent from my LG-D852 using Tapatalk

 

Okay, I have turned on the option for dated backups and to delete them after 3 days. I also deleted the old backups. From my understanding, the initial backup will take a while because I have 1.5TB of data to backup, but if I understand what you're saying correctly....after this initial backup, each dated backup will take less time? Please correct me if I am wrong.

No this is wrong. Having dated backups will make it store a complete backup folder for each date, instead of just storing the changes to the same folder.

Share this post


Link to post

If you have dated backups set up and delete after so many days then instead of deleting the dated backup it will copy the changes files into the old folder.  Net result is that the backup will be minutes.  Or barring that you also have the option to keep plex up and running

 

Sent from my LG-D852 using Tapatalk

 

Okay, I have turned on the option for dated backups and to delete them after 3 days. I also deleted the old backups. From my understanding, the initial backup will take a while because I have 1.5TB of data to backup, but if I understand what you're saying correctly....after this initial backup, each dated backup will take less time? Please correct me if I am wrong.

No this is wrong. Having dated backups will make it store a complete backup folder for each date, instead of just storing the changes to the same folder.

Not when attempt faster rsync is also turned on.

Share this post


Link to post

When you click the "Delete Incomplete Backup Sets" or the "Delete Old Dated Backup Sets" is there some way to tell that the operation has completed?  Perhaps it just works really fast, but when I click on them, nothing happens.  No message stating it completed successfully.  Am I missing something?

Share this post


Link to post

When you click the "Delete Incomplete Backup Sets" or the "Delete Old Dated Backup Sets" is there some way to tell that the operation has completed?  Perhaps it just works really fast, but when I click on them, nothing happens.  No message stating it completed successfully.  Am I missing something?

lol  TBH I don't remember if it sends a notification or not.  But, if its not showing as still running then it should be complete.    You can always see what's showing up in your backup folder on the shares. 

 

But, one thing to consider is that it deletes old / incomplete sets within the current backup folder.

 

That being said, during a migration from my main desktop to using a Windows VM all the time, some of the permissions on scripts got reset somehow that I found affected CA.  I'll look at the backup module tonight.

Share this post


Link to post

But, one thing to consider is that it deletes old / incomplete sets within the current backup folder.

 

Just FYI, I've clicked the "Delete Backup Sets" several times and it does not seem to be deleting the ERROR folders.

 

width=300http://my.jetscreenshot.com/12412/20170106-g79s-279kb.jpg[/img]

There's a difference in the buttons.

 

Delete Old Backups - Deletes everything in the current backup destination

Delete Incomplete Backups - Deletes All backups labelled error in the current backup destination

Delete Old Dated Backup Sets - Deletes All backups which are older than the # of days to keep in the current backup destination

 

The backup status window should say what's going on.

 

It works for me on my system.  Show me a screenshot of the backup settings and I'll try and replicate.

Share this post


Link to post

Strike that...  The reason why the scripts aren't working for you is because of the same reason why the backup on your system is putting the log into the syslog.  Just testing the fixes right now.

Share this post


Link to post

Quick question, because I need to be sure of this, if I select "Disk 2" as the destination and use the default "CommunityApplicationsAppdataBackup" as the Destination Share, then this app will create a folder on Disk 2 called "CommunityApplicationsAppdataBackup" and the backups will go in that folder?  I just need to be sure that whatever I do it's not going to erase everything on Disk 2.  :o

Share this post


Link to post

Quick question, because I need to be sure of this, if I select "Disk 2" as the destination and use the default "CommunityApplicationsAppdataBackup" as the Destination Share, then this app will create a folder on Disk 2 called "CommunityApplicationsAppdataBackup" and the backups will go in that folder?  I just need to be sure that whatever I do it's not going to erase everything on Disk 2.  :o

Correct.  (Or you can set it to be to a user share so the backup sets span the disk(s) according to the include/split level settings for the share)

 

But you might want to update first.  A bug fix just went into today's release (2017.01.09) that deals with logging

Share this post


Link to post

Is there a way to set this to compress as it backs up? Or is that not practical?

I'm also wondering about the "run in place of" command. Does that refer to the custom start stop scripts? I'm trying to set it up to use rclone instead of rsync.

Share this post


Link to post

Run in place of will do the starts and stops of the selected containers but will not run rsync.  Your script would be expected to do the backup

 

Sent from my SM-T560NU using Tapatalk

 

 

Share this post


Link to post

I would like to understand if I have done something wrong with excluding folders. This is the specific line of BackupOptions.json:

"excluded": "/mnt/user/system/docker/appdata/emby/cache, /mnt/user/system/docker/appdata/emby/metadata",

 

However all Emby metadata as well as the Emby cache is being saved every day. Anything I need to correct in the definition above?

 

Thanks a lot.

Share this post


Link to post

A couple questions:

 

Would it be too much to request rclone support? I was going to just copy the logic and translate it for rclone but it looks more involved than I expected. Rclone is similar to rsync but works on cloud services and also has an encryption option. I thought about just initiating rclone to sync after rsync but that loses the "attempt faster sync" feature and will use a lot of bandwidth and time re-uploading files unnecessarily. I know there can be issues with integrating other plugins but I thought it was worth asking. Or maybe you can suggest an alternative way to do it with existing tools and features.

 

I see you can exclude folders but is there any way to exclude subfolders? I'm asking because Plex is really bloated and takes up a lot of space. I really only need it to backup the database and not tons and tons of cached image files and such. Alternatively, I can just exclude Plex altogether and use a custom script to copy just the DB files but I'd rather not complicate things if not necessary.

Share this post


Link to post

A couple questions:

 

Would it be too much to request rclone support? I was going to just copy the logic and translate it for rclone but it looks more involved than I expected. Rclone is similar to rsync but works on cloud services and also has an encryption option. I thought about just initiating rclone to sync after rsync but that loses the "attempt faster sync" feature and will use a lot of bandwidth and time re-uploading files unnecessarily. I know there can be issues with integrating other plugins but I thought it was worth asking. Or maybe you can suggest an alternative way to do it with existing tools and features.

 

I see you can exclude folders but is there any way to exclude subfolders? I'm asking because Plex is really bloated and takes up a lot of space. I really only need it to backup the database and not tons and tons of cached image files and such. Alternatively, I can just exclude Plex altogether and use a custom script to copy just the DB files but I'd rather not complicate things if not necessary.

Any mounted path is supported.  I believe that rclone does give you a mount point within /mnt/disks, but since I don't have any use for the plugin, I can't tell you much.  But I wouldn't be surprised if you run into hardlink issues with it.  Best thing would be to try and set it as a destination and see what happens.

 

If you don't use the dated backup options, then every backup is only the changed files, so your sync will also only reflect the changed files.

 

To exclude subfolders, manually type them in 

Share this post


Link to post

I would like to understand if I have done something wrong with excluding folders. This is the specific line of BackupOptions.json:

"excluded": "/mnt/user/system/docker/appdata/emby/cache, /mnt/user/system/docker/appdata/emby/metadata",

 

However all Emby metadata as well as the Emby cache is being saved every day. Anything I need to correct in the definition above?

 

Thanks a lot.

Try a relative folder.  ie: only type in emby/cache and emby/metadata

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.