[Plugin] CA User Scripts


Recommended Posts

2 minutes ago, mgutt said:

@guilhem31

 

A) You could move all files with "mv":


#!/bin/bash
mv /mnt/user/backup/phonecamera/*.* /mnt/user/photos/phonecamera/

By that all files will be moved to "/photos/phonecamera/", but if they already exist, the script will fail. Instead you would need to force overwriting the destination by using the "-f" flag:


#!/bin/bash
mv -f /mnt/user/backup/phonecamera/*.* /mnt/user/photos/phonecamera/

 

B) If moving triggers a full new upload through your smartphone app, you should consider "rsync" to copy the files:


#!/bin/bash
rsync -a /mnt/user/backup/phonecamera /mnt/user/photos/phonecamera

 

Of course there are much more possibilities. Like copying files to a year/month base subfolder structure or copying only files with specific extensions like .jpg. I'm sure you will find other solutions through a little bit research.

 

Thanks a lot for your answer !!

 

I need to copy the files, not move them so I'll go for the rsync script.

But is it "one way" script ? I just want to copy from one directory to another, the folders won't be the same size at the end. Sorry because my english is not good enough to tell precisely what I have in my mind...!

 

So I'll give more informations :

The "photos" folder allready contains A LOT of photos, from my Google Photos account.

I want to incrementally add new pics taken with my new phone in the "backup" folder (it backups all my phone storage with syncthing app), and then, once a week for example  I want the "backup" photos files to be copied in the "photos" folder.

It would allow this folder to contain up-to-date all time photos I shoot.

 

I'm not sure that more informations = more clarity :D

Link to comment
22 minutes ago, guilhem31 said:

Thanks a lot for your answer !!

 

I need to copy the files, not move them so I'll go for the rsync script.

But is it "one way" script ? I just want to copy from one directory to another, the folders won't be the same size at the end. Sorry because my english is not good enough to tell precisely what I have in my mind...!

 

So I'll give more informations :

The "photos" folder allready contains A LOT of photos, from my Google Photos account.

I want to incrementally add new pics taken with my new phone in the "backup" folder (it backups all my phone storage with syncthing app), and then, once a week for example  I want the "backup" photos files to be copied in the "photos" folder.

It would allow this folder to contain up-to-date all time photos I shoot.

 

I'm not sure that more informations = more clarity :D

 

Yes, rsync is one-way. It won't delete in the destination, too (as long you do not set the flag "--delete"). So it's incremental as you need it.

 

P.S. If you do not pay for the Google cloud, it will massively reduce the quality of your photos. This is the reason why I'm using Amazon Cloud instead (with Prime membership).

Link to comment
8 minutes ago, mgutt said:

P.S. If you do not pay for the Google cloud, it will massively reduce the quality of your photos. This is the reason why I'm using Amazon Cloud instead (with Prime membership).

This is exactly why I need to store all my photos in original quality somwhere (at home!)

My old phone took 12mpx pics, so the quality wasn't degraded, but my new phone takes much better photos.

 

I'll try the rsync solution, thanks a lot again !

Link to comment
10 hours ago, mgutt said:

Yes, rsync is one-way. It won't delete in the destination, too (as long you do not set the flag "--delete"). So it's incremental as you need it.

Just to let you know that I managed to do what you told me, using a command like this :

 

rsync -a "/mnt/user/backups/myphonecamera/"* /mnt/user/photos

The "" is because there are spaces in the folder name, and the * helped me to copy only the folder contents

 

Thanks again @mgutt

Link to comment

Hello there,

i found a realy great script for my (ZFS) Backup purpose.

Unlucky me i have no clue about scripts % Co. I am still a noob and would like to know if somebody can help me to fix the errors / mistakes

the script came from here https://translate.google.com/translate?hl=de&sl=auto&tl=en&u=https%3A%2F%2Fesc-now.de%2F_%2Fzfs-offsite-backup-auf-eine-externe-festplatte%2F%3Flang%3Den

 

after several try and error i must gave up and pray somebody can help me : line 21: dialog: command not found

 

anyone ?

 

 

 

 

 

Link to comment

@Dtrain

Around line 21 is the command "zpool":

1364473125_2020-05-1818_53_11.png.c879f243a252c7d0164b841f24009949.png

 

This command is part of the zfsutils package and not available by default on an Unraid server. Maybe this is the reason? Test it by yourself. Open the terminal and type in "zpool":

1257960060_2020-05-1818_49_47.png.5b1cb8550b1af4691cd216a765d2c7af.png

 

If it returns "command not found" than you need to install the ZFS Utilites first.

 

P.S. You can use rsync to create incremental backups, too:

#!/bin/bash
# settings
user_share="Music"
# create full backup
if [[ ! -d /mnt/user/Backup/Shares/${user_share}_$(date +%Y) ]]; then
    rsync -a /mnt/user/${user_share} /mnt/user/Backup/Shares/${user_share}_$(date +%Y)
# create incremental backup
else
    rsync -a --delete --link-dest=/mnt/user/Backup/Shares/${user_share}_$(date +%Y) /mnt/user/${user_share} /mnt/user/Backup/Shares/${user_share}_$(date +%Y%m%d_%H%M%S)
fi

 

This creates backups as follows:

860382052_2020-05-1819_08_30.png.347498e760d4554be4360f63b3f5f0d2.png

 

Every folder contains the full 1:1 backup of its day, but as "--link-dest" creates only hardlinks for already existing files it physically copies only changed / new files. This means its saves as much space as a ZFS snapshot does.

Edited by mgutt
  • Like 2
Link to comment

Started to use the auto vm backup script from danijo. Added a custom cron to start the script each monday morning at 3am. Prior to that I tested a the custom cron during the day. What is miss (or at least failing to find) is a log of the script. When started on de forground the webui displays script output. Is this output stored somewhere so I can monitor the progress or when it's done just to check the log if everything went oke?

Link to comment
1 hour ago, sjoerd said:

Started to use the auto vm backup script from danijo. Added a custom cron to start the script each monday morning at 3am. Prior to that I tested a the custom cron during the day. What is miss (or at least failing to find) is a log of the script. When started on de forground the webui displays script output. Is this output stored somewhere so I can monitor the progress or when it's done just to check the log if everything went oke?

may take a look at the user scripts page, right hand besides your cron settings u can watch or download the log

  • Thanks 1
Link to comment
1 hour ago, alturismo said:

may take a look at the user scripts page, right hand besides your cron settings u can watch or download the log

I must be blind. I saw those in screenshots but they were from an older version of unRaid. I took couple of seconds after I reopened the page

 

Additional info:

So yeah when I either click show log or Download log I only get a partial logfile. Last line says "Full logs for this script are available at /tmp/user.scripts/tmpScripts/unRaid_autovmbackup-0-4/log.txt". That's kinda inconvenient since that directory is not accessibly directly. Isn't there a way to consolidate all these logfiles under tools/logfiles or something? Something like that might be already there but I just started unravelling the unraid. I merely was user using my vm's but it seems I need to be a good unRaid administrator as well 😁

Edited by sjoerd
added extra info
Link to comment

It would be nice to have a user-friendly input for the custom cron schedule or even an explanation.

1042800478_2020-05-2700_06_29.png.40ec76db95317e555347f186e59ae76a.png

 

This answer helped me to understand the syntax.

 

EDIT: Ok, now I found the "What is Cron" at the bottom of the page. Maybe you should rename it or add this popup after the "Custom Cron Schedule" gets focus.

1567847537_2020-05-2700_12_57.png.218f82433fd8ee42ca57f5fb7de1c88f.png

Link to comment

hi I posted a question in general question but I guess I need to post in here for script batch file help so I don't know how to delete my message in general discussion  but this is what I wrote

"

hi i know the avidemux isnt the forum to ask for a script and i not sure if this is the right subfolder for script

 

but i looking for help  how to read all videos    it be like avi moi flv mp4 even mkv  i have more then 200k of videos 

id like a batch program to read all of them and remux them into mkv  and delete all metatags

then if there is any errors  the script file would log it if any videos are out of sync  

id also want it to recreate it in a different folder with the same subdiectories...    i have tried  mkvtoolnix  but thats going to take me forever... but 500 at a time

 

anyone have a script or point me to the right directions  as i dont know how to write any of it"

 

 

 

Link to comment
9 hours ago, mgutt said:

user-friendly input for the custom cron

Unfortunately, a cron generator is way outside the scope of the plugin, and a manual entry would still be needed anyways to handle the very weird entries that are possible.  But if I remember, next update I'll move the "what is cron" link to someplace more visible.

Link to comment
1 hour ago, Squid said:

But if I remember, next update I'll move the "what is cron" link to someplace more visible.

I wish I'd seen that before!  Could never remember which "version" of cron was supported.  Have generated valid but incompatible schedules several times.

Link to comment
2 hours ago, Squid said:

Unfortunately, a cron generator is way outside the scope of the plugin, and a manual entry would still be needed anyways to handle the very weird entries that are possible.  But if I remember, next update I'll move the "what is cron" link to someplace more visible.

If realized in JS f.e. through a Pop-up, it  could only fill the field. By that it would be still possible to manually edit it. Could you send me the source code of the list page? I'll try to realize that. 

Link to comment

Quick one. 

 

Noticed in 6.8.3 userscripts 2020.05.11  that if I try and edit the NAME of a userscript (by clicking on the new gear icon) the editable line appears as the class for the gear icon itself "<i class="fa fa-gear"></i>"  rather than the expected title for editing.

 

The title of a user script can no longer be edited. 

 

Adding text after the font class will appear (larger class font) but the old title remains also. Is this normal or a bug?

Link to comment

Hi, I just installed this but can see I am way out of my depth.  I know absolultely nothing about writing scripts.  Would someone be able to help me with a very minor one?

 

I would like to copy all files from /mnt/disks/DELUGETorrents/Seedbox_Downloads

to 

/mnt/disks/DELUGETorrents/Sonarr_Pickup/

 

for now I need a copy but will eventually change to move

Link to comment
1 hour ago, DigitalDivide said:

Thanks!  Very much appreciated!

 

Just one other question, when I create the text file, does it require an extension or simply leave it blank?

 

Edit: nevermind, saw I can create it in the app.

Do you really want to execute this multiple times? It sounds you only want to execute it one time. Then you could use the Terminal:

IMG_20200530_195625.jpg.3a20b090f18f1e2a84186db687fa1b5d.jpg

Link to comment

Not sure what you mean?  The folder is where  I have my seedbox syncthing copying files to on my local server.  I then want a job to run that will grab those files and move them to a different foler where sonarr will pick them up.  If it runs every hour, I would expect it do have nothing to move once the files have been already moved.  THen when syncthing dumps a new file there, it will get moved again.

 

The problem I am having is Sonarr is trying to copy the files from the sync folder as they are being synched with gives an error.  The only way around this that I know of is to move them to a diff folder and have sonarr grab them from there.  Can't find a way to have Sonarr exclude the sync files.  When syncthing is synching, it temporarily names the files .sync<filename>.  If I could find a way for Sonarr to disregard files that start with .sync I'd be more than happy.

Link to comment

Ok, if you need hourly scheduling, CA user scripts is the correct way.

 

Edit: instead of physically copying all files you could hardlink them in a second folder. By that the "copy process" would be much faster and it will not waste your storage space.

 

Edit2: This should work:

rsync -a --delete --link-dest=/mnt/disks/DELUGETorrents/Seedbox_Downloads/ /mnt/disks/DELUGETorrents/Seedbox_Downloads/ /mnt/disks/DELUGETorrents/Sonarr_Pickup/

 

"--link-dest" forces rsync to create hard links instead if copying the complete file.

 

"--delete" means it removes all files from Sonarr_Pickup/ if they do not exist in Seedbox_Downloads/ anymore.

 

Before executing this command you should empty the destination folder or use a different one.

 

Edit3: Ok "cp" supports that by using the "l"-flag too:

https://superuser.com/a/1360635/129262

Edited by mgutt
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.