[Plugin] rclone


Waseh

Recommended Posts

On 10/11/2020 at 1:38 PM, Stupifier said:

You're a bit late to the game. People have been doing this for the better part of 3 years.....countless projects on GitHub and other places.

wow that's amazing. I have been reading about it all day, this is truly amazing. I was a bit hesistant to store this data in the cloud even with encryption... but everything that I've read seems to be it's not an issue if things are enrypted...

 

I did just see that gdrive limited daily bandwidth to 2TB which includes the 750GB upload. Any idea on how this will be effected? Not sure how much bandwidth would be used. I have 100Mbit symmetrical. Not sure how many clients could stream a 1080p file for 1.25tb? Any ideas?

 

@mgutt oh man, I was reading all these scripts with so many characters that I missed that. I will try it again, thank you so much!

Link to comment
39 minutes ago, maxse said:

wow that's amazing. I have been reading about it all day, this is truly amazing. I was a bit hesistant to store this data in the cloud even with encryption... but everything that I've read seems to be it's not an issue if things are enrypted...

 

I did just see that gdrive limited daily bandwidth to 2TB which includes the 750GB upload. Any idea on how this will be effected? Not sure how much bandwidth would be used. I have 100Mbit symmetrical. Not sure how many clients could stream a 1080p file for 1.25tb? Any ideas?

 

@mgutt oh man, I was reading all these scripts with so many characters that I missed that. I will try it again, thank you so much!

Again....late to the game. So everyone on the nerd-planet has been using Gsuite Google Drive unlimited storage......and therefore, Google has recently announced some big changes coming soon. A lot of speculation that the Unlimited storage will be ending. You missed the wagon by about 3 years dude.

Just giving you a heads up before you dive into this business 100%.......Google MIGHT be tightening the purse a bit (still unverified since nothing has change RIGHT NOW, just announcements, nothing more).


And no, I won't explain further.

Link to comment
19 hours ago, Stupifier said:

Again....late to the game. So everyone on the nerd-planet has been using Gsuite Google Drive unlimited storage......and therefore, Google has recently announced some big changes coming soon. A lot of speculation that the Unlimited storage will be ending. You missed the wagon by about 3 years dude.

Just giving you a heads up before you dive into this business 100%.......Google MIGHT be tightening the purse a bit (still unverified since nothing has change RIGHT NOW, just announcements, nothing more).


And no, I won't explain further.

No need to be so abrupt with the dude.... he has not missed the wagon at all. He may have to pay more for enterprise, but the boat is very much not missed....

 

@maxse You can sign up to Google Workspace Enterprise Plus if you want the unlimited storage option now. If you got in earlier you MAY have saved some cash, as MAY have been able to stay on the older/cheaper plans, but that's currently unknown.  You do have an option still of about $30 per month, for unlimited storage, which is a steal really!

Link to comment
  • 4 weeks later...

folks I have a question.

I got this working backing up my unraid to a remote unraid running minio. 

I am backing up a share that's 50tb, however, after running rclone for about a week it says in the logs that the size of the share I am backing up and hence it's data as 12TB, and then it slowly goes up to 12.212, 12.213, etc... It seems to go up rather slowly, is it going to ultimately reach 50tb before the upload is finished? I am concerned because my other smaller shares around 1-2TB had no issues when running the script and during the process rclone would show the correct size of the share as it's total size.... Is this something to be concerned about? Is there share just too big for rclone to know it's size? According to what I'm reading 50tb shouldn't be an issue. Would appreciate some help

Link to comment

Guys I need some help please.

 

I found another issue. I am running rclone crypt to a minio bucket on my unraid server. Both servers are local on my network and when the backup is done I will move the minio bucket server offsite.

 

I am having an issue where the file transfer just stops and minio on the backup server reports that the filename is too long. I don't understand why this happens because the OS (unraid) is the same on both servers. I was really hoping that this would be a great solution for me and it took a very long time to figure out this type of a backup solution.

 

Could someone help me out please? This is the error I got on the backup server running minio in docker

"Error: file name too long (cmd.StorageErr):

Link to comment

I am having some troubles upgrading to the latest beta rclone.  Anyone else?

 

##### Note that closing this window will abort the execution of this script #####
Updating rclone
Archive: /boot/config/plugins/rclone/install/rclone.zip
inflating: /boot/config/plugins/rclone/install/rclone
-------------------------------------------------------------------
Update failed - Please try again
-------------------------------------------------------------------
 

Link to comment
16 hours ago, jjslegacy said:

I am having some troubles upgrading to the latest beta rclone.  Anyone else?

 

##### Note that closing this window will abort the execution of this script #####
Updating rclone
Archive: /boot/config/plugins/rclone/install/rclone.zip
inflating: /boot/config/plugins/rclone/install/rclone
-------------------------------------------------------------------
Update failed - Please try again
-------------------------------------------------------------------
 

As i suspected something is wrong at rclones part with the latest beta release.
There are no linux binaries available which is why it fails to fetch the new version.
Nothing to do but wait until it's resolved :) 

Link to comment
On 10/13/2020 at 11:40 PM, ne10g said:

No need to be so abrupt with the dude.... he has not missed the wagon at all. He may have to pay more for enterprise, but the boat is very much not missed....

 

@maxse You can sign up to Google Workspace Enterprise Plus if you want the unlimited storage option now. If you got in earlier you MAY have saved some cash, as MAY have been able to stay on the older/cheaper plans, but that's currently unknown.  You do have an option still of about $30 per month, for unlimited storage, which is a steal really!

What is that option that you are referring to that cost 30$? 

Link to comment

Guys, 

since google is moving us from G Suite to Workspaces which means we will lose the unlimited storage. What is the command that I can use to re-download everything I have moved from my server to G Suite. 

 

I moved almost 40TB to G Suite over the past year and it is unfortunate that I am gonna have to download everything again just in case google decides to pull the plug on the unlimited storage 

Link to comment
8 minutes ago, livingonline8 said:

What is that option that you are referring to that cost 30$? 

He's talking about gsuite Enterprise. You can Google it. But just to be clear, nobody knows if that will maintain unlimited storage either. It's a total crap shoot and Google themselves are well known for being very unreliable for accurate information on this specific topic

5 minutes ago, livingonline8 said:

Guys, 

since google is moving us from G Suite to Workspaces which means we will lose the unlimited storage. What is the command that I can use to re-download everything I have moved from my server to G Suite. 

 

I moved almost 40TB to G Suite over the past year and it is unfortunate that I am gonna have to download everything again just in case google decides to pull the plug on the unlimited storage 

rclone copy "remote:Media_Directory" "/mnt/user/vodeo_share"

 

And whatever additional flags you want.

https://rclone.org/flags/

Link to comment
28 minutes ago, Stupifier said:

He's talking about gsuite Enterprise. You can Google it. But just to be clear, nobody knows if that will maintain unlimited storage either. It's a total crap shoot and Google themselves are well known for being very unreliable for accurate information on this specific topic

rclone copy "remote:Media_Directory" "/mnt/user/vodeo_share"

 

And whatever additional flags you want.

https://rclone.org/flags/

Thank you for your quick response 

 

I am facing a small problem 

 

I used this command to test the download speed:

 

rclone copy media_vfs:movies_plex/english/fifty shades of grey/ /mnt/user/test

 

But it is not working!! Am I supposed to use "" and where 

 

I would appreciate your help 

 

Also are there any flags you recommend when downloading our entire G Suite content

 

Thank again 

Edited by livingonline8
added a question
Link to comment
8 minutes ago, livingonline8 said:

Thank you for your quick response 

 

I am facing a small problem 

 

I used this command to test the download speed:

 

rclone copy media_vfs:movies_plex/english/fifty shades of grey/ /mnt/user/test

 

But it is not working!! Am I supposed to use "" and where 

 

I would appreciate your help 

 

Also are there any flags you recommend when downloading our entire G Suite content

 

Thank again 

Use the quotes like I outlined. Enclose your entire source path in quotes.....and then enclose your entire Destination path in quotes.

rclone copy "source" "destination"

Link to comment
21 hours ago, Stupifier said:

Use the quotes like I outlined. Enclose your entire source path in quotes.....and then enclose your entire Destination path in quotes.

rclone copy "source" "destination"

Thank you, 

 

I have one more question please... Rclone has been working well for more than a year but today when I wanted to upload some stuff 

mounting failed and I checked the script logs and I get the following message 

 

Script Starting Dec 02, 2020 14:00.01

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_unionfs_mount/log.txt

02.12.2020 14:00:01 INFO: mounting rclone vfs.
2020/12/02 14:00:02 Fatal error: Directory is not empty: /mnt/user/mount_rclone/google_vfs If you want to mount it anyway use: --allow-non-empty option
02.12.2020 14:00:06 CRITICAL: rclone google vfs mount failed - please check for problems.
Script Finished Dec 02, 2020 14:00.06

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_unionfs_mount/log.txt
 

 

 

 

So I added the --allow-non-empty but still it is not mounting 

 

what is going on?

Link to comment
1 hour ago, livingonline8 said:

Thank you, 

 

I have one more question please... Rclone has been working well for more than a year but today when I wanted to upload some stuff 

mounting failed and I checked the script logs and I get the following message 

 

Script Starting Dec 02, 2020 14:00.01

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_unionfs_mount/log.txt

02.12.2020 14:00:01 INFO: mounting rclone vfs.
2020/12/02 14:00:02 Fatal error: Directory is not empty: /mnt/user/mount_rclone/google_vfs If you want to mount it anyway use: --allow-non-empty option
02.12.2020 14:00:06 CRITICAL: rclone google vfs mount failed - please check for problems.
Script Finished Dec 02, 2020 14:00.06

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_unionfs_mount/log.txt
 

 

 

 

So I added the --allow-non-empty but still it is not mounting 

 

what is going on?

The log says the directory is NOT empty. Never do a mount to a directory which has stuff in it. So please figure out what is in "/mnt/user/mount_rclone/google_vfs".....if it is something important to you, then move it. If it is not important, then delete it and try your mount command again.

Also, for what its worth.....I personally like to mount to locations which are NOT user shares. Anything "/mnt/user/_____" is an unraid user share. Instead I would recommend you mount to something like "/mnt/disks/mount_rclone/google_vfs". Make sure that directory EXISTS and is EMPTY before using the mount command.

Link to comment
1 hour ago, Stupifier said:

The log says the directory is NOT empty. Never do a mount to a directory which has stuff in it. So please figure out what is in "/mnt/user/mount_rclone/google_vfs".....if it is something important to you, then move it. If it is not important, then delete it and try your mount command again.

Also, for what its worth.....I personally like to mount to locations which are NOT user shares. Anything "/mnt/user/_____" is an unraid user share. Instead I would recommend you mount to something like "/mnt/disks/mount_rclone/google_vfs". Make sure that directory EXISTS and is EMPTY before using the mount command.

What!!

 

it used to work perfectly well in the past... "/mnt/user/mount_rclone/google_vfs" is where I download my stuff so they can be uploaded. So making sure it is empty actually defeats the purpose. 

 

Also what about adding "--allow-non-empty" I tried it still not working so why is that ??

Link to comment
2 hours ago, livingonline8 said:

What!!

 

it used to work perfectly well in the past... "/mnt/user/mount_rclone/google_vfs" is where I download my stuff so they can be uploaded. So making sure it is empty actually defeats the purpose. 

 

Also what about adding "--allow-non-empty" I tried it still not working so why is that ??

You are getting very confused.
-Your issue was that you were MOUNTING rclone to "/mnt/user/mount_rclone/google_vfs" and it was giving an error because that directory has files in it.
-Now you tell me that "/mnt/user/mount_rclone_/google_vfs" is where you DOWNLOAD files to. You should NOT be setting up an rclone mount overlapping where you download your Media to.
-an rclone mount location and the location where you download media files to should be TWO separate locations.

rclone mounts are designed to be empty directory locations where you MOUNT a remote: to. In other words, an rclone mount directory is a place where you READ from the cloud. It is NOT a place where you put files you download into.

So here is an example for you:
rclone mount location = "/mnt/disks/rclone_mount"
downloaded media location (to be uploaded to cloud) = "/mnt/user/media/"

Update:
Maybe you were using unionfs or mergerfs and you didn't know it........open question to you

Edited by Stupifier
Link to comment
On 12/2/2020 at 6:00 PM, Stupifier said:

You are getting very confused.
-Your issue was that you were MOUNTING rclone to "/mnt/user/mount_rclone/google_vfs" and it was giving an error because that directory has files in it.
-Now you tell me that "/mnt/user/mount_rclone_/google_vfs" is where you DOWNLOAD files to. You should NOT be setting up an rclone mount overlapping where you download your Media to.
-an rclone mount location and the location where you download media files to should be TWO separate locations.

rclone mounts are designed to be empty directory locations where you MOUNT a remote: to. In other words, an rclone mount directory is a place where you READ from the cloud. It is NOT a place where you put files you download into.

So here is an example for you:
rclone mount location = "/mnt/disks/rclone_mount"
downloaded media location (to be uploaded to cloud) = "/mnt/user/media/"

Update:
Maybe you were using unionfs or mergerfs and you didn't know it........open question to you

Yes, I am have been using unionfs all those years... did anything change with that? 

 

This is why I was saying everything was working perfectly fine. 

 

 

Btw, Now that I am planning to download everything I have uploaded to g suite back to my server. 

 

Is there a download limit like the upload limit 

 

I have about 40TB of data on g suite and I wanna download them back, any recommended flags to limit the download speed and not hit a daily download limit. Also, I don't want the download to suck all my bandwidth too... my download speed is about 200 M

Link to comment
7 hours ago, livingonline8 said:

Yes, I am have been using unionfs all those years... did anything change with that? 

 

This is why I was saying everything was working perfectly fine. 

 

 

Btw, Now that I am planning to download everything I have uploaded to g suite back to my server. 

 

Is there a download limit like the upload limit 

 

I have about 40TB of data on g suite and I wanna download them back, any recommended flags to limit the download speed and not hit a daily download limit. Also, I don't want the download to suck all my bandwidth too... my download speed is about 200 M

Ok. I did not know you are using unionfs. Provide your unionfs mount command. This matters in solving your issue.

use this command to copy stuff from rclone remote to your local array storage.

rclone copy "remote:some_directory" "/mnt/user/some_share/" --bwlimit 10M -v -P


That will limit the download speed to 10 MB/second. You can download about 10 Terabytes per day I think. The download limit is higher than the upload limit. You should also use the --dry-run and -v flag. The dry run flag is used for testing your command (nothing actually transfers) and the -v flag is to output more information in the terminal window so you can see what is happening. And the -P flag is to see progress live as things are happening.

Again, I HIGHLY recommend you look over all the rclone flags available in https://rclone.org/flags/
I found the --bwlimit flag from that webiste. Trying to help you learn instead of just asking questions over and over.

Link to comment

Hey Guys, I am new to rclone and looked some guides up, for syncing to cloud servers.

I added one server and I am able to sync. but for some reason the script section is missing within the rclone plugin on my end.

Any ideas what I have to do? Did something changed lately?

 

https://prnt.sc/w0q90w

 

Oh nevermind, found the explaination here:

 

Edited by acidburn666
Link to comment

Hi all,

 

I am installing the plugin and it shows on the settings tab but nothing happens when i run the following command from terminal.

 

rclone config

 

plugin: installing: https://raw.githubusercontent.com/Waseh/rclone-unraid/master/plugin/rclone.plg
plugin: downloading https://raw.githubusercontent.com/Waseh/rclone-unraid/master/plugin/rclone.plg
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/master/plugin/rclone.plg ... done

+==============================================================================
| Installing new package /boot/config/plugins/rclone/install/rclone-2020.09.29-bundle.txz
+==============================================================================

Verifying package rclone-2020.09.29-bundle.txz.
Installing package rclone-2020.09.29-bundle.txz:
PACKAGE DESCRIPTION:
Package rclone-2020.09.29-bundle.txz installed.
No internet - Skipping download - Trying to install existing binary
Install failed - No binary found - Please try installing/updating the plugin again
plugin: run failed: /bin/bash retval: 1

Updating Support Links



Finished Installing. If the DONE button did not appear, then you will need to click the red X in the top right corner
 

Edited by Greygoose
Link to comment

@Greygoose
The error might be a clue ;) 

Quote

No internet - Skipping download - Trying to install existing binary
Install failed - No binary found - Please try installing/updating the plugin again

 

The plugin tries to ping 8.8.8.8, 1.1.1.1 and the rclone server only pinging the next in line if no answer was recieved from the former.
So the installer can't reach the internet.

Do you get an answer if you ping the above adresses from command line?

Edited by Waseh
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.