Jump to content
Waseh

[Plugin] rclone

516 posts in this topic Last Reply

Recommended Posts

Posted (edited)
On 1/5/2019 at 11:15 PM, Stupifier said:
On 1/3/2019 at 3:04 PM, oh-tomo said:
Just setting up rclone for the first time for Google Drive.  When I go to the auto config page at http://127.0.0.1:53682/auth I just see "This site can’t be reached 127.0.0.1 refused to connect."  Where did I go wrong?

Use rclone-beta plugin (yes the plugin)......not the Docker Container. Then open a shell and type "rclone config" to begin your setup

 

I've just installed the beta but I still only see rclone (not beta) in my plugin list.   Should there be two rclones in the installed plugins tab -- rclone and rclone-beta?

 

update: trying again after uninstalling rclone and installing rclone-beta

 

Edited by oh-tomo

Share this post


Link to post

Can someone tell me while this keeps happening. Its random and non consistant. I am making sure I am well under the 750gb per day and if I hit retry 10 or 12 times it will work and move on for a few files before doing it again. its making moving files to gsuite a grueling process

 

 

Screen Shot 2019-04-21 at 10.08.04 PM.png

Share this post


Link to post
1 minute ago, tmoran000 said:

Can someone tell me while this keeps happening. Its random and non consistant. I am making sure I am well under the 750gb per day and if I hit retry 10 or 12 times it will work and move on for a few files before doing it again. its making moving files to gsuite a grueling process

 

 

Screen Shot 2019-04-21 at 10.08.04 PM.png

You are writing data to your Google drive account all wrong.....That's why! rclone mounts are HORRIBLE to write to and really should ONLY he used to read data from.

 

Instead, if you MUST write to you Google Drive, use rclone copy/sync commands via terminal shell

Share this post


Link to post
Posted (edited)

It does the same whether I write to the mount folder or to the Union folder I have tried both and both continue to error.   Also I do not know how to transfer through Terminal shell so Im limited to Krusader

Edited by tmoran000

Share this post


Link to post
11 hours ago, tmoran000 said:

It does the same whether I write to the mount folder or to the Union folder I have tried both and both continue to error.   Also I do not know how to transfer through Terminal shell so Im limited to Krusader

You need to learn how to use the Terminal Shell.

 

Again, writing to an rclone mount is NOT reliable. It does not matter if you write to the mount folder directly or via the union folder, it is NOT reliable. STOP EXPECTING IT TO BE.

 

Instead, if you MUST write to you Google Drive, use rclone copy/sync commands via terminal shell.

 

You're only limited to krusader because you are unwilling to LEARN. Put in the time, and learn how to use rclone properly. If you are unwilling to learn how to use Terminal Shell rclone commands, then you should just stop using rclone and seek alternatives.

 

If you Google search, you can even find people who wrote scripts to monitor your union folder and automatically fire off terminal Shell rclone uploads for you. Cloudplow is one of those tools. But you have to be willing to learn how to use it

Share this post


Link to post
2 hours ago, Stupifier said:

 

 

You're only limited to krusader because you are unwilling to LEARN. Put in the time, and learn how to use rclone properly. If you are unwilling to learn how to use Terminal Shell rclone commands, then you should just stop using rclone and seek alternatives.

maybe you should stop coming off as a dick. Never once did I say I was unwilling to learn, I was simply stating what I have done and where I was getting the errors based on the help I have received so far by some very good members of this thread so from here on out I will not need your help. thank you.

Share this post


Link to post

Now that I am trying terminal to move files instead of krusader.. i was reading the Rclone.org site and I see that it is providing a command "rclone copy source:sourcepath dest:destpath" .   so I am in need of some help with the Source;Sourcepath..   my files found in users/media/files/ .      Im not sure what to put in as "Source:"   the dest I am assuming is my Gdrive:encrypted:/ .  I guess I am looking for the proper way to form to the source command. or atleast an example of what it should look like.

 

Thanks.

Share this post


Link to post
56 minutes ago, tmoran000 said:

Now that I am trying terminal to move files instead of krusader.. i was reading the Rclone.org site and I see that it is providing a command "rclone copy source:sourcepath dest:destpath" .   so I am in need of some help with the Source;Sourcepath..   my files found in users/media/files/ .      Im not sure what to put in as "Source:"   the dest I am assuming is my Gdrive:encrypted:/ .  I guess I am looking for the proper way to form to the source command. or atleast an example of what it should look like.

 

Thanks.

Source: and Destination: are remotes......type "rclone config" in terminal Shell to see a list of your current remotes, edit, and add new ones.

 

You're doing good if you are reading on rclone site. Learn about rclone copy,move,lsf,lsd, and more. Also look into the the various flags (especially --ignore-existing, -v, and others)

Share this post


Link to post
Posted (edited)

I have read some more but still nothing on here is clear how to write the source command.....     Source:Source path..  would it look like   Unraid:/mnt/user/Media/TV Shows/ .  ?  the destination would be Gsuite_Stream:Secure.     like I wish they would have put an example of what it should look like not just what the criteria is. 

Edited by tmoran000

Share this post


Link to post
Posted (edited)
7 hours ago, tmoran000 said:

I have read some more but still nothing on here is clear how to write the source command.....     Source:Source path..  would it look like   Unraid:/mnt/user/Media/TV Shows/ .  ?  the destination would be Gsuite_Stream:Secure.     like I wish they would have put an example of what it should look like not just what the criteria is. 

[[Template core/front/global/commentEditLine is throwing an error. This theme may be out of date. Run the support tool in the AdminCP to restore the default theme.]]

Examples below. I just threw these together in a couple minutes to show you. I don't use any of these. Sync will delete files on destination if they do NOT match source. If you change sync to copy.....then files in destination will NOT be deleted. This is all in the rclone documentation. I believe examples are there as well.


this will sync a local folder on unraid (specifically, the share named "my_documents_share") to a remote named "gdrive" I setup using "rclone config". "gdrive" is a remote I setup that is pointing to my Google Drive account. --ignore existing just makes sure to skip over files that are already synced and -v gives verbose output so you can follow along with what's going on.. Also notice the remote: nomenclature is only required when the source or destination is an actual remote location (like cloud storage).

rclone sync "/mnt/user/my_documents_share/" "gdrive:backup/my_documents_share/" --ignore-existing -v

 

This will sync a folder in the remote "gdrive" to another remote "gdrive2". "gdrive2" is just another remote setup in "rclone config". It could point to a specific folder inside your Google Drive account, it could point to an entirely different Google Drive account altogether. Notice here, both source and destination are remotes.....This means we are just syncing data across two different cloud storage locations. In other words, nothing is being synced locally at all. Also, notice --bwlimit 5M just limits how much bandwidth this sync will consume....so you don't saturate your connection. Here I set it to 5 MB/sec.

rclone sync "gdrive:backup/my_documents_share/" "gdrive2:backup/my_documents_share/" --ignore-existing -v --bwlimit 5M

 

This is a simple command to just see the directories listed in the remote named "gdrive". Just an easy way to see what directories you have in there. You can use a depth flag (can't remember exact wording for it) if you want it to list folders deeper then the top-level.

rclone lsd gdrive:

 

Edited by Stupifier

Share this post


Link to post

I will primarily be using the copy command. I see you are using the source starting at /mnt/, that is very helpful. Now in the documentation on the site it showed the first command in an Rclone copy  as    Source:Source path... Im not sure what to replace the "Source" with.. I understand the path now but what would I change the word source with before the :    so where the ** is what would I put here based on the format Source:Sourcepath Dest:Destpath

Quote

Rclone copy " ******:/mnt/user/media/tvshows/" "Gdrive_Stream:/Secure/"

 

Share this post


Link to post
42 minutes ago, tmoran000 said:

I will primarily be using the copy command. I see you are using the source starting at /mnt/, that is very helpful. Now in the documentation on the site it showed the first command in an Rclone copy  as    Source:Source path... Im not sure what to replace the "Source" with.. I understand the path now but what would I change the word source with before the :    so where the ** is what would I put here based on the format Source:Sourcepath Dest:Destpath

 

You're complicating stuff too much, he gave you a working script. You can just use:

 

rclone copy /mnt/user/Media Gdrive_StreamEN:Media --ignore-existing -v

Just need to change the /mnt/user/Media folder to the folder you want to copy. And same for Gdrive_StreamEN:Media aswell. So if you store it in the folder Secure it would be Gdrive_StreamEN:Secure.

Share this post


Link to post
54 minutes ago, tmoran000 said:

I will primarily be using the copy command. I see you are using the source starting at /mnt/, that is very helpful. Now in the documentation on the site it showed the first command in an Rclone copy  as    Source:Source path... Im not sure what to replace the "Source" with.. I understand the path now but what would I change the word source with before the :    so where the ** is what would I put here based on the format Source:Sourcepath Dest:Destpath

 

If both Source and Destination locations are LOCAL (not remotes/cloud locations).....then colon and stuff before it is not needed in the path. Everything before the colon is in reference to the name of the remote (as you configured in "rclone config"). With that said, re-read my previous post with examples. Sorry, but I can't explain much further on this. It's best to just run commands, observe result, and learn that way.

Share this post


Link to post

Why I have this issue? all the tutorials tell to leave it empty but when I press enter it comes again

 

root@Unraid:~# rclone config
No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n
name> Amazon
Type of storage to configure.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
 1 / A stackable unification remote, which can appear to merge the contents of several remotes
   \ "union"
 2 / Alias for a existing remote
   \ "alias"
 3 / Amazon Drive
   \ "amazon cloud drive"
 4 / Amazon S3 Compliant Storage Provider (AWS, Alibaba, Ceph, Digital Ocean, Dreamhost, IBM COS, Minio, etc)
   \ "s3"
 5 / Backblaze B2
   \ "b2"
 6 / Box
   \ "box"
 7 / Cache a remote
   \ "cache"
 8 / Dropbox
   \ "dropbox"
 9 / Encrypt/Decrypt a remote
   \ "crypt"
10 / FTP Connection
   \ "ftp"
11 / Google Cloud Storage (this is not Google Drive)
   \ "google cloud storage"
12 / Google Drive
   \ "drive"
13 / Hubic
   \ "hubic"
14 / JottaCloud
   \ "jottacloud"
15 / Koofr
   \ "koofr"
16 / Local Disk
   \ "local"
17 / Mega
   \ "mega"
18 / Microsoft Azure Blob Storage
   \ "azureblob"
19 / Microsoft OneDrive
   \ "onedrive"
20 / OpenDrive
   \ "opendrive"
21 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)
   \ "swift"
22 / Pcloud
   \ "pcloud"
23 / QingCloud Object Storage
   \ "qingstor"
24 / SSH/SFTP Connection
   \ "sftp"
25 / Webdav
   \ "webdav"
26 / Yandex Disk
   \ "yandex"
27 / http Connection
   \ "http"
Storage> 3
** See help for amazon cloud drive backend at: https://rclone.org/amazonclouddrive/ **

Amazon Application Client ID.
Enter a string value. Press Enter for the default ("").
client_id> 
This value is required and it has no default.
Enter a string value. Press Enter for the default ("").
client_id> 
This value is required and it has no default.
Enter a string value. Press Enter for the default ("").
client_id> 
This value is required and it has no default.
Enter a string value. Press Enter for the default ("").
client_id> 
This value is required and it has no default.
Enter a string value. Press Enter for the default ("").
client_id> 

 

Share this post


Link to post

Is there a script or something I can type in terminal to watch the status of my rclone uploads? The only way I can see status is if I go into putty and log in and then type my "rclone copy /mnt/share...."

 

Can I type anything in to get the upload screen?

Share this post


Link to post
4 hours ago, bobvilla said:

Is there a script or something I can type in terminal to watch the status of my rclone uploads? The only way I can see status is if I go into putty and log in and then type my "rclone copy /mnt/share...."

 

Can I type anything in to get the upload screen?

Look into the log output file flag in rclone documentation. Just log your output to a file and use tail -f terminal command to see what's been happening anytime you wanna check

Share this post


Link to post

How would one go about creating a log file to monitor the upload process? I see this on the rclone.org

 

 --log-file string

Also, for some strange reason I do not believe the script started when I start my array, ie. I don't see my Backblaze getting any larger. If I run a command to start in terminal will it stop if I close the terminal window?

Share this post


Link to post



How would one go about creating a log file to monitor the upload process? I see this on the rclone.org
 
 --log-file string

Also, for some strange reason I do not believe the script started when I start my array, ie. I don't see my Backblaze getting any larger. If I run a command to start in terminal will it stop if I close the terminal window?



To get log file just include this in your rclone command
--log-file /path/to/logfile/log_filename.txt

.... Just like the documentation says..."string" just means the path and filename of wherever you want the log to output to.

And don't forget to use a -v...... AND your when you use log file flag, your command info will ONLY go to the log file and not be displayed live in terminal window.

If you run any command in terminal, the command will STOP if you close the terminal window.

Share this post


Link to post

Hello,

 

i followed step by step space invader youtube video tutorial.

 

i want to sync my data from unraid to encrypted Google Drive/g suite for plex.

 

everything seems fine, all tests are good.

 

but when i want to copy a file via krusader from unraid to encrypted folder g drive i have a permission write error. And no sync. The gdrive is empty...

 

Maybe i missed something or don’t understand how to sync data from my server to gdrive. 

 

Thanks in advance for your help

0179BB32-AC47-4ED4-A7D7-3F99FF95D233.jpeg

Share this post


Link to post
On 5/13/2019 at 6:14 PM, L0rdRaiden said:

Why I have this issue? all the tutorials tell to leave it empty but when I press enter it comes again

 


root@Unraid:~# rclone config
No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n
name> Amazon
Type of storage to configure.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
 1 / A stackable unification remote, which can appear to merge the contents of several remotes
   \ "union"
 2 / Alias for a existing remote
   \ "alias"
 3 / Amazon Drive
   \ "amazon cloud drive"
 4 / Amazon S3 Compliant Storage Provider (AWS, Alibaba, Ceph, Digital Ocean, Dreamhost, IBM COS, Minio, etc)
   \ "s3"
 5 / Backblaze B2
   \ "b2"
 6 / Box
   \ "box"
 7 / Cache a remote
   \ "cache"
 8 / Dropbox
   \ "dropbox"
 9 / Encrypt/Decrypt a remote
   \ "crypt"
10 / FTP Connection
   \ "ftp"
11 / Google Cloud Storage (this is not Google Drive)
   \ "google cloud storage"
12 / Google Drive
   \ "drive"
13 / Hubic
   \ "hubic"
14 / JottaCloud
   \ "jottacloud"
15 / Koofr
   \ "koofr"
16 / Local Disk
   \ "local"
17 / Mega
   \ "mega"
18 / Microsoft Azure Blob Storage
   \ "azureblob"
19 / Microsoft OneDrive
   \ "onedrive"
20 / OpenDrive
   \ "opendrive"
21 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)
   \ "swift"
22 / Pcloud
   \ "pcloud"
23 / QingCloud Object Storage
   \ "qingstor"
24 / SSH/SFTP Connection
   \ "sftp"
25 / Webdav
   \ "webdav"
26 / Yandex Disk
   \ "yandex"
27 / http Connection
   \ "http"
Storage> 3
** See help for amazon cloud drive backend at: https://rclone.org/amazonclouddrive/ **

Amazon Application Client ID.
Enter a string value. Press Enter for the default ("").
client_id> 
This value is required and it has no default.
Enter a string value. Press Enter for the default ("").
client_id> 
This value is required and it has no default.
Enter a string value. Press Enter for the default ("").
client_id> 
This value is required and it has no default.
Enter a string value. Press Enter for the default ("").
client_id> 
This value is required and it has no default.
Enter a string value. Press Enter for the default ("").
client_id> 

 

Any help here?

Share this post


Link to post
plugin: installing: https://raw.githubusercontent.com/Waseh/rclone-unraid/master/rclone.plg
plugin: downloading https://raw.githubusercontent.com/Waseh/rclone-unraid/master/rclone.plg
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/master/rclone.plg ... done
plugin: downloading: http://downloads.rclone.org/rclone-v1.35-linux-amd64.zip ... failed (Invalid URL / Server error response)
plugin: wget: http://downloads.rclone.org/rclone-v1.35-linux-amd64.zip download failure (Invalid URL / Server error response)

I can't seem to install this plugin. Any ideas?

Share this post


Link to post

Is your plugin up to date? It seems to be trying to pull a pretty old version of rclone.

Share this post


Link to post

Can someone please help me stop the currently running rclone process so i can update?  I've tried killing processes, but I'm not sure I'm doing it right and I dont want to screw anything up.  Here's what I get when attempting to update the plugin.

 

plugin: updating: rclone.plg

+==============================================================================
| Skipping package rclone-2018.08.25-bundle (already installed)
+==============================================================================

Downloading rclone
Downloading certs
Archive: /boot/config/plugins/rclone-beta/install/rclone-beta-latest.zip
creating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/
inflating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/rclone.1 
inflating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/README.txt 
inflating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/git-log.txt 
inflating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/README.html 
inflating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/rclone 
Copy failed - is rclone running?
plugin: run failed: /bin/bash retval: 1

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.