DZMM Posted November 6, 2018 Share Posted November 6, 2018 (edited) 3/1/20 UPDATE: TO MIGRATE FROM UNIONFS TO MERGERFS READ THIS POST. New users continue reading 13/3/20 Update: For a clean version of the 'How To' please use the github site https://github.com/BinsonBuzz/unraid_rclone_mount 17/11/21 Update: Poll to see how much people are storing I've added a Paypal.me upon request if anyone wants to buy me a beer. There’s been a number of scattered discussions around the forum on how to use rclone to mount cloud media and play it locally via Plex, Emby etc. After discussions with @Kaizac @slimshizn and a few others, we thought it’d be useful to start a thread where we can all share and improve our setups. Why do this? Well, if set-up correctly Plex can play cloud files regardless of size e.g. I play 4K media with no issues, with start times of under 5 seconds i.e. comparable to spinning up a local disk. With access to unlimited cloud space available for the cost of a domain name and around $510/pm, then this becomes a very interesting proposition as it reduces local storage requirements, noise etc etc. At the moment I have about 80% of my library in the cloud and I struggle to tell if a file is local or in the cloud when playback starts. To kick the thread off, I’ll share my current setup using gdrive. I’ll try and keep this initial thread updated. Update: I've moved my scripts to github to make it easier to keep them updated https://github.com/BinsonBuzz/unraid_rclone_mount Changelog 6/11/18 – Initial setup (updated to include rclone rc refresh) 7/11/18 - updated mount script to fix rc issues 10/11/18 - added creation of extra user directories ( /mnt/user/appdata/other/rclone & /mnt/user/rclone_upload/google_vfs) to mount script. Also fixed typo for filepath 11/11/18 - latest scripts added to https://github.com/BinsonBuzz/unraid_rclone_mount for easier editing 3/1/20 - switched from unionfs to mergerfs 4/2/20 - updated the scripts to make easier to use and control. Thanks to @senpaibox for the inspiration My Setup Plugins needed Rclone – installs rclone and allows the creation of remotes and mounts. New scripts require V1.5.1+ User Scripts – controls how mounts get created How It Works Rclone is used to access files on your google drive and to mount them in a folder on your server e.g. mount a gdrive remote called gdrive_vfs: at /mnt/user/mount_rlone/gdrive_vfs Mergerfs is used to merge files from your rclone mount (/mnt/user/mount_rlone/gdrive_vfs) with local files that exist on your server and haven't been uploaded yet (e.g. /mnt/user/local/gdrive_vfs) in a new mount /mnt/user/mount_unionfs/gdrive_vfs This mergerfs mount allows files to be played by dockers such as Plex, or added to by dockers like radarr etc without the dockers even being aware that some files are local and some are remote. It just doesn't matter The use of a rclone vfs remote allows fast playback, with files streaming within seconds New files added to the mergerfs share are actually written to the local share, where they will stay until the upload script processes them An upload script is used to upload files in the background from the local folder to the remote. This activity is masked by mergerfs i.e. to plex, radarr etc files haven't 'moved' Getting Started Install the rclone plugin and via command line run rclone config and create 2 remotes: gdrive: - a drive remote that connects to your gdrive account. Recommend creating your own client_id gdrive_media_vfs: - a crypt remote that is mounted locally and decrypts the encrypted files uploaded to gdrive: It is advisable to create your own client_id to avoid API bans. Mount Script - see https://github.com/BinsonBuzz/unraid_rclone_mount for latest script Create a new script using the the user scripts plugin and paste in the rclone_mount script Edit the config lines at the start of the script to choose your remote name, paths etc Choose a suitable cron job. I run this script on a 10 min */10 * * * * schedule so that it automatically remounts if there’s a problem. The script: Checks if an instance is already running, remounts (if cron job set) automatically if mount drops Mounts your rclone gdrive remote Installs mergerfs and creates a mergerfs mount Starts dockers that need the mergerfs mount e.g. plex, radarr Upload Script - see https://github.com/BinsonBuzz/unraid_rclone_mount for latest script Create a new script using the the user scripts plugin and paste in the rclone_mount script Edit the config lines at the start of the script to choose your remote name, paths etc - USE THE SAME PATHS Choose a suitable cron job e.g hourly Features: Checks if rclone is installed correctly sets bwlimits There is a cap on uploads by google of 750GB/day. I have added bandwidth scheduling to the script so you can e.g. set an overnight job to upload the daily quota at 30MB/s, have it trickle up over the day at a constant 10MB/s, or set variable speeds over the day The script now stops once the 750GB/day limit is hit (rclone 1.5.1+ required) so there is more flexibility over upload strategies I've also added --min age 10mins to stop any premature uploads and exclusions to stop partial files etc getting uploaded. Cleanup Script - see https://github.com/BinsonBuzz/unraid_rclone_mount for latest script Create a new script using the the user scripts plugin and set to run at array start (recommended) or array stop In the next post I'll explain my rclone mount command in a bit more detail, to hopefully get the discussion going! Edited November 17, 2021 by DZMM cleaned up so easier to read 9 2 Quote Link to comment
DZMM Posted November 6, 2018 Author Share Posted November 6, 2018 (edited) Key elements of my rclone mount command: rclone mount \ --allow-other \ --buffer-size 256M \ --dir-cache-time 720h \ --drive-chunk-size 512M \ --log-level INFO \ --vfs-read-chunk-size 128M \ --vfs-read-chunk-size-limit off \ --vfs-cache-mode writes \ --bind=$RCloneMountIP $RcloneRemoteName: \ $RcloneMountLocation & --buffer-size: determines the amount of memory, that will be used to buffer data in advance. I think this is per stream --dir-cache-time: sets how long a directory should be considered up to date and not refreshed from the backend. Changes made locally in the mount may appear immediately or invalidate the cache, so if you upload via rclone you can set this to a very high number. If you make changes direct to the remote they won't be picked up until the cache expires --drive-chunk-size: for files uploaded via the mount NOT via the upload script i.e. if you add files directly to /mnt/user/mount_rclone/yourremote. I rarely do this and it's not a great idea --vfs-read-chunk-size: this is the key variable. This controls how much data is requested in the first chunk of playback - too big and your start times will be too slow, too small and you might get stuttering at the start of playback. 128M seems to work for most but try 64M and 32M --vfs-read-chunk-size-limit: each successive vfs-read-chunk-size doubles in size until this limit is hit e.g. for me 128M, 256M,512M,1G etc. I've set the limit as off to not cap how much is requested so that rclone downloads the biggest chunks it can for my connection Read more on vfs-read-chunk-size: https://forum.rclone.org/t/new-feature-vfs-read-chunk-size/5683 Edited February 4, 2020 by DZMM clean up 3 1 Quote Link to comment
DZMM Posted November 6, 2018 Author Share Posted November 6, 2018 (edited) Unionfs works 'ok' but it's a bit clunky as per the scripts above. Rclone are working on their own union which would hopefully include hardlink support unlike unionfs. It possibly will also remove the need for a seperate rclone move script, automating transfers from the local drive to the cloud https://forum.rclone.org/t/advantage-of-new-union-remote/7049/1 Edited November 6, 2018 by DZMM 1 1 Quote Link to comment
slimshizn Posted November 6, 2018 Share Posted November 6, 2018 (edited) 9 hours ago, DZMM said: --rc --rc-addr=172.30.12.2:5572 I see you added this along with fast list. What is the IP? Is that Plex? Going to try it out myself. Edit: Found your reasoning in the rclone forums. Edited November 6, 2018 by slimshizn Quote Link to comment
DZMM Posted November 6, 2018 Author Share Posted November 6, 2018 41 minutes ago, slimshizn said: I see you added this along with fast list. What is the IP? Is that Plex? Going to try it out myself. Edit: Found your reasoning in the rclone forums. It's my unRAID IP address 1 Quote Link to comment
slimshizn Posted November 6, 2018 Share Posted November 6, 2018 Got it, testing now. Quote Link to comment
slimshizn Posted November 6, 2018 Share Posted November 6, 2018 Yup got a connection refused issue. Hrm. Quote Link to comment
DZMM Posted November 6, 2018 Author Share Posted November 6, 2018 Have you tried it without the address bit and just --rc? My firewall setup is a bit complicated, so I'm not sure if other users need to add the address 1 Quote Link to comment
slimshizn Posted November 6, 2018 Share Posted November 6, 2018 (edited) Seems it worked, here's what I got. Quote 2018/11/06 17:56:55 Failed to rc: connection failed: Post http://localhost:5572/vfs/refresh: dial tcp 127.0.0.1:5572: connect: connection refused 2018/11/06 17:56:55 NOTICE: Serving remote control on http://127.0.0.1:5572/ Any need to be able to access that? Edit: I use the cloud storage purely for backup of everything, and have a copy I just tested both locally and on the cloud. Zero difference. Edited November 6, 2018 by slimshizn Quote Link to comment
DZMM Posted November 6, 2018 Author Share Posted November 6, 2018 (edited) 15 minutes ago, slimshizn said: Seems it worked, here's what I got. Any need to be able to access that? yep it's working. Eventually it will say: { “result”: { “”: “OK” } } What it's doing is loading/pre-populating your local directory cache with all your cloud library folders i.e. you'll get a better browsing experience once it's finished e.g. plex will do library scans faster I haven't used rclone rc yet or looked into it - I think it allows commands that can't be done via command line. Edited November 6, 2018 by DZMM 1 Quote Link to comment
slimshizn Posted November 7, 2018 Share Posted November 7, 2018 It never ended up saying result OK, but it seems to be working fine and viewing the share seems quicker than usual which is nice. Quote Link to comment
DZMM Posted November 7, 2018 Author Share Posted November 7, 2018 (edited) 7 hours ago, slimshizn said: It never ended up saying result OK, but it seems to be working fine and viewing the share seems quicker than usual which is nice. I just updated the mount script - give it another whirl as it ran quite fast for me. Try a full Plex scan and you'll see the speed difference Edited November 7, 2018 by DZMM 1 Quote Link to comment
NewDisplayName Posted November 7, 2018 Share Posted November 7, 2018 unbelievable cool stuff. If it would work, we dont even really need any large local storage anymore. What providers u use? If i look for gdrive it says 45€ (3 user min) for unlimited storage... Quote Link to comment
DZMM Posted November 7, 2018 Author Share Posted November 7, 2018 40 minutes ago, nuhll said: unbelievable cool stuff. If it would work, we dont even really need any large local storage anymore. What providers u use? If i look for gdrive it says 45€ (3 user min) for unlimited storage... sorry, it's $10/pm full price - ignore the 3/5 user min for unlimited as they don't enforce it. I have one account: root@Highlander:~# rclone about gdrive: Used: 79.437T Trashed: 1.158T Other: 2.756G There are usually 20% etc coupons for the first year if you shop around. 1 Quote Link to comment
NewDisplayName Posted November 7, 2018 Share Posted November 7, 2018 (edited) 13 minutes ago, DZMM said: sorry, it's $10/pm full price - ignore the 3/5 user min for unlimited as they don't enforce it. I have one account: root@Highlander:~# rclone about gdrive: Used: 79.437T Trashed: 1.158T Other: 2.756G There are usually 20% etc coupons for the first year if you shop around. WTF. Thats crazy. But what happens when they enforce it sometime... SO i could get unlimited for 15€/mon atm how it comes u only pay 10usd? For 10€ it says me 3 TB max. Edited November 7, 2018 by nuhll Quote Link to comment
DZMM Posted November 7, 2018 Author Share Posted November 7, 2018 (edited) 11 minutes ago, nuhll said: WTF. Thats crazy. But what happens when they enforce it sometime... SO i could get unlimited for 15€/mon atm how it comes u only pay 10usd? FOr 10€ it says me 3 TB max. Not sure where you're based but this is the UK price - and I got 20% off for the first 12 months: https://gsuite.google.co.uk/intl/en_uk/pricing.html A lot of people have been going for a while on this, so the limit enforcement doesn't seem an immediate threat. If they do start enforcing the 5 user min, I guess users could combine accounts in blocks of five and pay for one seat each - people do this already just in case. Edited November 7, 2018 by DZMM 2 Quote Link to comment
NewDisplayName Posted November 7, 2018 Share Posted November 7, 2018 (edited) Good idea. BTW, lol... https://gsuite.google.co.uk/intl/de_de/pricing.html How much RAM does rclone use for u? Edited November 7, 2018 by nuhll Quote Link to comment
DZMM Posted November 7, 2018 Author Share Posted November 7, 2018 One potential drawback is for each stream you have to be able to support the bitrate e.g 20Mbps, so if you don't have decent bandwidth this isn't a go-er - although if you don't it wouldn't be anyway, as you need to be able to upload all your content! I have 300/300 which is good enough for about 4-5x 4K or about 20-30x 1080P streams, although my usage is nowhere near this. 1 Quote Link to comment
NewDisplayName Posted November 7, 2018 Share Posted November 7, 2018 (edited) Ive got around 50 or 16mbits depending on which line it goes... does it support multithread? Anyway plex should regulate quality depending on line speed, correct? Edited November 7, 2018 by nuhll Quote Link to comment
DZMM Posted November 7, 2018 Author Share Posted November 7, 2018 1 hour ago, nuhll said: How much RAM does rclone use for u? Up to 1GB per stream: --buffer-size 1G Lower if you don't have the RAM. Some users use as little as 100MB. 59 minutes ago, nuhll said: Ive got around 50 or 16mbits depending on which line it goes... does it support multithread? Anyway plex should regulate quality depending on line speed, correct? Is that up or down? Because you need to (over the duration of the show or movie - the playback starts streaming straightaway) transfer all the file from gdrive to your local plex for playback, you need to be able to manage the average bitrate - in it's simplest form for a 60min 10GB file giving an average bitrate of 22Mbps you need this much bandwidth (the film won't be a constant 22Mbps - some bits will be higher and lower) on average to play. With a fast connection, rclone will grab it quicker depending on your chunk settings - so you'll see high usage for a few mins then bursty traffic afterwards. Remote access works the same way from your plex server to wherever - after you've got the file from gdrive. If you don't have enough bandwidth downstream, some users have paid for cheap dedicated servers/VPS with big pipes to host Plex there so they can support lots of users without hitting their local connection. I think @slimshizn does this Quote Link to comment
NewDisplayName Posted November 7, 2018 Share Posted November 7, 2018 (edited) Thats download. But i get a upgrade, latest in 1-2 Months. Probably atleast 100Mbits. Upload is slow tho, only 10mbits x 2. Lets say i download a movie, while its uploading to gdrive, its still acessable localy and only gets deleted when the upload is finished? When i start a movie, i can watch before its again complete downloaded, correct? I only need to support 1-2 users max... ^^ Edited November 7, 2018 by nuhll Quote Link to comment
NewDisplayName Posted November 7, 2018 Share Posted November 7, 2018 Only thing what im missing is remote encryption of the files, that would be a huge thing. Quote Link to comment
DZMM Posted November 7, 2018 Author Share Posted November 7, 2018 36 minutes ago, nuhll said: Lets say i download a movie, while its uploading to gdrive, its still acessable localy and only gets deleted when the upload is finished? yes 36 minutes ago, nuhll said: Upload is slow tho, only 10mbits x 2. you're no worse off with this setup than before for plex RAS. Uploading to gdrive will be slow though, but the files stay local until they are complete 37 minutes ago, nuhll said: When i start a movie, i can watch before its again complete downloaded, correct? yes it streams the movie while in the background rclone downloads in chunks. For my --vfs-read-chunk-size 128M It downloads a 128M chunk first then starts playing - that's how it starts in seconds. Then it keeps trying to double the next chunk it requests in the background - 256M, 512M, 1G etc etc 16 minutes ago, nuhll said: Only thing what im missing is remote encryption of the files, that would be a huge thing. It is encrypted - you are mounting the remote gdrive_media_vfs which encrypts the files when they are actually stored on gdrive. When you setup the gdrive_media_vfs remote choose crypt. A good how-to here: https://hoarding.me/rclone/. Where it mentions two encrypted remotes, I just use one gdrive_media_vfs and then create sub-folders inside the mount for my different types of media: Quote We’re going to encrypt everything before we upload it so this adds another layer to the process. How this works is you create remotes to your cloud storage, then we create an encrypted remote on top of the normal remote. These encrypted remotes, one for TV and one for movies are the ones we’ll be using for uploading. We’ll then be creating two more remotes afterwards to decrypt the plexdrive mounts. So 5 in total. To run it, rclone config. Select N for a new remote, and just name it ‘gd’ then select 7 for GD. This is the underlying remote we’ll use for our crypts. Follow this link to create a client ID and secret, and use them for the next two prompts in the rclone config. After this, select N, and then copy the link provided and use it in your browser. Verify your google account and paste the code returned, then Y for ‘yes this is ok’ and you have your first remote! Next we’re going to setup two encrypted remotes. Login to GD and create two folders, tv-gd and m-gd. Run rclone config again, N for new remote, then set the name as tv-gd, and 5 for a crypt. Next enter gd:/tv-gd, and 2 for standard filenames. Create or generate password and an optional salt, make sure you keep these somewhere safe, as they’re required to access the decrypted data. Select Y for ‘yes this is ok’. Then you can do the same for the second one, using the name m-gd, and the remote gd:/m-gd. There’s our two encrypted remotes setup Quote Link to comment
NewDisplayName Posted November 7, 2018 Share Posted November 7, 2018 (edited) 26 minutes ago, DZMM said: yes you're no worse off with this setup than before for plex RAS. Uploading to gdrive will be slow though, but the files stay local until they are complete yes it streams the movie while in the background rclone downloads in chunks. For my --vfs-read-chunk-size 128M It downloads a 128M chunk first then starts playing - that's how it starts in seconds. Then it keeps trying to double the next chunk it requests in the background - 256M, 512M, 1G etc etc It is encrypted - you are mounting the remote gdrive_media_vfs which encrypts the files when they are actually stored on gdrive. When you setup the gdrive_media_vfs remote choose crypt. A good how-to here: https://hoarding.me/rclone/. Where it mentions two encrypted remotes, I just use one gdrive_media_vfs and then create sub-folders inside the mount for my different types of media: So you only use one "remote", good idea , i guess. Thats really awesome, if i find some time, ill implement it. I guess since i have slower internet ill lower chunk size so it starts faster, only drawdown would be reduced speed (but ii have slower internet speed anyway, and maybe more CPU useage, which should be np for 1 or max 2user) Also i would change your script so it only uplaods files older then e.g. 1 year, so i dont waste time uploading "bad movies". I wonder if it would be possible to uploadonly files to gdrive when 2 ips (local) are not reachable. So it doesnt interfer with other network activitys. Edited November 7, 2018 by nuhll Quote Link to comment
DZMM Posted November 7, 2018 Author Share Posted November 7, 2018 (edited) 7 minutes ago, nuhll said: So you only use one "remote", good idea , i guess. Thats really awesome, if i find some time, ill implement it. I guess since i have slower internet ill lower chunk size so it starts faster, only drawdown would be reduced speed (but ii have slower internet speed anyway, and maybe more CPU, which should be np for 1 or max 2user) Experiment - too low and you'll get buffering/stuttering at the start. If 128M is too big, try 64M and maybe then even 32M. I think you won't need lower than 64M. It's light on CPU as it's hardly any different to playing a file off your local drive Edited November 7, 2018 by DZMM Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.