yendi Posted August 16, 2019 Share Posted August 16, 2019 1 minute ago, Bolagnaise said: Thankyou so much, i have only 10GB of ram currently and was seeing rclone crashes in unraid and out of memory issues. Im upgrading to 16GB tommorow Cool to know that I my issue is helping others ! I can confirm that since I upgraded my RAM I have not faced any issues (added 8 gb) and lowered the buffer to 128mb. For now I have noticed no difference between 128 and 256 buffer size on a 1000/300 fiber. Cheers ! Quote Link to comment
yendi Posted August 16, 2019 Share Posted August 16, 2019 @DZMM Quick question, if I put a --min-age 15d to use it as a somewhat local cache will it interfer with the directory caching time or any setting ? My idea is to leave a new file few days locally as for example if a new episode of a show is out, many people that have access to my server will watch it few days after. Would it work ? Thanks Quote Link to comment
DZMM Posted August 16, 2019 Author Share Posted August 16, 2019 4 hours ago, yendi said: For now I have noticed no difference between 128 and 256 buffer size on a 1000/300 fiber. Cheers ! That's interesting - are you playing any high bitrate movies or 4k? Quote Link to comment
yendi Posted August 16, 2019 Share Posted August 16, 2019 Just now, DZMM said: That's interesting - are you playing any high bitrate movies or 4k? Playing only 1080 Remux & some 4K. Why ? Do you find a noticeable difference? Quote Link to comment
DZMM Posted August 16, 2019 Author Share Posted August 16, 2019 25 minutes ago, yendi said: @DZMM Quick question, if I put a --min-age 15d to use it as a somewhat local cache will it interfer with the directory caching time or any setting ? My idea is to leave a new file few days locally as for example if a new episode of a show is out, many people that have access to my server will watch it few days after. Would it work ? Thanks Nope - rclone move just won't upload anything local that's younger than 15d. I've set mine to 15min (I think) because I don't want anything local. Quote Link to comment
yendi Posted August 16, 2019 Share Posted August 16, 2019 Just now, DZMM said: Nope - rclone move just won't upload anything local that's younger than 15d. I've set mine to 15min (I think) because I don't want anything local. Ok thanks ! So I will put 15D so I have a local 15 days cache to ease up thumbnail creation and prevent a huge bandwidth usage when many people are playing a recent content. Do you have some plex settigns that you deactivate as they are incompatible with Rclone? I have left everything on (even the partial scans etc) and did not see any issue. Quote Link to comment
DZMM Posted August 16, 2019 Author Share Posted August 16, 2019 Just now, yendi said: Playing only 1080 Remux & some 4K. Why ? Do you find a noticeable difference? Just was wondering / wanted more info as it might help other people low on ram. I haven't messed with my settings for around a year - I just erred in the side of caution when I posted my scripts. I was close to messing with my settings this weekend as I've had some recent budfering, but that was me making bad changes to my unifi APs which I realised last night and all is ok again. Quote Link to comment
DZMM Posted August 16, 2019 Author Share Posted August 16, 2019 3 minutes ago, yendi said: Ok thanks ! So I will put 15D so I have a local 15 days cache to ease up thumbnail creation and prevent a huge bandwidth usage when many people are playing a recent content. Do you have some plex settigns that you deactivate as they are incompatible with Rclone? I have left everything on (even the partial scans etc) and did not see any issue. Just Thumbnail creation for the reasons you've listed above ;-). I had a big pre-existing library that didn't have thumbnails - it would also take forever for my server to create and I'm not sure if they are needed. I don't miss them. Other guides recommend disabling doing the media analysis due to potential API bans, but I've never had a problem even when adding new content to Plex/google continuously at up to 1Gbps continuously for multiple days at times. Quote Link to comment
yendi Posted August 16, 2019 Share Posted August 16, 2019 1 minute ago, DZMM said: Just Thumbnail creation for the reasons you've listed above ;-). I had a big pre-existing library that didn't have thumbnails - it would also take forever for my server to create and I'm not sure if they are needed. I don't miss them. Other guides recommend disabling doing the media analysis due to potential API bans, but I've never had a problem even when adding new content to Plex/google continuously at up to 1Gbps continuously for multiple days at times. Thanks for all those info man! Really you make me save thousands in HDD Have a nice weekend 1 Quote Link to comment
DZMM Posted August 16, 2019 Author Share Posted August 16, 2019 6 hours ago, yendi said: Thanks for all those info man! Really you make me save thousands in HDD Have a nice weekend Same here. I've just finished selling 7 HDDs on eBay, including my Parity drive as I have a less real-time backup strategy now (to a seperate teamdrive on a cron job). The noise and heat reduction is really noticeable and of my remaining 16TB, I'm only using around 20% - with nearly 350TB in the cloud which would cost me over £7.5k just for the drives! 1 Quote Link to comment
JonathanM Posted August 16, 2019 Share Posted August 16, 2019 28 minutes ago, DZMM said: with nearly 350TB in the cloud which would cost me over £7.5k just for the drives! Quick question. How feasible would it be to set up another machine at a different IP with read only access to that same 350TB? Quote Link to comment
DZMM Posted August 16, 2019 Author Share Posted August 16, 2019 (edited) 10 minutes ago, jonathanm said: Quick question. How feasible would it be to set up another machine at a different IP with read only access to that same 350TB? Very feasible - lots of ways to do it: 1. Mount the same remote with same decryption passwords: no issues with reading from multiple mounts. Writing will work but there may be a lag for files appearing on other mounts until the directory cache expires. If more than one mount try to overwrite the same file only one will win, but no file corruption 2. Create a tdrive and share with 2nd gdrive account and then mount with same decryption passwords: same health warning as above 3. Easiest way - share plex servers! Although this will mean one machine doing all the lifting i.e. would be inefficient for bandwidth Edited August 16, 2019 by DZMM Quote Link to comment
Harro Posted August 17, 2019 Share Posted August 17, 2019 (edited) Just finished reading every post in this thread and a big thumbs up to @DZMM for creating these scripts. Before I start on this a few questions: 1) Since Plex is running on your server, and you might share your content with family members, the path in plex goes to your google drive, does that mean that your family members now will get the content from the cloud share or does your server download it first and then send it to the family members? 2) If the cloud drive downloads right to the family members, couldn't they set their own Plex up if you gave them access to the shared folder? 3) Would trans coding be a thing of the past, since everything would now becoming from the cloud only dictated by family download speeds? I /myself might not use this but is very interesting for a combining of media on multiple plex servers. Edited August 17, 2019 by Harro Quote Link to comment
NewDisplayName Posted August 17, 2019 Share Posted August 17, 2019 (edited) 1. cloud -> server -> family 2. see 1 but u can "link" unlimited plex server sto the same drive (which would be pretty dumb if u ask me) 3. transcode or not depends on the client, like a normal plex installation. just remmeber the whole thing appears to be a normal directory for your server/software, thats the whole deal. Edited August 17, 2019 by nuhll 1 Quote Link to comment
DZMM Posted August 17, 2019 Author Share Posted August 17, 2019 6 minutes ago, nuhll said: 1. cloud -> server -> family 2. see 1 3. transcode or not depends on the client, like a normal plex installation. Unless they are going to create their own mounts the simplest solution is to share your Plex server 1 1 Quote Link to comment
Harro Posted August 17, 2019 Share Posted August 17, 2019 Thanks for the replies. I use Kodi for my front end and would like to keep my files but with setting this up, I imagine I would have less buffering on my server once all files are on cloud and family members start to access those files instead of on my server. Missing the steps in here for set up and skipping to my concerns. In order to preserve my files I could just copy them to the upload folder correct? Then point Plex to the cloud drive. This would then take the load off of my server and place it on the clients. Correct on this? Quote Link to comment
Harro Posted August 17, 2019 Share Posted August 17, 2019 One other thing, will a plex pass be needed for setting this up? Quote Link to comment
NewDisplayName Posted August 17, 2019 Share Posted August 17, 2019 (edited) I dont understand your questions, why should plex pass be needed? Do you need plex pass for your current plex server? Oo "n order to preserve my files I could just copy them to the upload folder correct? Then point Plex to the cloud drive. This would then take the load off of my server and place it on the clients. Correct on this?" I dont know what you mean. Files in upload folder get uploaded and then local removed. Again: if you follow this tutorial, the files in the cloud appear like a normal directory on your server/software, you can / cant do the same things. This setup is not meant as backup, if you ask that. Its replacing local files with cloud files. For clients only bottleneck is your upload speed. The deal here is, that you get unlimited storage for a buck... Edited August 17, 2019 by nuhll Quote Link to comment
Harro Posted August 17, 2019 Share Posted August 17, 2019 5 minutes ago, nuhll said: Then point Plex to the cloud drive. This would then take the load off of my server and place it on the clients. Correct on this?" I dont know what you mean. I mean my plex server running now I transcode the files local , on the cloud the clients would be transcoding or direct stream of those same files not my server local. Why would my upload speed affect the clients? They would be downloading from the cloud. My files would be on that cloud, What would I be uploading? Quote Link to comment
NewDisplayName Posted August 17, 2019 Share Posted August 17, 2019 (edited) 35 minutes ago, Harro said: I mean my plex server running now I transcode the files local , on the cloud the clients would be transcoding or direct stream of those same files not my server local. Why would my upload speed affect the clients? They would be downloading from the cloud. My files would be on that cloud, What would I be uploading? Again: They download from your Server, your server download from cloud. For plex its just a normal folder. When something accesses it, it gets streamed from cloud. Edited August 17, 2019 by nuhll Quote Link to comment
Harro Posted August 18, 2019 Share Posted August 18, 2019 Sorry You know you focus so hard on what your mind set is, you forget to see what is right in front of you. Quote Link to comment
sol Posted August 18, 2019 Share Posted August 18, 2019 I posted this in [Plugin] rclone but crossposting here because I got most of my setup from this guide. Everything has been working great with rclone since I set it up about a month ago. This weekend though, I've lost the unionfs mount. I've shutdown unraid and rebooted and it doesn't seem to want to come back. Manually running ( in background) my rclone_unmount script and then running my rclone_mount script ( in background) always yields the same error in the log. 18.08.2019 08:50:01 INFO: Check rclone vfs already mounted. fuse: mountpoint is not empty fuse: if you are sure this is safe, use the 'nonempty' mount option 18.08.2019 08:50:01 CRITICAL: unionfs Remount failed. Script Finished Sun, 18 Aug 2019 08:50:01 -0500 my mount mount_unionfs isn't empty as I have a movies directory there, the movies directory is empty though. Should I just add the nonempty mount option or is there a different best practice or is something else going on? Any help is appreciated. Quote Link to comment
DZMM Posted August 18, 2019 Author Share Posted August 18, 2019 @sol there's something in the unionfs mountpoint that shouldn't be there. How are checking? Have you tried mc? Quote Link to comment
sol Posted August 18, 2019 Share Posted August 18, 2019 1 hour ago, DZMM said: @sol there's something in the unionfs mountpoint that shouldn't be there. How are checking? Have you tried mc? Terminal. There is nothing there but the movies directory and it's empty. Quote Link to comment
DZMM Posted August 18, 2019 Author Share Posted August 18, 2019 59 minutes ago, sol said: Terminal. There is nothing there but the movies directory and it's empty. Maybe the error is for the rclone mount path - Is it empty? Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.