Roudy

Members
  • Posts

    105
  • Joined

  • Last visited

Everything posted by Roudy

  1. If you have everything else working and just the Admin page fails to load, you need to add the Extra Argument variable that is mentioned in the bitnami documentation with the --proxy=edge value. Here is a screen shot of it for reference.
  2. Did you ever get this working, m1rc0? I may have a fix for you if you haven't.
  3. Yeah, I was running it without but we’ve been having a lot of rolling electrical outages where I’m at so I kept getting input/output errors. Had it nearly complete before leaving. I appreciate you knocking it out for everyone. If it works well, add it to GitHub and request a merge. I’m sure there will be more people who’ll need to mass download their data soon. I’ve been syncing for the time being since I haven’t received my notice yet. Sent from my iPhone using Tapatalk
  4. I was working on it but left for vacation. The upload script has a few extra measures that prevents it from running twice by creating a file to reference before running. I’ll try to finish it up in the next few days if no one beats me to it. Then you can set it to run every 10 minutes or whatever to ensure it keeps running all day. Sent from my iPhone using Tapatalk
  5. I actually do this with a friend of mine at the moment. It makes thins so easy to have a shared cloud storage for both of us to access since we are geographically separated. We mainly use Overseerr for requests and both have access to Starr apps to fix anything. I have 1 box that does all the media management and has the R/W service accounts and he has a few RO service accounts so it doesn't get out of sync. It's been working well for years.
  6. I personally store everything on a team drive but I don't have as much data on there as most (around 30TB) and everything is still working for me and I haven't received any emails about it. I've added some additional drives to my setup and have been converting video files to H265 and moving them locally but still using the mount to stream. I'll continue to use it as long as I can. Once it's gone I have a friend who is interested in the unlimited Dropbox to share the cost with me. Just need one more haha!
  7. We appreciate all the hard work and effort you put into your containers!
  8. Uh oh..... It's begun.... https://forum.rclone.org/t/new-limit-unlocked-on-google-drive/36136
  9. Replace all of the IPs with the DNS servers you want to use. I currently have 1.1.1.1,1.0.0.1 in mine.
  10. You change them in the Docker settings under the NAME_SERVERS variable.
  11. Ever find a solution? Trying to play with my son after taking a long break and getting the same error. Just curious if you had found a fix.
  12. Do you have MergerFS set up? You should be able to see the rclone mount there and all the files unencrypted.
  13. First, please delete the file attached to you message. Your username and password is throughout the file. I was having issues as well and updating the DNS fixed it for me. Try updating your nameservers to Cloudflare (1.1.1.1,1.0.0.1) and see if you still have the issue.
  14. You don't have to do the sample part of that website, just enable the API. You should be able to continue on.
  15. Trying to circle back on what I've missed. Have you gotten it to work? You might try running the "Docker Safe New Perms" located under the "Tools" tab and see if that helps at all. You might want to give it a restart as well. If that still doesn't work, we can try to look at your SMB settings.
  16. Sorry for the late reply, I've been away. Have you gotten it to work yet? I noticed you're on 96.9.2 instead of the 96.9.3 which fixed some permissions with the umask. Update your script to the latest version and let us know if you're still having issues.
  17. Can you post your mount script and file permissions of your directories?
  18. I haven't been having issues with that. It's been running pretty good actually. Maybe add a log file and change the log-level to debug to see if there is any indication of what's going on when it happens? It could also be some issue with mergerfs. --log-level DEBUG \ --log-file=/var/log/rclone \
  19. Can you temporarily make it public and see if you're able to edit the files? Want to make sure it is actually inheriting those permissions.
  20. Are you accessing your content through the "mount_rclone" folder? Is it shown under the unRAID shares tab? If so, what are the share settings for the folder?
  21. So far, I haven’t had any issues. I actually thought it was loading some content faster, but that’s just my opinion with what I’ve tested. No proof in that statement. Sent from my iPhone using Tapatalk
  22. Wow! That's taking advantage of the service haha! I hope the unlimited sticks. If they switch to the 5TB for every user, it's going to kind of suck. May have to look at establishing a library and share the encryption keys amongst people who pay to have access. I have something like that set up for a remote server at the moment.
  23. It also says to contact Google Support every 90 days to request more storage.
  24. Did you notice they updated services summary to 5TB for 5 or more users instead of unlimited? I feel like restrictions will be coming in the future...
  25. How long is the loading time for your movie? Is the file over 30GB? Could you post your mount script?