Jump to content

Stupifier

Members
  • Posts

    278
  • Joined

  • Days Won

    1

Posts posted by Stupifier

  1. On 5/28/2023 at 10:26 PM, Matthew_K said:

    Finally, I believe it would be more user-friendly if the Docker Compose interface opened in a new window or tab. Currently, when opened, it expands downward, requiring users to scroll down the page to see what was deployed.

    This! It would be excellent to have the Docker Compose Interface be more easily accessible. I hate going to the docker page only to wait forever for all my various non-compose dockers to load on the page...then scroll all the way down to the bottom to do compose stuff.

    Something/Anything more streamlined would be appreciated.

    Good work on this plugin overall though, I love it.

  2. 1 minute ago, tmchow said:

    Super basic question that I coiuldn’t find answer to… since this is installed via a plugin, is there any issue with rebooting the system and losing config? Or does the rclone config we add survive reboots?

    Config will survive reboot

    • Upvote 1
  3. How do I run any CA Userscript Script from within a Terminal Session?
     

    I know copies of scripts are held and ran from " /tmp/user.scripts/tmpScripts"....but when I go there via Terminal...I don't see ALL of my scripts listed (only a subset). It appears the only scripts listed here are ones which have been previously executed through Userscripts GUI.

    I know scripts are also located in "/boot/config/plugins/user.scripts/scripts" but those are all buttoned up with permissions restricting run (probably for good reason too since its the flash drive).

  4. On 3/28/2021 at 6:02 AM, RinxKninks said:

    Don't know, if your question has been answered, so https://rdiff-backup.net/ states:

    ..."In August 2019 Eric Lavarde with the support of Otto Kekäläinen from Seravo and Patrik Dufresne from Minarca took over, completed the Python 3 rewrite and finally released rdiff-backup 2.0 in March 2020."

    Ok.....not sure how that helps me install rdiff-backup on unraid though???? Are you suggesting a different command?

  5. 35 minutes ago, learningunraid said:

    Only if I knew how to do that? Because, Sonarr/Radarr runs automatically.

    Unraid GUI --> Docker Tab --> Uncheck Autostart next to Sonarr/Radarr....now they do not run automatically......

  6. FWIW,

    I'm running version 2020.09.29 of the rclone plugin with absolutely no issues on 6.9 RC2. Been running for months. I don't know if this is the most recent version of the plugin or not but it kinda doesn't matter.
    You can update rclone from within the plugin so.....ya, this is great!

    Also worth mentioning, I don't recall exactly if this is what I did...but I believe I had this version of the rclone plugin installed prior to updating to any 6.9 release. Then upgraded to 6.9. Again, no issues.

  7. 1 hour ago, CS01-HS said:

    For what it's worth I've been running 6.9-RC2 since it was released and although the latest version of the plugin prevents me from installing it, the version prior has worked consistently in my weekly runs.

    Same.

  8. 1 hour ago, JNCK said:


    I’m trying to copy files from my Unraid server to Google Drive using a VPN without having to route all the traffic of my Unraid box through a VPN.


    Verzonden vanaf mijn iPhone met Tapatalk

    No idea how to do that but seriously.....upload the files to Google Drive without the VPN. Google truly does NOT care. This isn't like you are seeding torrents, you don't need a VPN in this instance.

  9. 9 hours ago, questionbot said:

    Hi...  this thread is 32 pages long.. so not sure if this has been discussed, I am sure it has but I couldn't find it....

     

    I watched SpaceInvaders rClone tutorial - 

     

     

    In it he tells us to boot into GUI mode, but my server is completely headless and I am not able to put a monitor and keyboard on it with out a ton of effort...

     

    I was hoping there is a way to install rclone without GUI mode.... he dose mention briefly that there is but dose not actually show how to do it.

     

    TL;DR How do I install and setup the plugin with out GUI Mode turned on?

     

    Type "rclone config" in a terminal session and follow along with all the prompts. When you reach a prompt which asks if you are Headless, say YES. And just follow the prompt. It'll ask you to go to a link, auth your account, and paste some code back in the terminal. That's all.

     

    And for FWIW, this is the exact same procedure for any headless device. Unraid is not special. So you can Google any setup tutorial for help on this. I'm not exactly sure why SpaceInvaderOne had all the GUI mode discussion stuff in his video...but his video there is very old.

  10. 18 minutes ago, willm said:

    Thanks for the suggestion Stupifier. I didn't realize the download limit was 10TB -- I had been thinking about the service account approach, but actually I think with the 10TB limit, it's not needed. I could fully saturate my gigabit down connection and not hit 10TB a day.

     

    I've started a typical rclone sync from my decrypted mount to a new directory. It seems to be running fine ATM, weeks to go :)

    50TB of total data. I'd do your transfer in chunks...I'd also do it in a tmux/screen terminal session to avoid disconnects

  11. 11 minutes ago, willm said:

    Hey,

     

    I've been using rclone for a while on Google Drive (Encrypted), and have now decided to take it all locally -- ~50TB in total.

     

    What's the right way to think about copying all of this locally? I know the 750GB/day limit will make this take forever, so thinking about the correct way to set up rclone to copy data from a remote to local.

     

    Presumably the plan would need to handle internet downtime / happen automatically, so I guess that I'd want the reverse of rclone/mergerfs.

     

     

    So the setup seems like it'd be:

     

    1. stop uploading data to google drive

    2. long-running rclone ? script to copy to a local directory

     

    My config looks like

     

    [gdrive]
    type = drive
    client_id = x
    client_secret = x
    scope = drive
    root_folder_id =
    service_account_file =
    token = x

    [gcrypt]
    type = crypt
    remote = gdrive:/encrypt
    filename_encryption = standard
    directory_name_encryption = true
    password = x
    password2 = x

     

    and my unraid script is https://pastebin.com/uKWECVSx

    1. 750 GB/Day is for UPLOADS to Google. It's about 10 TB/Day for DOWNLOADS from Google.
    2. You can workaround these limits with the use of Google Service Accounts (no I won't explain how you set that up)
    3. There are scripts on github such as sasync to help you download to local. These scripts are capable of performing service account cycling so you can download/upload a lot per day. Google to find this in github.
  12. 13 minutes ago, Hoopster said:

    Why wait for testdasi to integrate UUD json? You can download the json and import it yourself as a dashboard into Grafana in GUS.  That's how we all did it before testdasi integrated UUD in as a dashboard option.

    Thought there was more to it than that......isn't there? I mean...shit....there is even an additional container (varken). I'm sure its more complicated than simply dumping a json.

  13. On 12/7/2020 at 2:27 PM, Stupifier said:

    How do I run any CA Userscript Script from within a Terminal Session?

    1. I know copies of scripts are held and ran from " /tmp/user.scripts/tmpScripts"....but when I go there via Terminal...I don't see ALL of my scripts listed (only a subset). It appears the only scripts listed here are ones which have been previously executed through Userscripts GUI.
    2. I know scripts are also located in "/boot/config/plugins/user.scripts/scripts" but those are all buttoned up with permissions restricting run (probably for good reason too since its the flash drive).

    Any help at all regarding this.....Sometimes its just plain easier for me to run stuff from a terminal instead of the unraid GUI....and I'd rather not clone all of my userscripts to another location just so I can run them via terminal. But if that's how its gotta be....ok....

  14. 1 hour ago, Archemedees said:

    Hey everyone, I am looking to setup something similar to what I have seen used for Plex servers etc, however I only want to offload data to my google drive. That said, I done even need access to the data on my server once its uploaded nor do I want it encrypted.  Does any one have a guide or can point me in the right direction? 

     

    IE.  Share on my server call "gdrive". Whenever I drop a file in said share, it uploads it to my google drive with no encryption then deletes it from the "gdrive" share. 

    Research a github project called cloudplow. It does exactly what you want. I will NOT walk you through setup for it. Review the readme carefully.

     

    You MAY have to use a docker version of cloudplow if Unraid has issues installing it the standard way.

  15. 7 hours ago, livingonline8 said:

    Yes, I am have been using unionfs all those years... did anything change with that? 

     

    This is why I was saying everything was working perfectly fine. 

     

     

    Btw, Now that I am planning to download everything I have uploaded to g suite back to my server. 

     

    Is there a download limit like the upload limit 

     

    I have about 40TB of data on g suite and I wanna download them back, any recommended flags to limit the download speed and not hit a daily download limit. Also, I don't want the download to suck all my bandwidth too... my download speed is about 200 M

    Ok. I did not know you are using unionfs. Provide your unionfs mount command. This matters in solving your issue.

    use this command to copy stuff from rclone remote to your local array storage.

    rclone copy "remote:some_directory" "/mnt/user/some_share/" --bwlimit 10M -v -P


    That will limit the download speed to 10 MB/second. You can download about 10 Terabytes per day I think. The download limit is higher than the upload limit. You should also use the --dry-run and -v flag. The dry run flag is used for testing your command (nothing actually transfers) and the -v flag is to output more information in the terminal window so you can see what is happening. And the -P flag is to see progress live as things are happening.

    Again, I HIGHLY recommend you look over all the rclone flags available in https://rclone.org/flags/
    I found the --bwlimit flag from that webiste. Trying to help you learn instead of just asking questions over and over.

  16. 8 hours ago, causefx said:

    I will try and help where I can but we're not entirely sure where the issue really is.  

     

    I'm not running unraid ATM but my ubuntu server is on kernel 4.4.0-190-generic - I'm running 6 instances of Organizr thru docker fine currently.  

     

    Currently the only difference we know of is the kernel...  i'm open to anything i can do on my end.

    Thank you so much causefx. I know you always help as best you are able. Really appreciative.

    Please let me know if I can provide ANYTHING to help you at all.

×
×
  • Create New...