Leaderboard

Popular Content

Showing content with the highest reputation on 12/15/18 in all areas

  1. I just made a very useful change to my scripts that has solved my problem with the limit of only being able to upload 750GB/day, which was creating bottlenecks on my local server as I couldn't upload fast enough to keep up with new pending content. I've added a Teamdrive remote to my setup, that allows me to upload another 750GB/day in addition to the 750GB/day to my existing remote. This is because the 750GB/day limit is per account - by sharing the teamdrive created by my google apps account with another google account I can upload more. Theoretically I could repeat for n extra accounts (each one would need a separate token team drive), but 1 is enough for me. Steps: create new team drive with main google apps account share with 2nd google account create new team drive remotes (see first post) - remember to get token from account in #2 not account in #1 otherwise you won't get 2nd upload quota amend mount script (see first post) to mount new tdrive and change unionfs mount from 2-way union to 3-way including tdrive new upload script to upload to tdrive - my first upload script moves files from the array, and the 2nd from the cache. Another way to 'load-balance' the uploads could be to run one script against disks 1-3 and the other against 4-x add tdrive line to cleanup script add tdrive line to unmount script Optional repeat if need more upload capacity e.g. change 3-way union to 4-way
    2 points
  2. Ok checked the 3rd user is working properly by creating a new remote using this user's token and then mounting it - it decrypted the existing folders and files correctly, so the password/encrypt sync worked 🙂 🙂
    1 point
  3. Those settings will work. You can always enable Log when not moving due to rules to see what its doing every 4 hours.
    1 point