axeman Posted December 21, 2020 Share Posted December 21, 2020 Strangely - my array went down after an upload finished. Tried to run the upload script again and am getting this now: Quote touch: cannot touch '/mnt/user/appdata/other/rclone/remotes/cloud_Videos/upload_running_daily_upload': Transport endpoint is not connected Quote Link to comment
DZMM Posted December 21, 2020 Author Share Posted December 21, 2020 22 minutes ago, axeman said: Strangely - my array went down after an upload finished. Tried to run the upload script again and am getting this now: Probably the same weird ? problem Quote Link to comment
axeman Posted December 21, 2020 Share Posted December 21, 2020 4 minutes ago, DZMM said: Probably the same weird ? problem hmm the only thing that I did before it happened was move some files around. my local share had a /videos/TV_Classics and a /videos/cloud_Videos/TV_Classics I moved the entire folder (using a Windows machine) from /local/Videos/ to /local/cloud_Videos That was basically the last thing I did (re-parent the directory). a quick reboot fixed it - just curious if the reparenting did it. Quote Link to comment
francrouge Posted December 23, 2020 Share Posted December 23, 2020 Hi all Quick question, Does the upload script stops when it reach the 750 gb limit and wait 24h ? thx Quote Link to comment
BRiT Posted December 23, 2020 Share Posted December 23, 2020 1 minute ago, francrouge said: Hi all Quick question, Does the upload script stops when it reach the 750 gb limit and wait 24h ? thx FWIU, You should be using Service Accounts so you have no upload quota. Quote Link to comment
francrouge Posted December 23, 2020 Share Posted December 23, 2020 (edited) 12 minutes ago, BRiT said: FWIU, You should be using Service Accounts so you have no upload quota. what does it mean i got an drive api but thats it ? Thx Edited December 23, 2020 by francrouge Quote Link to comment
privateer Posted December 25, 2020 Share Posted December 25, 2020 On 12/23/2020 at 10:53 AM, francrouge said: what does it mean i got an drive api but thats it ? Thx Are you using the team drives with service accounts? Quote Link to comment
francrouge Posted December 25, 2020 Share Posted December 25, 2020 Are you using the team drives with service accounts?No team drive just the drive api.ThxEnvoyé de mon Pixel 2 XL en utilisant Tapatalk Quote Link to comment
privateer Posted December 26, 2020 Share Posted December 26, 2020 post a copy of your script Quote Link to comment
francrouge Posted December 26, 2020 Share Posted December 26, 2020 12 hours ago, privateer said: post a copy of your script #!/bin/bash ###################### ### Upload Script #### ###################### ### Version 0.95.5 ### ###################### ####### EDIT ONLY THESE SETTINGS ####### # INSTRUCTIONS # 1. Edit the settings below to match your setup # 2. NOTE: enter RcloneRemoteName WITHOUT ':' # 3. Optional: Add additional commands or filters # 4. Optional: Use bind mount settings for potential traffic shaping/monitoring # 5. Optional: Use service accounts in your upload remote # 6. Optional: Use backup directory for rclone sync jobs # REQUIRED SETTINGS RcloneCommand="move" # choose your rclone command e.g. move, copy, sync RcloneRemoteName="gdrive_media_vfs" # Name of rclone remote mount WITHOUT ':'. RcloneUploadRemoteName="gdrive_media_vfs" # If you have a second remote created for uploads put it here. Otherwise use the same remote as RcloneRemoteName. LocalFilesShare="/mnt/user/mount_rclone_upload" # location of the local files without trailing slash you want to rclone to use RcloneMountShare="/mnt/user/mount_rclone" # where your rclone mount is located without trailing slash e.g. /mnt/user/mount_rclone MinimumAge="15m" # sync files suffix ms|s|m|h|d|w|M|y ModSort="ascending" # "ascending" oldest files first, "descending" newest files first # Note: Again - remember to NOT use ':' in your remote name above # Bandwidth limits: specify the desired bandwidth in kBytes/s, or use a suffix b|k|M|G. Or 'off' or '0' for unlimited. The script uses --drive-stop-on-upload-limit which stops the script if the 750GB/day limit is achieved, so you no longer have to slow 'trickle' your files all day if you don't want to e.g. could just do an unlimited job overnight. BWLimit1Time="01:00" BWLimit1="off" BWLimit2Time="08:00" BWLimit2="15M" BWLimit3Time="16:00" BWLimit3="12M" # OPTIONAL SETTINGS # Add name to upload job JobName="_daily_upload" # Adds custom string to end of checker file. Useful if you're running multiple jobs against the same remote. # Add extra commands or filters Command1="--exclude downloads/**" Command2="" Command3="" Command4="" Command5="" Command6="" Command7="" Command8="" # Bind the mount to an IP address CreateBindMount="N" # Y/N. Choose whether or not to bind traffic to a network adapter. RCloneMountIP="192.168.1.253" # Choose IP to bind upload to. NetworkAdapter="eth0" # choose your network adapter. eth0 recommended. VirtualIPNumber="1" # creates eth0:x e.g. eth0:1. # Use Service Accounts. Instructions: https://github.com/xyou365/AutoRclone UseServiceAccountUpload="N" # Y/N. Choose whether to use Service Accounts. ServiceAccountDirectory="/mnt/user/appdata/other/rclone/service_accounts" # Path to your Service Account's .json files. ServiceAccountFile="sa_gdrive_upload" # Enter characters before counter in your json files e.g. for sa_gdrive_upload1.json -->sa_gdrive_upload100.json, enter "sa_gdrive_upload". CountServiceAccounts="15" # Integer number of service accounts to use. # Is this a backup job BackupJob="N" # Y/N. Syncs or Copies files from LocalFilesLocation to BackupRemoteLocation, rather than moving from LocalFilesLocation/RcloneRemoteName BackupRemoteLocation="backup" # choose location on mount for deleted sync files BackupRemoteDeletedLocation="backup_deleted" # choose location on mount for deleted sync files BackupRetention="90d" # How long to keep deleted sync files suffix ms|s|m|h|d|w|M|y ####### END SETTINGS ####### ############################################################################### ##### DO NOT EDIT BELOW THIS LINE UNLESS YOU KNOW WHAT YOU ARE DOING ##### ############################################################################### ####### Preparing mount location variables ####### if [[ $BackupJob == 'Y' ]]; then LocalFilesLocation="$LocalFilesShare" echo "$(date "+%d.%m.%Y %T") INFO: *** Backup selected. Files will be copied or synced from ${LocalFilesLocation} for ${RcloneUploadRemoteName} ***" else LocalFilesLocation="$LocalFilesShare/$RcloneRemoteName" echo "$(date "+%d.%m.%Y %T") INFO: *** Rclone move selected. Files will be moved from ${LocalFilesLocation} for ${RcloneUploadRemoteName} ***" fi RcloneMountLocation="$RcloneMountShare/$RcloneRemoteName" # Location of rclone mount ####### create directory for script files ####### mkdir -p /mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName #for script files ####### Check if script already running ########## echo "$(date "+%d.%m.%Y %T") INFO: *** Starting rclone_upload script for ${RcloneUploadRemoteName} ***" if [[ -f "/mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName/upload_running$JobName" ]]; then echo "$(date "+%d.%m.%Y %T") INFO: Exiting as script already running." exit else echo "$(date "+%d.%m.%Y %T") INFO: Script not running - proceeding." touch /mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName/upload_running$JobName fi ####### check if rclone installed ########## echo "$(date "+%d.%m.%Y %T") INFO: Checking if rclone installed successfully." if [[ -f "$RcloneMountLocation/mountcheck" ]]; then echo "$(date "+%d.%m.%Y %T") INFO: rclone installed successfully - proceeding with upload." else echo "$(date "+%d.%m.%Y %T") INFO: rclone not installed - will try again later." rm /mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName/upload_running$JobName exit fi ####### Rotating serviceaccount.json file if using Service Accounts ####### if [[ $UseServiceAccountUpload == 'Y' ]]; then cd /mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName/ CounterNumber=$(find -name 'counter*' | cut -c 11,12) CounterCheck="1" if [[ "$CounterNumber" -ge "$CounterCheck" ]];then echo "$(date "+%d.%m.%Y %T") INFO: Counter file found for ${RcloneUploadRemoteName}." else echo "$(date "+%d.%m.%Y %T") INFO: No counter file found for ${RcloneUploadRemoteName}. Creating counter_1." touch /mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName/counter_1 CounterNumber="1" fi ServiceAccount="--drive-service-account-file=$ServiceAccountDirectory/$ServiceAccountFile$CounterNumber.json" echo "$(date "+%d.%m.%Y %T") INFO: Adjusted service_account_file for upload remote ${RcloneUploadRemoteName} to ${ServiceAccountFile}${CounterNumber}.json based on counter ${CounterNumber}." else echo "$(date "+%d.%m.%Y %T") INFO: Uploading using upload remote ${RcloneUploadRemoteName}" ServiceAccount="" fi ####### Upload files ########## # Check bind option if [[ $CreateBindMount == 'Y' ]]; then echo "$(date "+%d.%m.%Y %T") INFO: *** Checking if IP address ${RCloneMountIP} already created for upload to remote ${RcloneUploadRemoteName}" ping -q -c2 $RCloneMountIP > /dev/null # -q quiet, -c number of pings to perform if [ $? -eq 0 ]; then # ping returns exit status 0 if successful echo "$(date "+%d.%m.%Y %T") INFO: *** IP address ${RCloneMountIP} already created for upload to remote ${RcloneUploadRemoteName}" else echo "$(date "+%d.%m.%Y %T") INFO: *** Creating IP address ${RCloneMountIP} for upload to remote ${RcloneUploadRemoteName}" ip addr add $RCloneMountIP/24 dev $NetworkAdapter label $NetworkAdapter:$VirtualIPNumber fi else RCloneMountIP="" fi # Remove --delete-empty-src-dirs if rclone sync or copy if [[ $RcloneCommand == 'move' ]]; then echo "$(date "+%d.%m.%Y %T") INFO: *** Using rclone move - will add --delete-empty-src-dirs to upload." DeleteEmpty="--delete-empty-src-dirs " else echo "$(date "+%d.%m.%Y %T") INFO: *** Not using rclone move - will remove --delete-empty-src-dirs to upload." DeleteEmpty="" fi # Check --backup-directory if [[ $BackupJob == 'Y' ]]; then echo "$(date "+%d.%m.%Y %T") INFO: *** Will backup to ${BackupRemoteLocation} and use ${BackupRemoteDeletedLocation} as --backup-directory with ${BackupRetention} retention for ${RcloneUploadRemoteName}." LocalFilesLocation="$LocalFilesShare" BackupDir="--backup-dir $RcloneUploadRemoteName:$BackupRemoteDeletedLocation" else BackupRemoteLocation="" BackupRemoteDeletedLocation="" BackupRetention="" BackupDir="" fi # process files rclone $RcloneCommand $LocalFilesLocation $RcloneUploadRemoteName:$BackupRemoteLocation $ServiceAccount $BackupDir \ --user-agent="$RcloneUploadRemoteName" \ -vv \ --buffer-size 512M \ --drive-chunk-size 512M \ --tpslimit 8 \ --checkers 8 \ --transfers 4 \ --order-by modtime,$ModSort \ --min-age $MinimumAge \ $Command1 $Command2 $Command3 $Command4 $Command5 $Command6 $Command7 $Command8 \ --exclude *fuse_hidden* \ --exclude *_HIDDEN \ --exclude .recycle** \ --exclude .Recycle.Bin/** \ --exclude *.backup~* \ --exclude *.partial~* \ --drive-stop-on-upload-limit \ --bwlimit "${BWLimit1Time},${BWLimit1} ${BWLimit2Time},${BWLimit2} ${BWLimit3Time},${BWLimit3}" \ --bind=$RCloneMountIP $DeleteEmpty # Delete old files from mount if [[ $BackupJob == 'Y' ]]; then echo "$(date "+%d.%m.%Y %T") INFO: *** Removing files older than ${BackupRetention} from $BackupRemoteLocation for ${RcloneUploadRemoteName}." rclone delete --min-age $BackupRetention $RcloneUploadRemoteName:$BackupRemoteDeletedLocation fi ####### Remove Control Files ########## # update counter and remove other control files if [[ $UseServiceAccountUpload == 'Y' ]]; then if [[ "$CounterNumber" == "$CountServiceAccounts" ]];then rm /mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName/counter_* touch /mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName/counter_1 echo "$(date "+%d.%m.%Y %T") INFO: Final counter used - resetting loop and created counter_1." else rm /mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName/counter_* CounterNumber=$((CounterNumber+1)) touch /mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName/counter_$CounterNumber echo "$(date "+%d.%m.%Y %T") INFO: Created counter_${CounterNumber} for next upload run." fi else echo "$(date "+%d.%m.%Y %T") INFO: Not utilising service accounts." fi # remove dummy file rm /mnt/user/appdata/other/rclone/remotes/$RcloneUploadRemoteName/upload_running$JobName echo "$(date "+%d.%m.%Y %T") INFO: Script complete" exit thx Quote Link to comment
privateer Posted December 26, 2020 Share Posted December 26, 2020 (edited) 7 hours ago, francrouge said: # Bandwidth limits: specify the desired bandwidth in kBytes/s, or use a suffix b|k|M|G. Or 'off' or '0' for unlimited. The script uses --drive-stop-on-upload-limit which stops the script if the 750GB/day limit is achieved, so you no longer have to slow 'trickle' your files all day if you don't want to e.g. could just do an unlimited job overnight. BWLimit1Time="01:00" BWLimit1="off" BWLimit2Time="08:00" BWLimit2="15M" BWLimit3Time="16:00" BWLimit3="12M" This script throttles your upload at 15MB/s from 0800 to 1600 and 12MB/s from 1600 to 0100. Other than that it is unlimited bandwidth. I don't see anything in your script to cut it off at 750GB EDIT: There's a line I missed that stops it at 750GB already in there. Your script it good, no need to change the BWlimits Edited December 26, 2020 by privateer Quote Link to comment
francrouge Posted December 26, 2020 Share Posted December 26, 2020 This script throttles your upload at 15MB/s from 0800 to 1600 and 12MB/s from 1600 to 0100. Other than that it is unlimited bandwidth. I don't see anything in your script to cut it off at 750GBCool but since i dont have team drives do i need service account ?Envoyé de mon Pixel 2 XL en utilisant Tapatalk Quote Link to comment
privateer Posted December 26, 2020 Share Posted December 26, 2020 1 minute ago, francrouge said: Cool but since i dont have team drives do i need service account ? Envoyé de mon Pixel 2 XL en utilisant Tapatalk What are you trying to accomplish? Just shutting your script off at 750GB/day? Quote Link to comment
francrouge Posted December 26, 2020 Share Posted December 26, 2020 What are you trying to accomplish? Just shutting your script off at 750GB/day?For the moment yesThxEnvoyé de mon Pixel 2 XL en utilisant Tapatalk Quote Link to comment
privateer Posted December 26, 2020 Share Posted December 26, 2020 (edited) 5 hours ago, francrouge said: For the moment yes Thx Envoyé de mon Pixel 2 XL en utilisant Tapatalk One way to do this is to change your BWlimits from unlimited, 12M, and 15M to 9500K for all 3. This will keep your upload running all day but will limit the total bandwidth to keep your upload under 9500K. You can also adjust the timing with higher and lower speeds, but remember if the script is still running when it crosses from one BWlimit to another, it doesn't recheck until the script runs again. This means that shifting across times like this can have you cross 750GB even if you have different limits if you have a lot of data to upload. EDIT: There's a line I missed that stops it at 750GB already in there. Your script it good, no need to change the BWlimits Edited December 26, 2020 by privateer Quote Link to comment
francrouge Posted December 26, 2020 Share Posted December 26, 2020 One way to do this is to change your BWlimits from unlimited, 12M, and 15M to 9500K for all 3.Great thx a lot Envoyé de mon Pixel 2 XL en utilisant Tapatalk Quote Link to comment
DZMM Posted December 26, 2020 Author Share Posted December 26, 2020 2 hours ago, francrouge said: --drive-stop-on-upload-limit This stops the script when google says the account can't upload anymore i.e. at 750GB. Resets everyday. If you want to upload more than 750GB/day you need to use Service Accounts. 1 Quote Link to comment
francrouge Posted December 27, 2020 Share Posted December 27, 2020 (edited) 9 hours ago, privateer said: One way to do this is to change your BWlimits from unlimited, 12M, and 15M to 9500K for all 3. This will keep your upload running all day but will limit the total bandwidth to keep your upload under 9500K. You can also adjust the timing with higher and lower speeds, but remember if the script is still running when it crosses from one BWlimit to another, it doesn't recheck until the script runs again. This means that shifting across times like this can have you cross 750GB even if you have different limits if you have a lot of data to upload. EDIT: you replied while I was editing. make sure you read this again! hi again its wierd because i'm offen api ban when uploading. do i need to remove de # from the section. thx Edited December 27, 2020 by francrouge Quote Link to comment
DZMM Posted December 27, 2020 Author Share Posted December 27, 2020 1 hour ago, francrouge said: hi again its wierd because i'm offen api ban when uploading. do i need to remove de # from the section. thx I'm not sure what you are trying to achieve? You can only upload 750GB/day - if you hit this, that account/ID gets blocked for 24 hours. If you want to upload more, you have to use Service Accounts - there's no other way around it. Quote Link to comment
francrouge Posted December 27, 2020 Share Posted December 27, 2020 I'm not sure what you are trying to achieve? You can only upload 750GB/day - if you hit this, that account/ID gets blocked for 24 hours. If you want to upload more, you have to use Service Accounts - there's no other way around it.I would like to just upload 750gb or a bit less to not get blockedThxEnvoyé de mon Pixel 2 XL en utilisant Tapatalk Quote Link to comment
DZMM Posted December 27, 2020 Author Share Posted December 27, 2020 50 minutes ago, francrouge said: I would like to just upload 750gb or a bit less to not get blocked Thx Envoyé de mon Pixel 2 XL en utilisant Tapatalk But it makes no difference if your upload doesn't get blocked if you upload less than 750gb - you're just lowering the cap..... Quote Link to comment
francrouge Posted December 27, 2020 Share Posted December 27, 2020 But it makes no difference if your upload doesn't get blocked if you upload less than 750gb - you're just lowering the cap.....Yes but i still get api ban so its not stoping i guessEnvoyé de mon Pixel 2 XL en utilisant Tapatalk Quote Link to comment
Stinkpickle Posted December 27, 2020 Share Posted December 27, 2020 I want to mount my mergefs share on an Ubuntu Desktop machine, what is the best way to achieve this? I tried to export it as NFS and mount it that way, but it does not seem to be working. Thanks. Quote Link to comment
Bolagnaise Posted December 27, 2020 Share Posted December 27, 2020 4 hours ago, francrouge said: Yes but i still get api ban so its not stoping i guess Envoyé de mon Pixel 2 XL en utilisant Tapatalk If your getting API bans then you have not created a custom client ID as per this guide - https://rclone.org/drive/#making-your-own-client-id Quote Link to comment
DZMM Posted December 27, 2020 Author Share Posted December 27, 2020 5 hours ago, francrouge said: Yes but i still get api ban so its not stoping i guess Envoyé de mon Pixel 2 XL en utilisant Tapatalk The script doesn't stop before 750 - it stops when Google says it can't upload anymore, rather than continuing to run for until the api ban is lifted. This allows you to upload a different way i.e. service accounts. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.