JustinAiken Posted August 20, 2011 Posted August 20, 2011 I'd like to run through my movies directory recursively, and save every file that's under say, 2MB, to the cloud. That way if I ever lose my entire server, I could build my media library back up quickly by just reripping/downloading ISOs and MKVs, and I'd have all my metadata that I've so painstakingly added safe. Is there a way to do this? Some kind of a script or something?
cyrnel Posted August 20, 2011 Posted August 20, 2011 What cloud do you mean? It's easy to write something that filters by file size but where are the files going to end up? Do you want unRAID to copy them somewhere?
JustinAiken Posted August 20, 2011 Author Posted August 20, 2011 Either my dropbox, or some ohter site depending on how big it ends up being... I can move it there myself, if I can just get large fileless copy on my local machine..
Falcon Posted August 20, 2011 Posted August 20, 2011 Crashplan have support for filters and working good on unraid.
JustinAiken Posted August 20, 2011 Author Posted August 20, 2011 Hmm, that looks easy enough. I'll try it when I get home from work to see if it works... Probably just run it on my desktop treating the unRaid as another drive, instead of trying to get it to compile ON unRaid...
cyrnel Posted August 20, 2011 Posted August 20, 2011 It's Java, and really easy. The one slightly weird part is tunneling the control application from your desktop to the server, but it isn't hard. Once you get it going it becomes difficult to give up. Sooo nice to offload that function to an already running unRAID with almost endless storage.
JustinAiken Posted August 21, 2011 Author Posted August 21, 2011 Hmm, looks like that's not exactly what I'm looking for - I don't see the size filters anywhere, and I don't want a constantly running process - I'd rather do it manually.
Joe L. Posted August 21, 2011 Posted August 21, 2011 Hmm, looks like that's not exactly what I'm looking for - I don't see the size filters anywhere, and I don't want a constantly running process - I'd rather do it manually. find /mnt/user -type f -size -10k -exec ls -l {} \; the -size -NNN will only print those files under the size specified. google man find linux to read more about it. replace the "ls -l {} with cp {} /mnt/user/backup or something similar... (I don't think "cp" will create sub-folders)
cyrnel Posted August 21, 2011 Posted August 21, 2011 Yea, I hadn't tried anything but name filters. Doesn't look like they've ever implemented size filters. I will say it's excellent software and requires minimal overhead on the server. For what you're doing I'd be trying something like Joe. Copy or list the files you want and back that up. Something like: rsync -a --max-size=2k --prune-empty-dirs <source> <dest> That'll put the files you want in <dest> where you can back them up however you like. Repeat before each backup and it will only copy changes. A faster alternative might be cp -r -s ... so the copies are really just soft links. You're likely to end up with loads of empty directories if that's important.
JustinAiken Posted August 21, 2011 Author Posted August 21, 2011 rsync -a --max-size=2k --prune-empty-dirs <source> <dest> Thanks, this worked perfect, and very fast! One little terminal command just backed up hundreds of hours of carefully editing metadata with EMM.
Recommended Posts
Archived
This topic is now archived and is closed to further replies.