ilovejedd Posted March 10, 2011 Share Posted March 10, 2011 Anyone here have any suggestions? Would prefer free software or freeware but not averse to paying for a good commercial one. Whenever I change computers or an HDD starts showing signs of failure, I usually copy the contents to the unRAID server. I'm pretty sure I've got duplicates of files, particularly some massive ones (mkv, vhd, iso, etc) so I'm hoping to do some clean-up to gauge actual usage before I start buying a ton of HDD's. Who knows, I might even clear up enough space so I wouldn't need to buy HDD's for now. Thanks! Link to comment
jamerson9 Posted March 10, 2011 Share Posted March 10, 2011 Auslogics Duplicate File Finder. It's free. Link to comment
kizer Posted March 10, 2011 Share Posted March 10, 2011 I could of sworn there was a tool in unMENU for finding Dupe files. Not a Plugin, but an actual part of unMENU. I'm not at my machine now so I can't point it out. Link to comment
Joe L. Posted March 10, 2011 Share Posted March 10, 2011 I could of sworn there was a tool in unMENU for finding Dupe files. Not a Plugin, but an actual part of unMENU. I'm not at my machine now so I can't point it out. not the same thing at all. It lets you find multiple copies of the identically named file when it is on parallel directories on multiple disks. The user-shares can only make visible the one on the lowest numbered disk. Joe L. Link to comment
kizer Posted March 10, 2011 Share Posted March 10, 2011 Thanks for clarifying that Joe L. I've never used it myself, but saw it as an option. Link to comment
Joe L. Posted March 10, 2011 Share Posted March 10, 2011 Try the script attached to this thread: http://lime-technology.com/forum/index.php?topic=7018.msg68073;topicseen#msg68073 Link to comment
ilovejedd Posted March 15, 2011 Author Share Posted March 15, 2011 Try the script attached to this thread: http://lime-technology.com/forum/index.php?topic=7018.msg68073;topicseen#msg68073 Ooh, thanks for this! My Linux shell-fu is pretty non-existent. Is there a way to limit the script to only comparing files greater than a certain size, say 10MB? Thanks! Link to comment
prostuff1 Posted March 15, 2011 Share Posted March 15, 2011 Try the script attached to this thread: http://lime-technology.com/forum/index.php?topic=7018.msg68073;topicseen#msg68073 Ooh, thanks for this! My Linux shell-fu is pretty non-existent. Is there a way to limit the script to only comparing files greater than a certain size, say 10MB? Thanks! I believe so, though I don't have the time to try it out right now. In the find command there is a -size parameter that could be added that might do what you need. Have look here for an example of what I am talking about. Link to comment
ilovejedd Posted March 15, 2011 Author Share Posted March 15, 2011 I believe so, though I don't have the time to try it out right now. In the find command there is a -size parameter that could be added that might do what you need. Have look here for an example of what I am talking about. Thanks for the link. Looks like this is going to cut down quite a bit on the initial processing, too. I can run through all my large files first and then take care of the smaller files later. Hmm, is there a way to set upper and lower bound? Say, I want to divide the comparison in stages: > 1GB 100MB to < 1GB 10MB to < 100MB < 10MB Thanks again! Link to comment
Recommended Posts
Archived
This topic is now archived and is closed to further replies.