Kilrah Posted February 19, 2023 Posted February 19, 2023 TL:DR: I've wanted to find a file integrity checking tool that was multiplatform and didn't pollute filesystems with .sfv or .md5 files, couldn't find one, so made one. https://github.com/kilrah/hashcheck Written as a command line tool in Python, hashes files with sha256 and stores the absolute file paths, hash and other currently unused details (size, creation/modification date) in an easily portable/editable sqlite database file. I have a 32TB array on my main Windows PC, and my unraid server takes a backup of that twice a day using rsnapshot but I didn't have a good way of doing periodic checks both on the source and destination that didn't involve my slow Gbit network to run a file comparison with something like freefilesync. This now allows it, I've hashed all the files on my main PC, and can then copy the db file to the server, and using the path remapping / conversion feature can run a check locally on the server against the backup. Figured it would likely be useful to others, so here goes. I have some ideas for extra features like keeping track of checks which would allow e.g. to run a scheduled check that would verify at most X GB/TB of files for which the last check date is the oldest in one go so it could do periodic verifications in a mostly "set and forget" way, but if you find this useful let me know if you have ideas/see issues... 1 Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.