bonienl Posted June 21, 2019 Author Share Posted June 21, 2019 On 6/11/2019 at 8:13 PM, itimpi said: I seem to get a hash key mismatch on every file which is a hard link (at the User Share level) to another file in the same User Share. Is this expected behaviour? Is there anything I can do to avoid getting this error? The links are in a sub-folder of a user share - do I have to use the Custom folder option to exclude the sub-folders containing these hard links. If so can I have a folder name of the form FILMS/!3D. I have never tested with hard links. This utility write information in the extended attributes of a file, and I am not sure how these are handled in hard links. When excluding folders, you can enter its unique name (e.g. !3D), a full path is automatically generated by the tool. Quote Link to comment
itimpi Posted June 21, 2019 Share Posted June 21, 2019 1 hour ago, bonienl said: I have never tested with hard links. This utility write information in the extended attributes of a file, and I am not sure how these are handled in hard links. When excluding folders, you can enter its unique name (e.g. !3D), a full path is automatically generated by the tool. I have the exclude working fine, and in this case that is probably what I wanted anyway as the files have already had their integrity checked at the original location and I do not intend to delete the original files. I thought it was worth mentioning, though, as it is quite possible that you do not want the plugin to check a hard link (assuming it can detect them in the first place). Having said that I am surprised that an error is actually reported. I would have expected the extended attribute information to either be present on the hard link entry as well (in which case why the mismatch) or the plugin would generate a new set of extended attribute information. I will have to look at this information on my system to see if I can sport what is generating the mismatch information. Quote Link to comment
coolspot Posted June 24, 2019 Share Posted June 24, 2019 (edited) I have a question - why does the "Build Up-To-Date" become green while the plug-in is still processing a drive? Basically it's only 20% of the way through processing a disk but "Build Up-To-Date" is already flagged as green? Also, the number of files processed also seems off. For example, the system says 97.9% processed, but the calculation doesn't seem ot make sense? Currently processing file 10284 of 14494 Is this correct? Lastly, why do renamed files create a new hash - shouldn't the old hash be maintined if a file is renamed? Thanks. Edited June 25, 2019 by coolspot Quote Link to comment
Mihle Posted June 30, 2019 Share Posted June 30, 2019 So, the plugin complain about file corruption in the preview folder of Nextcloud. As I have told I should probably just exclude the Nextcloud appdata folder, I have tried to do that. I have added it to the custom folder field like this : /mnt/user/NCloud/appdata_[Some Weird Letter and number combination] because thats the Nextcloud appdata and location is called. I then have tried both clicking remove and export and remove and build, but it still complains about the same corruption inside the appdata preview folder. What do I have to do different to get it to work? Quote Link to comment
Mihle Posted July 4, 2019 Share Posted July 4, 2019 On 6/30/2019 at 9:02 AM, Mihle said: So, the plugin complain about file corruption in the preview folder of Nextcloud. As I have told I should probably just exclude the Nextcloud appdata folder, I have tried to do that. I have added it to the custom folder field like this : /mnt/user/NCloud/appdata_[Some Weird Letter and number combination] because thats the Nextcloud appdata and location is called. I then have tried both clicking remove and export and remove and build, but it still complains about the same corruption inside the appdata preview folder. What do I have to do different to get it to work? I thought this place was more active than this, but ok. Quote Link to comment
trurl Posted July 5, 2019 Share Posted July 5, 2019 On 6/30/2019 at 3:02 AM, Mihle said: So, the plugin complain about file corruption in the preview folder of Nextcloud. As I have told I should probably just exclude the Nextcloud appdata folder, I have tried to do that. I have added it to the custom folder field like this : /mnt/user/NCloud/appdata_[Some Weird Letter and number combination] because thats the Nextcloud appdata and location is called. I then have tried both clicking remove and export and remove and build, but it still complains about the same corruption inside the appdata preview folder. What do I have to do different to get it to work? Are you absolutely sure you have that path correct? That is a nonstandard path for docker appdata. Usually it would be something like /mnt/user/appdata/dockername Quote Link to comment
Mihle Posted July 6, 2019 Share Posted July 6, 2019 19 hours ago, trurl said: Are you absolutely sure you have that path correct? That is a nonstandard path for docker appdata. Usually it would be something like /mnt/user/appdata/dockername Yes, Nextcloud seem to have two appdata folders. The other one, the one is already excluded, via just excluding the appdata forlder. but putting in my own location does not seem to work as I mentioned so I may do something wrong? Quote Link to comment
BennTech Posted July 24, 2019 Share Posted July 24, 2019 On 6/10/2019 at 10:42 AM, shiarua said: Well, looking through the source, I guess I could just make some wrapper scripts around the bunker script for exporting hash files and running the verification checks. Seems like a feasible workaround. Do you have exporting to custom location working? if so, would you post the scripts you made? I can't have filenames exported to the unencrypted USB drive because the actual filenames might include sensitive information. Quote Link to comment
scorcho99 Posted July 29, 2019 Share Posted July 29, 2019 Believe it or not, I read this whole thread. I think I even maintained some of the information I read. Some one asked for only manual/scheduled hash creation, rather than using inotify to do the hashes when files are created. Was that feature ever added? I'd rather just swoop through nightly and do them then during the day when the server is busy doing other things. Plus, inotify-tools has issues where it can miss adding watches or even perform erroneous watches when files are moved. I think that might explain the issues some people have had while I was reading this thread. Quote Link to comment
itimpi Posted August 8, 2019 Share Posted August 8, 2019 I was wondering if it is worth changing the text on the FIND button to read DUPLICATES instead? Although I have had the plugin installed for ages it was only recently that I realized this button was about detecting duplicate files on the server. Since it works off the hash files generated by the plugin it is also very fast once you have the hash files generated. Knowing this capability exists might be a reason that encourages more users to make use of the plugin Quote Link to comment
coolspot Posted August 8, 2019 Share Posted August 8, 2019 Do you need to export the hashes to run a check? If not, does it just read each file and compare it to the extended metadata checksum? Quote Link to comment
DDock Posted August 12, 2019 Share Posted August 12, 2019 I'm having an issue that since I installed the plugin again like 1-2 months ago. I had used this plugin a long time ago and decided to install it again. It has consistently shows 5 files that are corrupted ever since I reinstalled it. Not sure if these are because there were some files left behind from the previous install of this plugin. 4 of them are just NFO's and one is an MKV. Is there a way to acknowledge the error so it stops reporting it or what is the best way to handle it. Quote Link to comment
Vr2Io Posted August 12, 2019 Share Posted August 12, 2019 8 hours ago, DDock said: I'm having an issue that since I installed the plugin again like 1-2 months ago. I had used this plugin a long time ago and decided to install it again. It has consistently shows 5 files that are corrupted ever since I reinstalled it. Not sure if these are because there were some files left behind from the previous install of this plugin. 4 of them are just NFO's and one is an MKV. Is there a way to acknowledge the error so it stops reporting it or what is the best way to handle it. That means the hash result not match the one which store in extend attributes, you can simple delete from attributes and then build and export again. Quote Link to comment
DDock Posted August 13, 2019 Share Posted August 13, 2019 16 hours ago, Benson said: That means the hash result not match the one which store in extend attributes, you can simple delete from attributes and then build and export again. Just to make sure I'm running the proper command I'm just deleting the attribute user.sha256 and not the other user.xxx attributes? Thank You for sending me down the correct path. Quote Link to comment
Vr2Io Posted August 13, 2019 Share Posted August 13, 2019 (edited) 1 hour ago, DDock said: Just to make sure I'm running the proper command I'm just deleting the attribute user.sha256 and not the other user.xxx attributes? Thank You for sending me down the correct path. Delete some may be ok, but just delete all. i.e. user.filedate="1547312879" user.filesize="85302" user.md5="a7184f925c3bc9eb2f8a917f7f26f9c8" user.scandate="1565683848" Edited August 13, 2019 by Benson Quote Link to comment
bobokun Posted August 20, 2019 Share Posted August 20, 2019 I've been trying to set up file integrity to get it to work but I'm not sure if I'm doing this correctly or not. I keep on getting BLAKE2 key mismatch on my libvirt.img and a lot of my nextcloud files: See example: BLAKE2 hash key mismatch (updated), /mnt/disk1/nextcloud/.htaccess was modified BLAKE2 hash key mismatch (updated), /mnt/disk1/nextcloud/appdata_ocdoy1vwt49l/appstore/apps.json was modified BLAKE2 hash key mismatch (updated), /mnt/disk1/nextcloud/appdata_ocdoy1vwt49l/appstore/categories.json was modified BLAKE2 hash key mismatch (updated), /mnt/disk1/nextcloud/appdata_ocdoy1vwt49l/appstore/future-apps.json was modified BLAKE2 hash key mismatch (updated), /mnt/disk5/vm backup/libvirt.img was modified Are these true errors?? I have these excluded and clearing it but every time integrity check runs it always finds these files mismatched. Here are my settings: Quote Link to comment
JonathanM Posted August 20, 2019 Share Posted August 20, 2019 35 minutes ago, bobokun said: I've been trying to set up file integrity to get it to work but I'm not sure if I'm doing this correctly or not. I keep on getting BLAKE2 key mismatch on my libvirt.img and a lot of my nextcloud files Yeah, don't do that. The whole point of this plugin is to keep tabs on seldom used archival type files, files that should never change. Active files should not even be monitored, it's a waste of resources. Quote Link to comment
bobokun Posted August 20, 2019 Share Posted August 20, 2019 19 minutes ago, jonathanm said: Yeah, don't do that. The whole point of this plugin is to keep tabs on seldom used archival type files, files that should never change. Active files should not even be monitored, it's a waste of resources. I understand and that's why I've added them on my exclude folders (as displayed on the screenshot) but it's still being monitored? I must be configuring something wrong but I'm not sure how I should fix it. Quote Link to comment
Vr2Io Posted August 20, 2019 Share Posted August 20, 2019 1 hour ago, bobokun said: Are these true errors?? I have these excluded and clearing it but every time integrity check runs it always finds these files mismatched. You need do clearing again and export, this to reflect the change. Quote Link to comment
Koshy Posted September 4, 2019 Share Posted September 4, 2019 BLAKE2 hash key mismatch, (720p_30fps_H264-192kbit_AAC).mp4 is corrupted I got this error on a Disk 3, is there any way I can find can out where on disk 3 this file is? Quote Link to comment
Vr2Io Posted September 4, 2019 Share Posted September 4, 2019 (edited) 37 minutes ago, Koshy said: BLAKE2 hash key mismatch, (720p_30fps_H264-192kbit_AAC).mp4 is corrupted I got this error on a Disk 3, is there any way I can find can out where on disk 3 this file is? Suppose be Unicode file name display issue. You may check disk3 hash file which store at export directory. Grep it and found which file(s) have similar name then further confirm by running b2sum <file> and compare the hash value. Edited September 4, 2019 by Benson Quote Link to comment
Koshy Posted September 5, 2019 Share Posted September 5, 2019 15 hours ago, Benson said: Suppose be Unicode file name display issue. You may check disk3 hash file which store at export directory. Grep it and found which file(s) have similar name then further confirm by running b2sum <file> and compare the hash value. Thank you, I found the file. Quote Link to comment
EdgarWallace Posted September 18, 2019 Share Posted September 18, 2019 On every run the server is providing a wrong error message: BLAKE2 hash key mismatch, /mnt/disk5/iTunes/Music/Wolfgang Haffner/Kind of Cool/02 So What.aif is corrupted However the file is perfectly ok. Is there any way to update the hash key manually? Quote Link to comment
bonienl Posted September 18, 2019 Author Share Posted September 18, 2019 34 minutes ago, EdgarWallace said: On every run the server is providing a wrong error message: BLAKE2 hash key mismatch, /mnt/disk5/iTunes/Music/Wolfgang Haffner/Kind of Cool/02 So What.aif is corrupted However the file is perfectly ok. Is there any way to update the hash key manually? You can run the CLI command (this updates the files in the specific folder) /usr/local/emhttp/plugins/dynamix.file.integrity/scripts/bunker -b2 -a "/mnt/disk5/iTunes/Music/Wolfgang Haffner/Kind of Cool" 1 Quote Link to comment
EdgarWallace Posted September 18, 2019 Share Posted September 18, 2019 You can run the CLI command (this updates the files in the specific folder)/usr/local/emhttp/plugins/dynamix.file.integrity/scripts/bunker -b2 -a "/mnt/disk5/iTunes/Music/Wolfgang Haffner/Kind of Cool" Oh great@bonienl. Thanks a lot. Gesendet von iPad mit Tapatalk Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.