Jump to content
bonienl

Dynamix File Integrity plugin

803 posts in this topic Last Reply

Recommended Posts

On 6/11/2019 at 8:13 PM, itimpi said:

I seem to get a hash key mismatch on every file which is a hard link (at the User Share level) to another file in the same User Share.   Is this expected behaviour?    Is there anything I can do to avoid getting this error?  

 

The links are in a sub-folder of a user share - do I have to use the Custom folder option to exclude the sub-folders containing these hard links.  If so can I have a folder name of the form FILMS/!3D.

I have never tested with hard links. This utility write information in the extended attributes of a file, and I am not sure how these are handled in hard links.

 

When excluding folders, you can enter its unique name (e.g. !3D), a full path is automatically generated by the tool.

Share this post


Link to post
1 hour ago, bonienl said:

I have never tested with hard links. This utility write information in the extended attributes of a file, and I am not sure how these are handled in hard links.

 

When excluding folders, you can enter its unique name (e.g. !3D), a full path is automatically generated by the tool.

I have the exclude working fine, and in this case that is probably what I wanted anyway as the files have already had their integrity checked at the original location and I do not intend to delete the original files.

 

I thought it was worth mentioning, though, as it is quite possible that you do not want the plugin to check a hard link (assuming it can detect them in the first place).   Having said that I am surprised that an error is actually reported.    I would have expected the extended attribute information to either be present on the hard link entry as well (in which case why the mismatch) or the plugin would generate a new set of extended attribute information.   I will have to look at this information on my system to see if I can sport what is generating the mismatch information.

Share this post


Link to post
Posted (edited)

I have a question - why does the "Build Up-To-Date" become green while the plug-in is still processing a drive? Basically it's only 20% of the way through processing a disk but "Build Up-To-Date" is already flagged as green?

 

Also, the number of files processed also seems off. For example, the system says 97.9% processed, but the calculation doesn't seem ot make sense?

 

Currently processing file 10284 of 14494

Is this correct?

 

Lastly, why do renamed files create a new hash - shouldn't the old hash be maintined if a file is renamed?

 

Thanks.

Edited by coolspot

Share this post


Link to post

So, the plugin complain about file corruption in the preview folder of Nextcloud. As I have told I should probably just exclude the Nextcloud appdata folder, I have tried to do that. I have added it to the custom folder field like this : /mnt/user/NCloud/appdata_[Some Weird Letter and number combination] because thats the Nextcloud appdata and location is called. I then have tried both clicking remove and export and remove and build, but it still complains about the same corruption inside the appdata preview folder. 

What do I have to do different to get it to work?

Share this post


Link to post
On 6/30/2019 at 9:02 AM, Mihle said:

So, the plugin complain about file corruption in the preview folder of Nextcloud. As I have told I should probably just exclude the Nextcloud appdata folder, I have tried to do that. I have added it to the custom folder field like this : /mnt/user/NCloud/appdata_[Some Weird Letter and number combination] because thats the Nextcloud appdata and location is called. I then have tried both clicking remove and export and remove and build, but it still complains about the same corruption inside the appdata preview folder. 

What do I have to do different to get it to work?

I thought this place was more active than this, but ok.

Share this post


Link to post
On 6/30/2019 at 3:02 AM, Mihle said:

So, the plugin complain about file corruption in the preview folder of Nextcloud. As I have told I should probably just exclude the Nextcloud appdata folder, I have tried to do that. I have added it to the custom folder field like this : /mnt/user/NCloud/appdata_[Some Weird Letter and number combination] because thats the Nextcloud appdata and location is called. I then have tried both clicking remove and export and remove and build, but it still complains about the same corruption inside the appdata preview folder. 

What do I have to do different to get it to work?

Are you absolutely sure you have that path correct? That is a nonstandard path for docker appdata. Usually it would be something like /mnt/user/appdata/dockername

Share this post


Link to post
19 hours ago, trurl said:

Are you absolutely sure you have that path correct? That is a nonstandard path for docker appdata. Usually it would be something like /mnt/user/appdata/dockername

Yes, Nextcloud seem to have two appdata folders. The other one, the one is already excluded, via just excluding the appdata forlder. but putting in my own location does not seem to work as I mentioned so I may do something wrong?

Share this post


Link to post
On 6/10/2019 at 10:42 AM, shiarua said:

Well, looking through the source, I guess I could just make some wrapper scripts around the bunker script for exporting hash files and running the verification checks. Seems like a feasible workaround. 

Do you have exporting to custom location working? if so, would you post the scripts you made?

 

I can't have filenames exported to the unencrypted USB drive because the actual filenames might include sensitive information.

Share this post


Link to post

Believe it or not, I read this whole thread. I think I even maintained some of the information I read.

 

Some one asked for only manual/scheduled hash creation, rather than using inotify to do the hashes when files are created. Was that feature ever added? I'd rather just swoop through nightly and do them then during the day when the server is busy doing other things. Plus, inotify-tools has issues where it can miss adding watches or even perform erroneous watches when files are moved. I think that might explain the issues some people have had while I was reading this thread.

Share this post


Link to post

I was wondering if it is worth changing the text on the FIND button to read DUPLICATES instead?   Although I have had the plugin installed for ages it was only recently that I realized this button was about detecting duplicate files on the server. 

 

Since it works off the hash files generated by the plugin it is also very fast once you have the hash files generated.   Knowing this capability exists might be a reason that encourages more users to make use of the plugin :)

Share this post


Link to post

Do you need to export the hashes to run a check? If not, does it just read each file and compare it to the extended metadata checksum?

Share this post


Link to post

I'm having an issue that since I installed the plugin again like 1-2 months ago. I had used this plugin a long time ago and decided to install it again. It has consistently shows 5 files that are corrupted ever since I reinstalled it. Not sure if these are because there were some files left behind from the previous install of this plugin. 4 of them are just NFO's and one is an MKV. Is there a way to acknowledge the error so it stops reporting it or what is the best way to handle it.

Share this post


Link to post
8 hours ago, DDock said:

I'm having an issue that since I installed the plugin again like 1-2 months ago. I had used this plugin a long time ago and decided to install it again. It has consistently shows 5 files that are corrupted ever since I reinstalled it. Not sure if these are because there were some files left behind from the previous install of this plugin. 4 of them are just NFO's and one is an MKV. Is there a way to acknowledge the error so it stops reporting it or what is the best way to handle it.

 

That means the hash result not match the one which store in extend attributes, you can simple delete from attributes and then build and export again.

 

 

 

Share this post


Link to post
16 hours ago, Benson said:

 

That means the hash result not match the one which store in extend attributes, you can simple delete from attributes and then build and export again.

 

 

 

Just to make sure I'm running the proper command I'm just deleting the attribute user.sha256 and not the other user.xxx attributes?

 

Thank You for sending me down the correct path.

Share this post


Link to post
Posted (edited)
1 hour ago, DDock said:

Just to make sure I'm running the proper command I'm just deleting the attribute user.sha256 and not the other user.xxx attributes?

 

Thank You for sending me down the correct path.

 

Delete some may be ok, but just delete all.

 

i.e.

user.filedate="1547312879"

user.filesize="85302"
user.md5="a7184f925c3bc9eb2f8a917f7f26f9c8"
user.scandate="1565683848"

Edited by Benson

Share this post


Link to post

I've been trying to set up file integrity to get it to work but I'm not sure if I'm doing this correctly or not. I keep on getting BLAKE2 key mismatch on my libvirt.img and a lot of my nextcloud files: See example:

BLAKE2 hash key mismatch (updated), /mnt/disk1/nextcloud/.htaccess was modified
BLAKE2 hash key mismatch (updated), /mnt/disk1/nextcloud/appdata_ocdoy1vwt49l/appstore/apps.json was modified
BLAKE2 hash key mismatch (updated), /mnt/disk1/nextcloud/appdata_ocdoy1vwt49l/appstore/categories.json was modified
BLAKE2 hash key mismatch (updated), /mnt/disk1/nextcloud/appdata_ocdoy1vwt49l/appstore/future-apps.json was modified
BLAKE2 hash key mismatch (updated), /mnt/disk5/vm backup/libvirt.img was modified

Are these true errors?? I have these excluded and clearing it but every time integrity check runs it always finds these files mismatched.

image.thumb.png.310da947b52c31713691b2772c119733.png

 

Here are my settings:

image.thumb.png.160dbfa6b367fcd0395b5469355b95b9.png

Share this post


Link to post
35 minutes ago, bobokun said:

I've been trying to set up file integrity to get it to work but I'm not sure if I'm doing this correctly or not. I keep on getting BLAKE2 key mismatch on my libvirt.img and a lot of my nextcloud files

Yeah, don't do that. The whole point of this plugin is to keep tabs on seldom used archival type files, files that should never change. Active files should not even be monitored, it's a waste of resources.

Share this post


Link to post
19 minutes ago, jonathanm said:

Yeah, don't do that. The whole point of this plugin is to keep tabs on seldom used archival type files, files that should never change. Active files should not even be monitored, it's a waste of resources.

I understand and that's why I've added them on my exclude folders (as displayed on the screenshot) but it's still being monitored? I must be configuring something wrong but I'm not sure how I should fix it.

Share this post


Link to post
1 hour ago, bobokun said:

Are these true errors?? I have these excluded and clearing it but every time integrity check runs it always finds these files mismatched.

You need do clearing again and export, this to reflect the change.

Share this post


Link to post
BLAKE2 hash key mismatch, (720p_30fps_H264-192kbit_AAC).mp4 is corrupted

I got this error on a Disk 3, is there any way I can find can out where on disk 3 this file is?

Share this post


Link to post
37 minutes ago, Koshy said:

BLAKE2 hash key mismatch, (720p_30fps_H264-192kbit_AAC).mp4 is corrupted

I got this error on a Disk 3, is there any way I can find can out where on disk 3 this file is?

Suppose be Unicode file name display issue. You may check disk3 hash file which store at export directory. Grep it and found which file(s) have similar name then further confirm by running b2sum <file> and compare the hash value.

Edited by Benson

Share this post


Link to post
15 hours ago, Benson said:

Suppose be Unicode file name display issue. You may check disk3 hash file which store at export directory. Grep it and found which file(s) have similar name then further confirm by running b2sum <file> and compare the hash value.

Thank you, I found the file.

Share this post


Link to post

On every run the server is providing a wrong error message:

BLAKE2 hash key mismatch, /mnt/disk5/iTunes/Music/Wolfgang Haffner/Kind of Cool/02 So What.aif is corrupted

However the file is perfectly ok. Is there any way to update the hash key manually?

Share this post


Link to post
34 minutes ago, EdgarWallace said:

On every run the server is providing a wrong error message:


BLAKE2 hash key mismatch, /mnt/disk5/iTunes/Music/Wolfgang Haffner/Kind of Cool/02 So What.aif is corrupted

However the file is perfectly ok. Is there any way to update the hash key manually?

You can run the CLI command (this updates the files in the specific folder)

/usr/local/emhttp/plugins/dynamix.file.integrity/scripts/bunker -b2 -a "/mnt/disk5/iTunes/Music/Wolfgang Haffner/Kind of Cool"

 

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.