Jump to content

Alex R. Berg

  • Content Count

  • Joined

  • Last visited

  • Days Won


Alex R. Berg last won the day on October 30 2018

Alex R. Berg had the most liked content!

Community Reputation

13 Good

1 Follower

About Alex R. Berg

  • Rank
    Advanced Member
  • Birthday 05/09/1977


  • Gender
  • Location

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hmm, its just an annoying warning, putting a scare into users. Unfortunately some GitHub merge stuff went badly between my developing fork and the Community app version by Bergware. Hence the warning, I think. If asked Bergware if he'll fix it, or if perhaps we should switch community apps to this repo. It works fine on unRaid 6.7.1
  2. @bonienlThere a new cache-dirs plugin in my repo fork from 2018-12, that hasn't been merged into the main. We had a merge accident, so a merge came look like a conflict. I havn't merged main changes into my fork, because I didn't want to deal with that. Maybe its easiest to start over on a new cache-dirs-only branch? Or change the dynamix applications to expect my forked github repo, as the source.
  3. Its working for me on unRaid 6.7.0. I tried accessing a folder on my disk2 both from windows, unRaid itself, in both cases both user and disk share. Disk didn't spin up. I'm filled up at the moment, so I probably don't investigate much, but if you want to supply logs, if it is to be useful, the logs from cache-dirs are needed Cache_dirs has a command to collect them. It would be awesome it that could somehow be collected automatically by unRaid, for instance by some kind of plugin-event. I assume that is not possible, but if it is, let me know. If not, it might make sense to ask Tom @ Limetech if that is a reasonable feature to add. Maybe you feel like doing that interwebtech, or tell me where to post? Best Alex
  4. Wow, nice machine Its low-level stuff in linux that does the reading of the directory structure. Cache-dirs itself just calls many find-processes. I think I mentioned in previous messages about how many files I have and what memory I use, so if you skim above you might find something. I cant think of anything helpful for you, except experiment. You can see if cache-dirs is currently scanning disks, check the cache_dirs debug flags, the script contains some statements that might help you debug, if you are not already an expert at linux.
  5. I'm sorry I don't have the time to go into a detailed problem solving session. Its probably uphill for you if you are not familiar with scripts, but here's some hints. cache_dirs should be in path, otherwise use `/usr/local/emhttp/plugins/dynamix.cache.dirs/scripts/cache_dirs` ``` sudo /usr/local/emhttp/plugins/dynamix.cache.dirs/scripts/cache_dirs -h ``` gives you the commands it runs. It spaws find-subprocesses, which does the actual reading of dirs, so yeah it indicates cache_dirs is not your problem, though of cause maybe it caches to little, when you see those folders. Maybe you have to little ram, but that's just guess-work. Cache_dirs only caches /mnt/disk* and /mnt/cache, not other root folders, but those root folders should be mounted in memory-space on UnRaid. Unless you've done some manual mounts, which I doubt you have. Best Alex
  6. I have no idea. If you run it manually the cache_dirs script, maybe its easier for you to figure out what is going on. There is also a switch to disable background, which may also help in your debugging. You have the command in the top you posted, or at least the first half of it. Best Alex
  7. @bonienl A new change is ready. Unfortunately you may find it to be a pain-in-the-arse because the last was not merged via git, but manually applied. Anyway I havn't created a new change request, as any change I push automatically gets added to the existing change-request. The new version is cache-dirs 2.2.7, and includes fixes to -a option which allows users to exclude via arbitrary find exclude commands, stuff from Joe I ruined long ago. bonienl, I suspect it is this commit you merged manually last: 8bef2e6dff9cb76553965fd898ef8692b0a29625. I would merge everything up to the point you merged last, so git knows it. And then merge normally. I'll point you all to bonienl when this next merge is done. I don't remember if I already pointed everybody back to main in nov/dec, but I'll do it again.
  8. I noticed a few days ago that cache_dirs was scanning my dirs at depth 5 and each scan took 45s, and touched disks and it had been going on for a long time. A parity scan was executed the day before, I don't know if it was related. The interesting part is that it started quickly recovered to full depth idle disks after I wrote 7 GB to the /tmp drive and deleted it (with my test_free_memory.sh script). Perhaps the writing to disks, caused linux to move some memory cache around. I'm using a cache pressure of 1. This looks a bit similar to what was happening to you @wgstarksif I'm not mistaken. In that if there's not enough memory free for whatever reason, cache_dirs spams the disks.
  9. I've released a new version on my 'beta' fork: https://raw.githubusercontent.com/arberg/dynamix/master/unRAIDv6/dynamix.cache.dirs.plg It fixes the -a option, and adds help information to the plugin page for how to filter dirs example: -a '-noleaf -name .Recycle.Bin -prune -o -name log -prune -o -name temp -prune -o -name .sync -prune -o -print' Avoid the () of Joe's example. Unfortunately * and " do not work, so we cannot filter for "*Old". The plugin messes up the double-quotes, and the cache_dirs script does not responde correctly even when it receives proper quoted -name "*Old" -prune. I'll push to dynamix, so it'll probably be live in the main in a few days.
  10. Wow that's pretty cool, I didn't realize we could just filter folders with that trick. The -a option is broken, with your \' you just avoided it crashing spectaculously. You can see the cache_dirs command that is executed in syslog (unraid log). The cache_dirs log also contains the 'find $args' command, which should list your -a arguments. Right now I think, its just empty for you, as with -a option. I'll release a new version maybe tomorrow, and send a patch pull. I cannot get the '*' in Joe's name filter to work, which is a shame, but maybe that's just me. More on that later.
  11. I just gave the slackware package collection one more attempt. I found the packages that are now required for subversion: <!ENTITY icu4c "icu4c-63.1-x86_64-1.txz"> <!ENTITY subversion "subversion-1.11.0-x86_64-1.txz"> <!ENTITY neon "neon-0.30.2-x86_64-1.txz"> <!ENTITY apr "apr-1.6.5-x86_64-1.txz"> <!ENTITY aprutil "apr-util-1.6.1-x86_64-7.txz"> <!ENTITY serf "serf-1.3.9-x86_64-3.txz"> <!ENTITY sqlite "sqlite-3.25.3-x86_64-1.txz"> <!-- ap/ --> <!ENTITY expat "expat-2.2.6-x86_64-1.txz"> <!-- l/ --> http://slackware.cs.utah.edu/pub/slackware/slackware64-current/slackware64/l/utf8proc-2.2.0-x86_64-1.txz http://slackware.cs.utah.edu/pub/slackware/slackware64-current/slackware64/ap/mariadb-10.3.11-x86_64-1.txz Its possible the sqlite and expat were not necessary, I didn't try without it. It was included in the ubuntu dependencies. The others are necessary. I just kept adding packages until 'svn' would run again.
  12. Yes certainly that is easy to do with 'docker -v'. I just don't need the svn client so I havn't done it. If I had lots of working dirs scattered in lots of places, it would be a hassle. I also frequently get confused and mess up with the virtual volumes in docker, I mean there's an indirection so its one more possible error, not to mention user id trouble. If I needed an SVN client on unRaid, I would prefer it to be on the system natively I think, but I must admit, I absolutely love the docker services that just work, no matter what version packages unRaid runs. PS: I need the svn server with Cyrus Sasl authentication.
  13. I have now tried updating my old subversion plugin to make it be based on current slackware 14.2. I kept running into missing packages, and then I have up on that approach. I then tried SCM-Manager docker. It worked great, super easy to setup and get working, but I didn't succeed with setting it up securely for remote access. Its probably easy, but I left it. I then tried building my own docker-file, because it sucks being dependent on managing our own dependencies from slackware and it sucks that it breaks when unRaid is updated. I got it working. It was a bit of a pain, but it turned out my Cyrus-Sasl password with a '#'-char didn't work in the new Cyrus-Sasl. But anyway it works now. The dockerhub images name is 'arberg/subversion'. The github docker-file repository is arberg/docker-subversion. dmacias: If you want to get subversion to working again, I captured the list of packages which ubuntu installs, when it installs subversion. All of the dependencies that I noticed was missing are in the list, such as serf. Here's the list of packages I tried installing from slackware-current, it wasn't enough: <!ENTITY icu4c "icu4c-63.1-x86_64-1.txz"> <!ENTITY subversion "subversion-1.11.0-x86_64-1.txz"> <!ENTITY neon "neon-0.30.2-x86_64-1.txz"> <!ENTITY apr "apr-1.6.5-x86_64-1.txz"> <!ENTITY aprutil "apr-util-1.6.1-x86_64-7.txz"> <!ENTITY serf "serf-1.3.9-x86_64-3.txz"> Ahh - here's the alpine OS package list, its a lot shorter and looks like slackware packages: (1/15) Installing db (5.3.28-r0) (2/15) Installing krb5-conf (1.0-r1) (3/15) Installing libcom_err (1.44.2-r0) (4/15) Installing sqlite-libs (3.24.0-r0) (5/15) Installing heimdal-libs (7.5.0-r1) (6/15) Installing libsasl (2.1.26-r13) (7/15) Installing cyrus-sasl (2.1.26-r13) (8/15) Installing libuuid (2.32-r0) (9/15) Installing apr (1.6.3-r1) (10/15) Installing expat (2.2.5-r0) (11/15) Installing apr-util (1.6.1-r2) (12/15) Installing lz4-libs (1.8.2-r0) (13/15) Installing serf (1.3.9-r4) (14/15) Installing subversion-libs (1.10.0-r0) (15/15) Installing subversion (1.10.0-r0) My Docker doesn't give me an svn command on the unRaid system which kind of sucks. It should be easy enough to add a svn script like "docker exec subversion svn $*" but that won't have access to local files, so not really an option.
  14. I suspect there is currently a mismatch of some versions for running subversion. It seems my unRaid 6.6.5 now updates packages from slackware-current, or something else on my system does (maybe NerdPack itself). svn says: `/usr/bin/svnserve: error while loading shared libraries: libicui18n.so.56: cannot open shared object file: No such file or directory` If I install http://slackware.cs.utah.edu/pub/slackware/slackware64-14.2/slackware64/l/icu4c-56.1-x86_64-2.txz with 'installpkg' or `/sbin/upgradepkg - -install-new` then it works. Except it doesn't fully work because my Cyrus-Sasl authentication is invalid. I see Cyrus SASL 2.1.27 is installed, which is the version from slackware-current. If I then roll back to 2.1.26, my authentication works. If I then upgrade to the new icu5c package on slackware-current I get the same svn error as above. So my guess is there is version mismatches, and probably icu4c was already installed. Maybe I'll be happy switching to docker https://hub.docker.com/r/sdorra/scm-manager/ It looks great, but requires some work making it secure for remote access I think.
  15. Interesting. It should definitely be built in something other than bash, but actually its a very simple fundamental idea, so could relatively easily be built in another language. If I was to do it, I would do it in scala. We would need a database to put the scan-durations in, which in my eyes complicates the setup quite a bit. I wonder if that's overengineering it a bit. I doubt many would access it, unless there's also a web-page to display the timings in some aggregate format. I doubt I'll play more with it. I don't think its worth it, because in my experience tinkering with cache_dirs, it can never be really good. Linux will discard the cache when it wants, and we don't know whether or not it has discarded cache or system is otherwise busy. I think there are many other projects much more valuable to put my efforts into. Actually I write to a csv-file which can be opened in excel. I've just added a 2.2.6 version that puts most or all the data in a csv-file. I think that might be useful to some. Best Alex