54tgedrg45

Members
  • Content Count

    15
  • Joined

  • Last visited

Community Reputation

0 Neutral

About 54tgedrg45

  • Rank
    Newbie

Converted

  • Personal Text
    Missing ZFS, iSCSI, NFS4, Podman and enjoying these;
    rsync: symlink "/.../lib/python3.6/os.py" -> "/usr/lib/python3.6/os.py" failed: Operation not not supported (95)

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Okay thank you for that , I now feel fine with it to rebuild the parity without disk1. * https://wiki.unraid.net/index.php/FAQ_remove_drive the two v6 links on top seem dead here, did one refer to: https://wiki.unraid.net/Shrink_array ?
  2. I want to remove disk1 from the array, and rebuild the parity without disk1. I've moved all data from disk1 to disk2, however there is still 8.8GB data on disk1(hidden?) When I remove disk1 and start the array three shares show "SOME FILES ARE UNPROTECTED", this probably because at first files where terrible scattered across all disks, with now each share assigned to it's own disk and data moved/arranged to disk (I moved data so no message "data outside assigned disk" on shares). All shares that live in cache only, don't show the unprotected message. All shares are a
  3. I was searching for the path of the logs since there is no download button that grabs them. I'm now thinking of putting all output together with the rsync log file at backup destination.
  4. Eum, where are the script output logs stored? after execution(of all scripts) I get the trashcan to delete log icon/prompt to delete log of task name, not informing path of log. How to view them? No idea where it would be. at flash /logs it shows none. Edit: OK, they can be found at /tmp/user.scripts/tmpScripts/ # logfile log.txt
  5. This happen to me after stopping the array, changing the DNS, and restarting the array. ty sturmstar
  6. Just started using VM's with Unraid. Made a template Debian VM, copied that on to a new folder to setup a new one, but I get confronted with UEFI shell on boot The solution for me
  7. With that approach I end up in UEFI shell at power on for a Debian vdisk1.img copy. Removing the Unraid share makes no difference.(v6.8.3) Edit: It seems to be such cause I found that when I run the following in the presented UEFI shell for my copied Debian image: fs0: cd efi/debian grubx64.efi that it will boot, but it's not persistent... The fix: https://wiki.debian.org/GrubEFIReinstall # Reinstalling grub-efi on your hard drive # Check that the computer booted in computer in EFI mode: [ -d /sys/firmware/efi ] && echo "EFI boot on
  8. I wonder how Synology is able to wakeup when receiving things like SMB request, I can only think of basic packet detection done with some BMC interface. For my situation I assigned an old rpi to send magic packets to the Unraid on known clients ping status 0. as the operating times vary. a.t.m I have this current cron job for testing: #!/bin/bash # 20200607 # sudo crontab -e # sudo apt-get install etherwake # sudo apt-get install fping # Config MACADDR[0]="AA:BB:CC:DD:EE:FF" #MACADDR[1]="AA:BB:CC:DD:EE:FF" IPHOSTS[0]="x.x.x.x" #IPHOSTS[1]="x.x.x.x" IPCLIENS[0]="x.x.x.x" IPCLIEN
  9. I run this plugin since 28Feb2020 on an Intel server board, and it only slept well for a few days straight after client activity, but further on it's was quite random/very rare. SSD cache Move is daily but takes only a few minutes. Parity check consumes whole Mondays every week. Is there something about btrfs partitioned drives in the pool? I have one drive(WD2000F9YZ (SE) HDD) with btrfs (only 17.3MB/2TB in use), other drives/parity are HGST Ultrastar He10/WD2002FAEX with xfs. It keeps unraid 6.8.3 awake according to log, while all disk spinned down according to unraid U
  10. If so, is it somehow possible to just capture tail of log for the plugin to display within browser limits?
  11. Anyone also experiencing instant Firefox tab hang2crash after clicking script log of an in background running script? This happens when running rsync with quite some stdout. Currently running UR 6.8.3/ US 2020.02.27 / FF 73.0.1