RAINMAN

Members
  • Posts

    179
  • Joined

  • Last visited

Everything posted by RAINMAN

  1. Thanks for the quick reply. Guess i am going to have to give the RC a try to test it out.
  2. I have a separate VLAN for my IoT stuff to attempt to separate it from my regular trusted network but I want to control some of the devices using the node red docker. Whats the best or recommended way to give only the node red docker access to the IoT network. I was thinking about 2 options but there may be something even better that I missed. 1. If there is a way to assign an IP to the docker I can create a firewall rule bridging that IP to the IoT network. 2. I can install another network card in unraid but I am not sure if there is a way to force only that docker to use the card. I don't want all of my unraid box accessible on the IoT network.
  3. Did that help? I have a different motherboard/processor but since 6.3 lots of kernel panics. I replaces motherboard, CPU, RAM, starting to feel fruitless lol.
  4. I'm not sure how home insurance works there but here they would be responsible for a new server of equivalent specs not a refurbished one. I'd hit them up with what that server cost new at the time and go from there. Could work out well for you actually.
  5. As of now, the pricing would be 120 per year. In 2 years time its not certain though.
  6. I just reviewed the install procedure for crashplan small business and it looks like the existing docker can be modified relatively easily as the install looks basically the same. The issue is the download link is only available from the console in your account. I'm not sure how this would work for the docker. Maybe it can be set as a variable?
  7. I am currently just using synctoy on a schedule to backup specific folders. I had used syncovery in the past as well as Easeus Todo Backup but I found full system images took too much space for little value. I really only need my user directory and a few others.
  8. True, but you can setup your other PCs to backup to unraid then backup the unraid array to crashplan pro. Generally you don't need a full system image backed up from each PC. I just setup each to backup all the user folders and some others that are important. If you wanted full images you could use another software to do system backups to unraid then use crashplan pro so its really only 1 machine you need to have a licence for. Thats an option at least.
  9. Annoying I just switched from Amazon Cloud drive and I was a few TB into this and now its done. I guess the plus side is I have until Sept 2018 + 75% off for another year which isn't bad either as long as this or another "pro" docker works with the small business plan. It's pretty much 2 years to re-evaluate. I was already backing up my other PCs (just relevant folders) with synctoy so not missing anything there. I guess there isnt a way to mount the shares directly into a windows VM and run blackblaze is there?
  10. This one just packed it in for me so I was browsing what others recommended as a good card. I purchased it in Feb and now all the drives attached to it are giving me errors. Debating what to replace it with.
  11. Is it just me or is the webgui drive status spin up/down balls not accurate at all. If I do hdparm -C /dev/sdc I get drive state is: active/idle but the ball is grey. If I click the spin up arrow the ball turns green and the command reads active still. If I then spin it down it changes to drive state is: standby as I would expect. But the question is, why isn't the webgui accurately depicting the status of the drives unless I do a manual force?
  12. a bit complicated but if you can SSH to the device you can setup and run the scripts directly from the device. On my DDWRT router I am running them directly and pushing the data to influx. You should be able to query SNMP data also I would think but I havent looked into this.
  13. I have a shell script running on my router that runs each minute with a cron job writing to my influx DB. You could look into doing this. Here is the topic with some example scripts:
  14. I honestly don't remember the details. It was a lot of mucking about. If you lookup how to use linux as a domain controller some tutorials should pop up. Honestly, it was a huge PITA and I don't recommend it. If 6.4 of unraid allows starting VMs before the array that would be a far better option.
  15. I had this same issue. I got a raspberry pi zero and used that as my secondary domain controller. Then once I connect to the domain and the VM starts the primary comes up and it seems ok. Rebooting unraid its a toss up if it remember the domain connection or not. PITA but its all I could do to get it working. Maybe there are some better ideas out there.
  16. Im having almost the exact same issue as you but I dont think I am running rsnapshot. I need to double check. I wish i saw this thread before I RMA'd my mother board and CPU.
  17. I had a simular issue with nextcloud. I talked with the nextcloud developer and it was determined it was the docker. I posted it on this forum somewhere to make a change to the docker as apline was the issues and was basically denied so I gave up on LDAP. Edit: Here is the post: Glad to see it isnt only me having the issue. I'd love to see it solved so I can use LDAP again.
  18. How are you guys even updating to v12? I updated the docker and I ran the updater but it says I am up to date:
  19. This looks very interesting. I didn't have a lot of luck getting the plugin working how I wanted so I hope this is more successful. Thanks for the hard work on this.
  20. So after playing with this for a bit I figured out how to sync it to amazon, if I run a command like this: rclone sync --transfers=10 --bwlimit 5M '/mnt/user/Console/Atari.2600/' encrypted:'Console/Atari.2600' It successfully writes the files to the encrypted drive but I don't see any of the files in the local mount. If I put the files in the local mount they get uploaded and I can see them. If I do a copy from /mnt/user/Console/Atari.2600/ to /mnt/disks/Console/Atari.2600/ it copies the files but I get a lot of file system errors. I definitely thing rclone is best to sync I just dont understand why I cant see the files in my local mount.
  21. So I managed to get an encrypted folder. Still playing around with it. So everything I put in my shared folder is only stored on the cloud and not locally? Just trying to understand the best way to utilize this with my existing shares. Any best practices?
  22. Opps, it was but I miss-read. For virtual machines you are a bit more limited. You can try: virsh domstats CPU usage is measured in ticks or ns so you would have to do some math on the calculation. Looks a bit complicated. I think memory just records what is available for the VM not actually used but I could be wrong. I measure the stats within each of my VMs (actually just 1 at the moment). But it would be nice to get the total usage of the entire CPU from outside the container.
  23. Haha, ok so it perked my interest. Here you go. Its almost done. Just need to parse it into a script. docker stats --no-stream $(docker ps --format={{.Names}}) Edit: Even better docker stats --no-stream $(docker ps --format={{.Names}}) | sed 's/%//g' | grep -v "CONTAINER" | awk '{print $1,$2,$3}' Just throw that into a while read and send to influx. Table: Name CPU Memory
  24. You can access the docker stats using: docker stats --no-stream Right now this outputs the dockers container IDs. There is a --format option that was implemented until version 1.8 of docker and i have 1.10 as part of unraid 6.2.4. With --format you can specify to use name instead which would probably be easier long term as the ID changes when you rebuild or change it. However, I dont see this option as existing for some reason. Not sure. What you can do is then run docker ps | grep ContainerID to and grab the container name from there. Something like that. I havent put together a script but its on my list so if someone else gets one going please share
  25. Cheers. Didnt even know there was settings for it. Close this please.