spxlabs

Members
  • Posts

    37
  • Joined

  • Last visited

Everything posted by spxlabs

  1. @thymon Did you ever get this working? Just a guess but perhaps the keys in /boot/config/.ssh did not get copied over during boot up on your remote system. Double check the "go" script.
  2. I have not experienced this. The filenames generated by the camera are always unique. I'm not sure what the best approach would be here but I think my first stab at the problem would be to have a script that appends the date to the filename that camera generates. Or stick everything in a unique folder first.
  3. @ich777I'm so glad you are keeping up here. I only logged in today because you messaged me lol. I've missed so many posts.
  4. This was so incredibly nice and unexpected, thank you!
  5. It’s just a meme I saw once. Glad you are answering all the questions because uh, idk anything.
  6. That is very interesting, I haven't seen that yet. I've had this setup for my Other for about a year and she has yet to have any issues like that when streaming. I'm not 100% sure which versions of software she is running because my production stuff is different than my test stuff. I'll definitely be keeping a look out for that in the future and maybe pitch in on some forums when we experience some issues. Thanks for the heads up this is the first I have heard of it.
  7. I can't give you a warm fuzzy answer that is directly a yes. However, in theory it should work. Just copy and backup your config files and give it a try? Then let us know?
  8. I'm fine enough I suppose. It got posted here. https://unraid.net/blog/unraid-capture-encoding-and-streaming-server
  9. Ahh okay. I believe, LUKS encrypts the entire drive, so on each boot you have to enter a password to unlock the drive before any shares or data become visible. So I don't think that is what you want. Well, maybe you do but LUKS isn't what you are looking for. I'm no expert and am not sure what the best procedure or way to do what you want is BUUUUTTTTTTT..... I think what I would do is something like this. From Windows 10 access the share with subfolders/files that you want to encrypt ---> right click and click properties on the folder/file and you should be presented with an option for encryption. Encrypt the folder or file, then use rsync to copy over the encrypted folder/file to the other server. I would not do this to a shared folder. I would only do this to a subfolder or file. I'm sure linux/macos have similar methods to Windows for encrypting a specific folder/file. There are probably plugins that will allow for encrypting and decrypting mounted shares "on the fly" but unfortunately that isn't an area of familiarity for me. Hopefully at a minimum, I've given you enough bread crumbs to figure something out.
  10. Are you thinking of using something different from LUKS? LUKS is how Unraid naturally provides disk encryption.
  11. Hey Spencer, just dropping in to say thank you, for the 10th or 11th time. Have fun unpacking!
  12. Lite-On, some "enterprise" NVME drives.
  13. @johnnie.black I guess my problem is my failure to understand why I haven't experienced any negative side effects. If the SSDs are unable to zero out then why is the performance still so amazing after 42,000TB of data written to the array?
  14. So, I have been working on a project using all SSD's in an array and I fear that my testing methodology is giving me invalid beliefs and results. The Array Single parity 1TB ssd 2 data drives, 1TB each Same brand and model SSD for all 3 No cache The Test 1 run = transferring a 48GB video 42 times after each run I do a parity check I randomly select a video to view and see if it plays after each run I delete all the data and start again My array is 2TB's in free capacity my SSDs do support trim, I am not using a trim plugin (on purpose) I am trying to determine when (or even if) write degradation happens in an all SSD array. I have no idea how to properly test for such a thing, so any advice would be impactful. Run #21 I just completed my 21st successful run and the transfer speeds have not significantly varied yet. I have not averaged the data from each run and each parity check yet but a cursory glance has not yielded any noticeable change in writes and reads. As you can image after completing my 21st successful run or transferring an approximate total of 42,336TB of data over the course of one and a half weeks-ish, I figured I would have perhaps noticed something. There has been little deviation in parity check times as well, at least none that my eye has perceived. Crossroads I am still compiling my data at this time and am not truly ready to reveal what I have recorded but I have reached a point where I am either incorrectly testing or I am misunderstanding what is actually going on. What am I doing wrong? What should I be testing for instead? Is there perhaps a better test than doing network file transfers? In what form will I see problems? Yes, I am aware this isn't officially supported. I am just a curious individual and was wondering what the threshold is for performance degradation and/or parity issues with an SSD array. Furthermore, I am aware that if I wanted the ultimate performance I could use a RAID1 cache or some other setup. That is not the point of this project/test. Thank you though.
  15. @Ford Prefect Ahh okay. I'll try and figure out the commit part. After playing around with other containers and this one more. It's starting to look like a no go, at least for my skill set. I really didn't want to use a VM but it's starting to look like that may be the best way to go.
  16. Okay well, I got it working. Unfortunately none of the settings get retained if the server reboots or if I make changes to the docker and it restarts. Super frustrating. However, I managed to update the Ubuntu, obs, and install other tools for some troubleshooting. After a few hours, I could not figure out why the NDI plugin does not pick up any NDI streams. I can ping the docker, it can ping my gaming rig, but it cannot see any NDI stream. Thoughts or experience?
  17. @Ford Prefect Thanks! I'll give it a go and see what happens. Mind if I pester you if things don't work out?
  18. @Nanobug Well if I remember I can just let you know here. Also, I started working late last night. I have no idea how long it will take me to complete. But if you don't hear from me in like a month, I either died, gave up, or forgot. Sooooooooooo oops.
  19. @Nanobug The server does not do any encoding/transcoding/decoding work. It simply takes a single RTMP stream and distributes that stream to multiple platforms of your choosing. Fear not. I am actively working on an OBS-Server container to take a stream from your gaming computer and encode/transcode/decode the stream, then send it off to twitch. But it is proving rather difficult due to my lack of understanding Docker and poor documentation. Stay tuned.
  20. Hey there person. I tried setting this up last night and could not get it to launch. I set the following: Port for VNC as well as the http://[IP]:[PORT] Path /root/.config/obs-studio /mnt/cache/appdate/obs-studio Not sure what else needs to be created or set. It would be really cool if you have more insight on how to get it to work inside of Unraid. For people like me looking at all the build instructions, how to run/start, etc are not helpful for me. I haven't found any container that lists what if any config file needs to help the container run. I'm not a Docker guru so I'm not entirely sure when I need to use a Variable/Path/Port/etc. I'm just a total scrub. Anyway, if you have more information on what needs to be done in Unraid, that would be very helpful.
  21. Yeah that's what I mean, super bizarre. lol -Probably -To be clear, this docker does not do any transcoding, your gaming rig will do all the transcoding work. This setup simply allows you to stream to multiple places at once. -When you port forward, your brother used your public IP:port to reach your server? -He could VPN into your network as an alternative to exposing your system to the internet.