spxlabs

Members
  • Posts

    33
  • Joined

  • Last visited

Everything posted by spxlabs

  1. This was so incredibly nice and unexpected, thank you!
  2. It’s just a meme I saw once. Glad you are answering all the questions because uh, idk anything.
  3. That is very interesting, I haven't seen that yet. I've had this setup for my Other for about a year and she has yet to have any issues like that when streaming. I'm not 100% sure which versions of software she is running because my production stuff is different than my test stuff. I'll definitely be keeping a look out for that in the future and maybe pitch in on some forums when we experience some issues. Thanks for the heads up this is the first I have heard of it.
  4. I can't give you a warm fuzzy answer that is directly a yes. However, in theory it should work. Just copy and backup your config files and give it a try? Then let us know?
  5. I'm fine enough I suppose. It got posted here. https://unraid.net/blog/unraid-capture-encoding-and-streaming-server
  6. Ahh okay. I believe, LUKS encrypts the entire drive, so on each boot you have to enter a password to unlock the drive before any shares or data become visible. So I don't think that is what you want. Well, maybe you do but LUKS isn't what you are looking for. I'm no expert and am not sure what the best procedure or way to do what you want is BUUUUTTTTTTT..... I think what I would do is something like this. From Windows 10 access the share with subfolders/files that you want to encrypt ---> right click and click properties on the folder/file and you should be presented with an option for encryption. Encrypt the folder or file, then use rsync to copy over the encrypted folder/file to the other server. I would not do this to a shared folder. I would only do this to a subfolder or file. I'm sure linux/macos have similar methods to Windows for encrypting a specific folder/file. There are probably plugins that will allow for encrypting and decrypting mounted shares "on the fly" but unfortunately that isn't an area of familiarity for me. Hopefully at a minimum, I've given you enough bread crumbs to figure something out.
  7. Are you thinking of using something different from LUKS? LUKS is how Unraid naturally provides disk encryption.
  8. Hey Spencer, just dropping in to say thank you, for the 10th or 11th time. Have fun unpacking!
  9. Lite-On, some "enterprise" NVME drives.
  10. @johnnie.black I guess my problem is my failure to understand why I haven't experienced any negative side effects. If the SSDs are unable to zero out then why is the performance still so amazing after 42,000TB of data written to the array?
  11. So, I have been working on a project using all SSD's in an array and I fear that my testing methodology is giving me invalid beliefs and results. The Array Single parity 1TB ssd 2 data drives, 1TB each Same brand and model SSD for all 3 No cache The Test 1 run = transferring a 48GB video 42 times after each run I do a parity check I randomly select a video to view and see if it plays after each run I delete all the data and start again My array is 2TB's in free capacity my SSDs do support trim, I am not using a trim plugin (on purpose) I am trying to determine when (or even if) write degradation happens in an all SSD array. I have no idea how to properly test for such a thing, so any advice would be impactful. Run #21 I just completed my 21st successful run and the transfer speeds have not significantly varied yet. I have not averaged the data from each run and each parity check yet but a cursory glance has not yielded any noticeable change in writes and reads. As you can image after completing my 21st successful run or transferring an approximate total of 42,336TB of data over the course of one and a half weeks-ish, I figured I would have perhaps noticed something. There has been little deviation in parity check times as well, at least none that my eye has perceived. Crossroads I am still compiling my data at this time and am not truly ready to reveal what I have recorded but I have reached a point where I am either incorrectly testing or I am misunderstanding what is actually going on. What am I doing wrong? What should I be testing for instead? Is there perhaps a better test than doing network file transfers? In what form will I see problems? Yes, I am aware this isn't officially supported. I am just a curious individual and was wondering what the threshold is for performance degradation and/or parity issues with an SSD array. Furthermore, I am aware that if I wanted the ultimate performance I could use a RAID1 cache or some other setup. That is not the point of this project/test. Thank you though.
  12. @Ford Prefect Ahh okay. I'll try and figure out the commit part. After playing around with other containers and this one more. It's starting to look like a no go, at least for my skill set. I really didn't want to use a VM but it's starting to look like that may be the best way to go.
  13. Okay well, I got it working. Unfortunately none of the settings get retained if the server reboots or if I make changes to the docker and it restarts. Super frustrating. However, I managed to update the Ubuntu, obs, and install other tools for some troubleshooting. After a few hours, I could not figure out why the NDI plugin does not pick up any NDI streams. I can ping the docker, it can ping my gaming rig, but it cannot see any NDI stream. Thoughts or experience?
  14. @Ford Prefect Thanks! I'll give it a go and see what happens. Mind if I pester you if things don't work out?
  15. @Nanobug Well if I remember I can just let you know here. Also, I started working late last night. I have no idea how long it will take me to complete. But if you don't hear from me in like a month, I either died, gave up, or forgot. Sooooooooooo oops.
  16. @Nanobug The server does not do any encoding/transcoding/decoding work. It simply takes a single RTMP stream and distributes that stream to multiple platforms of your choosing. Fear not. I am actively working on an OBS-Server container to take a stream from your gaming computer and encode/transcode/decode the stream, then send it off to twitch. But it is proving rather difficult due to my lack of understanding Docker and poor documentation. Stay tuned.
  17. Hey there person. I tried setting this up last night and could not get it to launch. I set the following: Port for VNC as well as the http://[IP]:[PORT] Path /root/.config/obs-studio /mnt/cache/appdate/obs-studio Not sure what else needs to be created or set. It would be really cool if you have more insight on how to get it to work inside of Unraid. For people like me looking at all the build instructions, how to run/start, etc are not helpful for me. I haven't found any container that lists what if any config file needs to help the container run. I'm not a Docker guru so I'm not entirely sure when I need to use a Variable/Path/Port/etc. I'm just a total scrub. Anyway, if you have more information on what needs to be done in Unraid, that would be very helpful.
  18. Yeah that's what I mean, super bizarre. lol -Probably -To be clear, this docker does not do any transcoding, your gaming rig will do all the transcoding work. This setup simply allows you to stream to multiple places at once. -When you port forward, your brother used your public IP:port to reach your server? -He could VPN into your network as an alternative to exposing your system to the internet.
  19. @itimpi is right, the # is to comment that line out and it shoudn't have been read by the container. If you copy and pasted there could have been a hidden character that was breaking things. But if you added that line in and it worked then that makes things super bizarre. Anyway. Glad you got it working.
  20. You could duplicate the container but you would most likely have to change the ports. I have a feeling if you stream to the same container things are not going to work. Client 1 and Container 1: port 1935 Client 2 and Container 2: port 1936 That should work in my mind but this is not something I have tested. Something I am planning to do or work on is figure out how to get two computers to stream to the server and then to twitch. Both computers would basically be like in split screen. However, it seems the best way to do this is use OBS with NDI on the server. Then within the canvas just make sure both computers have equal space. I'd like do this with a container instead of a VM but it is proving rather difficult.
  21. Yup audio worked for me. Only thing I can think of is, to double check your audio capture settings, which I'm sure you have done.
  22. Looks like we need more questions, I'll hop aboard. What would you consider is the luckiest thing to ever happen to you, of all time?