spxlabs

Members
  • Posts

    47
  • Joined

  • Last visited

Everything posted by spxlabs

  1. That's awesome! There aren't many fans out there but the ones that exist I cherish the most.
  2. I, for one, welcome our new VP of Global Support Overlord! I'd like to remind them that as a trusted YouTube personality, I can be helpful in rounding up others to toil in their underground homelab caves.
  3. You will need to enable SSH on both servers. SCP will not create the directory. I'm not 100% sure but I feel like on your backup server you should and would instead scp /luckybackup/.ssh/id_rsa.pub <backup server IP>:/root/.ssh/
  4. I'm not aware of any updated instructions over the last two years. However, as far as streaming to different platforms. In the nginx.conf you would add a second url to Facebook/YouTube/Twitch. So something like this. server { listen [::]:1935; chunk_size 4096; application live { live on; record off; push rtmp://live.twitch.tv/app/YOUR_TWITCH_KEY; push rtmp://a.rtmp.youtube.com/live2/YOUR_YOUTUBE_KEY; Pretty sure anyway. I haven't looked at this in two years....
  5. This video almost didn't make it. I was extremely frustrated with the setup and container. Especially since I discovered some bugs with it. I'm glad it was highly successful in the end. Without @ich777 the video definitely would not have existed. So kudos to him! Command Line is so much easier though lol. The GUI stuff is extra steps for sure.
  6. That is the most visited page on my website of all time, nothing else is even remotely close. I'm very glad that it is helpful for so many people. It's just too bad Dell has taken away that function from us.
  7. If I remember correctly, the CPU tax was still there but near half the amount of just using the CPU alone. I'm not sure of the cause of it for back then. When I've done this more recently between Windows and Linux (not Unraid), the cpu usage is extremely minimal. I feel like they may have made software improvements since these posts were originally created.
  8. @Caennanu Oops, sorry I did gloss over the CPU usage lol my bad. It is definitely not ideal to use that much CPU especially if your system is pulling double triple duty with other services yikes. There should definitely be a caveat in the blog post or a blurb or something mentioning that YMMV and recommending GPU encoding whenever possible. Or quick sync at the very least. So, you bring up something I never really considered and that is how a setup like this would affect others on the network. You got me thinking about a much more casual crowd of folks that may not have similar "pro-sumer/business/enterprise" equipment like I do at home, so the worry of over taxing never crossed my mind. While I can't update this blog post on Unraids' website, I can update it on mine and in the future for similar projects give a warning for side effects that may occur. lol I've definitely annoyed the wife pretty often with all the tinkering I've done on our home network so this really rings true to me. Man, you really gave some good feedback here. I've been so reluctant on learning anything "advanced" with networking and recently, many people have been pushing me to really learn and understand the network aspects of a homelab. UGH, I really don't want to though lol. It's clearly too important to at least not acknowledge it though and I think that is the underlying argument I've either willfully ignored or wasn't really able to extract from short commentary. Either way, I think you got something moving in my head hahaha. Oh and I don't know if it's worth anything but this also does exist as well, https://unraid.net/es/blog/unraid-6-9-capture-encoding-and-streaming-server, using NVIDIA of course. Same problems about networking though.
  9. @Caennanu All great questions and points! I'll take a stab at it lol - The only question i really have is, why would i want to do this for just streaming? Part of the reason I wrote this was to give people ideas of other ways to utilize their servers that they would like to "do something else with" besides a media server or file share. I feel posts like this are important because it gets peoples gears turning to come up with new ideas and try new things (I hope). Maybe a blog post like this will help someone get more comfortable with Unraid, Docker, and/or services. - With Nvenc encoding (only applicable for nvidia gpu's) the load on the gaming system is near to none (compared to 60% cpu utilization of your server with 16 cores / 32 threads?). That is true with newer NVIDIA GPU's but NVENC isn't always great. I will agree that this is not really the most ideal configuration but I guess it really depends on "your" use case. I think most of the "homelab" community tends to purchase used enterprise equipment that often has high core counts and threads. Ideally, what you would do instead is use a GPU on the other end instead of a CPU, in my honest opinion, but hey, we can only work with the hardware we have. On the network side of house, that's not something I can really speak to because I'm no expert. QoS could definitely be an issue but from my somewhat limited experience most equipment doesn't have QoS enabled by default. The network stuff can be a real rabbit hole because it is super hardware dependent, so forgive me for not really touching this portion. My blanket stance is; ehhh it will be fine, I'm not worried about it, you shouldn't worry about it, I'm not worried at all. - Next to this, since everything is virtualized, you (generally) do not have direct acces to the virtual machine. (docker is a virtual machine afterall). If your gaming system crashes, you have no more control over the stream since you have no more access. But the stream still goes on till and if you regain control? From my understanding, full time streamers prefer to use a second system in case their gaming rig crashes, so their stream doesn't go down. They often would rather the stream continue on while they get things back up. Now, ideally, you would be using a capture card and not using NDI/the network. However, I believe for the aspiring streamer who is extremely tight on cash would love a method similar to this. Presumably they would out grown this use case and move on to a different configuration. Like a capture card + Quick Sync, on a second system. - There are a couple more reasons i can think off why i don't think this is a good idea. But i'd love to hear some pro's (aside of offloading recording of the footage). For me, another big reason is noise and heat! Especially heat. Being able to utilize the network instead of a capture device often means you can place a second system in a different room, further away, and is generally less expensive with ethernet. So now you only have one system generating heat in your office/studio/bedroom/place of gaming sessions, instead of 2. Plus all your body heat! Yikes. Ethernet cables also allow for more distance and tend to be cheaper than HDMI cables. HDMI cables also have length constraints that need to be overcome with "active cables" that may or may not be good quality due to the manufacturing process and with the loose "standards" that define what is HDMI 1.5, 2.0, or whatever version. Finally and more importantly than anything, it's all about options! It's information for everyone to know that there are multiple ways to tackle a problem and find a solution that "you" can implement with the hardware, money, and use case that best fits. After all, not everything is one size fits all.
  10. A bit late to this topic but I daily drive a MBP and have been using Unraid for maybe 5 or 6 years now (same time with MacOS) and I can get 1GBps transfer speeds to my Unraid server. But I am using NVMe cache when making those file transfers. Now, if I send data directly to the array, then yes the transfer speeds are much slower, limited by the slowest drive in your array. Occasionally I also have to edit video off of my Unraid server's NVMe cache and that works great. I almost never write any data directly to the array. Even in Windows/Linux I have found performance to be lack luster when writing to the array. Personally I believe the array is more for long term storage than regular access. Some things that I have done in the past were to create SSD arrays and NVMe arrays and the transfer speeds were much better between MacOS and Unraid. But the cost to have 20TB of SSD storage is way to high for me. So I bought 4 used 1TB NVMe drives off ebay forever ago and use them as a RAID 10 cache. Ever since then I have never even thought about slow transfer speeds anymore. Of course there are some problems with this. 1. Data in your cache isn't safe until it gets moved to the array at night 2. It can be expensive 3. You will probably want a UPS to protect your Unraid server from random power outages while your data is in the cache If you have any specific questions about MacOS and Unraid I will do my best to answer them.
  11. @thymon Did you ever get this working? Just a guess but perhaps the keys in /boot/config/.ssh did not get copied over during boot up on your remote system. Double check the "go" script.
  12. I have not experienced this. The filenames generated by the camera are always unique. I'm not sure what the best approach would be here but I think my first stab at the problem would be to have a script that appends the date to the filename that camera generates. Or stick everything in a unique folder first.
  13. @ich777I'm so glad you are keeping up here. I only logged in today because you messaged me lol. I've missed so many posts.
  14. This was so incredibly nice and unexpected, thank you!
  15. It’s just a meme I saw once. Glad you are answering all the questions because uh, idk anything.
  16. That is very interesting, I haven't seen that yet. I've had this setup for my Other for about a year and she has yet to have any issues like that when streaming. I'm not 100% sure which versions of software she is running because my production stuff is different than my test stuff. I'll definitely be keeping a look out for that in the future and maybe pitch in on some forums when we experience some issues. Thanks for the heads up this is the first I have heard of it.
  17. I can't give you a warm fuzzy answer that is directly a yes. However, in theory it should work. Just copy and backup your config files and give it a try? Then let us know?
  18. I'm fine enough I suppose. It got posted here. https://unraid.net/blog/unraid-capture-encoding-and-streaming-server
  19. Ahh okay. I believe, LUKS encrypts the entire drive, so on each boot you have to enter a password to unlock the drive before any shares or data become visible. So I don't think that is what you want. Well, maybe you do but LUKS isn't what you are looking for. I'm no expert and am not sure what the best procedure or way to do what you want is BUUUUTTTTTTT..... I think what I would do is something like this. From Windows 10 access the share with subfolders/files that you want to encrypt ---> right click and click properties on the folder/file and you should be presented with an option for encryption. Encrypt the folder or file, then use rsync to copy over the encrypted folder/file to the other server. I would not do this to a shared folder. I would only do this to a subfolder or file. I'm sure linux/macos have similar methods to Windows for encrypting a specific folder/file. There are probably plugins that will allow for encrypting and decrypting mounted shares "on the fly" but unfortunately that isn't an area of familiarity for me. Hopefully at a minimum, I've given you enough bread crumbs to figure something out.
  20. Are you thinking of using something different from LUKS? LUKS is how Unraid naturally provides disk encryption.
  21. Hey Spencer, just dropping in to say thank you, for the 10th or 11th time. Have fun unpacking!
  22. Lite-On, some "enterprise" NVME drives.