spxlabs

Members
  • Posts

    47
  • Joined

  • Last visited

Posts posted by spxlabs

  1. On 10/25/2023 at 4:41 PM, Adam-M said:

    I am a long time fan of SPXLabs!  Thank you for your continued patronage of Unraid and all the wonderful content you provide to the community!

    That's awesome! There aren't many fans out there but the ones that exist I cherish the most.

    • Like 2
  2. On 9/24/2023 at 6:43 AM, gus1234 said:

    Hi!

    I'm trying to set up a backup from my main server to my backup server using the blog above but I'm having a bit of trouble. When I run the line:
     

    scp /root/.ssh/id_rsa.pub 192.168.1.9:/root/.ssh/


    I get an error saying port 22: Connection refused, am I correct in saying in mangement settings within webGUI I need to set use SSH to yes should fix that issue or is there something Im missing?

    I also had I another question another user mentioned to save the key under /luckybackup/.ssh/, in the next step of the blog when he copies the key to the backup server would I enter in the following line?

    scp /luckybackup/.ssh/id_rsa.pub <backup server IP>:/luckybackup/.ssh/

    As I dont have luckybackup on my backup server? Or would it just create the directory?

    You will need to enable SSH on both servers.

     

    SCP will not create the directory.

     

    I'm not 100% sure but I feel like on your backup server you should and would instead scp /luckybackup/.ssh/id_rsa.pub <backup server IP>:/root/.ssh/

     

     

  3. On 4/18/2023 at 7:12 AM, HostCentricUK said:

    Hi Guys/Gals,

     

    I am new here and I am looking to see if there is an updated version of the setup instructions and information on how to set the server to send to the different platforms (Facebook/YouTube/Twitch).

    I'm not aware of any updated instructions over the last two years.  However, as far as streaming to different platforms.
    In the nginx.conf you would add a second url to Facebook/YouTube/Twitch.

     

    So something like this.

    server {

    listen [::]:1935;

    chunk_size 4096;

    application live {

    live on;

    record off;

    push rtmp://live.twitch.tv/app/YOUR_TWITCH_KEY;

    push rtmp://a.rtmp.youtube.com/live2/YOUR_YOUTUBE_KEY;

     

     

    Pretty sure anyway.  I haven't looked at this in two years....

  4. On 10/19/2022 at 11:00 AM, Hoopster said:

    My favorite SPX Labs tutorial/video is backing up one unRAID server to another via LuckyBackup.

     

    Even though I don't use this method (already had it setup via scripts), I have seen a lot of new unRAID users asking for something like this (especially with some sort of GUI) so I imagine this is a very useful tutorial.

     

    This video almost didn't make it.  I was extremely frustrated with the setup and container.  Especially since I discovered some bugs with it.  I'm glad it was highly successful in the end.  Without @ich777 the video definitely would not have existed.  So kudos to him!

     

    Command Line is so much easier though lol.  The GUI stuff is extra steps for sure.

  5. On 10/18/2022 at 2:29 AM, jollymonsa said:

    SPX labs is without a doubt one of the top organic search sites when I am looking for server knowhow. I have referred what feels like a hundred times to his work when someone gets a dell 12th generation server and then says "its loud", yeah he has the fix for that. https://www.spxlabs.com/blog/2019/3/16/silence-your-dell-poweredge-server < incase YOU need that fix. I didnt know he had a youtube channel until fairly recently but have subbed and enjoy it a lot. Keep up the good work man!

     

    That is the most visited page on my website of all time, nothing else is even remotely close.  I'm very glad that it is helpful for so many people.  It's just too bad Dell has taken away that function from us.

  6. @Caennanu

     

    Oops, sorry I did gloss over the CPU usage lol my bad.  It is definitely not ideal to use that much CPU especially if your system is pulling double triple duty with other services yikes.  There should definitely be a caveat in the blog post or a blurb or something mentioning that YMMV and recommending GPU encoding whenever possible.  Or quick sync at the very least.

     

    So, you bring up something I never really considered and that is how a setup like this would affect others on the network.  You got me thinking about a much more casual crowd of folks that may not have similar "pro-sumer/business/enterprise" equipment like I do at home, so the worry of over taxing never crossed my mind.  While I can't update this blog post on Unraids' website, I can update it on mine and in the future for similar projects give a warning for side effects that may occur.  lol I've definitely annoyed the wife pretty often with all the tinkering I've done on our home network so this really rings true to me.

     

    Man, you really gave some good feedback here.  I've been so reluctant on learning anything "advanced" with networking and recently, many people have been pushing me to really learn and understand the network aspects of a homelab.  UGH, I really don't want to though lol.  It's clearly too important to at least not acknowledge it though and I think that is the underlying argument I've either willfully ignored or wasn't really able to extract from short commentary.   Either way, I think you got something moving in my head hahaha.

     

    Oh and I don't know if it's worth anything but this also does exist as well, https://unraid.net/es/blog/unraid-6-9-capture-encoding-and-streaming-server, using NVIDIA of course.  Same problems about networking though.

  7. @Caennanu  All great questions and points!  I'll take a stab at it lol

     

    - The only question i really have is, why would i want to do this for just streaming? 

    Part of the reason I wrote this was to give people ideas of other ways to utilize their servers that they would like to "do something else with" besides a media server or file share.  I feel posts like this are important because it gets peoples gears turning to come up with new ideas and try new things (I hope).  Maybe a blog post like this will help someone get more comfortable with Unraid, Docker, and/or services.

     

    - With Nvenc encoding (only applicable for nvidia gpu's) the load on the gaming system is near to none (compared to 60% cpu utilization of your server with 16 cores / 32 threads?).  That is true with newer NVIDIA GPU's but NVENC isn't always great. 

     

    I will agree that this is not really the most ideal configuration but I guess it really depends on "your" use case.  I think most of the "homelab" community tends to purchase used enterprise equipment that often has high core counts and threads.  Ideally, what you would do instead is use a GPU on the other end instead of a CPU, in my honest opinion, but hey, we can only work with the hardware we have.

     

    On the network side of house, that's not something I can really speak to because I'm no expert.  QoS could definitely be an issue but from my somewhat limited experience most equipment doesn't have QoS enabled by default.  The network stuff can be a real rabbit hole because it is super hardware dependent, so forgive me for not really touching this portion.  My blanket stance is; ehhh it will be fine, I'm not worried about it, you shouldn't worry about it, I'm not worried at all.

     

    - Next to this, since everything is virtualized, you (generally) do not have direct acces to the virtual machine. (docker is a virtual machine afterall). If your gaming system crashes, you have no more control over the stream since you have no more access. But the stream still goes on till and if you regain control?

     

    From my understanding, full time streamers prefer to use a second system in case their gaming rig crashes, so their stream doesn't go down.  They often would rather the stream continue on while they get things back up.  Now, ideally, you would be using a capture card and not using NDI/the network.  However, I believe for the aspiring streamer who is extremely tight on cash would love a method similar to this.   Presumably they would out grown this use case and move on to a different configuration.  Like a capture card + Quick Sync, on a second system.

     

    - There are a couple more reasons i can think off why i don't think this is a good idea. But i'd love to hear some pro's (aside of offloading recording of the footage).

     

    For me, another big reason is noise and heat! Especially heat.  Being able to utilize the network instead of a capture device often means you can place a second system in a different room, further away, and is generally less expensive with ethernet.  So now you only have one system generating heat in your office/studio/bedroom/place of gaming sessions, instead of 2.  Plus all your body heat! Yikes.  Ethernet cables also allow for more distance and tend to be cheaper than HDMI cables.  HDMI cables also have length constraints that need to be overcome with "active cables" that may or may not be good quality due to the manufacturing process and with the loose "standards" that define what is HDMI 1.5, 2.0, or whatever version.

     

    Finally and more importantly than anything, it's all about options!  It's information for everyone to know that there are multiple ways to tackle a problem and find a solution that "you" can implement with the hardware, money, and use case that best fits.  After all, not everything is one size fits all.

    • Like 2
  8. A bit late to this topic but I daily drive a MBP and have been using Unraid for maybe 5 or 6 years now (same time with MacOS) and I can get 1GBps transfer speeds to my Unraid server.  But I am using NVMe cache when making those file transfers.  Now, if I send data directly to the array, then yes the transfer speeds are much slower, limited by the slowest drive in your array.  Occasionally I also have to edit video off of my Unraid server's NVMe cache and that works great.

     

    I almost never write any data directly to the array.  Even in Windows/Linux I have found performance to be lack luster when writing to the array.  Personally I believe the array is more for long term storage than regular access.

     

    Some things that I have done in the past were to create SSD arrays and NVMe arrays and the transfer speeds were much better between MacOS and Unraid.  But the cost to have 20TB of SSD storage is way to high for me.  So I bought 4 used 1TB NVMe drives off ebay forever ago and use them as a RAID 10 cache.  Ever since then I have never even thought about slow transfer speeds anymore.

     

    Of course there are some problems with this.

    1.  Data in your cache isn't safe until it gets moved to the array at night

    2. It can be expensive

    3. You will probably want a UPS to protect your Unraid server from random power outages while your data is in the cache

     

    If you have any specific questions about MacOS and Unraid I will do my best to answer them.

  9. On 3/4/2021 at 2:51 AM, thymon said:

    Hello there

    I have a local server and backup server in another site.

    I follow this tutorial

    worked well so far.

    Last week, my backup server was off for several days (power supply problem).

    The backup server is now on. And the sync no longer works with this error in the script logs :

    rsync: connection unexpectedly closed (0 bytes received so far) [sender]
    
    rsync error: unexplained error (code 255) at io.c(228) [sender=3.2.3]

     

    Thanks in advance

    @thymon Did you ever get this working?

     

    Just a guess but perhaps the keys in /boot/config/.ssh did not get copied over during boot up on your remote system.  Double check the "go" script.

  10. On 10/10/2021 at 1:54 PM, BoKKeR said:

    Does your current setup support renaming files? I have a setup where I rsync my camera files from the sony camera SD card to a backup folder, but the sony camera choses to name the video files as 0001-0002 etc, If I wipe the card and record new footage rsync would just overwrite existing backup. Have you faced this problem? 

    I have not experienced this.  The filenames generated by the camera are always unique.  I'm not sure what the best approach would be here but I think my first stab at the problem would be to have a script that appends the date to the filename that camera generates.  Or stick everything in a unique folder first.

  11. On 5/10/2021 at 2:09 PM, jonp said:

     

     So while we may eventually bring in a backup solution, that isn't a promise and in the meantime, you can probably find a variety of ways to backup your system with some basic Googling, but here's a nice write-up from our friend over at @spxlabs:  https://www.spxlabs.com/blog/2020/10/2/unraid-to-remote-unraid-backup-server-with-wireguard-and-rsync

    This was so incredibly nice and unexpected, thank you!

    • Like 1
  12. On 4/11/2021 at 1:26 PM, svenvv said:

    As someone who's used obs-ndi on a daily basis for a long time I do suggest running audio outside of obs-NDI. Either by just running a seperate NDI monitor and capturing the audio output, or by using separate solution like sonobus (open source).

     

    The reason is a pair of bugs that have not been fixed as of version 4.9.1:

    - Losing audio output completely. This is something that can randomly happen after weeks of use. Your audio output will break, but all the vu-meters and even monitoring still works. You won't notice it until your audience starts screaming at you.

     

    - Desync. After a while there is a chance audio will drift out of sync with the video. Not something you'll notice in a test, but when running long streams this can get problematic.

     

    - When using screen capture HX, instead of the normal screen capture, there's a chance all audio sources go 'robot mode'. There are reports that this may have been fixed by using the NDI 4.6 runtime over the 4.5 runtime that gets installed by default, but I'm just not taking any risks myself.

    That is very interesting, I haven't seen that yet.  I've had this setup for my Other for about a year and she has yet to have any issues like that when streaming.  I'm not 100% sure which versions of software she is running because my production stuff is different than my test stuff.  I'll definitely be keeping a look out for that in the future and maybe pitch in on some forums when we experience some issues.  Thanks for the heads up this is the first I have heard of it.

    • Like 1
  13. On 3/20/2021 at 12:56 AM, PzrrL said:

    If I already have a container from linuxserver/swag (with nginx in it), can I just copy the config to swag, instead of creating another container?

    I can't give you a warm fuzzy answer that is directly a yes.  However, in theory it should work.  Just copy and backup your config files and give it a try?  Then let us know?

  14. On 11/1/2020 at 9:52 AM, 172pilot said:

    Interesting - I wasn't aware of LUKS..  I'll have to do more reading on it, but my impression so far is that it is for security on the volume, but once mounted, it's available to anyone (root).  Since I'm talking about streaming backup files to a remote UnRaid that I dont own, I'd want to encrypt the data before it leaves, rather than rely on an encryption that is managed by the remote system.   Maybe LUKS has a mode to do this..  I haven't read much yet, and will do so.  Thanks for pointing out this functionality I didn't know about!

     

    Ahh okay.

     

    I believe, LUKS encrypts the entire drive, so on each boot you have to enter a password to unlock the drive before any shares or data become visible.  So I don't think that is what you want.  Well, maybe you do but LUKS isn't what you are looking for.

     

    I'm no expert and am not sure what the best procedure or way to do what you want is BUUUUTTTTTTT..... I think what I would do is something like this.

     

    From Windows 10 access the share with subfolders/files that you want to encrypt ---> right click and click properties on the folder/file and you should be presented with an option for encryption.  Encrypt the folder or file, then use rsync to copy over the encrypted folder/file to the other server.  I would not do this to a shared folder.  I would only do this to a subfolder or file.

     

    I'm sure linux/macos have similar methods to Windows for encrypting a specific folder/file.  There are probably plugins that will allow for encrypting and decrypting mounted shares "on the fly" but unfortunately that isn't an area of familiarity for me.  Hopefully at a minimum, I've given you enough bread crumbs to figure something out.

  15. On 10/11/2020 at 11:04 AM, 172pilot said:

    One thing I think is missing - Unless I missed it while reading, is...  I would LOVE a way to have the data encrypted at the far end automatically..  In otherwords, I've got a friend with an Unraid server, and we'd both like to be the offsite backup for each other, but I think we'd want on-disk encryption at each other's remote sites.  Is there something on UnRaid that would let this happen yet?  I've never seen this option..

    Are you thinking of using something different from LUKS?  LUKS is how Unraid naturally provides disk encryption.