Jump to content

falconexe

Community Developer
  • Posts

    789
  • Joined

  • Days Won

    15

Posts posted by falconexe

  1. On 9/9/2020 at 9:00 PM, falconexe said:

    My UNRAID server went from being amazingly useful to LUDICROUS MODE in less than a week, and I’ve been running UNRAID since 2014...

     

    Well NextCloud was the next thing since sliced bread, until THIS!!! What's after LUDICROUS MODE? Comb the Dessert Mode? Anyone? Anyone? 😂

     

     

    Next Level UNRAID...

  2. I just setup Nextcloud last night with a Reverse Proxy (Subdomain/Let’sEncrypt) based on @SpaceInvaderOne's 2 videos and it works like a charm. I've been offloading my iOS photos all day. Dang near instantaneously, even with the secure encryption. Minor tweaking from his vids in 2020, but not too bad. Definitely some things you need to know though.
     

    I can even pass/use the UNRAID shares as if they are native folders with Nextcloud using the Nextcloud external devices plugin. I can now ingest/output data from any client/device/Unraid as a single secure bucket internally and externally.

     

    Setting up upload request folders and/or sharing folders with links is going to be amazing for friends and family, especially during the holidays. I feel like I just consolidated all my social circles in 1 spot, and I own it. Not social media companies. Finally, friends and family can now even use my massive server as an offsite backup repository, and that then gets backed up into the cloud (Duplicacy - See Below).

     

    I shared 1.5 GB of photos off my brother’s old phone from 6 years ago (he lost the photos and didn’t even know I had them) and he downloaded the entire folder in minutes. Safer than We Transfer because it is my server, and my domain, and everything is secure. I’m super impressed! Needless to say, Dropbox is getting dropped.

     

    On the native UNRAID backup front, I ended up going with Duplicacy (Duplicati was unstable), and Unlimited Google Drive for Business. I am running both my Backups and Nextcloud off my personal domain registered with GoDaddy on the cheap and Duck DNS.

     

    My UNRAID server went from being amazingly useful to LUDICROUS MODE in less than a week, and I’ve been running UNRAID since 2014...

     

    Let me know if you need any help. I’ve been busy on these topics recently...

     

    As always, I wanted to Personally thank Ed @SpaceInvaderOne for his extremely helpful advice and videos!

  3. On 9/1/2020 at 6:46 PM, CS01-HS said:

    I use Duplicacy for local backups (network share and USB.) I recently restored two ~60GB VMs as a test, one from each source. Worked fine although you can't navigate away from the restore screen or it'll stop. Other than that and the hassle of the initial setup it's worked fairly well.

     

    Add the password as a container variable or automated backups after a container update will fail pending manual log in.554097545_ScreenShot2020-09-01at8_41_59PM.thumb.png.9eeadbc1dd44ededabd58d99ec455fba.png 

    Thanks for this tip. I've set it up like this now.

    • Like 1
  4. On 9/30/2018 at 12:32 AM, Chrysen said:

    In Nextcloud Docker Container Settings, add your Path 

    ex    /mnt/usr/xxx/Music (unraid)     /Music  (Docker Container)

    And install External Storage Devices APP in Nextcloud, here ypu can settup your Folder /Music  (and you can choise the sharing Options.)

     

    @Chrysen Thanks for this info! This was already installed in the latest version of NextCloud. I just had to enable it.

     

    image.png.75609b732d094a8d7925dd094729a9c9.png

     

    image.thumb.png.13b350479efa87ae64b73bf72715265c.png

     

    image.png.78d6d710e08724d901c3d16f9687b625.png

    • Like 1
  5. 8 minutes ago, mgutt said:

    I'm using only one pc while testing. And I have new test results: Transfering to \\tower\cache\music (again bypassing shfs) is slow while backup is running, but transfer to \\192.168.178.9\cache\music with the same user credentials and same ethernet ports is fast?!

     

     

    Maybe you can test this scenario as well? For me it looks like Unraid is not good at multi-threading one smb user.

     

    Hmm, that is VERY ODD. Almost sounds like a DNS issue within your network. When I get some time, I'll run side-by-side comparisons as follows and report back.

     

    • \\HostName\Cache\Share
    • \\IPAddress\Cache\Share

     

     

    Also, did you edit hour HOSTS file in Windows 10? This is how I can resolve both the 10GBe peer-to-peer connection along side the 1GBe standard network connection. Both are resolvable by my UNRAID host name (or IP).

     

    https://www.groovypost.com/howto/edit-hosts-file-windows-10/

  6. 2 hours ago, mgutt said:

    So its the same as @falconexe has prooven. Writing to a share causes an extreme speeddown.

    Thanks for confirming @mgutt. I am still having this issue without my custom workaround. I appreciate all of the testing you did. Interesting that 2 computers give different results with regards to load/processes.

     

    I am going to build a new Windows 10 rig next month. Once it is up, I'll try again and see if I get any different results. I am also almost finished building a new house with CAT7 wired throughout and a dedicated server room. I am curious if the new network environment will afford me higher speeds. Currently, I lose 66% of my throughput due to this SHFS overhead.

     

    I'll subscribe to this thread and continue to watch. Cheers!

  7. So, Duplicati is terrible. I couldn't even get my initial 7TB backup set to complete due to all sorts of database/block issues. Basically, if you interrupt the initial set in any way (like graceful pausing or stopping after current file is finished ***BROKEN***, the entire thing fails. I tried 7 times to get it done, and each time some kind of fatal exception would occur. I uploaded 2TB at one point, only to have it fail once I stopped it (gracefully) per the documentation, in the hopes that I could resume it later. After reading many posts in their own forums, many others have had issues getting very large initial backup sets completed. I was hopeful, but now just depressed ha ha.

     

    Now I'm on to the paid Duplicacy. So far, not really impressed, but I'm trying the same 7TB backup set now. It does appear to be able to start/stop the initial set via a native partial backup snapshot function, so I am hopeful this product will succeed where the other free one failed. However, I am trying to restore a 30GB backup I did last night, and it is stuck on "Starting" for like 30 minutes, so.....not looking great.

     

    I'll report back my findings. At this point, my only other option if this fails is to try RClone, or go back to my Windows 10/Backblaze solution.

     

    @johnwhicker, I didn't want to believe that ALL of these options are nonviable, but you appear to be correct. Which completely SUCKS!

  8. Hi, I've been holding off on replying to this topic because I've been in the same boat and have been trying to figure out a viable long term solution, and recently came across one that is working well for me so far.

     

    Once Crashplan was discontinued, I had given up on a native UNRAID cloud backup solution a few months back and went with BackBlaze. I was using a Windows 10 machine to sync my unRAID shares with software called "SyncFolders" and a fleet of USB3 portable drives outside of UNRAID. Needless to say, it was a bit much, but it worked.

     

    I recently came across both Rclone (file syncing ) and Duplicati (true block based backups). I tried both, and I ended up using Duplicati and a cloud based solution with encryption, specifically GSuite (Google Drive For Business). They currently offer unlimited storage for $12/Month, and they do not enforce the 5 user minimum. However, you do have to own your own domain to use this solution. I already had one, so no big deal.

     

    The only thing with Google Drive, is that you can only upload 750GB/Day. I hit exactly 725.4GB before Duplicati started throwing server side errors. I have since throttled my uploads to 8 MB/s to keep it under this ceiling. (Math works out to 691.2 GB/Day [8MB * 86,400 Sec / 1000MB = 691.2GB/Day]. 9 MB/s puts it over, and the parameter has to be a whole number). This should keep Duplicati happy and support uninterrupted backups during my initial upload set. This would never be a problem once all files are initially backed up, but it is an interesting facet of the this solution's workflow.

     

    Other than that, I have had no issues. Restoring works from my testing with various files types and sizes. I'll be testing a full 8TB restore here soon once the initial backup set is completed. Hopefully, I won't run into the issues you did, but I remain hopeful. I am interested to hear what specifically your issues were with Duplicati and restores.

     

    It looks like you have not tried Rclone yet, so it may be worth a shot. Here are some great tutorials by @SpaceInvaderOne.

     

     

    Great tutorial if you just want a straight forward encrypted cloud sync via Rclone:

     

     

     

    Here is another way of performing block based backups (true backups) not syncing with Duplicati. You can use the same cloud services, or even UNRAID to UNRAID. This is what I am currently using.

     

     

     

     

    Hopefully this helps you and/or others who will be going down this path. I do intend on using Duplicati to backup UNRAID to UNRAID as well. Let me know if you have any questions. Good Luck!

     

  9. Sorry you are having so many issues. I also have a very expensive rig... I have seen some other posts about specific settings required for AMD ThreadRipper CPUs, and without those settings, you can get all kinds of freezing and intermittent lock-ups. Definitely search the forum, there is help out there. (EDIT: Looks like you did that "Ryzen Enhancements Enabled")

     

    No need to throw away unRAID. I've been using it in a very technical/professional environment since 2014, and there has never been an issue I couldn't solve. Good Luck!

  10. 1 hour ago, plxmediasvr said:

    I need help majorly bad.

    2 Words. UnBalance. I've moved literally hundreds of TB with it and have never had an issue. My largest combined transfer was re-dispersing 100TB, and it worked like a champ. It is just a GUI for CLI, but it is really great.

     

    You may have some kind of hardware issue, but if you are getting stalls after 25GB, that is NOT normal. Try installing the unBalance plugin, head over to the webpage (YourIP:6238), and give it a shot. If you do, I am very curious if you are still having the same issue.

  11. 7 hours ago, falconexe said:

    I just setup GSuite tonight and have my ENTIRE server syncing to the cloud and uploading at a staggering 250Mbit/s (Peaks at 35 Megabytes/s!). I have fiber with 1Gig Up though...So far, so good!

     

    Welp, there are drawbacks from having fiber internet. I already hit the limit of GSuite (Google Drive) in just a few hours at those sustained upload speeds. So apparently, you can only upload 750GB in a single 24 hour time-frame (shared across all GSuite products).

     

    I hit exactly 725.4GB before Duplicati started throwing server side errors. I have since throttled my uploads to 8 MB/s to keep it under this ceiling. (Math works out to 691.2 GB/Day [8MB * 86,400 Sec / 1000MB = 691.2GB/Day]. 9 MB/s puts it over, and the parameter has to be a whole number). This should keep Duplicati happy and support uninterrupted backups during my initial upload set. This would never be a problem once all files are initially backed up, but it is an interesting facet of the this solution's workflow.

     

     

    image.thumb.png.f0a25c3ab2ecdb21ac604078216fac57.png

  12. 9 hours ago, BRiT said:

     

    You should check out this thread on how Google Drives and Rclone can provide unlimited storage for around $12 a month. They don't actually require 5 users and they don't limit storage to 1TB when less than 5 users. Some users have well over 200 TBs.

     

    I was using BackBlaze via a fleet of external USB3 drives and a Windows 10 client with SyncFolders. Needless to say, it was a bit much ha ha. I've been looking for something quick, secure and native to UNRAID for a LONG time. I just setup GSuite tonight and have my ENTIRE server syncing to the cloud and uploading at a staggering 250Mbit/s (Peaks at 35 Megabytes/s!). I have fiber with 1Gig Up though... One note, in order to setup GSuite, YOU WILL NEED A DOMAIN. That was a bit surprising, but I already had one so just hooked it up through GoDaddy authentication pass-through (GSuite prompts for this).

     

    Thanks for the suggestion. Instead of using RClone, I ended up using the Duplicati docker with encryption. So far, so good!

    • Thanks 1
  13. 20 hours ago, tucansam said:

     

    Most of my drives have UDMA CRC errors, some have thousands, some have dozens, some have none.

     

    A few have "Reported Uncorrected," value number 187, whatever that means.

     

    Of 16 drives, those two errors are the only one that generate an orange line in the SMART value tables.

     

     

    Whenever I see anything light up in color, I immediately run a short and extended S.M.A.R.T. test. If results are verified, I then remove and replace that drive with a new pre-cleared drive.

     

    These are the Usual Suspects in Order by Severity (My Opinion):

    • Current pending sector > 0

      • Reallocated sector count > 0

        • Offline uncorrectable > 0

    If I see anything other than a fat ZERO here, I'm personally done with that drive. Others may say you can get away with it for a bit longer if you just have a low number of these and the metrics remain static, primarily reallocated sectors, but once there is 1, there will usually be more. Why take the risk? Especially when we are talking about SMR drives. I look for these opportunities to rid myself of these "mistake" drives. I got greedy a few years back with finding deals. For NAS type solutions, I would only get NAS type drives moving forward.

     

    I'm sure others with vastly more experience with specific drive diagnostics will and should chime in here. I'm a veteran IT professional and I have learned a lot from this community when I joined in 2014. Everyone here is great, and they will always take care of you if you listen, have an open mind, and remain calm when crap hits the fan.

     

    Hopefully this helps. If I were you, I would already have new drives shipping to my house and/or pre-clearing as we speak. And it goes without saying, I really hope you have good backups (onsite and cloud), and that you are already saving off this data. Good luck!

    • Like 1
×
×
  • Create New...