cyberspectre Posted September 21, 2019 Share Posted September 21, 2019 There have been a number of threads about this, but the latest ones date back a year or more, so I'm wondering what solutions people are employing at this moment in 2019. Some background: my UnRaid machine is not a 16-disk, data-hoarding closet server. It's a multi-role computer that's used for work, daily browsing, gaming, and everything in between, in addition to being a file server. The array has 2 HDDs, so there is some redundancy. But as there are only 2 HDD slots in the tower, I also have paid offsite backup for peace of mind. Until recently, I've used SpiderOak ONE in a docker container. But lately, I've been experiencing issues with the command-line client that make the service no longer usable. The docker container also frequently segfaults for some reason. Suffice to say, it's time to look at other options. I experimented with rclone to very little avail, and I don't enjoy the idea of sinking more time into that. This method looks promising, and may be my next route. But I'd like to hear what others are doing first. A simple docker container running a first-party, native Linux client from a reputable service would be ideal. Anybody have something like that? Quote Link to comment
BRiT Posted September 21, 2019 Share Posted September 21, 2019 Some are using Rclone and Google Drive, even for direct storage: Quote Link to comment
1812 Posted September 21, 2019 Share Posted September 21, 2019 (edited) I use cloudberry but backup to a local server and my own remote server. Edited September 21, 2019 by 1812 Quote Link to comment
rbh00723 Posted September 22, 2019 Share Posted September 22, 2019 So I was thinking/wondering if it might be possible to initially fill like a 10 TB external drive (currently I have about 3TB) locally then take said external drive and like connect it to a windows desktop at a friends house or maybe like to the USB port on his router, would I be able to make incremental backups to the drive? I just want to have like a offsite backup in case of fire or flood at my home. Am I crazy for wanting to do this? I just feel like this would be the cheapest option compared to paying for TB's of cloud storage... Quote Link to comment
cyberspectre Posted September 22, 2019 Author Share Posted September 22, 2019 2 hours ago, rbh00723 said: So I was thinking/wondering if it might be possible to initially fill like a 10 TB external drive (currently I have about 3TB) locally then take said external drive and like connect it to a windows desktop at a friends house or maybe like to the USB port on his router, would I be able to make incremental backups to the drive? I just want to have like a offsite backup in case of fire or flood at my home. Am I crazy for wanting to do this? I just feel like this would be the cheapest option compared to paying for TB's of cloud storage... 😂 "Hey best buddy, I don't want to pay a company to do my backup. Instead, I'd like you to pay for the electricity and bandwidth needed to do it." Quote Link to comment
rbh00723 Posted September 23, 2019 Share Posted September 23, 2019 5 hours ago, cyberspectre said: 😂 "Hey best buddy, I don't want to pay a company to do my backup. Instead, I'd like you to pay for the electricity and bandwidth needed to do it." okay okay, i see what your saying. firstly it doesn't need to necessarily need to "cost" him any bandwidth, also I would be willing to pay him say $20 a year...I think you might be over estimating what it "costs" to run a piece of hardware... I mean honestly I know its not going to consume 100 Watts so... ya its more like 5-10 Watts for like 2 hours at most per day assuming only incremental backups Quote Link to comment
cyberspectre Posted September 23, 2019 Author Share Posted September 23, 2019 15 hours ago, rbh00723 said: okay okay, i see what your saying. firstly it doesn't need to necessarily need to "cost" him any bandwidth, also I would be willing to pay him say $20 a year...I think you might be over estimating what it "costs" to run a piece of hardware... I mean honestly I know its not going to consume 100 Watts so... ya its more like 5-10 Watts for like 2 hours at most per day assuming only incremental backups Maybe so, but it's the principle if you ask me. Regardless, you might be able to use OpenWRT to accomplish something like that if you're determined. I figured out that the issue I was having with SpiderOak was due to one specific file. Deleted the file and now it's all good again. Easy, clean, headless backup. Quote Link to comment
detz Posted September 26, 2019 Share Posted September 26, 2019 I've been thinking a lot about this as my array grows. Most stuff I care about (photos, documents, wiki, etc) are backed up to the cloud but for the most part they're small, < 1TB total. The rest of my array (movies, music, etc) is not backed up, I always told myself that it was okay to lose it but the more I collect the more I'm starting to not think that way. A simple flooding from a broken pipe could wipe my entire 10+ year collection. Since uploading 40+TB of data is no small task I'm looking at alternative solutions that will scale and give me the best chance to recover, I'm not sure a backup service (backblaze) or cloud provider gives me that. So, I've actually started looking at using usenet as a possible offsite backup provider. Basically, each piece of media could be it's own post, encrypted and uploaded. Most modern providers keep binaries for over a year so if I set a rotation of uploading everything yearly I should be able to recover anything I need and I could do it on a per file basis if need be. My simple math says I would need to upload ~100GB/day to upload my entire array in a year which seams very obtainable assuming my isp doesn't care about me upload ~3.5TB a month. Thoughts? Quote Link to comment
ijuarez Posted September 26, 2019 Share Posted September 26, 2019 i use another server that gets powered up once a month dump my important data then turn it off. Quote Link to comment
detz Posted September 26, 2019 Share Posted September 26, 2019 18 minutes ago, ijuarez said: i use another server that gets powered up once a month dump my important data then turn it off. That's over $600 just in drives not counting server hardware. To do it well that server should also be located somewhere else too. 😞 Quote Link to comment
ijuarez Posted September 26, 2019 Share Posted September 26, 2019 11 minutes ago, detz said: That's over $600 just in drives not counting server hardware. To do it well that server should also be located somewhere else too. 😞 wow, sorry i didn't pay attention that you want to backup 40TB I just saw the 1TB. There's another user here that has about 35TB and he was using a seedbox provider, but i am unsure on how much he pays for that service. Quote Link to comment
JonathanM Posted September 26, 2019 Share Posted September 26, 2019 9 hours ago, detz said: So, I've actually started looking at using usenet as a possible offsite backup provider. https://www.goodreads.com/quotes/574706-only-wimps-use-tape-backup-real-men-just-upload-their Quote Link to comment
BRiT Posted September 27, 2019 Share Posted September 27, 2019 As I linked earlier, the cost is around $10 - $12 per month for unlimited storage, as Google doesn't enforce more than 5 users setup required. It used to be around $8 a year ago. The person who started that thread and is helping a lot of us out has over 400 TB stored there. Quote Link to comment
cyberspectre Posted October 1, 2019 Author Share Posted October 1, 2019 It isn't the cheapest service, but SpiderOak is once again working perfectly for me, and I can safely suggest it. The docker container runs a native Linux client that watches for changes. Any time I modify or add a new file to my array, it's backed up instantly. Quote Link to comment
spants Posted October 17, 2019 Share Posted October 17, 2019 (edited) On 9/27/2019 at 3:47 AM, BRiT said: As I linked earlier, the cost is around $10 - $12 per month for unlimited storage, as Google doesn't enforce more than 5 users setup required. It used to be around $8 a year ago. The person who started that thread and is helping a lot of us out has over 400 TB stored there. @BRiTGoogle drive does enforce the 1TB limit - I just hit it.... (using legacy google apps account) Edited October 17, 2019 by spants Quote Link to comment
BRiT Posted October 17, 2019 Share Posted October 17, 2019 Read the link and follow the thread instructions. It doesnt enforce it when you are on the middle or higher options. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.