Jump to content

Kaizac

Members
  • Content Count

    186
  • Joined

  • Days Won

    1

Everything posted by Kaizac

  1. Then it's a syncthing issue as far as I can tell. You should test with an other docker.
  2. Have you tried another docker like Radarr to see if you can write files there? Syncthing doesn't work with browsing through the ui. So you have to put exact paths. So in your case you start with /google.
  3. So how is your rclone set up? Can you post your rclone config? Just remove the tokens and such.
  4. What is the docker you are using? And what happens when you restart the docker. Is it working then?
  5. How do you give the docker access to the share? Can you share that screenshot.
  6. Thanks you're right. My backup Share is just using a lot of files. How do you do this yourself? All the pictures and small files you want to keep safe, don't amount to a lot of size, but do to number of files Some way to autozip them would be best I think. Just like the CA backup does but then for your own shares.
  7. How are you transferring the books to your devices when Calibre is running on it's own IP? I can use the content server but that will download the epub file, and not give me the option to create a library within my ereader.
  8. @DZMM how would your cleanup script work for a mount you've only connected to mount_rclone (backup mount for example which isn't used in mount_unionfs)? I can't alter your script as I'm not 100% sure whether some lines are necessary.
  9. No. The Team Share is a shared storage which multiple users have access to. So you get 750gb per day per user connected to this Team Share. It's not just extra size added to a specific account.
  10. Yeah I do the same, but thought backing up my whole cache was a nice addition. I was wrong ;). On another note, I'm not having any memory problems for a few months now. So maybe rclone changed something but I'm never running out of memory. Hope it's better for others as well.
  11. I'm currently on a backup of 2 TB with almost 400k files (370k)..... I thought backing up my cache drive would be a good idea, whilst forgetting that Plex appdata is huge of small files. Currently also getting the limit exceeded error. So I'm pretty sure rclone doesn't count folders as objects, but Gdrive does.
  12. How do you make sure the unmount script is run before the mount script?
  13. Just create 2 scripts. One for your rclone mount first start and your rclone continous mount. In your first start script you put at the beginning to delete the check files which should be removed to run the script properly.
  14. You could use mount_rclone as your RW folder and it will download directly to your Gdrive. However this will slowed by your upload speed. And it will probably also cause problems while direct writing to the mount. Rclone copy/move/etc. is intended to solve this issue by doing file checks.
  15. Not sure if I understand you properly. You point Sonarr to your unionfs folder so it doesn't matter where you store your file.
  16. Thanks! Kerio is indeed paid and quite expensive as far as I can tell. Using local clients on my desktop and then back that up is possible, however then I'm wasting local storage as well. Which is just waste of expensive space when I have enough of it on my Unraid box. So far running a mail client like Thunderbird in docker seems most likely.
  17. Thanks, unfortunately network drives are not recognized. And when I create a sym or dirlinker it also sees it's a networked drive.
  18. I'm looking for a way to create backups on my Unraid box of my online e-mailaccounts (Outlook/Hotmail/Gmail/etc.). I found MailStore home (free) and Mailstore Server (300 USD). The home version can only run on a Windows box while storing locally. Now I could run this in a Windows VM, but I find that quite a waste of resources. Are there any other ways you found to create these backups? Running Thunderbird as docker seems possible, but that is also not really the clean situation I'm looking for.
  19. I also have all my media stored on my Gdrive, with nothing local. Only thing I keep local are the small files, like subtitles and .nfo's because of the Gdrive/Tdrive file limit. I also keep a backup on Gdrive of my local files. Recently had a problem that I couldn't play files and it turned out my API was temporarily banned. I suspect that both Emby and Plex were analyzing files too much to get subtitles. So I switched to another API and it worked again. Something that has been an ongoing issue is memory creep. I've had multiple times that the server gave an out of memory error. I think it's because the scripts running simultaneously take up too much RAM (upload/upload backup/moving small files/cleanup). I will experiment a bit more with lowering the buffer sizes to lower the RAM usage. But with 28GB RAM I didn't expect to run into problems to be honest.
  20. No, did you put the shield on wifi? If so that's probably the issue. Or are you playing 4k movies? The shield doesn't need transcode so it will play on full quality which can be taxing on your bandwidth depending on your file size.
  21. Are you hard rebooting/force rebooting your server? If your server doesn't have enough time to shut down array it won't run the unmount script. That's why I added the unmount script before my mount script at boot. Otherwise the "check" files won't be removed and the scripts see it as still running.
  22. Did you set up your own client id before (per what DZMM linked)? If so log in to your Google Admin page and see whether these credentials are still valid. If so, rerun the config for your mount(s) and put in the client id and secret again. Don't alter anything else. Then when you get to the end of the config it will ask you if you want to renew your token. Then you can just say yes and run the authorization again. That should do it.
  23. You're complicating stuff too much, he gave you a working script. You can just use: rclone copy /mnt/user/Media Gdrive_StreamEN:Media --ignore-existing -v Just need to change the /mnt/user/Media folder to the folder you want to copy. And same for Gdrive_StreamEN:Media aswell. So if you store it in the folder Secure it would be Gdrive_StreamEN:Secure.
  24. Make sure you restart Krusader before you test. Krusader often only sees the old situation and not the new one, thus you won't see the mount working. When you go to your mount folder (mount_rclone) from the tutorial you should see 1 PiB in size to know it worked. If you don't see that, your rclone config and mounting was wrong. To help you with that we need more info about your config and mounting script.
  25. Use User Scripts and run it in the background. Through the log you can see what's happening if you want but you don't have to keep a window open.