Tomahawk51

Members
  • Posts

    37
  • Joined

  • Last visited

Everything posted by Tomahawk51

  1. I just configured the "icloudpd" docker to pull down my Apple iCloud Photos content to an Unraid folder - very cool and liberates my macbook from having to facilitate this. Question: is there any similar docker or workaround for Apple Music (formerly iTunes) - that would allow for a download from iCloud to my array? What I currently do: I store my Music Library locally on my Mac, sync it one-way to a folder with a SyncThing docker, (map that to Plex), and back it up (Duplicacy). I'd like to get the Mac out of the equation and get a solution setup to run within Unraid. I'm open to VM based solutions if needed as well, but it seems to me that running a MacOS VM is not reliable.
  2. <Unpackerr> My goal is to use Unpackerr as a watch folder for occasional unpacking needs; I have NO need for *Arr unpacking. I plan to manually drop files in on occasions (really big files typically like those from backups). I setup everything, docker: mapped in a "watch" folder I'd like to use to the "data" container path config file in App data folder: un-commented the [[folder]] config and mapped to "/data" The logs seem to show things look OK for the folder config, and it regularly is sweeping but it never finds or acts on those files. I put a few .zip and other compressed files in this location - nothing happens. I copied new ones in, and more nothing. I tried permutations of making new container paths (/downloads, /downloads_test), with no change. Is what I'm trying to do feasible? Do others have this working?
  3. LOL - I know, that's a good way to answer my question. Thanks. I was thinking the Reads might be asymmetric. I get the data will be written symmetrically. Anyway, I set it up and it's running great. I am very happy with the ZFS pool approach for cache on 6.12. The ability for drives to be removed gracefully seems much better than the BTRFS approach.
  4. I now have a ZFS mirror Cache pool with 2 2TB SSD drives. One is a WD Red, one a WD Blue. I understand the Red is more NAS oriented and I assume has better durability. Is there a difference in setting up 1 drive first in the pool and then adding the second in terms of the wear it will endure? I ask because I currently have the Blue as primary and will add the Red (following a Crucial MX500 failure - one of many for me). I wonder if I should re-intiatilze the pool to put the Red first vs. just add it as #2. Hope that made sense. Also, I'm wondering if I'm overthinking this since there is a mirror in place. Thanks!
  5. I imagine this is an easy answer (and I didn't find it in searching): When I choose to compress the image (vdisk.img), the original file remains in addition to the compressed one (vdisk.img, vdisk.img.zst). I'm getting about a 50gb savings in compression, and I'm OK with the headaches mentioned in other posts. Is there a way for me to exclude/remove that original .img file after the compression (or prevent it in the first place)? To restate: I'd like to keep a compressed VM image, not both the original and a compressed one.
  6. Hi, I'm still on my journey to get this (or any VPN-) docker enabled. After getting a new OVPN config file, I've resolved one error message and now get this: Is this something I can overcome? Is there a config file somewhere associated with the docker I can update the "--data-ciphers"- I couldn't find it. Thanks for any input.
  7. What do I have misconfigured? My docker config is blowing up my Docker Image file (see Writable below). Is this a common/known issue I can fix? attached pic for docker config Thank you for any input!
  8. I’d love some input on my thinking and advice on approach on backup: I am, thinking of buying 2 big HDDs and swapping them in rotation. I’m considering these 18tb ones + enclosures, despite concerns. Context: my wimpy isp connection has low download and much lower upload. I imagine it would be impractical to restore via a cloud backup solution, and months/years to backup my tbs (to be decided). I have some experience on this from backblaze and crahplan usage years ago. I have family nearby that visit, so I figure I can swap backup hds on a regular basis when they come over. Setting up a backup server at their home is not practical, and we both have poor isp bandwidth anyway. I would presume to leave one HDD plugged in to my server by USB, and would pull it out when they visit to swap for the other one. Let’s assume monthly at worst. Ideally, I could hot swap these usb drives, and see the backup process automate somehow. Question: Is this doable? if so, what is the best way to automate backups, including this scheme of having 2 HDDs in rotation? I’ve read that the UD script is an option, I have used dockers like Duplicati, and I also have a Win VM available. Any suggestions? thanks!
  9. I am considering buying these for an external backup option: is it known if unraid has any impact? I am guessing no, and that it doesn’t look at WDDA and instead only at SMART.
  10. Hi, hoping to get some VPN help. I use OctaneVPN, working on the deprecated rTorrentVPN docker and trying to move to either DelugeVPN or QBittorrentVPN. My VPN provider's OVPN file wasn't working, so I asked them to help and they gave me an update that include "tls-cipher "DEFAULT:@SECLEVEL=0". I get this in teh logs though: Is there something I can do to edit "--data-ciphers" to get things going? Again, this is working on the old rTorrent docker...just trying to migrate. Thanks!
  11. That was it - I changed the ports back to defaults and all is well now. Thanks so much! I only changed the ports as I had other services setup on 3000, but I’ll shift that side instead. Thanks again for all the support and for Mealiev1 :)
  12. Sorry for the delay - yes, it loads with my local IP and the 9925 port. I tried various permutations of using the BASE_URL: field in the docker config, but to no avail. It's not the end of the world if I can resolve this, but I sure would love to. Is there any guidance on how I should use that field?
  13. Thx for this docker - loving the progress in the app! I'm having trouble connecting with a Reverse Proxy, and could use help. I am using duckdns subdomains, and the setup is working with a few other apps and works with the "old" version of Mealie. ex: xxxMealiev1xxx.duckns.org. Does the field above relate to my situation, or is it only for those that own a top level domain? Not sure if helpful, but here's my Swag config data as well Thanks if anyone has any input to help!
  14. Thanks for the reply. Yes, I have started using an Inbox tag as well, and I understand how this can help in the workflow. I still don't see how I can do what I ideally want, in the context of going through a substantial backlog of docs that might already be organized. As a workaround, I think I'll try: cleaning out all my inbox items to get to a baseline creating subfolders aligned to tags outside of the import folder scan directly to these staging folders manually drag each subfolder's contents in 1 by 1, and then mass tag them in the UI I think this will add some efficiency for me. I think I'll go add a feature request as well for what I really want: >1 import folder that can be assigned to tags/metadata or an alternative solution.
  15. Basic question, couldn't find answers on the project page: Is it possible to import and assign tag at the same time? This is instead of the 2 step process of 1) Import, 2) go into the UX to click and assign tags For instance, if I scan a whole big pile of docs that should have the same tag, it seems inefficient to go and manually tag them vs. assign them more automatically at time of import. ex: setup >1 import folders for each tag? Is this possible or are there other approaches I should use?
  16. I'm looking for some practical advice, and am at the point I realize I need to ask for better informed advice. I built my server many years ago, and it's doing everything I need...but... Goal I'm seeking: I want to stream older games to a few Steam clients (ShieldTVs, AtGames Arcade, maybe my laptop as well). Primary interest in Games like Starcraft/2, Command & Conquer/Generals. Sure it would be nice to also be able to run more advanced games or emulators, but I'm not looking to play with a monitor attached, and not anticipating having top notch graphics and performance. note: I've run some basic games over steam and they did work...but the experience wasn't great. I'm assuming I need a GPU - I have none now. Initial Questions: Is this worth pursuing? Can I expect reasonable performance over Steam Connect, or am I wrong in my assumption that this could work satisfactorily with the addition of a GPU. I have been out of the HW game for so long, that I wonder: is my HW (below) still reasonable to run or should I be upgrading instead? It works for everything else besides gaming (many Dockers, VMs, etc). Notes: I'm using a Win11 VM. I tried passing through the Mobo integrated GPU (Matrox g200e). Windows turns off the driver, citing issues. I assume I should move on from this idea. I had an old Nvidia 7900 GS so I tried it, kind of. It wouldn't fit in the PCI x16 slot since hits the RAM modules. I read that I could move it to one of the other open 8x slots, so I did. These slots have no space constraints. I didn't realize this was doable as there are pins just hanging off...but Unraid shows it and VM dropdowns show it. I have hit the wall on getting it to passthrough to the VM after many days of trial and error (IOMMU, vBios, etc.). I also may have damaged a capacitor when learning it was going to hit the RAM, though I did spot solder the connection after (shaky confidence). Secondary Questions: I'm assuming if I buy a middle / value "mini" gpu, then maybe it would fit in the x16 slot vs. using one of the open ended x8 slots. I have not yet gotten out a ruler to check, but will before buying. What should I plan on - a regular card in an open x8, or find a mini card for the x16? Any input on what cards are easiest to passthrough to the VM, and perhaps what series/vintage would support my goal (and isn't top of the line and price)? Thank you in advance for reading all this! Specs: Motherboard Slots:
  17. My Request is for PaperCut: Mobility Print . https://www.papercut.com/products/free-software/mobility-print/ This is a replacement for Google Cloud Printing, and has been very useful running in a VM, but having a docker would be easier to manage. I have no experience in making dockers templates for Unraid, but if the entries in Docker Hub imply this should be easy, please just let me so that I can start researching. I did not find any “mobility print” offerings, but am not sure if the larger products overlap. https://hub.docker.com/search?q=papercut&sort=updated_at&order=desc
  18. I'm having the same experience. It's on my list to check the VPN Endpoints with a regular client. Will report back if that is OK (indicating maybe something else is going on?). UPDATE: The problem was with my VPN. Setting to a new endpoint in the config file resolved my issue.
  19. Any help on this error? I ran into on the Binhex docker, and seeing it similarly when setting up this one. Things were working well for years, then it broke. Like many Unraid things - maybe I broke it somehow! If i do a clean install, it works for a while...but then eventually fails and presents this error:
  20. Reporting back, to eat my hat and learn the lesson. My issues related to...a change in my VPN Provider's certificate. Sorry binhex, and also, thanks for this docker. I'm going to give rTorrent a shot next, and may transition, after reading some of your past comments on "what's the best" de facto client to use on Unraid.
  21. Chiming in, in case there is something going on with the docker more systemically: My DelugeVPN setup stopped working very recently (today/yesterday). I'm going through the arduous process of trying to diagnose if it's my VPN, though that is working fine on the client app. I'll keep checking here to see if others are having issues (and fined a solution).
  22. Thank you for this suggestion (albeit some time ago and me just coming upon it via search). Recently I've come across QDirStat, DupeGuru, and DriveSpeed + Unassigned Devices, UnBalance as excellent tools to help in cleaning up files and consolidating drives.
  23. Interesting, but prob not advantageous for me as I share the recordings on my plex server as well. Thanks for the comprehensive thinking.
  24. Ahh ok. I do also have a redundant cache (2 disks). I was thinking that by having a ‘PVR’ pool, I could more easily ensure my shares didn’t use this 1 disk vs the others (leverage abstraction). Now I have to ensure every other share doesn’t use that disk. good point though, it is working and I now understand pools aren’t for my use case I guess. Thanks!
  25. Just upgraded from 6.8 to 6.9, and it does seem snappier! Confusion: I have a VM running SageTV PVR and I've created a dedicated share for the TV recordings, using the "include/exclude disk" feature in the share setup. In other words, I have 5 data disks + parity and I use Disk 5 for recordings and it has parity protection. I'm not using Unassigned Devices for this. It *seems* to me that w/ 6.9 I might be able to create a new pool just for the recordings, and include just disk 5 to this. That is, I'd have 2 pools: 1) all stuff (4 disks), 2) just PVR recordings (1 disk). -Is this the correct understanding of what's possible or advisable? -will both pools be able to use the same parity drive for protection? -If I try to set this up, will I have to wipe my current disk 5 (pvr), or can I move it to a new pool without losing recordings? Note - I realize, and assume, I don't have a grasp on this!