syrys

Members
  • Posts

    164
  • Joined

  • Last visited

Everything posted by syrys

  1. Thanks. Yeah i figured out where to make the change, but i asked here to double check why. Curious what the difference is, and untill you told me before, i thought it didnt make any difference, thus the post. Either way, thanks for letting me know, i just did the change to each of my dockers
  2. So, since couple of weeks ago, i have been having a strange problem. Im not really sure why, whats causing it or even how to diagnose it. My Setup: Unraid 6.2.4 Parity: 1x 8tb Data: 1x 6tb, 3x 8tb, 1x 4tb. * most of the array drives are pretty full (95+%) and bulk of the free space is on the latest added 8tb drive (5.5tb/34tb Free). Cache: 2x 480GB SSD Unassigned Devices: 2x 4tb I have several dockers including sonarr/couchpotato to automate media, deluge/sabnzbd for downloading and plex for media delivery. Issue: Every few days, i notice that my unraid instance freezes for few minutes. The machine is still running etc, but im unable to load the unraid Web UI (when i refresh, my browser keeps loading and saying something like "waiting for socket"). Same issue happens with every docker (so, the docker's Web UI becomes unresponsive the exact same way). When i say this happens every few days, i should specify that i only NOTICE it every few days (because its only obvious if it happens while im watching media on plex), i have no idea how often this actually happens. Only way to solve the problem i have found so far is to just wait. It stays unresponsive for roughly 10 minutes, and comes right back up as if nothing was ever wrong. Last time it happened (last night), i quickly checked the logs (the logs button on top right of the Web UI) as soon as it became responsive, but there were no entries in the logs for the last hour or two according to the timestamps (it was only unresponsive for about 10 minutes). Does anyone know what or why this could be? Could this be due to an issue with a drive? A docker? Something else? Heat? Any help greatly appreciated. Cheers
  3. Hey all, i just recently realised that the "Fix Common Problems" plugin is displaying the following error for each of my dockers. Could some one explain if this actually is a problem, and if so, why? Error: As per the error, i have few dockers installed, and the docker data drive (for example, the download folder for torrent client or sabnzbd) is located in a deive that is attached using the "Unassigned Devices" plugin. It has been working like this for months. The error only showed up on "Fix Common Problems" plugin recently. Can someone please let me know if this is actually an issue i should be fixing? Cheers. PS. running unraid 6.2.4
  4. Ah, thats not really the bahaviour i want, that would be too much activity to some of the folders im trying to backup. I would rather have something running in a schedule, like once a day at 1am a full backup, and every 3 hours incremental backups for example. Ill try it out tomorrow, there might be an option to allow for this. Does unix not have other alternatives to duplicati (if so, they must have dockers right?)? there were few really good app on windows that does this (some paid, some free).
  5. Backing upto a user share on the raid or an unassigned devices drive. Backing up from also a user share or another mounted unassigned devices drive. What i used to do when i did this on a windows VM was to do daily/hourly type of backups to the unassigned devices drive (rather not have huge amount of activity on the raid) and more fuller backups (weekly? monthly?) to the raid. I would rather not use the command line if possible. It would be ideal if there was a docker that i can manage that is designed for backups. If all fails, i will check out rclone. Thanks for the duplicati suggestion, i didnt realise it had a docker. I did use duplicati before on windows, and had a horrible experience (where the backup didnt work... and lost a lot of data... and backups constantly stopped backing up due to an error, so it wouldnt actually back it up...). I dont know how i feel about it now, i can still give it a go i suppose. Have never used crashplan before, does it let you do multiple backups with multiple frequencies to multiple locations? Ill test it out.
  6. Haha awesome. Thanks guys. 1 more thing, I'm going to assume things like dockers etc would be seamlessly appear as if nothing ever happened? (I'm assuming the settings are stored in the boot flash disk and it will figure out the docker configuration on the raid/cache automatically). Sent from my Nexus 6P using Tapatalk
  7. I had a similar issue a little while back (in my case it was a power outage). Anyway, guys here suggested few things here: https://lime-technology.com/forum/index.php?topic=50147 In the end, although the copy command throw out many errors, the final vdisk image file i ended up with (of my W10 VM) was actually perfectly usable. I plugged in an extra HDD and mounted it using unassigned devices plugin, then used the copy command mentioned in that thread to copy the vdisk image to the new drive. Then re-linked the VM's vdisk location to the new location and it worked (it worked for me atleast, not sure how useful that would be for you). Also, if you copy the said vdisk image to another windows machine, you can try to use a zip client to extract it, even with errors, you might still have some luck salvaging some config/setting files etc that might be critical (for me, it was my plex database files and sonarr/couchpotato databases).
  8. Hey Guys, Got a quick question. I was planning on upgrading my unraid box hardware (mainly the CPU). If no mobo change is required, Is it as simple as pulling the old and putting the new and nothing else? Or do i need to do any sort of re-configuration? At the same time, i was thinking about upgrading to a different generation cpu, which would require a mobo change as well (thus maybe even ram upgrade from ddr3->ddr4). If this was the case, how different is the process? Just to add, my current setup has about ~6 HDDs (single parity) and 2 SSDs, some running off a dell h310 raid card, and some through the mobo sata ports. Im running unraid 6.2.x (which ever the current uptodate stable version is, its auto updating).
  9. Hey guys, I'm looking for a docker that i can create some scheduled backups with. I would need to backup several directories (stored in different locations), would be ideal to have some UI that can handle/manage this. As for the type of backups, i would need to setup sort of like a weekly full backup and daily incremental/differential backups with like a month or two of retention (frequency and retention would vary depending on what im backing up, this is just an example). Any suggestions? Cheers.
  10. Thanks so much. This should really be added to the docker or somewhere much more obvious. If it wasent for you, i would be spending days on this. On a slightly other note, how would one add/replace/edit docker config files? Rather, what is the right way to do it? For example, my docker App data (configs) are all in a user share i created called "DockerAppData". But if i browse to this share using my other windows machine, i cannot edit any files (permissions). However, if i use filezilla to login as root and connect to unraid through sftp, i can browse and edit the said files (obviously, im root). But, i can see all existing file permisisons are set to "nobody users". So, if i add/edit files, it might set the permissions incorrectly. Any suggestions as to what the correct way to copy files there? One of the reasons i want to do this is, i want to import some old setting/database files from my old couchpotato/sonarr setup, which requires overwriting the database file with the old one. TY in advance
  11. If you or anyone is using deluge + sonarr/cp, can i get couple of screenshots of the docker settings page for deluge and the sonarr/cp's download client (deluge) settings screen? I cant seem to get them to talk to each other
  12. @deuxcolors: i tried installing couchpotato as a docker and tried connecting to deluge that was also setup as a docker. But it seems to fail connecting every time. Can you tell me what sort of settings you used to get them connected. I tried the 2 different ports, ip/localhost and all that with deluge as a host and as bridge. couchpotato doesnt even have any logs for me to follow through either =/
  13. @deuxcolors and @CHBMB, no need to appologise for the slightly off topic discussion. In my case, its actually relavent and gave me some good info. Though after CHBMBs first question about host/bridge, i figured out the difference by trial and error (it was pretty intuartive if you mess around it a bit). That being said, deluge can be run in bridge, i have it running like that now. You simply need to edit the docker, put it to bridge mode, then at the bottom click "add another path, port", then use that popup to add a port 8112 on the container to map to 8112 on host (or different port if you have changed or if you require differently), then save and try access the webUI (if fails, you can anways switch it back to host mode and you will be back to how it was before). @deuxcolors: i actually figured out the AutoRemovePlus plugin soon after i made my last post haha. Thanks. It does what i want it to do, though the documentation was a bit hard to come by. I do however have another problem with deluge though. I have my deluge download folder in a shared drive (unassigned devices plugin, drive is mounted and shared on the network) on unraid. When i access this shared drive on my windows pc (through the network), i cannot edit/delete any files on the deluge download directory (not enough permission). If i however use an ftp client (filezilla) and sftp into unraid as root, then i can delete any files through the ftp client. Have you come accross this problem before? I havent tried this yet, but if i install/connect sonarr upto this, i believe it might also have file permission issues. @CHBMB, thanks for the advice/warning. Dont worry, my intention was never to follow anyone blindly or copy their setup entirely. I mainly want to know what software correctly talks to which, sadly this is such a rare combination with torrent clients and sonarr/couchpotato (i did a review of all supported torrent clients on windows with sonarr in the sonarr forums, tl;dr: only 1 was usable and that even had issues). So mainly after which torrent client to use (and roughly how it was setup) to work correctly with sonarr + couchpotato + plex.
  14. @deuxcolors thanks for posting your setup. I really wish rTorrent would work. Due to some network restrictions, i cannot seed torrents until it meets some seed ratio, but i can easily seed for a set period of time (24 hours, which will seed a huge % given my connection speed). rTorrent has a neat feature where it can do exactly this, and at the end of the set time period, it sill remove the seeding and entirely delete the data (which, hopefully sonarr has copied to its library already). But sadly it seems to be failing to intergrate with sonarr due to a magnet link bug they seem to be having. and Deluge only has the option to remove the item from download list but not actually delete the files. Wouldnt this completely clog up your downloads folder, since nothing is getting removed? Also, does deluge have any setting to stop seeding after a time period (as opposed to a seed ratio)? Im sort of stuck right now, trying to figure out a download client that works perfectly. Also, is there a docker for jackett public? (the normal jeckett now only supports private providers). Besides those, i will be trying to match your setup, unless anyone advises against it.
  15. Hey all, Sorry if this is the wrong subforum for this. Please move if that is the case. So, im sure a decent percentage of you use unraid as a media server and likely media automation (download, sort, serve). I have been trying to setup Sonarr for a while now on unraid, and been having a lot of issues. Do any of you use Sonarr with unraid to automate torrents? If so, can you please explain your setup? Im mainly looking for things like: - are you using docker apps (if so, which), or VM? - are you using torrents or ... Which clients? - how do you have download directories setup so sonarr can pick up the downloads? - what about plex (or alternative), do you use docker or VM? In my case, i have been using a Windows VM to run all the apps like sonarr, torrent clients, plex etc. It has been running ok, but windows doesnt seem all that stable for this (as it crashes time to time). That being said, all the torrent clients in windows have various issues in regards to connections to Sonarr. I have tried rTorrent as a docker, but then sonarr cannot seem to find the files correctly (first world problems). Any advice appreciated. Im just looking for some advice and someone who has a working setup that i can try to copy.
  16. Well, its not that simple. Thing is, once you swap your plex installation to a different environment, you need to figure out a way to sync your libraries/settings (or else you lose all your settings/progress etc). Same goes the other way too, so you cant simply decide to delete the docker image, you need to first figure out a way to export/import your settings/progress to the new/old environment you are going to. That being said, ill give it a go when i have a chance.
  17. Ok, ill give it a go. But I'm holding you responsible. Sent from my Nexus 6P using Tapatalk
  18. Thanks, ill try those suggestions. One of the reasons i created the thread was to make sure that i wont break or explode something by assigning more cores than i have (ie. double booking cores). But now that people have clarified that it wont break anything, i can try experiment a little. As for the plex docker suggestion, i have heard some bad storied about that, and i will stay away from it. Same goes with other dockers for things like couchpotato/sickrage/sonarr. These apps them selves seem to be pretty badly managed and breaks very often (they commit in broken code to the stable/release branch which then gets picked up by the auto updater which then breaks the app entirely and you have to do something manual to fix it). I can manage doing these manual fixes on windows (as im used to it now), but the effort of doing that on the docker is more than im willing to do at this stage Thus, im staying away from docker for the above mentioned apps.
  19. thanks for the answers guys. i will just clarify what my usecases are. as mentioned i have a unraid system with a i7 4770 (8 cores on unraid). 1st usecase (current setup): Currently i want to run a W10 VM, and i want to maximise the performance within this VM. The VM it self will be running things like plex, download clients, media automation software, backup software, and some other random windows apps. I just want them to run without too much issue, specially plex performance. I currently have 6 cores assigned to this VM, and was wondering if assigning all 8 cores to it would have any benefit/harm. 2nd usecase (future setup, potentially): given the above system/setup of the first usecase, want to try add a 2nd VM (W10), that will have GPU pass through to the VM, and use it as an every day computer. Not really heavy gaming, but mainly media consumption, browsing, programming and maybe some lite gaming (arcade/steam games maybe). any suggestions for above usecases?
  20. Bump. Anyone? Sent from my Nexus 6P using Tapatalk
  21. So, when creating a VM, you get an option to select number of CPUs (cpu Cores?) to associate with your VM. For example, i have a SINGLE i7 4770 CPU on my unraid server, and when adding a VM i get 8 cpu checkboxes where i can check as many of them as i want (atleast 1) to associate with the VM. So, my questions are: 1. what happens if you create a VM and check all the CPU checkboxes? Would you get any errors or issues with your unraid UI or any plugins and/or dockers apps that are running? How does it all work? 2. What if you create 2 VMs, 1 using 6 CPUs (out of 8 ), and the 2nd also using 6 (out of 8 ). Would the 2 VMs have issues? Would any plugins/unraid-ui/docker have any issues? 3. Unless its already answered in 1 and 2, is there any specific reason why you shouldnt check all the CPU checkboxes when creating a VM? Cheers.
  22. Fair enough. So saving credentials aside (even if its possible on the browser level, i think i explained already about the security problems with saving on the browser, and its technically impossible on a browser plugin level), i still stand by my feature request, as it is something very good to have (not critical, but you know, features...). @itimpi thanks for the clarification on the save credentials on the browser, ill dig into that further and make sure the domain isnt blocked from doing so (thats probably the reason, though as mentioned, i probably wont actually use that for security reasons).
  23. Unless you are using a different version or something (im on 6.1.x), im actually quite confused. The dialog im talking about is the automatic authentication dialog that the browser spits out when it receives a 401. It looks something like this: https://sahipro.com/docs/assets/images/using-sahi/401/401-chrome.png (not an actual screenshot, i just gogoled to find one). If your login dialog does not look like this, then we are talking about different things. If your login dialog does look like this, and it remembers your login details, please share a screenshot as i would love to get to the bottom of this if it can be fixed with some setting. Let me go a bit more technical: As mentioned, this popup comes from browsers when it sees a 401 on the request. Since this is browser invoked (not page invoked), it sits outside the scope of javascript/page, therefore any browser plugins or code within the page does not have access to the said popup to fill the fields. When i say browser plugins, im referring to password managers such as lastpass/keypass or what ever people are using. I suppose an exception would be the password manager that ships with OSX, what ever it was called. However, the browser it self may or may not be able to remember, but in my case (using latest chrome in windows 10), my browser does not prompt nor remembers my logins, and even then, saving login details on the browser is a VERY BAD IDEA. So, what im suggesting is to instead of returning a 401, create a login page, the redirect the user to the said login page where the login fields are simple html form (where the browser and the browser plugins can freely interact with). This also allows for additional functionality such as "remember me" (you would need to implement this on the backend too obviously).
  24. Goals: 1. i want to backup my VM(s) at a scheduled time (say weekly). 2. Hopefully have this automated so i dont have to manually do it. 3. And make the backup easy enough to restore/recover from. Problems: 1. taking a copy of the vdisk requires the VM to be off. 2. nothing out of the box? So, there have been some discussion and a solution here (https://lime-technology.com/forum/index.php?topic=39061.msg363625#msg363625) if anyone cares to read. But this solution is just a single backup, and requires the VM to be off. Can we possible create some sort of a script where it checks if a VM is on/off, if on, then turn it off (waits till its off), then runs the command mentioned in the above link (or just simply take a copy of the specified vdisk image), then brings the VM back to the state it was in prior (turn it on if it was on). And have this script execute on a set schedule somehow? Ofcorse, i dont know how hard it would be to actually turn this into a plugin and have a small ui where you can set some settings (frequency/time/destination etc) and enable/disable backups, and if someone can build something that would be amazing. Otherwise if i can get some info on how i would go about making a script, and how i would go about making a plugin, then i could even try making one my self.