Jump to content

peteknot

Members
  • Posts

    39
  • Joined

Posts posted by peteknot

  1. 29 minutes ago, Squid said:

    If it's an annoyance / problem just for you, then stop the docker service (Settings), switch to advanced view and enable template authoring mode.  Restart the service and edit the app, switch to advanced view and remove the entry TemplateURL

    It is definitely just an annoyance to me as I don't want the ports to show back up as then they could be accessed from outside the reverse proxy. So I will try this method. Thanks!

  2. Running UnRaid 6.8.3. I have docker containers that I don't want accessible outside of a reverse proxy. So I edited the container and removed the PORTs specified. Everything works fine til I go in and "Check for Updates" followed by "Apply Update". Then the PORTs I removed will be readded to the container's spec. I thought this was just checking the docker repository for a new image. Is this doing something more with checking the templates in "Community Applications" or something? Thanks for the help!

  3. 12 minutes ago, RGauld said:

    Recent update...  just tried it and after SABnzbd finished downloading the file, it wont move it to the NAS...

    when I check the activity tab in Sonarr, I see this: (Soanrr error.jpg).  Weird thing is, I have Radarr setup an running

    as well, and it also has to send the files to the NAS, which it does with no problems...

     

     

    Sonarr error.jpg

    I would also expect to see an error under 'System -> Logs'

  4. 38 minutes ago, RGauld said:

    so, I don't need to setup different paths in the one running version of SABnzbd? (see SAB setup.jpg)…  My question is the "Folder/Path" section...

    just leave it blank or do I have to input paths...  Also note that tying all this together, the tv-current shows are on a Asustor NAS, while the tv-ended

    shows, as well as all the apps are on the UnRaid server. So the tv-current running version of Sonarr media is mapped to the NAS...  I assume that means Sonarr will do all the file moving, etc?

     

    SAB setup.jpg

     

    So for the first part. No you don't need different paths. And this is because Sonarr will only be talking to SABnzbd about the category it cares about. So if 'tv-current' had a bunch of items ready to be moved, the Sonarr instance with 'tv-ended' as it's category won't even see them in the results when it queries SABnzbd.

     

    As for the second part, in regards to the different machine paths. SABnzbd is going to tell Sonarr where the file is located. But SABnzbd is going to say where the file is in relation to it's setup. That becomes a problem when the paths don't line up because of different machines. If you turn on the advanced settings in Sonarr, then go to download client, there is a part for remote path mappings. So you may be able to configure it through that. But would require reading up on the setup. I have it all on the same machine, so all I needed to do was line up the download paths among the Sonarr/Radarr instances and SABnzbd.

     

  5.   

    19 hours ago, RGauld said:

    I have a question:  Is it possible to run 2 copies of this app?  The reason is hard to explain, suffice to say I have 2 copies of Sonarr running to monitor 2 sets of TV Shows and I need to know if I can run 2 copies of SABnzbdvpn…  the TV Shows I have split between current and ended, and one copy of Sonarr monitors the current TV Shows while the other copy of Sonarr monitors the ended TV Shows...  I'm trying to setup the Sonarr that monitors the current TV shows, and as it has different paths, etc, I can't use the same SABnzbdvpn that is running with the other Sonarr that monitors the ended TV Shows...  I'm hoping this makes sense to someone, and that I can get some help with this...

     

    I have a similar setup, two copies of Sonarr/Radarr, one for 1080p and one for 4k. The thing is they don't really care about each other, and really don't know they're there. You just need to configure Sonarr to use different categories and then you only need one version of SABnzbd. So you would configure one Sonarr instance to have a "tv-current" category and the other to have "tv-ended" for instance. As long as the "/downloads" path is common amongst the contains, the path to where the downloads get put, the destination path, can be different per instance of Sonarr.

     

    Now, if you don't want to just reuse one container, yes, you can easily run multiple copies of this container. Just make sure to set it all up again with different configuration paths. Though consider that means additional load on the computer and an additional VPN connection if you have a limit set by your provider.

  6. 3 hours ago, subagon said:

    I've been using pihole running on a raspberry pi for a few months and no problems. I migrated over to this docker but have a problem. My sonarr docker is no longer unable to resolve dns lookups. The sonarr docker is running on 192.168.1.2 (same as the unraid server) and pihole is on 192.168.1.10 (same ip I was using when running on raspberry pi). No other docker is using port 53 or 67. If I shutdown the pihole docker and restart the raspberry pi, sonarr starts resolving dns again.

     

    I feel like I'm missing something basic... Ideas?

    I believe this is from the default segregation of docker from the host. I have the same issue. If you give Sonarr a dedicated IP, then the communication between the two should be allowed. I can't find the post but I think it was release 6.4 that added dockers not being able to communicate with the host. 

  7. 5 hours ago, CowboyRedBeard said:

    Does it just take a few days for the docker image to catch up for updates? When I go to the admin page it shows there is an update available, but when I "check for updates" in the docker tab of unraid it doesn't show an available update for the docker????

     

    I do have: 

    pihole/pihole:latest

    https://hub.docker.com/r/pihole/pihole/

    Yes, you need to look at the tags tab on the docker hub page. You can see that latest hasn't been updated for 9 days right now. The updates you're seeing on the admin page are PiHole updates. You can update them in the container, but they will only live as long as the container is around. It looks like they've started dev containers of the latest updates, so probably testing it now.

  8. @phorse Here's what I did to get the kindlegen binary working.

     

    1. Downloaded tarball from https://www.amazon.com/gp/feature.html?docId=1000765211 to the docker config folder.

    2. Untar'ed it and removed everything from it except for `kindlegen`

    3. `chmod 777 kindlegen`

    4. `chown nobody:users kindlegen`

    5. Changed the external binary's path to /config/kindlegen

    kindlegen.PNG.c6f40e5dc68bac22004814fb5d9fbe54.PNG

    6. Confirmed tool showing up in info page

    info.PNG.e4ef14074a0a765db2286248dbd8d6f5.PNG

     

    Hopefully that gets it working for you. I tested the convert and send feature to my email and it worked fine. Good luck.

    • Like 1
  9. 23 hours ago, kizer said:

     

    Yep I however do not have my unRAID using my pilhole for its DNS. Just in case you are. 

     

    I think it should be pointed out this is because if you're assigning IPs to your dockers, then UnRaid can't talk to the dockers and would not have DNS, as talked about here:

    If you want Plex to talk to PiHole, you can set the DNS in the docker config. Under extra parameters, you can put in

    --dns={PiHole_Address}

    And that would adjust the DNS settings for Plex.

  10. Hey guys,

     

    I had a lot of trouble getting this up and running and found out that it was due to some network firewall rules I have in place. I saw from the logs that openvpn was connecting fine and it said that the application had been started but I could not connect. I found after playing around for a few hours that it is my firewall rule that blocks my VLANs from communicating with each other. The rule is explained in greater detail here: https://help.ubnt.com/hc/en-us/articles/115010254227-UniFi-How-to-Disable-InterVLAN-Routing-on-the-UniFi-USG#option 2. I gave this docker (as all my other dockers) it's own IP address and specified the correct LAN_NETWORK. But with this rule enabled, I'm not able to access the docker, eg. services such as Radarr can not use the api to access Sabnzbd.

     

    What I'm confused by, and I'm guessing has to do with the IP Rules set up in the container, is why the IP address I'm talking to, contained within the LAN_NETWORK is being changed. I can't see anything obvious from the network logs in the browser about the IP changing and a little stumped on what's going on here. I can't completely remove the rule for security reasons, but may be able to craft some rules before it that allow this traffic if I understood better what was going on. Thanks for the help guys!

  11. 17 hours ago, WannabeMKII said:

    Ah ha, 'DNS  resolution not currently available' resolved thanks to your link, appreciated!

     

    As for the second section, that's gone straight over my head O.o Is this something I can do manually, or is this something that can be added to the docker?

     

    I don't know the answer to that either. Preferably they would be baked into the docker container. I don't see why you couldn't do them manually, but don't expect them to survive an upgrade then.

  12. On 2/4/2018 at 6:45 AM, WannabeMKII said:

    I've tried using Pi-Hole three times. Once on 6.3.5, once on 6.4 and again on 6.4, but with it's own IP.

     

    The problem I'm having is that 75% of time, everything is fine, but every now and again, pages just don't load. They hang and hang and refreshing the page 2 or 3 times then causes it to load?

     

    Any ideas?

     

    The only other thing I've noticed is that I get the following on startup and when updating the lists;

     

    
      [✗] DNS resolution is currently unavailable

    I'm not sure why?

     

    See:

    https://lime-technology.com/forums/topic/48744-support-pihole-for-unraid-spants-repo/?do=findComment&comment=616611

     

    As for the loading time: I wonder if it's related to one of the latest blog posts by Pi-Hole.

     

    https://pi-hole.net/2018/02/02/why-some-pages-load-slow-when-using-pi-hole-and-how-to-fix-it/

     

    I don't know if changes mentioned in that blog post are going to be incorporated into the docker container.

  13. So I think there's some confusion, to me too. But I think here's my understanding in looking through spread out information.

     

    So I do believe that @J.Nerdy is correct in that CP for small business does offer UNLIMITED backup (not 5 TB) from a excerpt when you go to crashplan for home now.
     

    Quote

    Simply click below to learn about our unlimited, automatic, secure cloud backup solution at an affordable monthly price of just $10 per month, per device. No hidden fees to set up or restore. 

     

    And now to address the 5TB thing that people have mentioned. It seems that when you click the learn more link to crashplan for small business, you can MIGRATE your cloud backps that are 5TB and smaller. So I read this as you have to start over your backups after the 5TB limit. I haven't gone through the migration but that's how I'm reading the information. 

    Now another reason I don't want to migrate to even confirm is I don't know if this docker support small business acounts as others have mentioned. Would like to know if someone has it working with a small business account.

    So I think there's some confusion, to me too. But I think here's my understanding in looking through spread out information.

     

    So I do believe that @J.Nerdy is correct in that CP for small business does offer UNLIMITED backup (not 5 TB) from a excerpt when you go to crashplan for home now.
     

    Quote

    Simply click below to learn about our unlimited, automatic, secure cloud backup solution at an affordable monthly price of just $10 per month, per device. No hidden fees to set up or restore. 

     

    And now to address the 5TB thing that people have mentioned. It seems that when you click the learn more link to crashplan for small business, you can MIGRATE your cloud backps that are 5TB and smaller. So I read this as you have to start over your backups after the 5TB limit. I haven't gone through the migration but that's how I'm reading the information. 

    Now another reason I don't want to migrate to even confirm is I don't know if this docker support small business acounts as others have mentioned. Would like to know if someone has it working with a small business account.

  14. I have this process running and consuming 10% of my cpu at all times. I found it to be from the Crashplan-Desktop docker.

     

    Is this expected and is anyone else seeing this as well?

     

    2671 root      20  0 23104 3672 3064 S  16  0.0  0:01.48 /bin/bash /etc/my_init.d/config.sh

    I haven't been seeing it. But I start up the docker when I need it, then stop it when I'm done. Is there a reason you need to keep it running?

  15. Someone kindly help out with this  :-[

    So I'm having the same problem. I looked to make sure that the ip/port settings are correct. I even redeployed the docker several times, and on redeploy I'm able to connect to CrashPlan for a while. Then I'm not able to reconnect, the launcher on Windows just stays open, but does not say unable to connect to server. And it will actually stay open for a while and close when I shutdown the docker. So I know there is some sort of connection being formed. The only thing that I've come across is: http://support.code42.com/CrashPlan/Latest/Troubleshooting/Unable_To_Connect_To_The_Backup_Engine#Linux_Solution

     

    And I noticed in the docker file that one language is not en_US like the rest. I don't know if this is related or not as I haven't figured out yet how to manually edit the docker file.

×
×
  • Create New...