Jump to content

lococola

Members
  • Posts

    30
  • Joined

  • Last visited

Everything posted by lococola

  1. I just updated Privoxyvpn to the latest version, but now my Soulseek container that I'm running through Privoxy can't connect any longer. EDIT: Solved. There was some issue with certificates on the side of AirVPN. Fixed it by renewing the cert from the Airvpn client area, then generating a new ovpn file and placing that in PrivoxyVPN's config folder, replacing the old file. Everything now working again.
  2. I'm still on 6.8.3. I would like to upgrade to 6.9.2. How can I do this? From Unraid I can only select 6.10.
  3. You're right. Adding the set_real_ip lines to the advanced config fixed the issue for me. Strange that it somehow managed to work for several months without it. Still, working again now. Thanks for the help!
  4. Like so: This is what I have in my access list: If I set the Proxy Host to not use this access list, so setting it to Publicly Accessible, then I can reach my vault just fine through the domain, and also login to it without issues. So the problem is somewhere in the access list, but I can't figure out how.
  5. I have vaultwarden running through a reverse proxy that I set up with NGINX Proxy Manager. This used to work fine but suddenly since today I'm getting 403 errors with the reverse proxy. In NGINX I have an access list configured that allows only my external IP and my internal LAN, it denies everything else. This used to work fine. But now I am getting 403 errors when this access list is used. If I set the proxy host to Public Access, everything is fine. I don't know what could cause the access list to suddenly not work any more. I am using Cloudflare and I did set the custom config real_ip_header CF-Connecting-IP; in the advanced settings of the proxy host. HTTP/2 support is disabled. Any ideas what could be the issue? I updated NGinx to the latest version. Is the some log that I can check?
  6. I found I couldn't login anymore, no idea why. So I reset my password according to this procedure from github. But now I lost my entire configuration! Everywhere I go it says "owner is null". Do I have to set it up again from scratch or can this be fixed somehow? edit: and now I can't login anymore... fantastic
  7. Is it possible to enable the recycle bin just for a single share or folder? Right now it's enabled for every share by default, but I have a lot of shares and there appears to be a limit on the amount of exclusions you can add.
  8. I also do not receive actual mails. However, after sending the invitation I was able to create an account for that e-mail address by simply going to the login page and clicking the Create Account button. I found that the process would actually work for the e-mail address that I sent an invite to. So just try it out
  9. I set up NGinx Proxy Manager combined with Cloudflare and a domain for the Bitwarden docker. It all works, but I had to forward port 80 and 443 in my router. Now, it seems anyone can access the bitwarden login page from my docker from the internet. I did disable the admin page access using the tip from this thread. But the fact that you can just type my bitwarden.mydomain.com address and get to the login screen worries me. I really only need this to be accessible from within my LAN. Is it possible to somehow hide this page for any WAN access? How do you guys do it? Just accept that the login page is visible to the world? EDIT: Looks like I was able to make it a little bit more secure using NGinx proxy manager Access Lists. At first I couldn't get the ACL to work. I added the external IP of my router to the ACL, but I kept seeing 403 errors. And I did save the proxy host config each time again also. The thing that finally fixed it for me was adding the following code to the proxy host advanced config: real_ip_header CF-Connecting-IP; Now the login page is only visible from my own IP address, and gives a 403 error from any other IP. Makes me feel a little bit more secure
  10. Thanks. I've read through that post and I learned some new things about the behavior of Windows regarding authentication and caching. But in my case, I don't want my main Windows user account to have access to the private share. What I want is to always have a login prompt appear when trying to access that particular share. Only with the correct username/password you can get in. This was working nicely (albeit with a workaround) before by approaching the share using the IP address instead of the hostname. But after the KB5003173 update that stopped working.
  11. So, I have a hidden private share on my Unraid server that I am able to access on my Windows 10 machine by manually typing the ip address and path, and then entering the username/password that I setup for that share. Yesterday Windows update KB5003173 was installed on my Windows 10 machine. The Unraid share now cannot be accessed any more from that computer. Other public shares on the Unraid server are still accessible. But not the private share. I figured it might be an SMB 1.0 thing. In the Unraid settings I changed the SMB option "Enable Netbios" to No, which should disable SMB 1.0. But the private share is still not accessible. Then I reverted the KB5003173 update on my Windows 10 computer and after this the private share can again be accessed. Anyone else run into this? Is there a solution other than not installing the Windows security update?
  12. Drive: WD80EZAZ-11TDBA0 Power on hours: 7385 (10m, 2d, 17h) Load Cycle Count: 9311 Unraid is set to never spin down the drive, and in the drive properties spin-down is set to default (so, never). Shouldn't the LCC be a factor 10 lower than what it currently is?
  13. I appreciate your help. But it's getting really confusing. I think a reverse proxy is not the solution for me if I don't want to open any ports in my router. So I will start from scratch again and see if I can get Bitwarden to work with some form of self signed certificate. I saw mention of Caddy, perhaps that works better for me. Thanks again though.
  14. Interesting. How would I go about setting that up?
  15. It's confusing. I mean, Is port 443 needed for Letsencrypt, or for Bitwarden? I verify Letsencrypt over DNS and I don't have to have any ports open in my router for that. If it is Bitwarden that needs to go online over port 443, then what for? What is surprising to me is that once you have set up Bitwarden over the reverse proxy, anyone can access the login page if they know the url. Sure, they have to crack the password and then circumvent the 2FA, so it's safe enough. But still, I really don't like the thought of having my password manager available through the internet like that. I'm frankly surprised nobody else seems to have an issue with this. I will check out if I can get Bitwarden to work with a self signed cert. Since the app does not need to be accessed over WAN it should be secure enough. *If* it works...
  16. So I installed Bitwarden RS on my Unraid server. I only want to access it from within the LAN. I also do not want to open any ports in my router. Unfortunately it appears this is not possible. It turns out I can't create a user account if Bitwarden is not on HTTPS. Fine. So I go through the whole work of setting up a reverse proxy, getting duckdns, and buying a domain (only a few bucks anyway). I spent all day setting it up and getting it all connected. Also I had to get Letsencrypt to verify over DNS since my ISP is blocking port 80 (took me half a day before I figured out this was a thing). In the end, it all works. I can access Bitwarden through my domain, I can create an account etc. However, it only works with port 443 forwarded in my router to Unraid's port 1443. If I disable port forwarding in the router, Bitwarden is not accessible. Now, to me it seems kind of ridiculous that I have to go over the WAN in order to access a locally hosted server. I mean, I am running local, and Bitwarden is local. There must be some more efficient way to connect than via the internet... I am guessing I am probably missing some obvious setting or feature? Is there anyone who can shed some light on this? A reverse proxy may be nice to have for future apps, but in this situation all I wanna do is access Bitwarden locally (and securely). I have extensively searched this forum (and the rest of the internet) but information on this particular thing is scarce. It appears everybody really likes to access Bitwarden from the other side of the world for some reason...
  17. Well, I got it running with the proper GUID and GPID now. Unfortunately I still have the issue of Soulseek not writing the client data, so whenever I restart the docker I lose all my settings. I have set the option to 1 minute, but to no avail. Could it be a permissions issue? This is how I set up my paths in the Docker settings: Which user:group should I set the /mnt/cache/appdata/soulseek/ folder and subfolders to? root:root, or nobody:users? I tried both but it won't save the client data...
  18. I want to run this on my Unraid server. Do I need to run this through a VPN? Or a reverse proxy? Or both?
  19. I feel like a total nitwit, but I can't get this to work. I installed MariaDB with default settings. Then I installed Wordpress with default settings. Then when I open the WP webgui, I get "Error establishing a database connection". What do I need to do to get this working? I know how to install Wordpress itself, I'm just not seeing the initial WP setup screen. Never mind, fixed it. I needed to set the path to the MariaDB with the right port in the WP docker settings. Working now!
  20. Very cool plugin! Been playing around with it. One question. When you hover a menu item there appears this little colored underline. How can you change the color of this underline? I tried to find out the object using Chrome Inspector, but I haven't been succesfull. Very strange!
  21. I want to use a script to run a rsync command, but I also want to run this from a screen, so I can check the progress at any time with screen -r. I have several backup scripts for each share that I want to backup. I'm having some issue getting screen to work. This is how the script looks now: #!/bin/bash export SCREENDIR=/root/.screen screen -dmS rsyncdms echo "Job '[job name]' started" rsync [with parameters etc] /usr/local/emhttp/webGui/scripts/notify -s "Job '[job name]' finished at $(date '+%Y-%m-%d %H:%M')" So it needs to do 3 things (echo, rsync and notify). When I run the script from the User Scripts page with the Run Script button, it works perfectly. But when I run it in the background and then try to view the status with 'screen -r' in a terminal, I just get a blank terminal with a prompt. I read it's because you need to put something immediately behind the 'screen -dmS rsyncdms' command, but then it still doesn't work. How can I get these 3 things to be executed in order, all within its own screen session? I saw another post where someone made a new user script and within it called the .sh file, but I want to integrate the screen function within the backup script itself, if possible.
  22. Yep, that was it. I ran rsync with -v and then I could see it didn't even attempt my custom named key, only the rsa_id ones. Naming the key accordingly fixed it.
×
×
  • Create New...