lococola

Members
  • Posts

    21
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

lococola's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. I also do not receive actual mails. However, after sending the invitation I was able to create an account for that e-mail address by simply going to the login page and clicking the Create Account button. I found that the process would actually work for the e-mail address that I sent an invite to. So just try it out
  2. I set up NGinx Proxy Manager combined with Cloudflare and a domain for the Bitwarden docker. It all works, but I had to forward port 80 and 443 in my router. Now, it seems anyone can access the bitwarden login page from my docker from the internet. I did disable the admin page access using the tip from this thread. But the fact that you can just type my bitwarden.mydomain.com address and get to the login screen worries me. I really only need this to be accessible from within my LAN. Is it possible to somehow hide this page for any WAN access? How do you guys do it? Just accept that the login page is visible to the world? EDIT: Looks like I was able to make it a little bit more secure using NGinx proxy manager Access Lists. At first I couldn't get the ACL to work. I added the external IP of my router to the ACL, but I kept seeing 403 errors. And I did save the proxy host config each time again also. The thing that finally fixed it for me was adding the following code to the proxy host advanced config: real_ip_header CF-Connecting-IP; Now the login page is only visible from my own IP address, and gives a 403 error from any other IP. Makes me feel a little bit more secure
  3. Thanks. I've read through that post and I learned some new things about the behavior of Windows regarding authentication and caching. But in my case, I don't want my main Windows user account to have access to the private share. What I want is to always have a login prompt appear when trying to access that particular share. Only with the correct username/password you can get in. This was working nicely (albeit with a workaround) before by approaching the share using the IP address instead of the hostname. But after the KB5003173 update that stopped working.
  4. So, I have a hidden private share on my Unraid server that I am able to access on my Windows 10 machine by manually typing the ip address and path, and then entering the username/password that I setup for that share. Yesterday Windows update KB5003173 was installed on my Windows 10 machine. The Unraid share now cannot be accessed any more from that computer. Other public shares on the Unraid server are still accessible. But not the private share. I figured it might be an SMB 1.0 thing. In the Unraid settings I changed the SMB option "Enable Netbios" to No, which should disable SMB 1.0. But the private share is still not accessible. Then I reverted the KB5003173 update on my Windows 10 computer and after this the private share can again be accessed. Anyone else run into this? Is there a solution other than not installing the Windows security update?
  5. Drive: WD80EZAZ-11TDBA0 Power on hours: 7385 (10m, 2d, 17h) Load Cycle Count: 9311 Unraid is set to never spin down the drive, and in the drive properties spin-down is set to default (so, never). Shouldn't the LCC be a factor 10 lower than what it currently is?
  6. I appreciate your help. But it's getting really confusing. I think a reverse proxy is not the solution for me if I don't want to open any ports in my router. So I will start from scratch again and see if I can get Bitwarden to work with some form of self signed certificate. I saw mention of Caddy, perhaps that works better for me. Thanks again though.
  7. Interesting. How would I go about setting that up?
  8. It's confusing. I mean, Is port 443 needed for Letsencrypt, or for Bitwarden? I verify Letsencrypt over DNS and I don't have to have any ports open in my router for that. If it is Bitwarden that needs to go online over port 443, then what for? What is surprising to me is that once you have set up Bitwarden over the reverse proxy, anyone can access the login page if they know the url. Sure, they have to crack the password and then circumvent the 2FA, so it's safe enough. But still, I really don't like the thought of having my password manager available through the internet like that. I'm frankly surprised nobody else seems to have an issue with this. I will check out if I can get Bitwarden to work with a self signed cert. Since the app does not need to be accessed over WAN it should be secure enough. *If* it works...
  9. So I installed Bitwarden RS on my Unraid server. I only want to access it from within the LAN. I also do not want to open any ports in my router. Unfortunately it appears this is not possible. It turns out I can't create a user account if Bitwarden is not on HTTPS. Fine. So I go through the whole work of setting up a reverse proxy, getting duckdns, and buying a domain (only a few bucks anyway). I spent all day setting it up and getting it all connected. Also I had to get Letsencrypt to verify over DNS since my ISP is blocking port 80 (took me half a day before I figured out this was a thing). In the end, it all works. I can access Bitwarden through my domain, I can create an account etc. However, it only works with port 443 forwarded in my router to Unraid's port 1443. If I disable port forwarding in the router, Bitwarden is not accessible. Now, to me it seems kind of ridiculous that I have to go over the WAN in order to access a locally hosted server. I mean, I am running local, and Bitwarden is local. There must be some more efficient way to connect than via the internet... I am guessing I am probably missing some obvious setting or feature? Is there anyone who can shed some light on this? A reverse proxy may be nice to have for future apps, but in this situation all I wanna do is access Bitwarden locally (and securely). I have extensively searched this forum (and the rest of the internet) but information on this particular thing is scarce. It appears everybody really likes to access Bitwarden from the other side of the world for some reason...
  10. Well, I got it running with the proper GUID and GPID now. Unfortunately I still have the issue of Soulseek not writing the client data, so whenever I restart the docker I lose all my settings. I have set the option to 1 minute, but to no avail. Could it be a permissions issue? This is how I set up my paths in the Docker settings: Which user:group should I set the /mnt/cache/appdata/soulseek/ folder and subfolders to? root:root, or nobody:users? I tried both but it won't save the client data...
  11. I want to run this on my Unraid server. Do I need to run this through a VPN? Or a reverse proxy? Or both?
  12. I feel like a total nitwit, but I can't get this to work. I installed MariaDB with default settings. Then I installed Wordpress with default settings. Then when I open the WP webgui, I get "Error establishing a database connection". What do I need to do to get this working? I know how to install Wordpress itself, I'm just not seeing the initial WP setup screen. Never mind, fixed it. I needed to set the path to the MariaDB with the right port in the WP docker settings. Working now!
  13. Very cool plugin! Been playing around with it. One question. When you hover a menu item there appears this little colored underline. How can you change the color of this underline? I tried to find out the object using Chrome Inspector, but I haven't been succesfull. Very strange!
  14. I want to use a script to run a rsync command, but I also want to run this from a screen, so I can check the progress at any time with screen -r. I have several backup scripts for each share that I want to backup. I'm having some issue getting screen to work. This is how the script looks now: #!/bin/bash export SCREENDIR=/root/.screen screen -dmS rsyncdms echo "Job '[job name]' started" rsync [with parameters etc] /usr/local/emhttp/webGui/scripts/notify -s "Job '[job name]' finished at $(date '+%Y-%m-%d %H:%M')" So it needs to do 3 things (echo, rsync and notify). When I run the script from the User Scripts page with the Run Script button, it works perfectly. But when I run it in the background and then try to view the status with 'screen -r' in a terminal, I just get a blank terminal with a prompt. I read it's because you need to put something immediately behind the 'screen -dmS rsyncdms' command, but then it still doesn't work. How can I get these 3 things to be executed in order, all within its own screen session? I saw another post where someone made a new user script and within it called the .sh file, but I want to integrate the screen function within the backup script itself, if possible.