Passwordless SSH login


Recommended Posts

1) If necessary, generate an SSH key on your Mac or Linux machines, using ssh-keygen.

 

2) Create an authorized_keys file for the unRAID server, using the id_rsa.pub files on all the machines which require access.

 

3) Copy this file to your server's /root/.ssh/ folder.

 

 

This will work until a reboot. To handle a persistent setup:

 

1) Copy the authorized_keys file to /boot/config/ssh/.

 

2) Add this to the end of.your /boot/config/go, using your preferred editor:

 

mkdir /root/.ssh
chmod 700 /root/.ssh
cp /boot/config/ssh/authorized_keys /root/.ssh/
chmod 600 /root/.ssh/authorized_keys

  • Like 6
  • Thanks 1
Link to comment

1. On Linux or Mac, use ssh-keygen to generate a key pair, there'll be 2 files generated, a private key file (e.g. id_rsa) and a public key file (e.g. id_rsa.pub);

 

2. On your unRAID server, enter folder /root/.ssh (create the folder if it doesn't exist), then edit the file /root/.ssh/authorized_keys (create the file if it doesn't exist), and copy/paste the contents of public key file (id_rsa.pub) into authorized_keys. If you have multiple Linux/Mac computers, you may put multiple public keys into authorized_keys, one in a line;

 

3. Change permission of authorized_keys to 600:

 

    chmod 600 /root/.ssh/authorized_keys

 

Now you can ssh into unRAID w/o entering password.

 

However, after reboot, you'll lose everything in /root folder, so you need to find a way to keep everything during a reboot. Here's how I made it:

 

1. Create a script /boot/config/ssh/setup_ssh_client.sh:

 

   

#!/bin/bash

SSH_DIR=/root/.ssh

mkdir ${SSH_DIR}
chmod 755 ${SSH_DIR}
cp /boot/config/ssh/authorized_keys ${SSH_DIR}/authorized_keys
chmod 600 ${SSH_DIR}/authorized_keys

 

    make it executable:

 

    chmod 755 /boot/config/ssh/setup_ssh_client.sh

 

2. Copy the previously created authorized_keys into /boot/config/ssh folder

 

    cp /root/.ssh/authorized_keys /boot/config/ssh/

 

3. Edit /boot/config/go and add the following line at the end, so that our setup script can be called during reboot:

 

    /boot/config/ssh/setup_ssh_client.sh

 

All done. Now your ssh settings will be setup automatically at reboot.

  • Like 4
  • Thanks 2
Link to comment

A slightly better way to maintain the keys across reboots is to

* copy the authorized_keys file to /boot/config/ssh/root.pubkeys

* copy /etc/ssh/sshd_config to /boot/config/ssh

* modify /boot/config/sshd_config to set the following line

AuthorizedKeysFile      /etc/ssh/%u.pubkeys
 

 

This will allow you to keep the keys on the flash always and let the ssh startup scripts do all the copying.

Edited by ken-ji
Clarification
  • Like 3
  • Thanks 3
Link to comment

A slightly better way to maintain the keys across reboots is to

* copy the public key /boot/config/ssh/root.pubkeys

* copy /etc/ssh/sshd_config to /boot/config/ssh

* modify /boot/config/sshd_config to set the following line

AuthorizedKeysFile      /etc/ssh/%u.pubkeys

 

This will allow you to keep the keys on the flash always and let the ssh startup scripts do all the copying.

 

Good idea! Consider we always login to unRAID server with root, there's no need to provider multiple authorization key files.

Link to comment
  • 1 year later...

By using this SSH Plugin

and reading the hidden README

which you could read with this command :

cat /boot/config/plugins/ssh/read_me.txt

or here :

Quote

 

Create file called "authorized_keys" in this directory and paste into it the contents of a public key from a keypair

To do this, you must have created a public and private key-pair.  Use the following steps to do this:

From command line (telnet / putty)

1. Type "ssh-keygen -t rsa -f /boot/config/plugins/ssh/<USERNAME>/.ssh/id_rsa"
   NB.  replace "<USERNAME>" with the name of the user.
   
2. When prompted, type a passphrase if you wish for additional security for the private key.  Press enter if not for no passphrase

3. Create a copy the public key into the same location and call it "authorized_keys".
   eg.  cp /boot/config/plugins/ssh/<USERNAME>/.ssh/id_rsa.pub /boot/config/plugins/ssh/<USERNAME>/.ssh/authorized_keys
   
Verify everything has been created correctly.

Upon restarting SSH, the plug-in will look for (and find) authorized_keys and copy this file to the users home directory.  eg.  /home/someuser/.ssh/authorized_keys

----------------------------------------------------------

The private part of the key is "id_rsa".  You must take this to the system you intend to connect *from*.  If you intend to use Putty to connect, then you *MUST* first convert the private key from standard OpenSSH format to Putty compatible format.

A copy of PUTTYGEN for UnRAID has been included.  To convert the private key, follow these steps:

From command line (telnet / putty):

1. Type "/usr/bin/puttygen /boot/config/plugins/ssh/<USERNAME>/.ssh/id_rsa -o /boot/config/plugins/ssh/<USERNAME>/.ssh/id_rsa.ppk
2. In Putty, create an entry to your UnRAID server and in "Connection -> SSH -> Auth" section of Putty, browse for the file you created (id_rsa.ppk).

 

 

 

Todd Pike explain how to do it:

 

In my case I already had a authorized_keys populated with few ed25519 keys

so I simply copy this one inside /boot/config/plugins/ssh/<USERNAME>/.ssh/authorized_keys

 

Thanks Todd :)

  • Like 1
Link to comment
  • 4 months later...
  • 2 weeks later...
On 10/19/2016 at 4:12 AM, ken-ji said:

A slightly better way to maintain the keys across reboots is to

* copy the authorized_keys file to /boot/config/ssh/root.pubkeys

* copy /etc/ssh/sshd_config to /boot/config/ssh

* modify /boot/config/sshd_config to set the following line


AuthorizedKeysFile      /etc/ssh/%u.pubkeys
 

 

This will allow you to keep the keys on the flash always and let the ssh startup scripts do all the copying.

 

Wanted to thank @ken-ji this method works nicely.

 

I'm sure anyone wanting to follow your instructions will know this already, but thought i'd just point out that:

 

"* modify /boot/config/sshd_config to set the following line" should read "* modify /boot/config/ssh/sshd_config to set the following line", i think?

 

Link to comment
  • 1 month later...
On 10/19/2016 at 4:14 AM, georgez said:

1. On Linux or Mac, use ssh-keygen to generate a key pair, there'll be 2 files generated, a private key file (e.g. id_rsa) and a public key file (e.g. id_rsa.pub);

 

2. On your unRAID server, enter folder /root/.ssh (create the folder if it doesn't exist), then edit the file /root/.ssh/authorized_keys (create the file if it doesn't exist), and copy/paste the contents of public key file (id_rsa.pub) into authorized_keys. If you have multiple Linux/Mac computers, you may put multiple public keys into authorized_keys, one in a line;

 

3. Change permission of authorized_keys to 600:

 

    chmod 600 /root/.ssh/authorized_keys

 

Now you can ssh into unRAID w/o entering password.

 

However, after reboot, you'll lose everything in /root folder, so you need to find a way to keep everything during a reboot. Here's how I made it:

 

1. Create a script /boot/config/ssh/setup_ssh_client.sh:

 

   


#!/bin/bash

SSH_DIR=/root/.ssh

mkdir ${SSH_DIR}
chmod 755 ${SSH_DIR}
cp /boot/config/ssh/authorized_keys ${SSH_DIR}/authorized_keys
chmod 600 ${SSH_DIR}/authorized_keys
 

 

 

    make it executable:

 

    chmod 755 /boot/config/ssh/setup_ssh_client.sh

 

2. Copy the previously created authorized_keys into /boot/config/ssh folder

 

    cp /root/.ssh/authorized_keys /boot/config/ssh/

 

3. Edit /boot/config/go and add the following line at the end, so that our setup script can be called during reboot:

 

    /boot/config/ssh/setup_ssh_client.sh

 

All done. Now your ssh settings will be setup automatically at reboot.

Hey georgez, great guide. But I understand that ken-ji had some better practice idea below, could you maybe edit this guide to reflect that? For us noobs that need step by step guides :D

Link to comment
  • 7 months later...
On 8/8/2018 at 3:10 PM, MyKroFt said:

I know this is old, but ...

 

I am trying this and getting server refused our key.  Am trying to use extraputty

  

suggestions?

There's a bug on forums that introduces invisible characters to posts, so if you had copied text from a post into a script on your unraid machine, the script may not work--I realized recently this was the reason why passwordless SSH was not working for me.

 

Example from georgez's post: image.png.3a224db9a38a26332ad3bbc06c56b4b4.png

 

You may need to manually type it or use a text editor to catch these errors.

Edited by m8ty
add link
  • Like 1
Link to comment
  • 4 months later...
 puttygen /boot/config/plugins/ssh/mykroft/.ssh/id_rsa -o /boot/config/plugins/ssh/mykroft/.ssh/id_rsa.ppk


 

getting error:

puttygen: error loading `/boot/config/plugins/ssh/mykroft/.ssh/id_rsa': unrecognised key type
 

I did use

ssh-keygen -t rsa -f /boot/config/plugins/ssh/<USERNAME>/.ssh/id_rsa

to generate key

 

any suggestions?

Edited by MyKroFt
Link to comment
  • 1 month later...
  • 1 month later...
On 10/19/2016 at 10:14 AM, georgez said:

1. On Linux or Mac, use ssh-keygen to generate a key pair, there'll be 2 files generated, a private key file (e.g. id_rsa) and a public key file (e.g. id_rsa.pub);

 

2. On your unRAID server, enter folder /root/.ssh (create the folder if it doesn't exist), then edit the file /root/.ssh/authorized_keys (create the file if it doesn't exist), and copy/paste the contents of public key file (id_rsa.pub) into authorized_keys. If you have multiple Linux/Mac computers, you may put multiple public keys into authorized_keys, one in a line;

 

3. Change permission of authorized_keys to 600:

 

    chmod 600 /root/.ssh/authorized_keys

 

Now you can ssh into unRAID w/o entering password.

 

However, after reboot, you'll lose everything in /root folder, so you need to find a way to keep everything during a reboot. Here's how I made it:

 

1. Create a script /boot/config/ssh/setup_ssh_client.sh:

 

   


#!/bin/bash

SSH_DIR=/root/.ssh

mkdir ${SSH_DIR}
chmod 755 ${SSH_DIR}
cp /boot/config/ssh/authorized_keys ${SSH_DIR}/authorized_keys
chmod 600 ${SSH_DIR}/authorized_keys
 

 

 

    make it executable:

 

    chmod 755 /boot/config/ssh/setup_ssh_client.sh

 

2. Copy the previously created authorized_keys into /boot/config/ssh folder

 

    cp /root/.ssh/authorized_keys /boot/config/ssh/

 

3. Edit /boot/config/go and add the following line at the end, so that our setup script can be called during reboot:

 

    /boot/config/ssh/setup_ssh_client.sh

 

All done. Now your ssh settings will be setup automatically at reboot.

think you sir , your reply perfectly solved my problem!

Link to comment
  • 4 months later...
On 12/19/2019 at 10:27 AM, trurl said:

The post you quoted is over 3 years old.

 

Unraid 6.8 and above does not allow executing scripts from the flash (/boot) drive. Use the User Scripts plugin instead of the go file.

Will unRAID ever support ssh keys? I know there is a plugin by for me that plugin will simply not install, and so much research i have done to accomplish a solution for this, but there are so many.

Secondly, do you have a an unraid recommended solution for this?

  • Like 2
Link to comment
23 hours ago, Ustrombase said:

Secondly, do you have a an unraid recommended solution for this?

I basically did what the example shell script has in it, but I put it all in my go script, and it works just fine in 6.8.3 since it doesn't execute anything directly (just copies and changes permissions). 6.8.0 tightened the security of the go script to not execute things directly from the boot flash drive.

 

My go script contains this:

mkdir /root/.ssh

chmod 700 /root/.ssh

cp /boot/config/ssh/authorized_keys /root/.ssh/

chmod 600 /root/.ssh/authorized_keys

And I just make sure that /boot/config/ssh/authorized_keys contains the public key I want to use.

 

Alternately, you could use the go script to copy the example shell script to /tmp, set the permissions, then execute it.

 

I've not tested it but it would be something like this in the go script (with the setup_ssh_client.sh unaltered and in the recommended location):

cp /boot/config/ssh/setup_ssh_client.sh /tmp/setup_ssh_client.sh

chmod +x /tmp/setup_ssh_client.sh

bash /tmp/setup_ssh_client.sh

 

Link to comment
  • 1 month later...

I'm at a loss after 3 days of research and many reboots on this.

 

I am trying to create a "safe" connection between my unraid and a raspberry pi which will ultimately be remote.

I am using raspbian lite 10 on the raspberry. I have already installed wireguard on it and it connects to unraid successfully at each reboot. So that's for the "safe" tunnel in between the two.

 

Unraid (10.253.0.1) can ping and ssh into the raspberry pi (10.253.0.3) using the IP address between brackets, and the other way around. So SSH works wonderfully... but only by entering the password.

 

I created the ssh public keys [using ssh-keygen], added them to the authorized_keys file on each endpoint [cat raspberrypi.pub >> authorized_keys], but to no avail. Everything works fine, but the password is always required. I have tried of course to disable the password login on unraid [in the settings section once the SSH plugin is installed] to try and see if it would force the usage of the predefined public key stored, but nope, the connection then fails.

 

I want to be scripting some rsync later, so I absolutely need to have the passwordless login working. I have tried many tutorials around, and I think I understand the mechanisms quite well. As far as I can tell, I have done everything necessary, but I don't understand why the password is still required for any SSH attempt I do (either way).

 

I have tried also all the different values for chmod on the different files and folders, but no combination seems to work.

 

Worthy of note, I am using the root account on each side, even on the raspberry (so not the pi account, I have enabled root, but even other "new" users created with super user privileges still have the same behaviour and the password is requested).

 

Any help would be greatly appreciated

Edited by denishay
Link to comment

OK... I have found a solution, and writing it here as it may help others troubleshoot passwordless SSH connections. I knew I had done everything right and couldn't find why it was working with passwords entered bu not without.

 

The whole key is in the "-v" (verbose) option of the SSH command which allowed be to see all the steps and find out what was wrong.

 

As a reminder, host is unraid and the remote guest is a raspberry pi under debian 10 lite (without desktop).

Quote

root@raspberrypi:~/.ssh# ssh -v root@unraid
OpenSSH_7.9p1 Raspbian-10+deb10u2, OpenSSL 1.1.1d  10 Sep 2019
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 19: Applying options for *
debug1: Connecting to unraid [10.253.0.1] port 22.
debug1: Connection established.
debug1: identity file /root/.ssh/id_rsa type -1
debug1: identity file /root/.ssh/id_rsa-cert type -1
debug1: identity file /root/.ssh/id_dsa type -1
debug1: identity file /root/.ssh/id_dsa-cert type -1
debug1: identity file /root/.ssh/id_ecdsa type -1
debug1: identity file /root/.ssh/id_ecdsa-cert type -1
debug1: identity file /root/.ssh/id_ed25519 type -1
debug1: identity file /root/.ssh/id_ed25519-cert type -1
debug1: identity file /root/.ssh/id_xmss type -1
debug1: identity file /root/.ssh/id_xmss-cert type -1
debug1: Local version string SSH-2.0-OpenSSH_7.9p1 Raspbian-10+deb10u2
debug1: Remote protocol version 2.0, remote software version OpenSSH_8.1
debug1: match: OpenSSH_8.1 pat OpenSSH* compat 0x04000000
debug1: Authenticating to unraid:22 as 'root'
debug1: SSH2_MSG_KEXINIT sent
debug1: SSH2_MSG_KEXINIT received
debug1: kex: algorithm: curve25519-sha256
debug1: kex: host key algorithm: ecdsa-sha2-nistp256
debug1: kex: server->client cipher: [email protected] MAC: <implicit> compression: none
debug1: kex: client->server cipher: [email protected] MAC: <implicit> compression: none
debug1: expecting SSH2_MSG_KEX_ECDH_REPLY
debug1: Server host key: ecdsa-sha2-####################################################################
debug1: Host 'unraid' is known and matches the ECDSA host key.
debug1: Found key in /root/.ssh/known_hosts:1
debug1: rekey after 134217728 blocks
debug1: SSH2_MSG_NEWKEYS sent
debug1: expecting SSH2_MSG_NEWKEYS
debug1: SSH2_MSG_NEWKEYS received
debug1: rekey after 134217728 blocks
debug1: Will attempt key: /root/.ssh/id_rsa
debug1: Will attempt key: /root/.ssh/id_dsa
debug1: Will attempt key: /root/.ssh/id_ecdsa
debug1: Will attempt key: /root/.ssh/id_ed25519
debug1: Will attempt key: /root/.ssh/id_xmss
debug1: SSH2_MSG_EXT_INFO received
debug1: kex_input_ext_info: server-sig-algs=<ssh-ed25519,ssh-rsa,rsa-sha2-256,rsa-sha2-512,ssh-dss,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521>
debug1: SSH2_MSG_SERVICE_ACCEPT received
debug1: Authentications that can continue: publickey,password,keyboard-interactive
debug1: Next authentication method: publickey
debug1: Trying private key: /root/.ssh/id_rsa
debug1: Trying private key: /root/.ssh/id_dsa
debug1: Trying private key: /root/.ssh/id_ecdsa
debug1: Trying private key: /root/.ssh/id_ed25519
debug1: Trying private key: /root/.ssh/id_xmss
debug1: Next authentication method: keyboard-interactive
debug1: Authentications that can continue: publickey,password,keyboard-interactive
debug1: Next authentication method: password
root@unraid's password: <I entered the password there>
root@raspberrypi:~/.ssh# v
total 12
-rw-r--r-- 1 root root  444 Jun 12 19:29 known_hosts
-rw------- 1 root root 1823 Jun 12 19:28 rasp
-rw-r--r-- 1 root root  398 Jun 12 19:28 rasp.pub

root@raspberrypi:~/.ssh# mv rasp id_rsa
root@raspberrypi:~/.ssh# mv rasp.pub id_rsa.pub
root@raspberrypi:~/.ssh# v
total 12
-rw------- 1 root root 1823 Jun 12 19:28 id_rsa
-rw-r--r-- 1 root root  398 Jun 12 19:28 id_rsa.pub
-rw-r--r-- 1 root root  444 Jun 12 19:29 known_hosts
root@raspberrypi:~/.ssh# ssh root@unraid
Last login: Fri Jun 12 20:31:58 2020 from 10.253.0.3
Linux 4.19.107-Unraid.

root@unraid:~# exit
logout
Connection to unraid closed.

root@raspberrypi:~/.ssh# ^C

 

As you can see, all it took was to rename the public certificate and upload it to the host.

 

To do that "properly" without to have to set permissions, you can use the ssh-copy-id -i <path+certificate_name> user@host command.

 

So from now on, if you are having issues with your certificates or passwordless SSH connections, you know how to look for what's missing or not working!

Link to comment
  • 1 month later...

For those of you who followed these instructions but still get prompted for a password, I found that I had to change the permissions in the /root/.ssh folder to make the ssh server happy. It doesn't complain loudly so I had to look at /var/log/syslog to find the error. Run this on the UnRAID server to fix your file permissions. It will do the "right" thing:

 

chmod -R -v g-rwx,o-rwx /root/.ssh

 

Also, if you need to restart the ssh server, do this:

 

/etc/rc.d/rc.sshd restart

 

Link to comment
  • 3 weeks later...
On 10/19/2016 at 5:12 AM, ken-ji said:

A slightly better way to maintain the keys across reboots is to

* copy the authorized_keys file to /boot/config/ssh/root.pubkeys

* copy /etc/ssh/sshd_config to /boot/config/ssh

* modify /boot/config/sshd_config to set the following line


AuthorizedKeysFile      /etc/ssh/%u.pubkeys
 

 

This will allow you to keep the keys on the flash always and let the ssh startup scripts do all the copying.

This solution looks very neat, is it still valid for Unraid 6.8.3? I followed the instructions but unfortunately I still need to enter password when connecting via SSH, after a SSH restart with the following command:

sudo /etc/rc.d/rc.sshd restart

FYI I use a Mac to generate the keys and to connect to Unraid.

 

Is it important how the keys are generated? Would this be okay or should the email field be left out?

ssh-keygen -t rsa -C "[email protected]"

I have also modified permissions on authorized_keys with the "chmod 600" command as well as on the /root/.ssh folder as frakman1 adviced in the previous post.

Shall the authorized_keys file still be placed in boot/config/ssh/root.pubkeys/ even though the /boot/config/ssh/sshd_config file is modified in this way?

AuthorizedKeysFile      /etc/ssh/%u.pubkeys

I think it looks like it is looking for a file in /etc/ssh but it is placed in boot/config/ssh/root.pubkeys/.

 

Thank you in advance for any reply.

Link to comment

Its valid still.

when you run

/etc/rc.d/rc.sshd restart

it should restart sshd service by first copying all the files from /boot/config/ssh to /etc/ssh, regenerate server keys if needed, then backup the newly generated  server keys.

If you did what my post suggested, the authorized keys file will now be by default /etc/ssh/root.pubkeys, which will need to be backup to the /boot/config/ssh directory at your discretion.

if you are getting asked for a password only after you restarted sshd, you must have added the public keys to the wrong place (/etc/ssh/root.pubkeys) then restarted sshd. add them to /boot/config/ssh/root.pubkeys, then restart sshd

 

 

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.