How To Configure SSH for a RAC Installation
GOAL
This document will explain how to configure SSH, which is required to run a RAC installation. Following the instructions in the installation guide are also correct, but sometimes this will not work, although the reason for that isn’t clear. Therefore after some investigation it seems to be that the steps below will work too.
Starting with 11gR2 the Oracle Universal Installer the SSH can be setup automatically using the ‘SSH Connectivity’ button.
SOLUTION
To configure SSH you need to perform the following steps on each node in the cluster.
$ cd $HOME
$ mkdir .ssh
$ chmod 700 .ssh
$ cd .ssh
$ ssh-keygen -t rsa
Now accept the default location for the key file
Enter and confirm a passphrase. (you can also press enter twice).
$ ssh-keygen -t dsa
At the prompts, accept the default location for the key file (press Enter).
Then press “Enter” twice to accept no passphrase.
Please note: SSH with passphrase is not supported for Oracle Clusterware 11g release 2 and later releases.
If you provide a passphrase for pre 11.2 release, then you need to do 2 addition steps.
$ exec /usr/bin/ssh-agent $SHELL
$ /usr/bin/ssh-add
These statements will inform the ssh agent to add the keys to the shell used.
$ cat *.pub >> authorized_keys.<nodeX> (nodeX could be the nodename to differentiate files later)
Now do the same steps on the other nodes in the cluster.
When all those steps are done on the other nodes, start to copy the authorized_keys.<nodeX> to all the nodes into $HOME/.ssh/
For example if you have 4 nodes you will have after the copy in the .ssh 4 files with the name authorized_keys.<nodeX>
Then on EACH node continue the configuration of SSH by doing the following:
$ cd $HOME/.ssh
$ cat *.node* >> authorized_keys
$ chmod 600 authorized_keys
NOTE: ALL public keys must appear in ALL authorized_keys files, INCLUDING the LOCAL public key for each node.
To test that everything is working correct now execute the commands
$ ssh <hostnameX> date
So on example in a 4 node environment:
$ ssh node1 date
$ ssh node2 date
$ ssh node3 date
$ ssh node4 date
Repeat this 4 times on each node, including ssh back to the node itself. The nodeX is the hostname of the node.
The first time you will be asked to add the node to a file called ‘known_hosts’ this is correct and answer the question with ‘yes’. After that when correctly configured you must be able to get the date returned and you will not be prompted for a password.
Please note, ssh will not allow passwordless access if permissions on the home directory of the account you are using allow write access for everyone.
You will also see permission denied error when the permissions on $HOME are 777 or 775. $HOME should have permission 700 or 755.
Disable banner (/etc/banner) on all cluster nodes when you
- run clusterverify (cluvfy, runcluvfy)
- install software
- patch the system
Please work with your System Administrator or contact your Operating System support in case you still have problems setting up ssh.
Trackbacks & Pingbacks