In order to sync SSH keys on your Containership account to an attached Kubernetes cluster, you must execute some commands on your nodes to add a containership  user:

NOTE: Syncing SSH keys are not supported on hosts running Container-Optimized OS from Google. However, if you are using GKE, you can switch the Node image setting, to Ubuntu  in the Google Cloud Console (This will terminate your existing nodes, and launch new ones to replace them). 

1.) SSH into your host:

ssh -i ~/PATH/TO/KEY.pem [email protected]

2.) Create directories:

ubuntu:~$ sudo mkdir -p /etc/containership/home

NOTE: this directory may already exist. In this case, the command will not fail, and the existing directory structure will not be overwritten.

3.) Create Containership user, and assign home directory:

ubuntu:~$ sudo useradd -d /etc/containership/home containership

4.) Recursively change ownership of /etc/containership/  directory, to the containership  user:

sudo chown -R containership:containership /etc/containership/

5.) Create /etc/sudoers.d file for the containership  user:

sudo visudo -f /etc/sudoers.d/containership

6.) Grant (passwordless) sudo access to containership  user, by adding the following line, and saving the file:

containership ALL=(ALL) NOPASSWD:ALL

7.) Verify that the containership user is able to assume root, without providing a password:

ubuntu:~$ sudo su - containership
$ sudo su -
root:~#

8.) Repeat steps above on all of your nodes. You should now be able to SSH  into your nodes, by running the following command:

ssh -i ~/PATH/TO/CONTAINERSHIP_KEY.pem [email protected]

This should drop you into a shell corresponding to your Containership user account. 

Did this answer your question?