Introduction
Connecting to our instances with ssh can be time-consuming since we have to know every time the IP to use.
This isn’t an issue in environments where public IPs don’t change since we can configure our SSH Config File and we are ready to use aliases to connect to our instances.
In cases where the public IP of the instances is changing often though(e.g. in AWS when an instance is stopped, it’s changing its public IP), we need to constantly update this file to keep it up to date, which can be a tiring procedure.
To address this I set up a dockerized solution that will fetch all the information for your AWS EC2 instances and will generate an SSH Config File ready to be used by executing just one line of code!
Solution
Our solution is composed of:
- a simple python script that fetches the information regarding EC2 instances in a region and generates the SSH Config File.
- a Dockerfile
Alright, let’s get to it.
To auto-magically generate your SSH Config File you’ll need:
- an
ssh key file
that will be used to connect to EC2 instances in /Users/$USER/.ssh/ awscli
installed- an AWS account with at least one EC2 instance.
First, in your terminal execute:
run awscli configure
or just export as environment variables your AWS credentials:
export AWS_ACCESS_KEY_ID=#############
export AWS_SECRET_ACCESS_KEY=###############
And you are ready to generate the ssh config file!
Before we do that let’s have a quick look at the python script and the Dockerfile.
aws_ssh_get_info.py
#!/usr/bin/python
"""Script to generate ssh config file for AWS EC2 instances in a region."""
import boto3
import os
# The generated config file
path_to_config = '/root/.ssh/aws_ec2.config'
# The path to the SSH key we use to connect to those instances
path_to_ssh_key = os.environ['SSH_KEY_PATH']
# The username to use for ssh
instance_username = os.environ['INSTANCE_USERNAME']
def main():
try:
"""
Describe ec2 instances and generate the format of the ssh config file.
"""
aws_client = boto3.client('ec2')
paginator = aws_client.get_paginator('describe_instances')
response_iterator = paginator.paginate(
DryRun=False,
PaginationConfig={
'MaxItems': 100,
'PageSize': 10
}
)
ssh_config_file = open(path_to_config, 'w')
ssh_config_file.write("##########################\n")
ssh_config_file.write("##### AWS SSH CONFIG #####\n")
ssh_config_file.write("##########################\n\n")
"""
We iterate the results and read the tags for each instance.
If the instance has no Tags we use the PublicDnsName as host
Using those we create an ssh config entry for each instance.
and append it to the config file.
host <Tag Name or PublicDnsName>
Hostname <ec2-public-ip>
IdentityFile <path_to_ssh_key>
User <instance_username>
"""
for page in response_iterator:
for reservation in page['Reservations']:
for instance in reservation['Instances']:
try:
host_line = ""
host = ""
env = ""
if 'PublicIpAddress' in instance:
public_ip = instance['PublicIpAddress']
for tag in instance['Tags']:
if tag['Key'] == "Name":
name = tag['Value']
else:
name=instance['PublicDnsName']
host_line += "##########################\n"
host_line += "host {}\n".format(name)
host_line += " Hostname {}\n".format(public_ip)
host_line += " IdentityFile {}\n".format(
path_to_ssh_key)
host_line += " user {}\n".format(
instance_username)
host_line += "##########################\n"
host_line += "\n"
ssh_config_file.write(host_line)
except Exception as e:
raise e
print("File updated: " + path_to_config)
except Exception as e:
print(e)
if __name__ == '__main__':
main()
This script is using the boto3
library to fetch information regarding your EC2 instances and then generates the structure of an SSH Config File.
It expects as inputs the path of your ssh key SSH_KEY_PATH
where you have your key used to ssh to instances, the username
used on ssh command INSTANCE_USERNAME
(defaults to ubuntu), and the AWS region AWS_DEFAULT_REGION
.
To set the alias that will be used for easier and quicker ssh connections we are using the Name Tag
of each instance and then we set it for the host
parameter in the SSH Config File. If the instance has no Tags we use the PublicDnsName as host.
Let’s see how we created our Docker image.
Dockerfile
FROM python:3.7-alpine as base
ENV INSTANCE_USERNAME=ubuntu
ENV AWS_DEFAULT_REGION=us-east-2
FROM base as builder
RUN mkdir /install
WORKDIR /install
# copy the dependencies file to the working directory
COPY requirements.txt /requirements.txt
# install dependencies
RUN pip install --prefix=/install -r /requirements.txt
FROM base
COPY --from=builder /install /usr/local
# set the working directory in the container
WORKDIR /root
RUN mkdir .ssh
COPY aws_ssh_get_info.py ./
ENTRYPOINT [ "python" ]
CMD ["./aws_ssh_get_info.py" ]
Quite straightforward, setting default variables, installing dependencies, and executing our python script.
Generate your SSH Config File
Finally, we are ready to fetch our config file.
The last thing to note is that we need to mount the path where our .ssh directory
is located. Mount it as volume: -v ~/.ssh/:/root/.ssh/
.
Set the env vars AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_SESSION_TOKEN(if needed), SSH_KEY_PATH at runtime.
docker run -v ~/.ssh/:/root/.ssh/ \
-e "AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID" \
-e "AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY" \
-e "AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN" \
-e "SSH_KEY_PATH=~/.ssh/ssh_example_name.pem" \
-e "AWS_DEFAULT_REGION=us-east-1" \
moustakis/aws-ec2-get-ssh-config:1.0
On successful execution you should see a message: File updated: /root/.ssh/aws_ec2.config
. The /root location refers to the path inside the container.
Locally the file is generated under /Users/$USER/.ssh/. Go to this directory and rename the file to config
and you are good to go!
For example, if I have 3 EC2 instances in us-east-1
named instance-1
, instance-2
, instance-2
and I execute the above example command, a file that looks like this will be generated.
aws_ec2.config
##########################
##### AWS SSH CONFIG #####
##########################
##########################
host instance-1
Hostname x.x.x.x
IdentityFile ~/.ssh/ssh_example_name.pem
user ubuntu
##########################
##########################
host instance-2
Hostname x.x.x.x
IdentityFile ~/.ssh/ssh_example_name.pem
user ubuntu
##########################
##########################
host instance-3
Hostname x.x.x.x
IdentityFile ~/.ssh/ssh_example_name.pem
user ubuntu
##########################
Where x.x.x.x
is the public IP of each instance.
To connect to one of these instances we just have to use the EC2 name of the instance, for example:
ssh instance-1
Even if the public IP of this instance changes, the only thing we have to do is to execute our script that will update our SSH Config file and we should still be able to connect with the ssh instance-1
command.
You can find all related files in this repo https://github.com/Imoustak/aws-ec2-get-ssh-config
That’s all folks, hope you enjoyed this. We saw how we can keep our SSH Config File up to date with a single command and avoid spending time looking for public IPs of EC2 instances.