AI-Tools Week - 1 : Mission Multi Image

Goal

My main goal was to set up an Ansible script that can deploy all the docker images onto any remote infrastructure. The aim was to understand if pulling images from a Package Registry like Github Packages or Docker Hub is faster than building the image from the code and running them.

Technologies Needed

  • Ansible

  • Hashicorp Vault

  • Docker Swarm

  • GitHub Personal Access Token

What I was able to do?

The first thing I tried to do was understand the technologies individually. I had used docker swarm and a GitHub PAT before. But Ansible and Hashicorp Vault for new to me. Although I had heard of it and had an idea of what it did, I had never used it.

Target Machine that acts as a Docker Swam Manager.

I created a Docker Swarm to orchestrate my Docker images. One thing I learnt that is you need to SSH into the target machine from the local machine. For that, you need a SSH key-value pair. The reason this is better than a simple HTTP password-based connection is that it is susceptible to man-in-the-middle attacks.

SSH - Key Generation involves creating a private and public key. You will share the public key with the host you want to SSH onto. During the Login Process, the Target Machine will run your private key against the public key you shared and verify your authorization.

Setting up HashiCorp Vault

I struggled a lot here and learnt many things the hard way.

Accessing a vault you create itself painstaking. I dreaded trying to debug this error server returned HTTP response to HTTPS client.

The vault always had some problems when I ran on my local system :(. Eventually, I decided to switch to an EC2 machine and ran the vault as a docker container.

I learnt a few things while I was attempting to debug:

TLS (Transport Layer Security) Certificate: a digital certificate that is used to establish secure and encrypted connections between a client (such as a web browser) and a server.

There are two steps to using a TLS Certificate

  • Creating One: This is similar to creating a SSH key pair. Here you will generate a .PEM key which will act as the certificate

  • Signing it: This step is crucial. The signing authority is like a digital stamp that tells others that this certificate is valid. Signing is again a two-step process

    • You need to generate a CSR (Certificate Signing Request). You will be providing details about your organisation and yourself.

    • Once you have CSR, you need to submit it to a CA (Certificate Authority) for them to verify and sign it. CA's are the ones that validate your TLS Certificate.

Fun Fact: You can sign a certificate by yourself (Self-Signing). But unfortunately, many services don't accept unauthorized CA's :(.

Ansible Playbook

This was the simplest yet the most essential step of all. I used Ansible to automate the deployment process from end to beginning. It goes as follows

  1. Set up Vault with Vault Credentials

  2. Retrieve Secrets from Vaults about Docker Environment Variables and GitHub

  3. Login into GitHub Registry (ghcr.io) using a GitHub Personal Access Token (PAT) to pull the Docker images.

  4. Deploy the Docker Images onto a Docker Swarm.

Here is the PR: https://github.com/Samagra-Development/ai-tools/pull/191

Thoughts

I realised taking the extra minute to look at the code you write can take you for KiloMetres. While working with the Vault, I always had trouble figuring out the path of the secret I uploaded myself. Looking at the log after each command will eliminate a lot of the problems you might face.