Navigating DevOps: Setting Up Your Local Environment
Written on
Chapter 1: Introduction to DevOps
What is DevOps, and Why Should You Care?
When searching for a clear definition of DevOps, you'll find many vague descriptions that might leave you confused. To simplify, DevOps essentially refers to the process where a change in your Git repository triggers a sequence of automated actions. These actions run tests, and if successful, a Docker container is created and deployed to your server for users to access the latest version of your application. My goal is to deploy on AWS (Amazon Web Services), as I’m eager to enhance my resume and no longer want to tell recruiters about my past experience deploying a static site to AWS three years ago.
Understanding Docker Containers
So, what exactly is a Docker container? Think of it as a small factory that simulates a computer environment (Windows, Linux, or Mac) specifically designed to run your website. It houses all the necessary files for your site.
The beauty of using containers is that they can be hosted anywhere online, allowing others to download and run your application on their own machines. More often than not, your site is deployed to cloud services like AWS or Azure, where it runs on their servers. While this is typically geared toward larger organizations, I’m exploring these concepts primarily for educational purposes.
Using containers also provides benefits such as error management; if a container fails, it can be replaced with a new instance without much hassle. However, this tutorial will focus on other aspects, so let’s set that aside for now.
What Are CI Pipelines?
A Continuous Integration (CI) Pipeline is essentially a series of scripts executed in response to actions in your repository. They build your code and automatically deploy it to a server for users or testers to access. This automation eliminates the need for manual deployment, streamlining the process and reducing delays caused by waiting for someone with the right skills and permissions to make the changes.
Our Pipeline Objectives
I haven’t encountered a tutorial that aligns perfectly with what I want to achieve, so I decided to document my journey as I create it. This means I might not achieve my goals perfectly, but it allows me to track what I learn and any challenges I face along the way. There’s a thrill in possibly failing spectacularly in front of my readers, although it also comes with a hint of anxiety.
Let’s outline the tasks I want my pipeline to execute:
- Run tests
- Build the Docker container
- Deploy the container to ECR (Elastic Container Registry)
- Launch an ECS task to make the site live
While this seems straightforward, numerous configurations must be established before these scripts can function, so let's dive in.
Local Environment Setup
Before we start, if you're on Windows and haven't enabled Linux support, do that now. Docker Desktop won’t run without it, and the installation can take some time.
Next, download and install Docker Desktop. Additionally, if you don’t have an AWS account yet, sign up for one. This part isn't strictly local, but it's a good idea to get it done now.
Project Setup
If you have a React project ready, feel free to use it. I require a project with tests to run through my CI pipeline. Luckily, the CodeSandbox I created for learning React includes tests (I was also familiarizing myself with Jest), so I exported it to GitHub.
You can fork it if you'd like, but this repository contains all the necessary scripts, so you might prefer to copy the sandbox as a repository using the GitHub logo on the sidebar for an easier follow-along.
Once you've copied it, clone it to your local machine as you typically would. While the specific code isn't crucial, having tests in your project will enable you to practice running them within your pipeline.
Getting Your Project Running Locally
If you’re using your code, you likely know how to run it locally. If you copied my project to your GitHub, once it's on your machine, execute npm install in the terminal within the project directory to install dependencies. Then, confirm it works by running npm start. If it doesn’t, well, it works on my machine 🤷♀️.
Next, create a file named Dockerfile in the project root. I’ll admit I borrowed the Dockerfile from another tutorial, which was immensely helpful, though it didn’t cover everything I wanted to learn.
Consider adding a .dockerignore file to prevent unnecessary files from bloating your container.
Now, let’s build the Docker image:
docker build -t your_username_here/ag-grid-example .
And run it to test:
docker run -p 3000:80 -d your_username_here/ag-grid-example:latest
Open a new tab in your browser and go to localhost:3000 to see your site (or mine, depending on your setup).
Congratulations! You’ve successfully built a Docker container using React code and have it running. Next, we’ll set up all the components needed on AWS for the scripts to function.
Chapter 2: Video Insights
In this video titled "DevOps as a Service with Benjamin Johnson," the speaker discusses the fundamentals of DevOps and how it operates in a cloud environment.
Amy Smith from LinkedIn shares valuable insights on fostering successful enterprise relationships, emphasizing the importance of collaboration in tech projects.