Streamline Your Software Pipeline: CI/CD Essentials with GitHub, Jenkins, Docker & Docker Hub
Why ?
Streamlining workflows with tools like Jenkins, Docker, Git, GitHub, Docker Hub, and embracing DevOps principles is paramount for modern software teams. These tools automate processes, enhance collaboration, and ensure consistency from development to deployment. By integrating CI/CD pipelines, version control, containerization, and centralized repositories, organizations can achieve rapid delivery, improved quality, and greater agility in the ever-evolving landscape of software development.
Introduction
In this hands-on blog post, we'll delve into creating a simple CI/CD pipeline using the power of GitHub, Jenkins, Docker, and Docker Hub.
We'll walk you through the step-by-step process of setting up each tool and configuring them to work together seamlessly. You'll learn how to:
Leverage GitHub as your version control system to track code changes.
Utilize Jenkins as a continuous integration server to automate builds and testing.
Employ Docker to containerize your application for consistent deployment across environments.
Utilize Docker Hub as a central repository for storing and sharing your Docker images.
By the end of this post, you'll have a practical understanding of how to implement a CI/CD pipeline for your own mini projects, paving the way for faster development cycles and more efficient deployments!
Let's Get started !
- Lay the Foundation: Setting Up Your Tools
GitHub: Sign up for a free account and create a new repository for your project. This will serve as your central code hub.
Jenkins: Download and install Jenkins on your server. We'll configure it to become the conductor of your CI/CD orchestra.
Docker: Install Docker on your system. Docker will package your application into a self-contained unit.
You will need dockerfile and dockercompose file (depending on the repository you use). Here is the dockerfile and dockercompose file for my github repository
Dockerfile
FROM node:20 WORKDIR . . COPY package*.json ./ RUN npm install COPY . . ENV MONGO_URL=mongodb://mongo:27017/dummy EXPOSE 3000 CMD [ "node", "app.js" ]
docker-compose.yml
version: '3.8' services: app: build: . ports: - "3000:3000" depends_on: - mongo environment: - MONGO_URL=mongodb://mongo:27017/dummy volumes: - .:/usr/src/app networks: - app-network mongo: image: mongo:5.0 ports: - "27017:27017" networks: - app-network networks: app-network: driver: bridge
Docker Hub: Create a free Docker Hub account. This will be your image storage facility, accessible from anywhere.
Set up a docker hub repository after you signup on docker hub
- Connecting the Dots: Configuring Jenkins
Access Jenkins: Open your web browser and navigate to the URL where Jenkins is installed on your server. It typically follows the format
http://<your_server_ip>:<port_number>
, where<your_server_ip>
is the IP address of your server and<port_number>
is the port Jenkins is running on (usually 8080 by default and the password for the first time login is in for ubuntu "/var/jenkins_home/secrets/initialAdminPassword" ).Install Plugins: Equip Jenkins with the necessary plugins, like "Pipeline" and "Docker." These plugins will allow Jenkins to interact with your code and Docker.
I recommend to use BlueOcean. Blue Ocean is a new user experience for Jenkins based on a patronizable, modern design that allows users to graphically create, visualize and diagnose Continuous Delivery (CD) Pipelines.
If you're hesitant to install and configure Jenkins directly on your system, fear not! The official Jenkins project offers a convenient Docker image that gets you up and running quickly. This image, also provided by BlueOcean (a popular Jenkins plugin for UI customization), streamlines the setup process and eliminates manual configuration steps.
Create a Job: Click on new Item > pipeline.
Configure the Pipeline: Here's where the magic happens! We'll craft a "Jenkinsfile" script that outlines the automated steps for your pipeline. This script might involve fetching code from GitHub, building the Docker image, and pushing it to Docker Hub. But before this we will add an Description and select the Github Project (I have used my own repository which is public).Note that the repository is public thus no credentials are needed but if it is private then we need to add credentials.
Now that your Jenkins project is prepped, it's time to decide when the CI/CD pipeline should spring into action. Here, Jenkins offers two primary triggers: GitSCM and PollSCM. Let's explore which one suits your needs:
GitSCM: This trigger continuously monitors a specific Git repository (like GitHub) for changes. Whenever you push code updates to the repository, Jenkins will automatically trigger the pipeline, initiating the build and deployment process. Ideal for scenarios where you want rapid response to code changes. WebHooks must be configured it GitSCM is used which is avaliable in repo settings in github.
PollSCM: This trigger functions on a predefined schedule. You configure Jenkins to check the Git repository at specific intervals (e.g., every minute, every hour). If any changes are detected, the pipeline is triggered. This option is suitable when you have less frequent code updates or prefer a more controlled execution of the pipeline. It is usually used when our jenkins server is behind a firewall and we cannot use GitSCM.
Here I have used PollSCM that is triggered every 5 minutes.
After choosing a trigger (GitSCM for automatic execution on code pushes or PollSCM for scheduled checks), configure the pipeline definition. You can either write the pipeline steps directly in Jenkins or store them as a Jenkinsfile script in your version control system for better maintainability. This script will be the heart of your pipeline, dictating actions like fetching code, building your application, and potentially creating and deploying Docker images.
Now we will add Pipeline code in groovy:
```plaintext pipeline { agent any
environment { DOCKER_IMAGE = 'usernamefordocker/rento' DOCKER_TAG = 'latest' GIT_REPO = 'github.com/Ayush-n25/Rento.git' }
stages { stage('Checkout') { steps { sh "rm -rf *" sh "git clone github.com/Ayush-n25/Rento.git" } }
stage('Build Docker Image') { steps { script { sh "cd Rento && docker build -t ${DOCKER_IMAGE}:${DOCKER_TAG} ." } } }
stage('Push Docker Image') {
steps { script { withCredentials([usernamePassword(credentialsId: 'docker', usernameVariable: 'DOCKER_USERNAME', passwordVariable: 'DOCKER_PASSWORD')]) { sh "docker login -u ${DOCKER_USERNAME} -p ${DOCKER_PASSWORD}" sh "docker push ${DOCKER_IMAGE}:${DOCKER_TAG}" } } }
} }
post { always { sh 'docker rmi ${DOCKER_IMAGE}:${DOCKER_TAG}' sh "rm -rf *" } success { echo 'Pipeline completed successfully.' } failure { echo 'Pipeline failed.' } } } ```
Add your username for docker in this jenkins file.
Before saving the pipeline we will add our docker credentials to jenkins.
For that go to Manage Jenkins > Credentials > System > Global credentials.
Add the username (username for Docker hub) , password of Docker hub and ID (this will be reference for using credentials i.e. credentialsId in jenkinsfile).Save and Build: After crafting your Jenkinsfile script save your Jenkins project configuration. This triggers Jenkins to parse the Jenkinsfile and prepare the pipeline for execution.
Trigger the Pipeline: Depending on your chosen trigger (GitSCM or PollSCM), the pipeline might automatically start based on code pushes or your defined schedule. Alternatively, you can manually trigger the pipeline execution from the Jenkins interface for initial testing purposes.
Now you can check the Docker hub repository for the image.
Conclusion:
The world of software development thrives on efficiency. By implementing a CI/CD pipeline, you've embraced automation, streamlining your development process and paving the way for faster deployments. This hands-on guide equipped you with the knowledge to build a mini CI/CD pipeline using industry-standard tools. You learned how to leverage GitHub for version control, utilize Jenkins for automated builds and testing, harness Docker for containerized deployments, and benefit from Docker Hub for image storage.
Remember, this is just the first chapter in your CI/CD journey. As you explore further, consider integrating additional tools for testing and deployment specific to your project's needs. With continuous learning and experimentation, you'll transform your CI/CD pipeline into a powerful engine, propelling your development team towards greater efficiency and agility.
Stay tuned for future posts where we'll delve deeper into advanced CI/CD concepts and explore various deployment strategies!
Useful Resources:
About Creator
Hello! I'm am Ayush Naik, student of IT. I have a keen interest in cloud computing, DevOps, and full-stack development. I'm passionate about Continuous Integration and Continuous Deployment (CI/CD) practices and excited to share insights and tips on streamlining your development workflow. Join me as I explore the latest trends and technologies in CI/CD!
Connect with me on