CICD
CICD
Popular tools include Jenkins, GitHub Actions, GitLab CI, and Tekton.
Jenkins
● What is it?
Open-source tool for automating software development tasks.
● Strengths:
○ Supports many plugins for customization.
○ Works with various version control systems (e.g., Git).
○ Great for complex workflows.
● Use Cases:
Building apps, running tests, deploying code, and managing large-scale
pipelines.
GitHub Actions
● What is it?
Built into GitHub to automate tasks like testing and deploying code.
● Strengths:
○ Easy to set up with GitHub repositories.
○ Supports cross-platform testing.
● Use Cases:
Automating workflows for projects hosted on GitHub.
GitLab CI
● What is it?
A CI/CD tool within GitLab for automating builds, tests, and deployments.
● Strengths:
○ Works seamlessly with GitLab repositories.
○ Built-in Docker support and debugging tools.
● Use Cases:
Managing pipelines, testing code, and deploying apps from GitLab.
Tekton
● What is it?
A Kubernetes-native CI/CD tool for cloud-native workflows.
● Strengths:
○ Designed for containerized builds.
○ Works well with Kubernetes and cloud tools.
● Use Cases:
Automating Kubernetes pipelines and scaling builds in the cloud.
Each tool caters to specific needs. Pick the one that matches your workflow and
infrastructure.
Comparison
Comparison of CI/CD Tools: Jenkins, GitHub Actions, GitLab CI, and Tekton
Jenkins
GitHub Actions
GitLab CI
● Overview: Fully integrated CI/CD tool within GitLab for complete DevOps
lifecycle management.
● Features:
○ Configured using .gitlab-ci.yml.
○ Supports Docker and Kubernetes natively.
○ Provides built-in security and Auto DevOps pipelines.
● Strengths:
○ Complete DevOps platform in one tool.
○ Strong container and cloud support.
● Best For: Teams using GitLab for version control and end-to-end DevOps.
Tekton
Comparative Summary
Feature Jenkins GitHub GitLab CI Tekton
Actions
Parallel/Matrix Yes (via Yes (matrix Yes (via Yes (via parallel
Builds pipeline steps) strategy) parallel tasks)
jobs)
Conclusion:
● Jenkins is best suited for highly customizable pipelines, particularly for
organizations that need a wide range of integrations or have legacy systems.
● GitHub Actions is ideal for teams already using GitHub, as it offers an
easy-to-use, event-driven CI/CD solution integrated directly into the GitHub
ecosystem.
● GitLab CI provides a complete DevOps platform and is perfect for teams
that are heavily integrated with GitLab and want a full end-to-end solution.
● Tekton is designed for cloud-native, Kubernetes-based environments,
offering flexibility and scalability for teams using modern containerized
applications.
Example 1:
groovy
pipeline {
agent any
environment {
MY_ENV_VAR = 'SomeValue'
}
stages {
stage('Checkout') {
steps {
checkout scm
}
}
stage('Build') {
steps {
script {
echo "Building the project..."
sh './mvn clean install'
}
}
}
stage('Test') {
steps {
script {
echo "Running tests..."
sh './mvn test'
}
}
}
stage('Deploy') {
steps {
script {
echo "Deploying to production..."
sh 'scp target/*.jar user@prod-server:/opt/myapp/'
}
}
}
}
post {
always {
echo 'Cleaning up resources'
}
success {
echo 'Build and Deployment Succeeded!'
}
failure {
echo 'Build or Deployment Failed'
}
}
}
Explanation:
Example 2
pipeline {
agent {
docker { image 'node:14' }
}
environment {
DOCKER_REGISTRY = "docker.io"
IMAGE_NAME = "my-node-app"
}
stages {
stage('Checkout Code') {
steps {
checkout scm
}
}
stage('Install Dependencies') {
steps {
sh 'npm install'
}
}
stage('Build Docker Image') {
steps {
script {
def image =
docker.build("${DOCKER_REGISTRY}/${IMAGE_NAME}:${env.BUILD_ID}
")
image.push()
}
}
}
stage('Deploy') {
steps {
script {
// Sample deploy command
sh 'docker run -d -p 3000:3000
${DOCKER_REGISTRY}/${IMAGE_NAME}:${env.BUILD_ID}'
}
}
}
}
post {
always {
cleanWs()
}
success {
echo 'Deployment Successful'
}
failure {
echo 'Deployment Failed'
}
}
}
● Explanation:
○ Docker Agent: Jenkins is using the node:14 Docker image.
○ Stages: Checkout code, install dependencies, build Docker image, and
deploy.
○ Push to Docker Registry: The image is tagged with the build ID and
pushed to Docker Hub.
Example 3
groovy
pipeline {
agent any
environment {
SLACK_CHANNEL = '#ci-cd-notifications'
SLACK_TOKEN = credentials('slack-webhook-token')
}
stages {
stage('Checkout') {
steps {
checkout scm
}
}
stage('Build') {
steps {
script {
echo "Building the project..."
sh 'mvn clean install'
}
}
}
stage('Test') {
steps {
script {
echo "Running tests..."
sh 'mvn test'
}
}
}
stage('Deploy') {
steps {
script {
echo "Deploying to production..."
sh 'scp target/*.jar user@prod-server:/opt/myapp/'
}
}
}
}
post {
success {
script {
slackSend(channel: SLACK_CHANNEL, message: "Deployment
Successful! :white_check_mark:")
}
}
failure {
script {
slackSend(channel: SLACK_CHANNEL, message: "Deployment Failed!
:x:")
}
}
}
}
● Explanation:
○ Slack Notifications: Sends a success or failure notification to a Slack
channel using a webhook.
○ Stages: Checkout, build, test, and deploy.
Example 4
Jenkins Pipeline with Slack Notifications and Docker Build:
groovy
pipeline {
agent any
environment {
DOCKER_IMAGE = "myapp"
SLACK_CHANNEL = "#build-notifications"
SLACK_CREDENTIALS = credentials('slack-webhook')
}
stages {
stage('Checkout') {
steps {
checkout scm
}
}
stage('Build Docker Image') {
steps {
script {
echo "Building Docker Image..."
sh "docker build -t ${DOCKER_IMAGE}:${BUILD_ID} ."
}
}
}
stage('Test') {
steps {
script {
echo "Running Tests..."
sh "docker run ${DOCKER_IMAGE}:${BUILD_ID} test"
}
}
}
stage('Push Docker Image') {
steps {
script {
echo "Pushing Docker Image..."
withCredentials([usernamePassword(credentialsId: 'docker-hub',
usernameVariable: 'DOCKER_USER', passwordVariable: 'DOCKER_PASS')]) {
sh "echo $DOCKER_PASS | docker login -u $DOCKER_USER
--password-stdin"
sh "docker push ${DOCKER_IMAGE}:${BUILD_ID}"
}
}
}
}
}
post {
success {
slackSend(channel: SLACK_CHANNEL, message: "Build ${BUILD_ID}
succeeded!")
}
failure {
slackSend(channel: SLACK_CHANNEL, message: "Build ${BUILD_ID}
failed!")
}
always {
cleanWs()
}
}
}
● Explanation:
○ Slack Notifications: Sends a Slack message on success or failure of
the build process.
○ Docker Build: Builds a Docker image and pushes it to Docker Hub
after testing.
Example 5
In this Jenkins Pipeline example, the user has created a pipeline that integrates with
GitHub to automate the build and deployment process of a project.
groovy
pipeline {
agent any
environment {
GIT_REPO_URL = "https://github.com/my-org/my-repo.git"
}
stages {
stage('Checkout') {
steps {
git url: "${GIT_REPO_URL}", branch: 'main'
}
}
stage('Build') {
steps {
script {
echo "Building project..."
sh 'mvn clean install'
}
}
}
stage('Test') {
steps {
script {
echo "Running unit tests..."
sh 'mvn test'
}
}
}
stage('Deploy') {
steps {
script {
echo "Deploying to server..."
sh 'scp target/*.jar user@prod-server:/opt/myapp/'
}
}
}
}
post {
success {
echo "Build successful!"
}
failure {
echo "Build failed!"
}
}
}
1. GitHub Integration: The pipeline pulls the project code from a GitHub
repository (specified by the GIT_REPO_URL environment variable). The
repository is checked out in the Checkout stage using the git command.
2. Build: In the Build stage, Maven is used to clean and build the project
with the mvn clean install command, compiling the necessary files.
3. Test: The pipeline runs unit tests in the Test stage using the mvn test
command to ensure that the code functions as expected.
4. Deploy: Once the build and tests pass, the Deploy stage transfers the build
artifact (a .jar file) to a production server via scp.
5. Post Actions: After the pipeline finishes, the post block notifies the user
about the build status, either success or failure, to provide feedback on the
pipeline's outcome.
This example demonstrates how Jenkins can automate the process of pulling code
from GitHub, building, testing, and deploying applications, along with notifying
the user of the build's success or failure.
Example 6
This Jenkins pipeline integrates SonarQube for static code analysis in a Java
project. It includes stages for checking out the code, building the project, running
SonarQube analysis, and deploying the application. The pipeline ensures that code
quality is assessed through SonarQube before deployment.
groovy
pipeline {
agent any
environment {
SONARQUBE = 'SonarQube' // Name of the SonarQube server configured in
Jenkins
SONAR_PROJECT_KEY = 'my-project-key'
SONAR_PROJECT_NAME = 'My Project'
SONAR_PROJECT_VERSION = '1.0'
}
stages {
stage('Checkout') {
steps {
checkout scm
}
}
stage('Build') {
steps {
script {
echo "Building project..."
sh 'mvn clean install'
}
}
}
stage('SonarQube Analysis') {
steps {
script {
echo "Running SonarQube Analysis..."
sh """
mvn sonar:sonar \
-Dsonar.projectKey=${SONAR_PROJECT_KEY} \
-Dsonar.projectName=${SONAR_PROJECT_NAME} \
-Dsonar.projectVersion=${SONAR_PROJECT_VERSION} \
-Dsonar.host.url=http://localhost:9000
"""
}
}
}
stage('Deploy') {
steps {
script {
echo "Deploying to server..."
// Add your deployment steps here
}
}
}
}
post {
success {
echo "Build and SonarQube analysis succeeded!"
}
failure {
echo "Build or SonarQube analysis failed!"
}
}
}
● Explanation:
○ The SonarQube Analysis stage runs the SonarQube analysis using
Maven’s sonar:sonar goal.
○ You need to configure SonarQube in Jenkins and provide the correct
SonarQube URL and Project Key.
○ The analysis will be triggered after the build process and will give
feedback on the quality of the code.
Example 7
groovy
pipeline {
agent any
stages {
stage('Checkout') {
steps {
git 'https://github.com/my-repo.git'
}
}
stage('Build & Test') {
parallel {
stage('Build') {
steps {
script {
echo 'Building the project...'
sh './build.sh'
}
}
}
stage('Unit Tests') {
steps {
script {
echo 'Running unit tests...'
sh './run_tests.sh'
}
}
}
}
}
stage('Deploy') {
steps {
script {
echo 'Deploying to the server...'
sh './deploy.sh'
}
}
}
}
post {
success {
echo 'Pipeline completed successfully!'
}
failure {
echo 'Pipeline failed!'
}
}
}
● Explanation: This Jenkins pipeline uses parallel execution in the Build &
Test stage to run the build and unit test jobs simultaneously. After these jobs
are complete, the deployment will proceed.
Example 8
groovy
pipeline {
agent any
environment {
SLACK_CHANNEL = '#ci-channel'
SLACK_TOKEN = credentials('slack-token')
}
stages {
stage('Checkout') {
steps {
git 'https://github.com/my-repo.git'
}
}
stage('Deploy') {
steps {
script {
echo 'Deploying the project...'
sh './deploy.sh'
}
}
}
}
post {
success {
slackSend (channel: SLACK_CHANNEL, message: 'Pipeline succeeded!',
token: SLACK_TOKEN)
}
failure {
slackSend (channel: SLACK_CHANNEL, message: 'Pipeline failed.',
token: SLACK_TOKEN)
}
}
}
● Explanation: This Jenkinsfile runs the Build & Test stage with different
versions of Node.js using a matrix configuration. After the pipeline
completes, it sends Slack notifications about the success or failure of the
build.
Example 9
This example demonstrates a Jenkins pipeline that performs parallel test execution
for unit tests and integration tests, optimizing the testing phase and reducing
overall pipeline runtime.
groovy
pipeline {
agent any
stages {
stage('Checkout') {
steps {
checkout scm
}
}
stage('Build') {
steps {
script {
echo "Building the project..."
sh 'mvn clean install'
}
}
}
stage('Test') {
parallel {
stage('Unit Tests') {
steps {
script {
echo "Running unit tests..."
sh 'mvn test -Dtest=UnitTests'
}
}
}
stage('Integration Tests') {
steps {
script {
echo "Running integration tests..."
sh 'mvn test -Dtest=IntegrationTests'
}
}
}
}
}
stage('Deploy') {
steps {
script {
echo "Deploying to production..."
sh 'scp target/*.jar user@prod-server:/opt/myapp/'
}
}
}
}
}
● Explanation:
○ Parallel Execution: The test stage runs both unit tests and integration
tests in parallel, reducing overall pipeline runtime.
○ Maven Build: A Maven build and deployment process is included.
Example 10
The provided Jenkins Pipeline integrates Trivy for container scanning. Trivy is a
vulnerability scanner for Docker images, and this pipeline performs a scan on a
Docker image after building it. The pipeline includes stages for checking out code,
building the Docker image, running the Trivy scan, and deploying the application.
The scan checks the Docker image for vulnerabilities, and the results are reported
in the post-build steps, with success or failure messages depending on the outcome.
The Docker socket is mounted to allow Trivy to interact with Docker for scanning.
groovy
pipeline {
agent any
environment {
TRIVY_IMAGE = 'aquasec/trivy:latest' // Docker image for Trivy
DOCKER_IMAGE = 'my-docker-image:latest' // Your Docker image to scan
}
stages {
stage('Checkout') {
steps {
checkout scm
}
}
stage('Trivy Scan') {
steps {
script {
echo "Running Trivy scan on Docker image..."
sh """
docker run --rm -v /var/run/docker.sock:/var/run/docker.sock \
${TRIVY_IMAGE} ${DOCKER_IMAGE}
"""
}
}
}
stage('Deploy') {
steps {
script {
echo "Deploying application..."
// Add your deployment steps here
}
}
}
}
post {
success {
echo "Build and Trivy scan succeeded!"
}
failure {
echo "Build or Trivy scan failed!"
}
}
}
● Explanation:
○ Trivy Scan: The docker run command pulls the Trivy Docker image
and scans the built Docker image for vulnerabilities.
○ Docker Socket: The -v /var/run/docker.sock:/var/run/docker.sock
allows Trivy to interact with Docker to scan the image.
Example 11
groovy
pipeline {
agent any
environment {
SELENIUM_IMAGE = 'selenium/standalone-chrome:latest' // Selenium
Docker image with Chrome
CHROME_DRIVER = '/usr/local/bin/chromedriver' // Path for ChromeDriver
}
stages {
stage('Checkout') {
steps {
checkout scm
}
}
stage('Build') {
steps {
script {
echo "Building the application..."
// Add your build steps here, e.g., `mvn clean install`
}
}
}
post {
success {
echo "Build and tests succeeded!"
}
failure {
echo "Build or tests failed!"
}
}
}
● Explanation:
○ Selenium Docker Image: Uses the official
selenium/standalone-chrome image for running Selenium tests in a
Chrome browser.
○ Test Execution: The Selenium test script run_selenium_tests.py is run
inside the Docker container, and it mounts the tests and build
directories.
name: Node.js CI
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [12.x, 14.x, 16.x]
steps:
- name: Checkout repository
uses: actions/checkout@v3
- name: Set up Node.js
uses: actions/setup-node@v3
with:
node-version: ${{ matrix.node-version }}
- name: Install dependencies
run: npm install
- name: Run tests
run: npm test
- name: Deploy
run: |
echo "Deploying to production..."
# Add your deployment command here
● Explanation:
○ Trigger: Runs on push to main and pull requests.
○ Matrix Strategy: Runs jobs across multiple Node.js versions.
○ Steps: Checkout, setup Node.js, install dependencies, run tests, and
deploy.
Example 2
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
● Explanation:
○ Cache: Uses caching to store pip dependencies, speeding up builds.
○ Jobs: The pipeline includes checking out code, setting up Python,
caching dependencies, running tests, and deploying.
Example 3
This demonstrates the integration of GitHub Actions for automating the CI/CD
pipeline of a Node.js application. The workflow is structured to automatically
build, test, and deploy the application based on branch-specific conditions. It runs
on pushes to the main and develop branches, ensuring that every code change is
tested and built. The deployment step is conditionally triggered only for changes
pushed to the main branch, ensuring that only production-ready code is deployed.
This setup streamlines the process of managing code quality and deployment,
making it more efficient and reliable for continuous integration and delivery.
deploy:
runs-on: ubuntu-latest
needs: build
if: github.ref == 'refs/heads/main'
steps:
- name: Checkout code
uses: actions/checkout@v3
● Explanation:
○ Conditional Deployment: The deployment job only runs if the push
is to the main branch, enabling conditional deployments based on
branches.
○ Node.js Setup and Testing: Sets up Node.js, installs dependencies,
and runs tests before deployment.
Example 4
This automates the process of building and pushing Docker images using GitHub
Actions. It defines a CI/CD workflow that triggers on pushes to the main branch.
The workflow checks out the code, sets up Docker Buildx for multi-platform
builds, creates a Docker image tagged with the commit SHA, and pushes it to
Docker Hub. Finally, it logs out from Docker Hub after the process is complete.
This setup streamlines the Docker build and deployment process, ensuring
consistency and efficiency.
on:
push:
branches:
- main
jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
● Explanation:
○ Docker Build and Push: The workflow builds a Docker image and
pushes it to Docker Hub with a tag based on the commit SHA.
○ Docker Buildx: Uses Docker Buildx to enable building images for
different platforms.
Example 5
on:
push:
branches:
- main
jobs:
build-deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
● Explanation:
○ AWS ECS Deployment: This example deploys a Docker image to
AWS ECS after building it.
○ AWS CLI Setup: Configures AWS CLI to authenticate and deploy to
AWS.
○ Docker to ECR: Pushes the Docker image to Amazon Elastic
Container Registry (ECR).
Example 6
on:
push:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
● Explanation:
○ Caching: Uses GitHub Actions cache to store Node.js dependencies
and speeds up builds.
○ Optimized Build Process: Caches node_modules to avoid reinstalling
dependencies on each run.
Example 7
This example demonstrates using GitHub Actions to automate the build and code
quality analysis process with SonarQube. The workflow triggers on push events to
the main branch, sets up JDK 11, caches Maven dependencies, and performs a
Maven build followed by a SonarQube analysis. The integration ensures that each
commit is analyzed for code quality using SonarQube, with a token stored as a
GitHub secret for authentication.
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
● Explanation:
○ SonarQube Token: You will need to create a SonarQube Token and
store it as a GitHub secret (SONARQUBE_TOKEN) to authenticate
the analysis.
○ The Build and Analyze with Maven step runs the SonarQube
analysis after the Maven build.
○ It caches the Maven dependencies to speed up the process in future
runs.
Example 8
GitHub Actions with Trivy for Docker Scanning:
This example demonstrates how to integrate GitHub Actions with Trivy for Docker
image scanning. It triggers the workflow on pushes to the main branch, builds a
Docker image, and then uses the Trivy action to scan the image for vulnerabilities.
The integration simplifies security checks in your CI pipeline by automatically
scanning Docker images for issues using Trivy.
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
Example 9
This example demonstrates how to set up GitHub Actions for Selenium test
automation. It triggers the workflow on a push to the main branch, running tests
on an ubuntu-latest runner. The workflow includes setting up a Selenium
service using the selenium/standalone-chrome Docker container,
configuring Python 3.8, installing dependencies, and running the Selenium tests
using a Python script.
on:
push:
branches:
- main
jobs:
selenium-tests:
runs-on: ubuntu-latest
services:
selenium:
image: selenium/standalone-chrome
options: --shm-size 2g
steps:
- name: Checkout code
uses: actions/checkout@v3
● Explanation:
○ Selenium Service: The selenium/standalone-chrome Docker container
is used as a service in the GitHub Actions workflow.
○ Python Setup: The workflow sets up Python and installs the
necessary dependencies from requirements.txt before running the
Selenium tests.
Example 10
name: CI Workflow
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Deploy
run: ./deploy.sh
● Explanation: This GitHub Actions workflow triggers on pushes and pull
requests to the main branch. It sets up a Node.js environment, installs
dependencies, runs tests, and deploys the application using a custom shell
script.
Example 11
This GitHub Actions workflow defines a CI/CD pipeline with multiple jobs,
specifically for building and deploying a Dockerized application.
Workflow Overview:
● Trigger: The pipeline runs when changes are pushed to the main branch or
a pull request is made to it.
● Jobs:
○ Build Job:
■ Sets up a Node.js environment and installs dependencies.
■ Runs tests and builds a Docker image.
■ Tags the image with the GitHub SHA and pushes it to Docker
Hub.
○ Deploy Job:
■ After the build job completes, it pulls the Docker image from
Docker Hub and deploys it to a production server using SSH.
This setup automates the process of building, testing, and deploying a Dockerized
application, ensuring that each deployment uses the latest successful build.
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v3
- name: Set up Node.js
uses: actions/setup-node@v3
with:
node-version: '16'
- name: Install Dependencies
run: npm install
- name: Run Tests
run: npm test
- name: Build Docker Image
run: |
docker build -t myapp:${{ github.sha }} .
docker tag myapp:${{ github.sha }} mydockerhub/myapp:${{ github.sha }}
docker push mydockerhub/myapp:${{ github.sha }}
deploy:
runs-on: ubuntu-latest
needs: build
steps:
- name: Checkout Code
uses: actions/checkout@v3
- name: Deploy to Production
run: |
ssh user@prod-server 'docker pull mydockerhub/myapp:${{ github.sha }}
&& docker run -d mydockerhub/myapp:${{ github.sha }}'
● Explanation:
○ Jobs: build and deploy jobs.
○ Build Job: This job installs Node.js, runs tests, builds a Docker
image, and pushes it to Docker Hub.
○ Deploy Job: After the build job completes, the deploy job pulls the
Docker image from Docker Hub and deploys it to a production server.
Example 12
● Event Triggers: The workflow is triggered by pushes or pull requests to the main
branch.
● Jobs:
1. The build job runs on the ubuntu-latest environment.
2. Matrix Strategy: It tests three Node.js versions (12, 14, and 16) in parallel,
reducing the time taken to test across different versions.
● Steps:
1. Checkout Code: Fetches the code from the repository.
2. Set up Node.js: Configures Node.js for each version in the matrix.
3. Install Dependencies: Runs npm install to set up project dependencies.
4. Run Tests: Executes npm test to ensure code correctness.
5. Build Project: Compiles the project using npm run build.
6. Deploy: Runs a deployment script (deploy.sh).
In summary, this workflow helps ensure that the project works across different versions of
Node.js by running tests in parallel, making it more efficient.
name: Node.js CI
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [12, 14, 16] # Runs tests on multiple Node.js versions
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Deploy
run: ./deploy.sh
● Explanation: This GitHub Actions workflow runs the same jobs across
multiple Node.js versions using a matrix strategy. It will test Node.js
versions 12, 14, and 16 in parallel.
Example 13
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
deploy:
runs-on: ubuntu-latest
needs: build
steps:
- name: Checkout code
uses: actions/checkout@v3
This example demonstrates a simple GitLab CI pipeline that automates the process
of building, testing, and deploying a Flask application using Docker. The pipeline
consists of three stages: build, test, and deploy. In the build stage, a Docker image
is built and pushed to a Docker registry. The test stage runs tests using pytest inside
the Docker container. Finally, the deploy stage deploys the Docker image to a
production server via SSH.
yaml
stages:
- build
- test
- deploy
variables:
DOCKER_REGISTRY: "docker.io"
IMAGE_NAME: "my-flask-app"
before_script:
build:
stage: build
script:
- docker build -t
$DOCKER_REGISTRY/$CI_PROJECT_NAMESPACE/$IMAGE_NAME:$CI_
COMMIT_REF_NAME .
- docker push
$DOCKER_REGISTRY/$CI_PROJECT_NAMESPACE/$IMAGE_NAME:$CI_
COMMIT_REF_NAME
test:
stage: test
script:
stage: deploy
script:
Explanation:
yaml
stages:
- build
- test
- deploy-staging
- deploy-production
variables:
IMAGE_NAME: "myapp"
DOCKER_REGISTRY: "docker.io"
before_script:
build:
stage: build
script:
- docker build -t
$DOCKER_REGISTRY/$CI_PROJECT_NAMESPACE/$IMAGE_NAME:$CI_
COMMIT_REF_NAME .
- docker push
$DOCKER_REGISTRY/$CI_PROJECT_NAMESPACE/$IMAGE_NAME:$CI_
COMMIT_REF_NAME
test:
stage: test
script:
deploy-staging:
stage: deploy-staging
script:
- develop
deploy-production:
stage: deploy-production
script:
only:
- main
Explanation:
yaml
stages:
- build
- test
- deploy
variables:
IMAGE_NAME: "myapp"
DOCKER_REGISTRY: "docker.io"
before_script:
build:
stage: build
script:
- docker build -t
$DOCKER_REGISTRY/$CI_PROJECT_NAMESPACE/$IMAGE_NAME:$CI_
COMMIT_REF_NAME .
- docker push
$DOCKER_REGISTRY/$CI_PROJECT_NAMESPACE/$IMAGE_NAME:$CI_
COMMIT_REF_NAME
test:
stage: test
parallel: 3
script:
deploy:
stage: deploy
script:
- ssh user@prod-server 'docker pull
$DOCKER_REGISTRY/$CI_PROJECT_NAMESPACE/$IMAGE_NAME:$CI_
COMMIT_REF_NAME && docker run -d
$DOCKER_REGISTRY/$CI_PROJECT_NAMESPACE/$IMAGE_NAME:$CI_
COMMIT_REF_NAME'
Explanation:
● Parallel Jobs: The test stage is set to run in parallel with 3 jobs, reducing
testing time.
● Stages: Build, test, and deploy stages for Docker images.
This GitLab CI pipeline defines a multi-stage pipeline that builds, tests, and
deploys a Docker-based application. It also demonstrates how to store build
artifacts during the build stage, which can be used in subsequent stages.
yaml
stages:
- build
- test
- deploy
variables:
IMAGE_NAME: "myapp"
DOCKER_REGISTRY: "docker.io"
before_script:
build:
stage: build
script:
- docker build -t
$DOCKER_REGISTRY/$CI_PROJECT_NAMESPACE/$IMAGE_NAME:$CI_
COMMIT_REF_NAME .
- docker push
$DOCKER_REGISTRY/$CI_PROJECT_NAMESPACE/$IMAGE_NAME:$CI_
COMMIT_REF_NAME
artifacts:
paths:
- build/
test:
stage: test
script:
deploy:
stage: deploy
script:
Explanation:
● Artifacts: The build job stores build artifacts, which can be used in
subsequent stages (e.g., testing).
● Multi-Stage Pipeline: Divides the pipeline into build, test, and deploy stages,
each performing its task in sequence.
yaml
stages:
- build
- test
- sonarqube
variables:
SONAR_PROJECT_KEY: "my-project-key"
SONAR_PROJECT_VERSION: "1.0"
SONAR_HOST_URL: "http://localhost:9000"
SONAR_LOGIN: $SONARQUBE_TOKEN
before_script:
- apt-get update -y
stage: build
script:
sonarqube_analysis:
stage: sonarqube
script:
- mvn sonar:sonar
-Dsonar.projectKey=$SONAR_PROJECT_KEY
-Dsonar.projectName=$SONAR_PROJECT_NAME
-Dsonar.projectVersion=$SONAR_PROJECT_VERSION
-Dsonar.host.url=$SONAR_HOST_URL
-Dsonar.login=$SONAR_LOGIN
only:
- main
Explanation:
This pipeline includes a security scan of the Docker image using Trivy, a
vulnerability scanner. The scan stage runs Trivy against the built Docker image to
identify potential security issues.
yaml
stages:
- build
- scan
- deploy
variables:
DOCKER_IMAGE: "my-docker-image:latest"
TRIVY_IMAGE: "aquasec/trivy:latest"
build:
stage: build
script:
stage: scan
script:
deploy:
stage: deploy
script:
Explanation:
● Docker Scanning: Uses Trivy to scan the Docker image for vulnerabilities.
● Security Integration: Adds security checks as part of the CI pipeline.
Introduction:
yaml
apiVersion: tekton.dev/v1beta1
kind: Pipeline
metadata:
name: example-pipeline
spec:
tasks:
- name: build
taskRef:
name: build-task
- name: test
taskRef:
name: test-task
- name: deploy
taskRef:
name: deploy-task
---
apiVersion: tekton.dev/v1beta1
kind: PipelineRun
metadata:
name: example-pipelinerun
spec:
pipelineRef:
name: example-pipeline
resources:
- name: git-repo
resourceRef:
name: example-repo
Explanation:
This pipeline defines three sequential tasks: build, test, and deploy. Each
task references a predefined Tekton Task. The PipelineRun triggers the execution.
yaml
apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
name: build-task
spec:
steps:
- name: build
image: node:16
script: |
#!/bin/bash
npm install
Explanation:
This task uses a Node.js image to install dependencies and build a project.
yaml
apiVersion: tekton.dev/v1beta1
kind: Pipeline
metadata:
name: example-pipeline
spec:
resources:
- name: git-repo
type: git
tasks:
- name: clone-repository
taskRef:
name: git-clone
resources:
inputs:
- name: repository
resourceRef:
name: git-repo
- name: build
taskRef:
name: build-task
- name: test
taskRef:
name: test-task
- name: deploy
taskRef:
name: deploy-task
Explanation:
This pipeline includes a Git resource for cloning a repository, followed by build,
test, and deploy tasks.
yaml
apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
name: git-clone
spec:
resources:
inputs:
- name: repository
type: git
steps:
- name: clone
image: alpine:latest
script: |
#!/bin/bash
Explanation:
This task clones a GitHub repository into a workspace using a Git resource.
yaml
apiVersion: tekton.dev/v1beta1
kind: Pipeline
metadata:
name: sonar-pipeline
spec:
tasks:
- name: build
taskRef:
name: maven-build
resources:
inputs:
- name: source
resource: git-repo
- name: sonar-analysis
taskRef:
name: sonar-scanner
params:
- name: SONAR_PROJECT_KEY
value: my-project-key
- name: SONAR_PROJECT_NAME
value: My Project
- name: SONAR_HOST_URL
value: http://localhost:9000
- name: SONAR_LOGIN
value: $(params.sonar-token)
---
apiVersion: tekton.dev/v1beta1
kind: PipelineRun
metadata:
name: sonar-pipelinerun
spec:
pipelineRef:
name: sonar-pipeline
params:
- name: sonar-token
value: $(secrets.SONARQUBE_TOKEN)
Explanation:
This pipeline integrates SonarQube for static code analysis. A token is passed
securely for authentication.
yaml
apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
name: selenium-tests
spec:
steps:
- name: run-selenium-tests
image: selenium/standalone-chrome
script: |
#!/bin/bash
python3 /tests/run_selenium_tests.py
Explanation:
A Selenium test task uses a Chrome container to execute Python-based Selenium
tests.
yaml
apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
name: trivy-scan
spec:
params:
- name: IMAGE
type: string
steps:
- name: trivy-scan
image: aquasec/trivy:latest
script: |
#!/bin/bash
Explanation:
The task scans a Docker image for vulnerabilities using Trivy.
● Jenkins: Jenkinsfile defines the pipeline using stages like Checkout, Build,
Test, and Deploy. You can use shell scripts to execute build, test, and
deployment tasks.
● GitHub Actions: file defines jobs and steps that execute on events like push
or pull_request. You can set up environments (e.g., Node.js) and use actions
for common tasks (like checking out code).
● GitLab CI: .gitlab-ci.yml file defines stages and jobs with commands that
execute on pushes to a repository. Jobs can be configured to run on specific
branches.
● Tekton: Pipeline and Task resources are defined in . Tekton Pipelines are
Kubernetes-native, and tasks are reusable units of work that are executed as
part of a pipeline.
3. What is a Jenkinsfile?
○ Answer: A Jenkinsfile is a text file that contains the definition of a
Jenkins pipeline. It is usually stored within the project repository and
contains the code that defines the pipeline structure, stages, steps, and
the logic that controls the execution of the pipeline.
4. How do you integrate Jenkins with version control systems like Git?
○ Answer: Jenkins can be integrated with Git by using the Git plugin.
This allows Jenkins to pull code from a Git repository to trigger jobs.
Typically, Git webhooks are configured to notify Jenkins whenever
changes are pushed to the repository, automatically triggering the
corresponding pipeline or job.
Answer:
Answer: GitLab runners are agents that execute the jobs defined in a
pipeline. They can be shared runners provided by GitLab or specific
runners installed on your infrastructure. Runners can execute jobs in
different environments like Docker containers or virtual machines.
Answer: GitLab runners are agents that execute jobs defined in the
.gitlab-ci.yml file. Runners can be either shared (provided by GitLab)
or specific to a project or instance. They can run on different
environments like Docker, Kubernetes, or virtual machines.
1. What is Tekton, and how does it fit into the CI/CD process?
○ Answer: Tekton is an open-source Kubernetes-native CI/CD
framework that allows you to create and manage continuous
integration and delivery pipelines. Tekton provides Kubernetes
custom resources like Pipeline, PipelineRun, and Task to define, run,
and manage CI/CD pipelines on Kubernetes clusters.