DevOps
DevOps
Introduction to DevOps
Objectives...
After learning this chapter you will be able to:
D Understand the essential characteristics of DevOps including building a culture of
shared responsibility, transparency, and embracing failure.
D Study the concepts like importance of Continuous Integration and Continuous
Delivery, Infrastructure as Code, Test Driven Development, Behaviour Driven
Development.
O Study essential DevOps concepts and use of it.
O Understand the organizational impact of DevOps tools, and their use in the Linux
operating system.
1,1 INTRODUCTION
Many times, we hear from software developers that they do DevOps or use Devops
tools means that are using two words : development and operations. DevOps culture is
different from traditional corporate culture; it typically requires a change in mindset,
processes and tools. This is linked with Continuous Integration (C) and Continuous
Delivery (CD) practices with laC (Infrastructure as Code).
The term Devops (Development and Operations) is a collection of tools and
technologies combined to carry out various business processes. The purpose is to
bridge the gap between the development department, and the operations department
which are two of the most important departments in any IT organization.
1.1.1 Define DevOps
Definition: DevOps (a combination of two words such as "development" and
"operations") is the combination of practices and tools designed to increase an
organization's ability to deliver applications and services faster than traditional'
software development processes.
(1.1)
Introduction to DevOps
IV] 1.2 -
DevOps : MCA [Management Sem. IV] 1.3 Introduction to DevOps
[Management Sem.
DevOps : MCA
DevOps? The rigid nature of the development process in SDLC makes it impossible to revert to
1.2 WHATIS
as a result of
to replacing siloed
Development the previous stage of development, and the model has become 'outdated
"Dev" and "Ops" refers now work togethor.. the ever-evolving nature of software development, where implementation must
be
The contraction of teams that
create multidisciplinary continually updated in accordance with user feedback in order to meet the evolving
Operations. The idea is to DevOps practices include
and tools. Essential requirements of the IT sector.
shared and efficient practices delivery, and monitorinp are widely used
continuous Let's see some of the fewW modern software development models that
planning, continuous integration,
a constant journey.
applications. DevOps is away from traditionally siloed tem in the industry today.
The DevOps philosophy
focuses on breaking Lean Model:
•
a more collaborative approach. Under
the Deuo 1.
presence in the IT industry
software development to adopt The concept of lean software development derived its
teams work together throughout the project production system was the first company to
model, development and operations from Toyota manufacturing. Toyota
deployment. process in the mid-20th century to improve their car
from development to
lifecycle, introduce the lean development
production and reduce wastage of time and resources.
Lean Model followed by many manufacturing sectors,
across various industries. This
approach was first executed in software development in 2003.
Development Why Lean model ased in software?
IT industry
There are various reasons for the popularity of Lean methodology in the
such as:
It allows frequent product changes and software releases.
DevOps
Shorter development lifecycle.
Continuous exchange of preliminary development steps.
o Simultaneous improvement in development quality, time and
so on
Operations These are some of the notable factors that make the lean development model essential
up with the current pace of. software
Quality AssUrance (QA)
for organizations that want to keep
development.
2. Agile Model:
is iterative in
Agile is a continuous integration and deployment approach which
Fig. 1.1: DevOps Model nature. It develops its principles from Lean methodology. Some of the agile
SDLC MODELS, LEAN, ITIL,; AGILE approaches can be attributed to the following concepts:
1.3 Frequent analysis and implementation of changes.
The focus of the IT industry has been changed to continuous integration ana Team-oriented leadership or specifically ownership of tasks by each member of
deployment approach. There is a constant need for upgrading and integrating the
the team.
solution with the existing software to meet the market demands. Therefore, owing to deliverables.
a continuous delivery approach, the communication between stakeholders involved
o
Itis usually very self-organized and responsible for its
o Agile perfectly supports organizational and customer expectations.
in developing a software solution and the end-users has increased more than ever, processes that are used to implement Agile
Agile Process: Following are the various
due to which there is a constant need for feedback and its implementation.
Process:
SDLC: on a team-driven development
(a) Scrum: This process of agile development focuses
SDLC began with Waterfall model but is gradually shifting to other models and ina
environment. The team is composed of 7 to members, usually, with major roles
9
approaching the Agile DevOps and Lean methodologies. a Owner, and Scrum
Software Development Life Cycle (SDLC) is a waterfall
and responsibilities classified as Scrum Master, Product
as
traditional methodology for software
model which conta Team. The three roles can be explained further follows:
development.
Introduction
- Sem. IV] 1,4 to DevOps
DevOps : MCA
[Management
DovOps : MCA(Managoment - Sem.IV] 1.5 Introductlon to DevOps
member is responsible for organizing the team and
Scrum Master: This
communication gaps or any other gap about the task being () Extreme programming (XP) - Extreme programming methodology is very useful
eliminating in situations where there are frequent release cycles, shorter development phases,
delivered. a product and uncertainties relating to the functionality to be developed.
is responsible for creating backlog.
Product Owner: This member
backlog. and ensuring that the designated tasks are comnle 3. ITIL ( Information Technology Infrastructure Library):
prioritizing the
ITIL provides the framework and organized processes, while Lean
by the end of each iteration. reminds team
o Scrum Team: The team is responsible for
completing the allocated to members to reduce waste (for IT, this is in the form of time and non-utilized talent)
a self-organized and collaborative approach. and Agile helps team members to work more quickly and adapt to change.
within a sprint, by
are listed as follows: For example, think of an T service desk handling ticket pileups due to Covid. They
Some of the Scrum Practices
product likely are using incident management through ITIL V3/2011 or ITIL. 4, which keeps
Sprint Planning: In this type of planning, the team discusses the teams working through tickets in the same way. But, when they add in Lean, it might
backlog, initial plan of action, and the tasks to be completed during this
look like the team is reducing downtime by re-príoritizing tickets. Add in a bit of Agile
sprint. and the tickets might be routed to people outside of the IT service desk who are
o Daily Scrum Meets: Daily scrum meetings refer to the daily morning equipped to handle specific types of issues, or it might look like a group tackling
meetings, usually time-boxed for 15 minutes, where they discuss their plan of problems and implementing feedback loops.
action for the day. Individually, these methodologies work to update support and service delivery, but
Sprint Review Meeting: This type of review meeting refers to a meeting combined they can boost it to the next level It should be noted, that you don't
where completion of the planned course of action is discussed and monitored necessarily need all three to be successful, but you can find success within the
to determine the future course of actions needed to. accomplish any combination. Together, Lean, ITL, and Agile offer:
bottlenecks. Faster resolution of issues.
o Sprint Retrospective Meets: This is the last phase and the last scrum meeting., Improved productivity for agents.
In this phase, the overall development strategies are discussed regarding the A better overall customer experience.
scope of solution implementation, identification of bottlenecks, the success of Reduction in wasted time and ultimately, money.
planned courses, and any other scope of improvement, which could be
1.4 WHY DevOps?
adopted in future projects, are discussed.
) Crystal Methodologies: This methodology includes the interaction between DevOps is used to increase an organization's speed at the time of delivering
people, more than processes and tools. This method also includes the approaches applications and services. Many companies have success fuly implemented DevOps to
accepted by the team according to the scope of a project. enhance their user experience including Amazon, Netflix, etc.
(c) Dynamic Software Development Method (DSDM):
This methodology is also Facebook's mobile app is updated every two weeks effectively which tells users you
referred to as a Rapid Action Development Model. In can have what you want and you can have it. It is the DevOps philosophy that helps
th the users are involved
actively in the development of the project. Teams are
empowered with decision Facebook ensure that apps are not outdated and that users get the best experience on
making capabilities. Facebook. Facebook achieves this true code ownership model that makes its
(d) Feature-driven Development (FDD: This a developers responsible which includes testing and supporting through production
is feature-driven methodology, where
each phase involves completing a small feature and delivery for each kernel of code. They write and update their true policies like this
within the given time. It consist
of design walk-through and code inspection, so on. but Facebook has developed a DevOps culture and has successfully enhanced its
and
(e) Lean Software Development - Lean software development lifecycle.
development includes 'just-in-time
production techniques. This methodology targets
to eliminate waste and reduee Industries have started to prepare for digital transformation by shifting their means
cost, in that way increasing to weeks and months instead of years while maintaining high quality as a result.
the efficiency of the entire software development
process. DevOps is the solution for all this.
Introduction to Devops
Sem. IV] 1.6
DevOps:MCA [Management- DevOps : MCA [Management-Sem. IV]
1,7 Introductlon to DevOps
The DevOps is different from traditional
IT because traditional IT has 1000s Jino.
creat o Operate: At this level, the available version is ready for users to use. Here, the
code and is created by different teams with different standards but DevOps is
knowledge of the product.
IT complev
Traditional is department looks after the server configuration and deployment.
by ene team with intimate o Monitor: The observation is done at this level that depends on the data which is
DevOps is easily understandable.
understand while
gathered from consumer behaviour, the efficlency of applications, and from
DerOps Lifecycle: various other sources.
• DevOps Lifecycde is the methodology where professional development teams come
Best practices to follow when using Devops:
more efficiently and quickly. The Devone
together to bring products to market Implement an automated dashboard which gives run-tíme information about the
as Plan, Code, Building. Test, Releasing
lifecycle consists of various phases such development of product stages.
Deploying. Operating. and Monitoring. As it keeps the entire team together which is gccd for colaboration and
communication.
Pian Allow DevOps to be a cultural change within the organization.
Be patient with the developers when using DevOps.
Code
Montor Maintaina centralized unit for storage.
Build a flexible infrastructure as it can access at any time anTwhere with the help of
internet.
DevOps Build
Advantages:
Operate Lifecycle 1. Faster Delivery: It enables organizations to release new procucts and updates
faster and more frequently, which can lead to a competitive adratage.
2. Improved Collaboration: Devops promotes collaberaticn between derelopment
and operations teams, resulting in better communication, increased efficiency.
Deploy Test
and reduced friction.
3. Improved Quality: DevOps emphasizes autormated testing and continucus
Relezse
integration, which helps to catch bugs early in the development process and
improve the overall quality of software.
Fig. 12: DevOps Lifecyde 4. Increased Automation: DevOps enables organizations to automate many manual
c Plan: The first step of planning is determining the commercial needs and processes, freeing up tme for more strategic work and recucing the risk of
gathering the opinions of end-user by professionals. human error.
c Code: The code for the same is developed
and in order to simplify 5. Better Scalability: Devops enables organirations to quickly and efficiently scale
the design, the
team of developers uses tools and extensions
that take care of security problems. their infrastructure to meet changing demands, improving the ability to respond
o
Build: After the coding part, programmers use various tools for to business needs.
the submission o! to deliver new
the code to the common code source. 6. Increased Customer Satisfaction: DevOps helps organizations
o more quickly. This can result in increased customer
Test: This level is very important as software integrity
should be assured at thi features and updates
level At this phase, various types of tests such as satisfaction and loyalty.
User Acceptability TestinE as continuous
Safety Testing, Speed Testing and many more 7. Improved Security: DevOps promotes security best practices, such
will be done. can help to reduce the risk of security breaches
o Release: At this level, everything is ready to testing and monitoring, which
be deployed in the operationa systems.
environment. and improve the overall security of an organization's
to improve their use
o Deploy: In 8. Better Resource Utilization: DevOps enables organizations
this level, Infrastructure-as-Code assists in personnel, which can result in
infrastructure and subsequently publishes creating the operationa of resources, including hardware, software, and
lifecycle tools.
the build using various DevoP cost savings and improved efflciency.
Introduction to DevOp
1.8
DevOps : MCA [Management- Sem. IV DevOps : MCA (Management- Sem. IV] 1.9 Introduction to Dev0ps
problem. This is where the deployment of Another effective data migration strategy is to perform both the database
the again: process independently. This
we deploy a database migration process from the application deployment
happens when or any change in the
data is erased. will also make sure data migration is done without data loss
The old version of the are created.
structure, instances, and schemas application behaviour.
The new database database.
o Finally, the data is
loaded into the 6. Configuration Management:
Change: Configuration management is another crucial step in DevOps infrastructure
5. Incremental manage DevOps infrastructure we are
effective technique to management in which we ensure that all the files and software which
• Incremental change is another even after we are making changes
to it correctly, and working as
data. It ensures an application
keeps working
other hand expecting on the machine are available, configured
(CI). On the
which is an important
pre-requisite of continuous integration every intended.
successful deployment of
software release, a However, when we
Continuous delivery demands the means we must update Managing configuration manually is simple for single machine.
production. This are connected -
including the changes to the database
into
held in it. So, we are handling five or ten servers with which 100-200 computers
while retaining the valuable data a nightmare. That's why we need a better way to
the entire operational database so we can easily take back control of things if configuration management becomes
need an efficient rollback
strategy that manage things:
anything goes wrong. changes to a file or a
(a) Version Control: Version control is responsible for recording
For this, we must follow the following data
migration strategies: so we can easily remember specific versions later on. It
most efficient mechanisms for data set of files over time that
(a) Database Versioning: It is one of the
is a good practice because if we know the previous versions
of files, we can easily
migration in an automated fashion. All we need is to create a table in the database Version control can also help us
every time we make a change to the roll back to the earlier versions of the project.
which contains its version number. Now,
recover in case we make mistakes and screw up things.
database, we will have to create two scripts:
A roll-forward script that takes the database from version
x to version x+1. Best practices for Version Control:
x. Use version control for everything (source
code, tests, database scripts, builds
A roll-backward script that takes the database from version x+1 to version and configuration files).
which & deployment scripts, documentation, libraries,
Another thing we will need is an application configuration setting are working properly.
to work. Check in regularly to see if all the versions
specifies the version of the database with which it is designed messages during check-in. This can save
Then during the deployment, we can use tool which looks at the
a current Use detailed multi-paragraph commit
error occurs later.
version of the database and the database version required by the application hours of debugging in case any
version being deployed. Then this tool will use the roll-forward
or roll (b) Managing Components and Dependencies:
1. Managing External Libraries: Since
external libraries come in binary form,
backward scripts to align both the application and the database version
can be a difficult task. Here are two ways we can get this
correctly. We can read about database scripting in detail here. managing them
(b) Managing Orchestrated Changes: This is another common practice for data done:
version control.
migration. We are not in favor of it because it would be better if applications
o
Check the external libraries into the
o Declare the external libraries and
use a tool like Maven or Ivy to down them
could communicate directly, not through the database. Still, many companies are our artifact repository.
following this practice and integrating all applications through a single database. from the Internet repositories to
The best way is divide the application into smaller
Be cautious when doing the same because even a small change to the database can 2. Managing Components: to the application, reduce
components. This will limit the scope of the changes
have a negative impact on how other applications work. We should test such encourage reuse, and enable a much more efficient
regression bugs,
changes in an orchestrated environment before implementing them in the
development process on large projects.
production environment.
1.20 Introduction to DevOps -
[Management - Sem. IV DevOps MCA [Management Sem. V
:
1.21 Introduction to DevOps
DevOps : MCA man
configuration should be
Configuration: Software a DevOps purpose to manage The agile purpose is to manage
(c) Managing Software & testing, and consider Purpose
to proper management
carefully. We should subject it as: end to end engineering complex projects.
configuration principles such processes.
few important software same repository
Keep all the available application configuration options in the Task It focuses on constant testing It focuses on constant changes.
as its source code. and delivery.
o Manage the values of
configurations separately. Team Size It has a large team 'size as it It has a small team size. As
process with the help of values
Perform configurations using an automated involves all the stack holders. smaller the team, the fewer
repository. people work on it so that they
taken from the configuration
to avoid confusion. can move faster.
Use clear naming conventions
information. Team Skillset DevOps divides and spreads the Agile development emphasizes
o Do not repeat any
as simple as possible. skill set between the training all team members to
o Keep the configuration information development and the operation have a wide variety of similar
configuration system.
Do not over-engineer or over-optimize the team. and equal skills.
tests and keep a record of each.
Run all necessary configuration on Agile can implement within a
Implementation DevOps is focused
a DevOps infrastructure management and software
That is how we establish collaboràtion, so it does not have range of tactical frameworks
The process requires a lot of patience and guidance accepted such as safe, scrum, and sprint.
deployment environment. any commonly
go wrong.
because there are many chances, things could framework.
Duration The ideal goal is to deliver the Agile development is managed
110 DevOps AND AGILE code to production daily or in units
of sprints. This time is
Agility is the key-Additionally, through DevOps, scalability can be achieved quickly every few hours. much less than a month for
and easily even for big organizations with a Stable and reliable operating each sprint.
environment, pushing one's business to stay ahead of the competition. End to End business solution Software development.
Target Areas
Agile helps in bridging the gap between Business and Development teams and DevOps and fast delivery.
is coming
helps in doing this for Development and Operations
teams. Feedback Feedback comes from the In Agile, feedback
focuses on collaboration, customer internal team. from the customer.
Agile refers to an iterative approach which
supports only shift left.
feedback, and small, rapid releass, DevOps is the practice of bringing development Shift left It supports both variations left| It
and operations teams together. DevOps central
concept is to manage end-to-end Principle and right.
on, functional and
engineering processes (Concept to Cash). Focus DevOps focuses on operational Agile focuses
readiness. non-functional readiness.
and business
Difference between DevOps and Agile:
for In Dev0ps, developing, testing. Developing software is inherent
DevOps and Agile are two software development strategies having similar aims Importance are to Agile.
and implementation all
product development, delivery end to end. equally important.
Table 1.2: Key difference between Devops and Agile Agile produces better
Quality DevOps contributes tó creating
automation applications suites with the
Parameter Devops Agile better quality with can
early bug removal desired requirements. It
Definition DevOps is a practice of bringing Agile refers to the continuous and adapt according to the
Developers need to follow quickly
development and operation iterative approach, which Coding and best Architectural changes
made on time during
collaboration, project life.
teams together. focuses on practices to maintain quality the
customer feedback, small, and standards.
rapid releases. contd.
...
Contd.
Introduction to DevOps - Sem. IV
Sen N 1.22 DevOps : MCA[Management
DevssCA angment- Ansible, Bugzilla, Kanboard, and JIRA
1.23 Introduction to DevOps
Chef, AWS, 1. Puppet:
Puppet, are some popular Agile tools.
Tos OpenStack are
and Team City Puppet is the most widely used DevOps tool. It allows the delivery and release of
POpular DerOpstools on
Agle does not emphasize the
Autemation is the prìmary goal automation,
technology changes quickly and frequently. It has features of versioning, automated
Autemation works on the testing, and continuous delivery. It enables to manage entire infrastructure as code
of DevOps It
principle of maximizing without expanding the size of the team.
efficiency rhen deploying Features:
sofuare. Real-time context-aware reporting.
communication | Scrum is the most common
Conication Deros
inraes spes and design method of implementing Agile Model and manage the entire environment.
documents It s essential for the software development. Scrum Defined and continually enforce infrastructure.
cperational team to fully meeting is carried out daily. Desired state conflict detection and remediation.
understand the software release It inspects and reports on packages running across the infrastructure.
and its netrork implications for
the It eliminates manual wòrk for the software delivery process.
the enough runnìng
enloyment pces o It helps the developer to deliver great software quickly.
DevOps the process The agile method gives priority
Doentation 2. Ansible:
documentation is prime because to the working system over
complete documentation. It is Ansible is a leading DevOps tool Ansible is an open-source II engine that automates
it ml snd the sofvare to an
cperational team for ideal when you are flexible and application deployment, cloud-provisioning, intra-service orchestration, and other I
deployment. Automation responsive. However, it can tools. It makes it easier for DevOps teams to scale automation and speed up
minimis the impact harm when you are trying to
of productivity.
insuffiient turn things over to. another
ocumentation. Ansible is easy to deploy because does not use any agents
or custom security
Howeve, in the development of team for deploynent. infrastructure on the client-side, and by pushing modules to the clients. These
sophisticated software, it is
modules are executed locally on the client side, and the output is pushed back to the
difficult to transfer all the
knowiedge required Ansible server.
Features:
L11 DevOps TOOLS o Itis easy to use to open source deploy applications.
• Puppet Chef, Ansible and SaltStack are some most popular tools. o Ithelps in avoiding complexity in the software development
process.
o It eliminates repetitive tasks.
JUnit Maven up the development process.
It manages complex deployments and speeds
3. Docker:
shipping, and running
HEF gradle Sensu Docker is a high-end DevOps tool that allows building,
applications on multiple systems. It also helps to gather the apps quickly
distributed
DevOps Tools container management.
from the components, and it is typically suitable for
Features:
A more comfortable and faster.
It configures the system
splunk SALTSTACx
JIRA NSIBLE
Itincreases productivity.
are used to run the application in an isolated
D o It provides containers that
ecipse Bamboo
environment.
Fig. 1.3: Dev0Ps Tools
Introduction to DevOps
DovOps : MCA (Managemont- Sem. IV]
Plan
Operate Hardware
Build Release
Monitor
Test
Red Hat Enterprise Linux (RHEL). RHEL is a popular Linux distribution that is 11.
or grep [options] pattern file
a
widely used for applications such as microservers, cloud computing, application 12. grep: Search for a pattern in file
development, storage solutions, and many more. output.
3. Fedora: Fedora is another option for RHEL cantered developers. It differs from or directories. find [path] [expression]
13. find: Search for files name
Cent0S in two very important ways. For starters, Fedora is not an RHEL clone like a tar [options] archive
14 tar Archive files and directories into
:
file(s) _to_archive
CentOS. It is officially adopted by the RHEL team since Red Hat uses
Fedora as a tarball. cont, ...
sort of proving or testing ground for upcoming RHEL technologies. Because of this
Fedora is fully integrated with RHEL.
Introduction to DevOps
Sem. IV] 1.32 DevOps : MCA [Management- Sem. IV]
DevOps : MCA [Management- 1.33 Introduction to DevOps
gzip [options] file 3. Automate Processes: DevOps practitioners should
gzip:Compress files. be comfortable with
15
gunzip [options] file.gz automating processes. This means scripting out manual processes,
setting up
16 gunzip: Uncompress files. Continuous Integration/Continuous Delivery (CI/CD) pipelines, and streamlining
ps (options]
ps Display information about running
:
task execution. For this purpose, knowledge of scripting languages is important.
17
processes. 4. Monitor and Optimize: Monitoring and Optimizing systems and processes are key
components of the DevOps role. Monitoring helps identify problems, while
18. top Display system resource usage and
: top
optimization helps ensure processes run efficiently. Systems and process
process information. monitoring are essential aspects of the Linux DevOps role. DevOps engineers must
ssh: Connect to remote system using
a SSH. ssh (user@]hostname be able to analyze system and process performance metrics, detect any potential
19.
issues, and take proactive steps to prevent outages or other negative impacts on
1,17 LINUX ADMINISTRATION system performance, They must also be able to identify opportunities for
a improvement and develop strategies for optimizing system performance.
The job of a Linux systems administrator is to manage the operations of computer 5. Collaborate: DevOps is all about collaboration between teams, and practitioners
system like maintaining, enhancing, creating user accounts/reports, and taking
should be comfortable communicating with stakeholders and other teams.
devices
backups using Linux tools and command-line interface tools. Most computing Collaboration is essential when transitioning to a Linux DevOps role. Working
open-source
are powered by Linux because of its high stability, high security, and closely with other team members can help build the skills and knowledge needed
environment. for a successful transition.
Linux DevOps is the practice of using Linux-based systems and tools to build,
deploy,
1.18 ENVIRONMENT VARIABLES
and manage applications in a Continuous Integration and Continuous Deployment
Environment variables are pairs of keys and values that can be used to customize the
environment.
many common tasks associated with build process and store sensitive data such as access details to deployment servers.
This approach allows for the automation of
Linux is a multi-user operating system. Multi-user means that each user has own
software development, including source control, build automation, infrastructure
dedicated opèrating environment after logging in to the system. And this
orchestration, deployment, monitoring, and logging. environment is defined by a set of variables, which are called environment variables.
A Linux DevOps Manager should: Users can modify their own environment variables to meet the requirements of the
1. Learn DevOps Tools: a Linux DevOps manager should learn DevOps tools like environment.
Docker, Ansible, Jenkins, Kubernetes, etc. These DevOps tools should be learned to 1. Use command env or printenv to display currently defined environment
become DevOps Manager in the Linux Operating system. To learn Linux DevOps variables, for example:
tools, one should know the Linux command line and its various commands. This $ env (or printenv)
includes learning how to navigate the file system, how to create, remove and _SESSION_ID=3092
XDG
manage files, and how to install and configure software. It is also important to
HOSTNAME=XXXX
learn scripting languages such as Python, Bash, and Ruby, as these are commonly NVM_CD_FLAGS=
used in Devops automation. Once a person is comfortable with the Linux
TERM=Xterm-256color
command line and scripting languages, they can move on to learning about
DevOps tools such as Ansible, Puppet, Chef, and Jenkins. SHELL=/bin/bash
2. Get Familiar with Infrastructure as Code(laC): laC is the process HISTSIZE=1000
of managing SSH_CLIENT=XXXX 49967 22
and provisioning infrastructure through machine-readable definition files.
Infrastructure as Code (laC) is a method of managing and provisioning SSH _TTY=/dev/pts/0
/mnt/data/home/abc /root/docker/httpd
$ export PATH=$PATH:$PWD
Set/Unset New Env Variable
To set a new environment
variable. $ echo $PATH
3. export command:
$ echo $VERSION /usr/local/sbin : /usr/local/bin:/usr/sbin :/usr/bin: /root/bin: /root/docker
VERSION=1.0.0
/httpd
$ export
2. HOME: The user's main working directory is the default directory when the user
$ echo $VERSION
logs in to the Linux system.
1.0.0 $ whoami
an existing environment variable.
4. unset command: To delete/remove tony
$ echo $VERSION
$ echo $HOME
1.0.0 /home/modern
we input will
$ unset VERSION 3. HISTSIZE: Save the number of historical commands. The commnands
$ echo $VERSION be saved by the systemn, and this environment variable records the number of
You can set/unset multiple variables as well. commands to be kept. Generally 1000.
$export VERSION=1.0.0 VERSION2=2.0.0 $ echo $HISTSIZE
$unset VERSION VERSION2 1000
$ HISTSIZE=1001
Set Persistent ENV Variables:
users, you can leverage echo $HISTSIZE
1, For All users: To make ENV variables persistent for all $
variables on
letc/profile file. This file used to set system-wide environmental 1001
user's shells. The variables are sometimes the same ones that are in the 4. LOGNAME: Current user login nme.
an initial PATH or PS1 for all shell
.bash profile, however this file is used to set $- echo $LOGNAME
immediately, you need to run source /etc/profile to take effect immediately, and cloud-dev.modern.com
current user.
otherwise it will only take effect when you re-login as the user next time. 6. SHELL: The type of shell used by the
user, you can modify the $SHELL
2. For Single User: Setting specific ENV variables for single $ echo
a can be
.bash _profile file in the user home directory, which is hidden file that /bin/bash
viewed by ll -a: 1.19 NETWORKING
$ cd Troubleshooting Commands:
Linux Networking and machine and to
-a .bash_profile to view the hostname of the
$ 11
1, hostname: hostname command is used
-rw-r--p-- 1 tony tony 193 Sep 22 2021 .bash_profile set the hostname.
Common ENV Variables: modern.com
Example: sudo hostname when you restart the
1. PATH: The paths, separated by colons, are a list of directories where executable hostname using "hostnamne" command, (
If you set the
change to the name specified in the hostname file
programs can be found. machine, the hostname will
$ echo $PATH e.g. /etc/hostname).
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin
Introduction to DerO
DerOps:MCA (anagemet-Ser n 138 DerOps :1VCA(Wanazet-Se n
127
hostname permanentiy. you can use the letc/hoste For example, you can troubleshoot prox esver conections sing
So if you want to change the wE
present on the server.
file or relerant hostname fle WZet e uUSe proryyes http proryagrory post:port
ttto:/lerteralsíte.com
achines, you can change it in the
letc/hostname file.
0 For buntu Tou can check
Cent0S and Fedora you Can change it in the if anebsite is up ty fechíng the fle.
For PHEL
get ws.google.con
fetc/sysconfig/network file.
reverse lookup of IP or a DNS name. 6. ip(config): ip conmand is used to display and manipalzte routes 2nd setrott
2. Host: Host command is for the
Example:
interfaces. ip conmand is the rener version of ifeonfiz ifeorfg wots in
systens, but it is better to use the ip commard instezd cf tifcorfg
a te
an IP, you can use the host commands 2n
If you want to find DNS attached with
a
Display network derices and configuration:
follns:
íp addr
host 8.8.8.8
You can also do the reverse to find the IP.adiress associated
viith the donain This conmand can use vith pipes and grep to get more graa pt e tte P
nane. For erample, zddress of the etho ínterface.
host cevopscube.con íp a | grep etho l grep "iet"lztfo {orit $2
rermote server is reachable or
3. ping The ping networking utility is used to check if the Get details of a specific interfzce
not It is prímarily used for checking the connectívity and troubleshooting the
ip a shos eth2
network It prorides the folloing detaíls.
Poutíng tables can be listed by follosing cmande
) Eytes sent and received
(i) Packets sent, recefved, and ost
íp route
Gii) Approcimate round-tríp tíme (in milliseconds) ip route list
Syntax: pingeIP or DNS> 7. arp: ARP (Address Pesolution Protoco) shons tie cate table of ocal neors P
Example: píng devopscube.com addresses and MAC adáresses that the system interzctd mith
To ping IP address: píng 2.8.8.8 arp
ss(netstat): The ss command is a replzcemernt foz netstat Yon can stl se
e
If youwant to limít the píng output viithout using ctrl + c, then you can use the 8.
flag with a rumber as shown below: netstat comnand on al sstems. Using tnis command, gou can oe z
píng -c 1 devopscube.com informatíon than netstat commard ss cOmmnand is fas be2se it Ets al tie
4. curl: Curi utílity is primarily used to transfer data from or to a server. However, you inforrnation frorn the kernel userspace
can use ít for network troubleshootíng, For netv/ork troubleshootíng, (a) Listening all connections: The "ss command will ist al te TCP, UD?, Ua d
curl supports
protocols such as DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, socket Connectíons on your machine.
LDAPS, MOTT, POP3, POP3S, RTMP, RTMPS, RTSP, SCP, SFTP, SMB, SMES, you want to flte cut TCP, UD?
SMTP, SMTPS, (b) Filtering out TCP, UDP and UNIX Sockets: If
TELNET and TFTP.
UNIK Socket details, use "-t*-u and "- f2gs ith the "ss" command It
I
For example, curl can check connectivity on port 22 using to specific parts. If you want to ist both
telnet. show all the established connectíons the
curl -v telnet://192.168.33.10:22 as
connected and listening pors using "a" vith the specific fag shcwn belo.
To check the FTP connectivity using Curl
SS-ta
curî ftp://ftptest.net
You can troubleshoot vreb server S5-u3
connectivity as vwell.
SS-Xa
curl http://devopscube.con -I ports, use -1" flag with ss
(c) List all listeníng ports: To list all the listening use
5. wget: The vwget cormmand is prímarily used UDP or UNIX socket, -t, "-u" and "-x fag
to fetch web pages. You can use wget to cOmmand. To list specific TCP,
troubleshoot netnork issues as well.
with "-1" as shown belon.
DevOps :MCA [Management - Introduction to DevOpa
Sem. IV 1.38
DevOps :MCA(Management- Sem. IV]
1.39 Introduction to DevOps
redhatedevopscube:~$ ss -lt
Address:Port Peer Address:Port 14, route: This command is used to get the details of the route
table for your system and
State Recv-0 Send-0 Local to manipulate it.
LISTEN e 128 *:ssh *:* Examples:
LISTEN 50 :::http-alt :::* For listing all routes:
LISTEN s0 :::55857 ::: Execute the "route" command without any arguments to list all the existing
LISTEN 128:::ssh routes in your system or server.
LISTEN 50 :::53285 :::* readhat@devopscube:~$ route
redhatedevopscube :-$ Kernel IP routing table
(d) List all established: To list all the established ports,
use the state established flap 2 Destination Gateway Genmask Flags Metric Ref Use Iface
shown below. default ip-172-31-16-1. 0.0.0.0 UG 000 etho
-t -r state established 172.17.0.0 * 255.255.0.0 000 dockero
U
SS
lsof irdi
TItetan
One of the common errors faced by developers and Devops engineers is "Bind failed Bosno1
Catal
a Indoreta
error: Address alreadyin use". You can find the process ID associated with port Cestina
Dargs Italigro
up the port.
using the following command. Then you can kill the process to free Ceutsch Bt1s
EAAryu PyoA
lsof -i:8080 Esper anto
Peer Sanegil1.
Requirements:
For this installation, you will need a copy of the Ubuntu Linux Server 16.04 Fig. 1.6 (i): Language Selection Screen
installation media. You can obtain the latest version as a DVD image (1SO) which can Select the language you would like to use and press Enter. By defaultit is English.
be used for this installation at www.ubuntu.com/download/server. Be sure to select
2. Next you will be asked to select an action.
.
the Ubuntu Linux Server edition as the desktop version uses a graphical installer.
ubuntu
We will be installing the x86_64 DVD ISO image on a virtual machine.
Installer Keyboard Notes:
The Ubuntu text installer utilizes keyboard keys for menu selections. The following is
Install Ubntu Server
a list of the primary keys you will use: Install NAAS Fegion Centroller
Install MAAS RaCk Controller
o Tab key or Arrow keys: Navigate from one selection to another Check disc for defects
Test tesory
Space Bar :Toggle selections on or off Boot frca first hard disk
Rescue a troken systen
Enter key :
Accept the current selections and proceed to
the next step (Some
keyboards may have a Return key rather than an Enter key)
Installation:
F4 Modes FS Accessibillty F6 0ther Options
F1 Help F2 Language F3 Keyaap
The following steps will guide you through a basic
installation of Ubuntu Linux Server
16.04. The installation process will take
'some time to complete and some steps w Fig. 1.6 (ii): Select an Action
take longer than others.
we wil choose the default Install Ubuntu Server by
[Important: Kindly take backup before Since we are installing Linux,
installation as this installation may erase the
data] pressing Enter key.
Introduction to DevÔnN
Sem. IM) 1.42
DevOps: MCA [(Management- DevOps : MCA (Managemont- Sem. IV]
process, the installer will ask fo 1.43 Introduction to DevOps
we have begun the installation Here, we will select <No> to allow us to
3. Now that system to use during installation and operatio Pressing the Tab key will allow you to move manually select our keyboard layout.
language that you would like the between selections.
the tallat jon procHi. he telcted
larquese sill
Daee he rar tor t jstalled afea, t3u can tr to hae ypt rtord Lpt tKtdby
tot rt ta ds this, yu vil! te grtit s
im of Lep. If ypt
l to lt irtord ept troa Lst.
:
Lcailzst
tetect ettrs lrpt
o tx
Astuian
tessevas
Frcais
Galsci
s
bacn tàates httra
Selecta language for the System
Fig. 1.6 (iii):
We will be using the default, English.
you will be asked to select the
4. Once the system language has been selected
location the system will use. This setting is used for configuring the locality
of tarea
letes latuon si!! be vsed ta set your tie 0re and also for exgncle to help
ct the tytten Lcaje. Norlly this shoul: be the country tere you live.
tocat iors baset on the Inguese you selected. Choose "other"
Ps isLoat Lon
titis If
yr t listed.
Africa
ctte
yinte tttent
Fig. 1.6 (iv): Select Location Srto-atian (S
We will accept the default, United States by pressing Enter key.
5. The installer will now ask whether or not it should try to detect your keyboard
layout. You can select <Yes> to allow the system to detect the keyboard layout. If
is successful you will automatically skip to step 8. Layout
Specific Keyboard
Fig. 1.6 (vii): Select
If the detection is not successful you will need to complete the manual selectio which is the default.
process in steps 6 and 7 as if you had selected <No>. We.will again choose English (US)
Introductlon
DevOps : MCA [Management - Sem. IM
1.44 toDevop
your network options.
automatically configure messapes If itt
DovOps : MCA(Managomont-
Som. IV]
8. Your system will now try to
vou will be presented with the following tailure 1.45
unable to do this 10. Now that we
have selected
Introductlon to DevOps
tht sethora
enter the Internet Protocol Configure network manually, we will be
{1 Cntise (IP) address
for our system. If you do not asked to
address please consult your know your IP
Neter tcmirati falirpotocol. Alternat ively. network administrator
for the information.
r netur is rotably ret usin thehtre is not eor ins In this example, we
ey be slo netuc will use the IP address 1.2.3.4.
the HCF Seve
rocer ly.
configure the network at this time. If you choose to skip configuring the In the provided field, enter the IP address of 1.2.3.4. When done, press the Tab key
network, you will need to manually configure your network settings after until you get to <Continue> and then press the Enter key.
installation completes before your system will be able to communicate with other 11. You will next be prompted to enter the network mask for your network. Again, if
servers on your network. you do nt know your network mask please consult your network administrator
For our example, we will select Configure network manually. for the information.
as our network mask.
Cond1gure the retucr In our example, we willuse the default 255.255.255.0
Fron tere yu con chocoe to retry
H
nretuork autoconfiguration
(tich e sceed if your DHCP server teres a lcng tine to respond)
r tohostnane
conf ige the retucr tanuallg. Sone DHP servers require a
res
r .
3 Cr
OF
to sent by the cllent,
be $O uou.can also choose to The netaSk LS
S
te eter.n stratr et L
te
retry DO
provide.
etcr Corfiguraion elth a hostnane that you netecrk, Corselt
your retcrK
per
ics.
Netuor conf ipurst1on Dethod:
Netnask :
Fetry retuors toCd 1zra: icn 5S.255.255.0
Fetry retutri a0crlgration vith a tCP hostnane
Do rot cri ipre tte retut at this tine
Go Eack
Mask
Fig. 1.6 (xi): Network simply press
mask of 255.255.255.0
the default network
Since we are accepting to <Continue> and then press the Enter
key.
Fig. 1.6 (ix): Configure you get
Network manually
After you have made your selection press the Tab key until
the Enter key to continue.
Introduction to DevOps
DevOps: MCA [Managomont- Sem. IV]
- Sem. IV] 1.46 1.47 Introductlon to DevOps
[Management our network
DevOps: MCA
IP address of the
network gateway for you do not In the field provided, enter the IP address of 8.8.8.8 for the name server
we will enter the information t0 you if address.
12. In this step, When done, press the Tab key until you get to Continue> and
Your network
administrator can provide this then press the
Enter key.
know what it is. as our gateway address. 14. Next you will be asked to enter the name that this host will be known as. This
we will use the address 1.2.3.1 name can be a single word (no spaces) and should not contain special
In our example, characters
{ Cont icure tte netecn such as "% It 1s Common, however, for system administrators to utilize a dash "
periods) that in their host names. (Such as web-server-1)
atress (four runbers separated by
the gate is
an IP
Anoun as the defsult
router. AII
Indicates the gatey roter, LAN
also
instance, to the Internet) is
soes utside yor (for you my h3ve no
traffic that router. In rare cicustances,
Sent thugh thiscase, vou cn If you don't knou Please enter the hostnare for this tpto.
router: in tht lee this blank, your netecrk
the praer ansuer to this Questicn, consult The
knou
hostane i:
tf your slra.
tote Te ypr.pta t tte netsrt, l
o dn'
up your oun hone
8dninistrator. retucr, yo cn ak
scothicg uo ere.
Gateuoy: Hastnane:
tuntu
TZ.3.1
Cont Irues (Go Back
OS
Gateway Haboves Saac celect Eato xtlvats ttm
Fig. 1.6 (xii): IP address of the Network Fig. 1.6 (iv): Hostname
In the provided field, enter the IP address of 1.2.3.1 for the gateway address. When selecting your host's name, it is important to select a meaningful name to
press the
When done, press the Tab key until you get to <Continue> and then prevent confusion with other hosts on your network.
we will leave the host
Enter key. Since the host we are building-will not appear on a network,
name for server our name as its default value of ubuntu. Press the Tab key until you get to <Continue>
13. In this step, we will enter the IP address of the primary
network. Your network administrator can provide this information to
you if you and then press the Enter.
15. In this step, you will be asked to enter the full name of the primary
user of the
namne server
do not know what it is. Here, we will use the address 8.8.8,8 as our
system. (Note: On Ubuntu systems, this user is NOT the root superuser but will
address. have administrative capabilities.)
Coni lgure the netucr
The nane servers ere used to 1004 up host nates on the netuork. Usereccount vill be created tor you to Se Instead c the root xcount for
non-admInistrative activities.
Please eter the IF eddresses (not host nates) of up to 3 name a
nane of this user. This Intoration wilL te use for istnce utes
Servers, separated by speces. Do not use coras. The first nan Please enter the real progran hLch diio!s r
server In the 1lst ulll be the first t0 be querled. If you don't uant defauit origln for emai!s sent byy this user
as
ll as any
name. YCur full nae is a ressonab
le
rcice.
to use any nane Server, Just leave this field blans.. the user's real'
Full name for the nev user:
Ngne server addresses:
TechonTheNet
3.B.8.8
<Co BaçA>
(Go Eac>
CCont irues
17. On this screen, we will need to enter the password you would like to use
for the <G0 Back
techonthenet user acCount.
It is important to choose a strong password that cannot be easily guessed
and that Fig. 1.6 (ix): Confirmation for the password
you'll remember! (Typically strong passwords are more
than 8 characters long and Since our machine will only be used for this tutorial, we will accept the risk and
contain upper/lower case characters and have numbers or special
characters such continue. If you choose <No> you will have to repeat the password selection
as a "$".)
(Steps 11 and 12). If you choose <Yes> you will proceed to the next step.
( Set u uers and pOSsuorCs 20. Ubuntu allows youto encrypt your home directories for security. This is useful for
situations where users require security on items they keep in their home
A gOod
passuord ulll contain a eixture of
changed at regular intervals. letters, nucbers and punctuat ion and should be
directories. Here, we will select <No> and not encrypt the home directories.
Choose a çassord for the neu uSer:
[0 Set o users and cascrcs
USho any files stored there
Passucrd in CIesr You may configure your hone directory for encryotlcn, such that
remain private even if yOur cocputer is stolen.
(Go Bac)
CCont you login and
Inue The system will seanlessiy rount your encrypted hoe directory each tie
autonat lcally unount uhen you log ut of all active sessions.
Fig. 1.6(xvii): Set up a Password Encrypt your hone directory?
After entering the password, press (Yes
the Tab key until you get to <Continue> <GO Back>
then press the Enter key. and
18. In this step, you will home directories
be asked to re-enter the
password you used from the Fig. 1.6 (xx): Confirmation to encrypt the
previous step. This is to ensure choose the time zone your computer
that the passwords match. 21, In this step, you will configure the clock and correct local
use this setting to display the
will use. The system time services will
time.
Introduction to DevOps
DevOps: MCA [Management - Sem. IV] 1.50 MCA(Management - Sem. IV
DevOps :
1.51 Introduction to DevOps
c
Cre the
is not litted, then ciesenesp tback to the step Theose lnguse 23. Next we will need to select the hard disk
I1 the cesired
and te lect a cotry
te on
that etes the Cesied tie (the country you livt or are that we will apply the hard drive
locate). partitions to. Since our computer contains only one hard drive, we can
yor
accept the
Select te defaults and continue. If your systerm has more than one hard disk, you
will need
to select the appropriate hard drive your system is set to
boot from.
Here, we will choose the second option which is Guided - uise CYes
up LVM. entire disk and set
configure LVM.
The instal2er
schenes) r
cn
1f
. you can TIonig a dis (using different standard Fig. 1.6 (oiv): Save changes to disks and
your selected partition layout to the hard disk, tab to
do it taslly. with gulded part it ioning you
StAl! hae a crce later to revie vill
rd custcaise the results. If you are ready to apply
If you dhocce guie1 part lit Lon ing for en ent you
<Yes> and then press the Enter key.
should te d. ire dip, uiIl next be asked uhich
dis
Fartit lnirg tethod: armount of hard disk space you would like the
25. In this step, you will be entering the
G!d - t re ci The installer will by default fill in the
1dU t
ire d1sk and t LW installer to use for Ubuntu Linux Server.
Guided - can lower this value to
S et ire di set o ercrpted your hard disk but you
d
LVH
Applying updates on a frequent basis Is an lcortant part of keeoirg yor systes secure.
group to use for guided partitioning
Fig. 1.6 (oxv): Amount of volume to utilize all
By default, updates need to te arolied canually using paxkage narageent tools.
We will leave the value unchanged which will tell the installer
aut omatically dounload and Install
Aternat1vely, you Can chose to have this systea
over the ueb as cart of a group
you get to <Continue> and then security updates, or you can choose to tanage this systes
available hard disk space. Press the Tab key until of systens using Canonlcal's Landscape service.
press the Enter key. HOw do yOu Unt to monaxe Upzrades Cn this systen?
to
26. The installer will confirm that you are ready to write the partition information No
autoatic uodates
the hard disk. Instal! seCurlty uodates autoatlco1
Manage systen uith Lardscace
you
If you cont Inue, the chanzes llsted telou ui!! be uritten to the disks. 0therulse,
uil! te atle to further changes rually.
manage upgrades on the system
The partItion tables of the follouing devices are changed:
Fig. 1.6 (xxviii): Options to manner
LV root
updates are applied in a timely
LVM
Gbntvg,
LM VG buntu-v, LV s0p,!
always a good idea to ensure security
SCSI33 (0,0,0) (sda It is option.
tollouing partit ions are going to te feralted: so we will select Install security updates automatically
break other software
The
LVM VG
buntuvf, LY root as ext4 are concerned that an automatic update might
LW VG
tuntu-v, LV
s01 as suep
sda) as ext2
(TIP: If you No automatic updates and
may want to select
(
bttes
aata software/service to install on the host
Enter
POveS!oacr) Celectss
Fig. 1.6 (ocvii): HTTP Proxy Information
We will not provide any HTTP proxy information so we press the Tab key until we Select the additional
Fig. 1.6 (xxix):
get to <Continue> and then press the Enter key.
Introductlon to DovOpa
DevOps : MCA (Management - Sem.
lV| 1.54 povops:MCA [Managemont - Sem. IV]
1.55
secure shell (SsH) from Introduction to DevOps
Since we would like to be able to log into the host using 32. 1f all goes well in a few mínutes, you
will see a login prompt similar to
another host on the network we will also select OpenSSH scrver. following screenshot: the
menu
You can select a menu option by pressing the Space Bar. Moving between Ubuntu 16.04 LTS ubuntu
items can be accomplished by using your keyboard's Arrow keys. ttuj1
1buntu login:
When you are finished selecting the software, select <Continue>,
30. The system will now ask to install the GRUB boot loader onto the master boot Fig. 1.6 (xoxii): Log-in Screen
record of your hard disk. GRUB is used during the boot up process to enable You have successfully installed Ubuntu Server Linux.
Now you can log in using
Ubuntu Linux Server to load. the user name and password you configured during Steps 10, 11
and 12.
Ital: the CRP tor Jcaser cn hgrd dizk 1.21 RPM AND YUM INSTALLATION
a
It that this nev instsllat Ion 1s the only operat ing systea on this cotputer. If so,
seems YUM (Yellowdog Updater Modified) is an open-source and free command-line package
1t should be safe to Instoll the GLS boot loader to the naster boot record of your first
hard dr ive. management utility for systems executing the Linux OS with the help of the RPM
Wning: If the installer falled to detect another operating sy:ten that is present on package manager. Many other tools offer GUI to YUM functionality because YUM
your coeputer, odityirg the noster toot record ul!l noe that operat ing system
tenporar i1y urtootable, though GRUÊ Can te taruslly configured Jater to boot contains a command-line interface.
It.
YUM permits automatic updates and package dependency management over RPM
Install the GRU8 bogt Ioxder to the taster boot record?
based distros. YUM implements software repositories (set of packages) that can be
locally used or on a network connection similar to the Advanced Package Tool from
Debian.
Installing YUM in Ubuntu:
Fig. 1.6 (o):
Install the GRUB boot loader Step 1 : Update the System
We will select <Yes> to install the GRUB boot loader. We need to execute the update command for getting the latest package information and
31. Now the installation is complete. The
installer will now prompt you to reb0ot the updating package repositories:
computer. $ sudo apt update
Nexgubuntu: -$ sudo apt update
1 FiniSh the Instsl!at 1c1 (sudo) passkord for krishu:
Htt:1 http://ppa.launchpad. net/gezakovacs/ppafubuntu focal Inaelease
Installbt ion cop lete Hit:2 http://in.archive.ubuntu.con/ubuntu focal InRelease
Installaticn is cono lete, so if tine to toot into your neu system, Make sure to rerove
the installat icn edis (C0-POM, isfIcoples), so that you boot Into the neu system
Hit:3 http://in.archtve.ubuntu.conyubuntu focal-updates InRelease
than restart*irg
irg the installat icn. rather Hit:4 http:|\n.archive.ubuntu.con/ubuntu focal-backports Inkelease
Get:5 http:||Securlty. ubuntu.ccn/ubuntu focal-security InRelease (114 ks)
CCont drue
Get:6 http:/isecurlty.ubuntu.conubuntu focal·security/ain J35 Packages [441 k
B)
Ign;7 http://securtty.ubuntu.con/ubuntu focal-securlty/nain ands4 Packages
Get:8 http://securlty.ubuntu.con/ubuntu focal·securtty/naln Translatton-en [257
kB).
DEP-11 Metadat
Ign:9 http://security.ubuntu.con/ubuntu focal-security/nain ands4
-a Metadat
Get:10 http://securtty.ubuntu.con/ubuntu focal-security/raln ands4 c-n-f
(16.4 k8)
and64 Package
Ign:11 http://securtty.ubuntu.con/ubuntu focal-securlty/restricted
S
Translattlon -e
Get:12 http://security.ubuntu.con ubuntu focal •security/restricted
n (135 kB]
Teb moves: focal-securltyfuntverse (386 Packages
Spac2) Selectst Enter sctlvates buttons Ign:13 http://securlty.ubuntu.con/ubuntu
and64 Packages
Get:14 http://security. ubuntu.con/ubuntu focal-securlty/unlverse
Fig. 1.6 (xoxi): Finish
Select <Continue> to reboot the Installation [705 k8)
focal-securlty/untverse Translatton-en
Ign:15 http://security.ubuntu. con/ubuntu
into Ubuntu Linux Server.
Fig. 1.7 (a)
Introduction to DevOps
DevOps : MCA [Management - Sem. IV] 1.56 -
DevOps : MCA(Management Sem, IV
1.57 Introduction to DevOps
: 3.
Step 2 Install YUM Which of the following is not a feature of continuous delivery?
We need to execute the install command for quickly installing the
packages and thei
(a) Automate Everything (b) Continuous Improvement
dependencies:
$ sudo apt-get install
yum (c) Bug fixes and experiments (d) Gathering Requirement
ubuntu@ubuntu-Virtual Box:-$ sudo apt-get install yum 4. What is the use of Git?
(a) Version Control System tool
[sudo] password for ubuntu: (b) Continuous Integration tool
Reading package lists... Done (c) Containerization tool (4) Continuous Monitoring tool
Building dependency tree .. 5. What type of mindset is the core of a DevOps culture?
Reading state infornation. Done Service Mindset
(a) (b) Skill Mindset
People Mindset
(c) (d) Process Mindset
Fig. 1.7 (b)
6. Which statement does NOT define DevOps?
Summary (a) DevOps is a framework and job title that focuses on structured processes to
This chapter gives an idea about DevOps and its operations. organize flow between the Development and Operations teams.
DevOps (a combination of two words such as "development" and "operations") is (b) DevOps is a movement or practice that emphasizes collaboration and
the combination of practices and tools designed to increase an organization's communication of both software developers and other Information
ability to deliver applications and services faster than traditional software Technology (IT) professional.
development processes. (c) Devops is about experiences, ideas, and culture.
DevOps is used to increase an organization's speed at time of delivering (d) DevOps is an activity of optimizing the development to operations value
applications and services. Many companies 'have successfully implemented stream by creating an increasingly smooth, fast flow of application changes
DevOps to enhance their user experience including Amazon, Netflix, etc. from development into operations.
There are various terminologies used in DevOps such as Container, Commit, 7. Whatis the difference between Continuous Delivery and Continuous
Agent, Deployment etc. Deployment?
Devops is the grouping of cultural philosophies, practices, and tools that (a) Continuous delivery is a manual task, while continuous
deployment is an
increases an organization's skill to deliver applications and services at high
automated task.
velocity: evolving and improving products at a faster pace than organizations a manual release to production decision, while
(b) Continuous delivery has
using traditional software development and infrastructure management to production.
continuous deployment has releases automatically pushed
processes. delivery includes all steps of software development life cycle;
(c) Continuous
There are various tools like Puppet, Ansible used as validation and testing.
for management of DevOps.
DevOps configuration is the evolution . and automation continuous deployment may skip few steps such
administration role, bringing automation to infrastructure of the
systems
(d) Continuous delivery
means complete delivery of the application to customer;
management and of the application in
deployment. continuous deployment includes only deployment
This chapter also shows how to install Linux 0S customer environment
and use its basic commands. every time you need to incorporate
Check Your Understanding 8. creates an extra merge commit
1. Identify the method that does least impact changes. (b) Git Push
methodology?
the. establishment of DevOps (a) Git Merge
(d) Git Fetch
(a) Waterfall Software Delivery (c) Git Fork
(b) Lean Manufacturing
changes that were made in a certain commit?
9. How will you find all
(c) Continuous Software Delivery
(d) Agile Software Delivery
the
id> (b) git diff-tree -d <commit id>
(a) git diff-tree -a
2. What is DevOps? <commit
(a) A small team of people (d) git diff-tree-s <commit id>
that own everything related to a particular service (C) gitdiff-tree -r <commit id>
(b) Developers performing Answers
operations
(c) Developers and Operations 9. (c)
team members working together 5. (a) 6.(d) 7. (b) 8. (a)
(d) None of above 2. (a) 3. (d) 4. (a)
1. (a)
Introductlon to DevOps
DevOps : MCA [Management - Sem. IV) 1.58
Practice Questions
QI Answer the following questions in short. 2...
1. What is DevOps?
2. How Lean Model is better than SDLC?
3. State various processes used to implement Agile.
4. What is ITIL?
5. What are various phases of DevOps lifecycle?
Version Control -GIT
6. Who are DevOps Stakeholders?
7. State any two goals of Devops
8. What is BDD?
Objectives...
9. What is Continuous Delivery? After learning this chapter you will be able to:
10. What is MTBF? Understand the concept of version control systems, GIT and its types.
11. What is MTTR?
Understand the basics of GIT and its installations.
12. What is Database Versioning?
Learn various commands of GIT.
13. State the best practices for version control for DevOps?
O Understand repository creation using GIT with branching concept.
14. What is Kernel?
15. What are environment variables?
16. What is YUM?
Q.II Answer the following questions. 2.1 INTRODUCTION TO GIT
1. What are various processes used to implement Agile?
Version Control Systems (VCS) are used by all software developers during all phases
2. Explain various phases of Devops Life cycle?
3. What are best practices to follows when using DevOps?
of project development. These version control system helps them to manage software
4 What are various advantages and disadvantages of DevOps?
code and it helps them to keep the history of all versions of all software code, projects,
5. What are various goals of DevOps? and objects. In software engineering all the software developers must communicate
6. Explain DevOps perspective of 1Infrastructure SEnvironment? with each other to get better outcome of it. Version control systems support version in
7. What are key differences between DevOps and Agile? collaborative framework so that all software engineers work efficiently. Without the
8. State some features of any one DevVOps Tools you like? use of VCS collaborations become challenging. The aim of this chapter is to give idea
9. What is Configuration Management? about version control system with Git, its history, installation and branching concept.
10. Write a short note on Configuration Management. an efficient way to
11. What are three approaches Before version control systems software developers did not have
of continuous integration and deployment? to work
collaborate on their code. Software developers had hectic time while trying
a
12. Explain architecture of Linux OS. code for
on the same code at the same timne. They improvised by mailing each other
13. Explain any four basic commands Linux,
in floppy disks as backups,
14. What are roles and responsibilities of
Linux DevOps Manager? sharing, they stored their code on USB sticks and physical
parts of a system it was
they made sure to work in small teams and work different
15. Explain any four Linux networking in
and troubleshooting commands. systems that could suit their
Q.III Write short notes on: manageable for small projects, but people needed large
1. Continuous Integration to the need for a version control system where developers
and Continuous Delivery (CI/CD). needs. These challenges led
a
2. DevOps Life Cycle. on code and keep backups of various versions of
could effectively collaborate
3. Architecture of Linux.
4. Linux Networking Commands. project. Version
have a basic understanding about
5. Environment Variables. But before moving into GIT, We should
Control Systems (VvCS) and Git.
(2.1)
DevOps : MCA [Management - Sem. IV] 2.2 Verslon Control- GIT
nevOps: MCA[Managemont Sem. IV)
2.3 Verslon Control GIT
2.2 WHAT IS GIT? Local Version Control Systems:
Git is a free and open-source Distributed Version Control System (DVcS) designed Many pcople's version-control method
of choice is to copy files into another directory
to handle everything from small to very large projects with speed and efficiency (perhaps a time-stamped directory, if they're clever).
This approach is very common
(Global Information Tracker). because it is so simple, but it is also incredibly error prone.
It is easy to forget
GIT relies on the basis of distributed development of software where more than directory you're in and accidentally write to the wrong file or copy over which
files you
one developer may have access to the source code of a specific application and can don't mean to.
modify changes to it that may be seen by other developers. To deal with this issue, programmers long ago developed local VCSs
that hada simple
GIT has the functionality, performance, security, and flexibility that most teams database that kept all the changes to files under revision control.
and individual developers need. Local Computer
Initially designed and developed by Linus Torvalds for Linux kernel development
in 2005. Version Database
Checkout
Every GIT working directory is a full-fledged repository with complete history and
full version tracking capabilities, independent of network access or a central File Versicn 3
server.
GIT allows a team of people to work together, all using the same files. And it helps
the team cope up with the confusion that tends to happen when multiple people Version 2
are editing the same files.
2.3 ABoUT VERSION CONTROL SYSTEMS AND TYPES Version 1
Version control is a system that records changes to a file or set of files over time so
that you can recall specific versions later. For the examples in this chapter, you wiìll
use software source code as the files being version controlled, though in reality you Fig. 2.1: Local Version Control Systemns
can do this with nearly any type of file on a computer. One of the most popular VCS tools was a system called RCS (Revision Control System)
many computers
If you are a graphic or web designer and want to keep every version of an image or (manages multiple revisions of files). which is still distributed with
files) in a
layout (which you would most certainly want to), a Version Control System (VCS) is a today. RCS works by keeping patch sets (that is, the differences between
very wise thing to use. It allows you to revert selected files back to a previous state, on disk; it can then re-create what any file looked like at any point in
special format
revert the entire project back to a previous state, compare changes over time, see who time by adding up all the patches.
last modified something that might be causing a problem, who introduced an issue Version Control Systems can be divided in categories,
2
and when, and more. Using a VCS also generally means that if you screw 1. Centralized Version Control System (CVCS)
things up or
lose files, you can easily recover. In addition, you get all this for very (DVCS)
little overhead. 2. Distributed Version Control System
Version control systems are a type of software tool that helps
in recording changes 1, Centralized Version Control System (CvcS)
made to files by keeping track of modifications made to the code. to collaborate
When we develop a
people encounter is that they need
software product, most of the time we collaborate as a group. So, The major issue with RCS is that Centralized Version
groups of software developers might be the thing is these developers on other systems. To deal with this problem,
located at different locations and each one of with as Cvs, Subversion,
were developed. These systems (such
them contributes to some specific kind of functionality or features. So, Control Systems (CVCSs) versioned files, and a number
to contribute a server that contains all the
to the product, they made modifications to the source code. So,
this version control and Perforce) have single For many years, this has been
system helps developers to efficiently of clients that check out files from
that central place.
communicate and manage (track) all the
changes that have been made to the source code along
with the information like who the standard for version control.
made and what change has been made.
Version Control GIT
DevOps: MCA (Management- Sem. IV] 24 DevOps : MGA Management- Sem. IV
2.5
Verslon Control GIT
one repository, and each user gets
Centralized version control systems contain'just
own working copy. You need to commit to reflecting on your changes in the Server Computer
their
repository. It is possible for others to
see your changes by updating. In CVCS, twO Version Database
are to make your changes visible to others which are, commit and
things required Version 3
update.
some major
Even though it seems pretty good to maintain single repository, it has
a
Version 2
drawbacks like,
a network to
It is not locally available: So, you always need to be connected to
any case of
High probability of losing data: Since everything is centralized, in
the central server getting crashed, it may result in losing the entire data of the
project.
Computer A Computer B
Computer A
File File
Version Database
File
Version Database Verslon Database
Version 3
Version 3 Version 3
Version 2
Verslon 2
Version 2
B
Computer
Version 1
Version 1
File 1
Version
Control System
Fig. 2.2: Centralized Version Control System Fig. 2.3: Distributed Version
over CVCS.
2. Distributed Version Control System (DVcs): you can have the following advantages
If you use DVCS,
& pull) are very fast because it only
needs to access the
Distributed version control systems contain multiple repositories. Each developer has All operations (except push need an internet
you do not always
their own repository and working copy. So, that means even you do some changes by not a remote server. Hence,
hard drive,
committing, other developers have no access to your changes. This is because commit connection. data
can be done locally without manipulating the
will reflect those changes in your local repository, and you need to push them in order o Committing new change-sets
to make them visible on the central repository. Similarly, when you on the main repository. they can share
update, you do a full copy of the project repository,
not get other's changes unless you have first pulled those changes
into your Since every contributor has case.
they want in any
repository. In DVCS, there are 4 main operations named as, commit, push, pull, and changes with one another if any of time, the lost data
can be easily
server gets crashed at point
update. If the central repositories.
one of the contributor's local
recovered from any
Version Control
DevOps: MCA [Management - Sem. IV] 2.6 GIT
DevOps : MCA(Management- Sem. IV
2.7
Verslon Control -GIT
2.4 DIFFERENCE BETWEEN CVCS AND DVCS in particular Linus Torvalds, the creator of Linux)
eome of the lessons to develop their own tool based on
Control and they learned while using BitKeeper. Some of
Following are the important differences between Centralized Version were as follows:
the goals of the new
system
Distributed Version Control.
Table 2.1 Speed
Centralized Version Control Distributed Version Control Simple design
Sr. No. Key
In CVs, a client needs to get In DVS, each client
can have a Strong support for non-linear development (thousands of parallel branches)
1 Working
as well and have
local copy of source from local branch Fully distributed
server, do the changes and acomplete history on it. Able to handle large projects like the Linux kernel efficiently (speed and data size)
commit those changes to Client needs to push the
changes to branch which will Since from year 2005, git is easy to use and yet retain these initial qualities. It is very
central source on server.
then be pushed to server useful for large projects, and its incredible branching system is very useful for non
repository. linear development.
In 2005, for the development of Linux kernel, Linus Torvalds needed a new version
2
Learning CVS systems are easy to learn DVS systems are difficult for
a newsystem from
Curve and set up. beginners. Multiple control system So he went offline for a week, Wrote revolutionary
commands scratch, and called it git. Fifteen years later, the platform is the undisputed leader in a
need to be
remembered. crowded field.
including Google
3 Branches Working on branches in Working on branches in Worldwide, many of the start-ups, collectives and multinationals,
difficult in CVs. Developer easier in DVS. Déveloper faces use Git to maintain the source code of their software projects. Some
and Microsoft, as
often faces merge conflicts. lesser conflicts. use Git via commercial hosting companies such
host their own git projects, others 2011).
DVD systems are workable (founded in 2010) and gitLab (founded in
4
Offline Cvs system do not provide gitHub (founded in 2007), Bitbucket was acquired by
registered developers and
Access offline access. offline as a client copies the The largest of these, gitHub, has 40
million
entire repository on their Microsoft for a whopping $7.5 billion in
2018.
local machine. as a version control system (VCS),
competitors) is sometimes categorized
git (and its and sometimes revision
a
slower as every DVS is faster as mostly user management system (SCM),
5. Speed CVS is
to. deals with local copy without
sometimes a source code
command need
control system (RCS) a survey of developers by Stack
communicate with server. hitting server every time. nmarket dominance is
The best indication of git's
6. Backup If CVS Server is down, If DVS server is down,
Overflow. a single repository
developers cannot work. developer can work. using was client server, so the code sits in
their local copies. Traditionally, version control System (CvS), Subversion and
Concurrent Versions
a central server. systems.
-0r repo-on Version Control (TFVC) are all examples of Client-server
2.5ASHORT HISTORY OF GIT Team Foundation where development is
a corporate environment,
The Linux kernel is an open-source software project of large scope. During the early A Client-server
VCS works fine
an in-house development team with good
by
years of the Linux kernel maintenance (1991-2002), changes to the software were is ndertaken you have a collaboration
involving
tightly controlled and work so well if and
passed around as patches and archived files. In 2002, the Linux kernel, project began connections. It does not independently,
network working voluntarily,
using a proprietary DVCS called BitKeeper. or thousands of developers, which is all typical with
Open
hundreds with the code,
In 2005, the relationship between the community that developed the Linux kernel and eager to try out new things
remotely, all such as Linux.
the commercial company that developed BitKeeper broke down, and the tool's free-. (OSS) projects
SOurce Software
of-charge status was revoked. This prompted the Linux development community (and
Version Control - GIT
Devops :MCA [Management - Sem. IV] 2.8 pevOps : MCA (Management- Sem, IV
2.0
Verslon Control - GIT
Tn
2.6 GIT BASICS the above example, all three cards represent
Bon select which version of the file we want different versions of the same file. We
Git Life Cycle: to use at any point
iump to and from to any version of time. So we can
of the file in the Git time continuum.
Following are stages of Git Life cycle: Git also helps you synchronize code between
1. Local working directory:. The first stage of a Git project life cycle is the local vour friend are collaborating on a
multiple people. So, imagine you
and
project. You both are working on the same
working directory where our project resides, which may or may not be tracked. iles. Now Git takes those changes you and your friend made independently project
git init git add merges them into a single "Master" repository. So, and
by using Git you can ensure you
both are working on the most recent version of the repository. So, you don't have to
Working Initialization worry about mailing your files to each other and working
Directory with number of copies of
the original file.
Staging Area
Git Workflow:
Before. we start working with Git commands, it is necessary that you understand what
it represents.
What is a Repository?
git push git commit
A repository a.k.a. repo is nothing but a collection of source code.
github Local
Repository There are four. fundamental elements in the Git Workflow: Working Directory,
Fig. 2.4: git Life Cycle Staging Area, Local repository and Remote Repository.
Remote Repo
2. Initialization: To initialize a repository, we give the command git init. With this Working Staging Local Repo
(MASTER)
Directory Area (HEAD)
command, we will make Git aware of the project file in our repository.
3. Staging Area: Now that our source code files, data files, and configuration files
are being tracked by Git, we will add the files that we want to commit to the
Git Add Git Commit Git Push
staging area by the git add command. This process can also be called indexing.
The index consists of files added to the staging area.
4. Commit: Now, we will commit our files using the git commit -m our message Git Merge Git Fetch
command.
Git is Distributed Version Control System.
Git helps you keep track of the changes you make to your code. It is basically the it
Pull
history tab for your code editor (With no incognito mode?). If at any point while
coding you hit a fatal error and do not know what's causing it you can always revert
back to the stable state. So it is very helpful for debugging. Or you can simply see
what changes you made to your code over time. a simple git Workflow
Fig. 2.6: Diagram of states.
a your Working Directory, it can be in three possible
If you consider file in with the updated changes are marked to
list1 = [1,2,3] list1 = [1,2,3] means the files
list1= [1,2,3] 1, It can be staged: Which not yet committed.
list1.append(4) list1.append(4) list1.pop() local repository but
print(list1) list1.append(5) print(list1) be committed to the with the updated changes
are not vet
print(list1) Which means the files
2. It can be modified:
stored in the local repository. you made to your file are
Which means that the changes
#iniialFile 3. It can be committed:
#addedALine #makingChanges
repository.
Fig. 2.5: A simple example of Version History of a File safely stored in the local
- Sem. Version Control GI IVI
DevOps: MCA [Management IJ 2.10 DevOps : MCA(Management- Sem, 2.11
Version Control-GIT
git Commands: 3. Tell git who you are.
git add is acommand used to adda file that is in the working directory to the staging Introduce yourself. Mention your Git username and email
address, since every
area. Git commit will use this information to identify you as
the author.
git commit is a command used to add all files that are staged to the local repository. $ git config --global user.name "YOUR_USERNAME"
git push is a command used to add all committed files in the local repository to the $ git config --global user.email "im satoshi@musk.com"
remote repository. So in the remote repository, all files and changes will be visible to $ git config --global --list # To check the
info you just provided
anyone with access to the remote repository. A. Generate/check your
mchine for existing SSH keys.
git fetch is a command used to get files from the remote repository to the local Using the SSH protocol, you can connect and authenticate to remote servers
repository but not into the working directory. and services. With sSH keys, you can connect to gitHub without supplying
git merge is a command used to get the files from the local repository into the your username or password at each visit.
working directory. If you did setup SSH, every Git command that has a link you replace it by:
git pull is command used to get files from the remote repository directly into the Instead of : https://github.com/username/reponame
working directory. It is equivalent to a git fetch and a git merge. You use : git@github.com: username/reponame .git
Process to place file in Git: [Note :You can use both waysalternatively.]
1. Make a gitHub Account. 5. Let us use Git.
Create a new repository on gitHub. Now, locate to the folder you want to placea
under Git in your terminal. (You can more learn about this in the link Create
repo -gitHub Docs)
First, let's create your user account $ cd Desktop/folder name
Initialize git:
And to place it under Git, enter:
touch README. md # to create
$
a README
file for the repository
$ git. init # initiates
an empty git repository
Once you start making changes on your files and you save them, the file won't my
match the last version that was committed to Git. To see the changes you just
made: Branches Tags
$ git diff # To show the files changes not yet staged my-patch-1
Revert back to the last committed version to the git Repo: myarb-patch-1
Now you can choose to revert back to the last committed version by entering:
$ git checkout. P Create branch: my from 'main'
OR for a specific file View all branches
$ git checkout <filename>
View Commit History: Fig. 2.8
You can use the git log command to see the history of commit you made to yellow banner, click Compare
& pull request
3. Above the list of files, in the
your files:
create a pull request for the associated branch.
to
$ git log Compare &
pull request
than a minte ago
Each time you make changes that you want to be reflected on gitHub, the P octo-repo had recent pushes less
to
following are the most common flow of commands: menu to select the branch you'd like
branch dropdown compare branch drop-down
$ git add. 4. Use the base use the
$ git status # Lists all new or modified files to be committed
merge your changes into, and then
you made your changes in.
$ git commit -m "Second commit"
menu to choose the topic branch
$ git push -u origin master
Devops : MGA(Management- Sem. IV
5. Type a title and description for your pull request. 2.15
Verslon Control GIT
10. Push (send) your changes
6. To create a pull request that is ready for review, click Create Pull Request from your copy of the repository,
command, origin refers to up to gitLab. In
To create a draft pull request, use the drop-down and select Create Draf the copy of the repository this
BRANCHNAME stored at gitLab. Replace
Pull Request, then click Draft Pull Request. With the name of your
branch:
Collaborating: git push origin BRANCHNAME
So imagine you and your friend are collaborating on a project. You both are 11. Git prepares, compresses,
and sends the
working on the same project files. Each tme you make some changes and (here, gitLab) are prefixed with remote data. Lines from the remote repository
like this:
push it into the master repo, your friend has to pull the changes that you Enumerating objects: 9, done.,
pushed into the git repo. Meaning to make sure you're working on the latest
Counting objects: 100% (9/9), done.
version of the git repo each time you start working, a git pull command is
the way to go. Delta compression using up to 10
threads
Compressing objects: 100% (5/5), done.
2.7 GIT COMMAND LINE
Writing objects: 100% (5/5), 1, 84. KiB 1.84
To add a new file from the command line: MiB/s, done.
Total 5 (delta 3), reused 0 (delta 0), pack-reused
1 Open a terminal (or shell) window. 0 remote:
remote: To create a merge request for BRANCHNAME,
2. Use the "change directory" (cd) command to go to your GitLab
project's folder, visit:
Run the cd DESTINATION command, changing DESTINATION to remote: https://gitlab.com/gitlab-org/gitlab//merge_requests/new?merge
the location of request%5Bsource_branch%5D=BRANCHNAME
your folder.
3. Choose a Git branch to work in. You can either: remote: To https://gitlab. com/gitlab- org/gitlab.git
Create a new branch to add your file into. Don't submit changes directly to [new branch]· BRANCHNAME-> BRANCHNAME branch 'BRANCHNAME set up to
the
default branch of your repository unless your project is very small and you're track 'origin/BRANCHNAME.
the only person working on it. Your file is now copied from your local copy of the repository, up to the remote
Switch to an existing branch. repository at gitLab. To create a merge request, copy the link sent back from the
4. Copy the file into the appropriate directory in your project. Use your remote repository and pastes it into a browser window.
standard tool
for copying files, such as Finder in mac OS, or File Explorer in Windows. 2.8 INSTALLING GIT
5. In your terminal window, confirm that your file is present
in the directory: Git has a very light footprint for its installation. For most platforms, you can simply
Windows: Use the dir command. copy the binaries to a folder that is on the executable search $PATH.
All other operating systems: Usè the ls command. You
should see the name a
Git is primarily written in C, which means there is unique distribution for each
of the file in the list shown. on a subpage of the ofiicial git site
supported platform. The git installers can be found
6. Check the status of your file with the git status are several installers available there for
command. Your file's name (https: //git-scm. com/downloads). There
should be red. Files listed in red are in your file system, but Git
isn't tracking them those who don't want to go through the hassle of doing the installation manually.
yet.
7. Tell Git to track this file with the git add FILENAME 2.9 INSTALLING ON LINUX
Command, replacing
FILENAME with the name of your file. Download For linux and Unix:
8. Check the status of your file again with It is easiest to install Gít on Linux using the preferred
package manager of your Linux
the git status command. Your file's
name should be green. Files listed in green are
tracked locally by Git, but still need distribution.
to be committed and pushed.
Debian/Ubuntu:
9. Commit (save) your file to your local copy your your release of Debian/Ubuntu
of project's Git repository:. For the latest stable version for
git commit -m "DESCRIBE COMMIT IN A FEW WORDS"
$ apt-get install git
DevOps : MCA [Management- Verslon
Sem. V])
2.16 ConttolGt wps: MGA (Mandgement Sanm. y
$ add-apt-repository ppa:git-core/ppa # apt update; apt install git tnstall homebrew if you don't already have it.
then:
Fedora: $ brew instali git
$ yum ínstall git (up to Fedora 21) Obtaining Source Release:
$ dnf install git (Fedora 22 and later) rt wou prefer to download the git code from source
its ar if you want the latest
of git, visit git's master repository. version
2,10 INSTALLING ON WINDOWS
As of this writing.,
the master repository for git saurcas is
1. Browse to the official Git website: https://git -5Cm,com/downloads the pub/software/scm director.
htto://git.kernel.org in
2. Click the download link for WindoWs and allow the download to
complete. Vou can find a list of all the
available versions at.
3. Browse to the download location (or use the download shortcut in your http://kernel.org/pub/softrare/sCmÍtit.
browser.
Double-click the file to extract and launch the installer. Tobegin the build. download the source code for version
L6.0 (or later) and unpack it:
Downloads $ wget http://kernel.arg/gub/softare/scs/it/gt-9.1.tar.gz
2.40,0
S tar xzf git-e.41.tar-gz
S
cd git-0.01
Doaad or Winlo Verify Git Installation:
. Type below
command on terminal:
-
S git. -version
Logos gt version 1.8.3.2
2,12 GIT ESSENTIALS
asGtCa Working with Git Commands
The Git Command Line:
Git is simple to use. Just type git. Without any arguments, Git lists its options and the
most common subcommand:
ng rt
Fig. 2.9
2.11 INITIAL SETUP
How to launch Git in Windows?
Git has two modes of use:
1 Abash scripting shell (or command line)
2. Graphical User Interface (GUI)
Launch Git Bash Shell:
To launch git Bash open the Windows
Start menu, type git bash and press Enter.
Download for mac OS:
There are several options for installing
git on mac Os. Note that any non-source
distributions are provided by third parties,
and may not be up to date with the latest
source release. Fig. 2.1Q: Git Commands
Version Control- GT MCA (Management-Sem, IV)
MCA [Management - Sem. IV 2.18 pevOps: 2.19 -
DevOps: Version Control GIT
.tmp
Viewing your commits:
• The full list of changes since the beginning of time, even when disconnected from all
networks:
S git log
$ git log --since=yesterday
$ git log --since=2weeks ts t
Stashing: Ounes t be cott
Git offers a useful feature for those times when your changes are in an incomplete
state, you aren't ready to commit them, and you need to temporarily return to the last
committed (e.g. a fresh checkout). This feature is named stash and pushes all your (ster (ret-) ital
checkout command to checkout the newly dowvnloaded remote branch. git log -oneline main,.origin/main
git checkout coworkers/feature_branch approve the changes and merge
TO them into your local main branch, use
[Note: checking out coworkers/feature_branch'.] following commands: the
You are in 'detached HEAD' state. You can look around, make experimental changes checkout main
git
and commit them, and you can discard any commits you make in this state without git log origin/main
impacting any branches by performing another checkout. Then we can use git merge origin/main:
If you want to create a new branch to retain commits you create, you may do so (now git merge origin/main
or later) by using -b with the checkout command again.
The origin/main and main branches now point to the same commit, and you are
Example: git checkout -b <new-branch-name> synchronized with the upstream developments.
The output from this checkout operation indicates that we are in a detached HEAD
state. This is expected and means that our HEAD ref is pointing to a ref that is not in 2.15.2 git PULL
sequence with our local history. Being that HEAD is pointed at the The git pull command is used to fetch and download content from a remote
cONOrkers/feature_branch ref, we can create a new local branch from that ref. The repository and immediately update the local repcsitory to match that content.
'detached HEAD' output shows us how to do this using the git checkout command: Merging remote upstream changes into your local repository isa common task in Git
based collaboration work flows.
git checkout -b local_feature_branch
Here we have created a new local branch named local_feature branch. This puts The git pull command is actually a combination of two other commands, git fetch
execute a git
updates HEAD to point at the latest remote content and we can continue development followed by git merge. In the first stage of operatian git pull will
to the local branch that HEAD is pcinted at. Once the content
on it from this point. fetch scoped
merge workfow. A new merge commit will be
Synchronize origin with git fetch: downloaded, git pull will entera
created and HEAD updated to point at the new commit
The following example walks through the typical workflow for synchronizing your
local repository with the central repository's main branch. How does the git Pull work?
runs git fetch which dowalcads content from the
git fetch origin The git pull command first remote
This will display the branches that were downloaded: Then a git terge is executed to merge the
specified remote repository.
new merge commit. Let us consider the following
ale8fb5..45e65a4 main -> origin/main content refs and heads into a local
process. Assume we have a
ale8fb5..9eSablc develop -> origin/develop the pull and merging
• [new example to better demonstrate origin.
branch) some-feature -> origin/some-feature a remote
repository with a main branch and
The commits from these new remote branches are shown as squares instead of circles cn remcte crgin
Main
in the diagram below. As you can see, git fetch gives you access to the entire branch
structure of another repository.
Origin /Develop
Origin / Main
Local onginmain Nain
repo
in your
Main branch
and a Remote
Repository with a Main the point where the
Fig. 2.15: the changes from
download all E. The git pull
Origin /Some Feature pull will example, that point is
in this situation, git separated. In this
Fig. 2.14: Synchronize origin with git fetch local and main branch
DevOps:MCA [Management- Sem. IV) Verslon Control-GIT
2.26 MCA (Managomont
Dovops : -Som. IV
Command will fetch the diverged remote commits which are A-B-C. Then 2.27
the nuh
cit pull discussion:
Verslon Control GIT -
process will create a new local merge commit containing
the content of the ne. Mat can think or
diverged remote commits. git pull as git's version
eunchronize your local of svn update. It is an easy way
Remote Origin /Main repository with upstream to
oxplains cach step of the pulling process, changes. The following
diagram
Origin /Main
|Origin / Main
Main
Local main
Fig. 2.16 (a)
In the above diagram, we can see the new commit H. This (b)
commit is a new merge
commit that contains the contents of remote A-B-C commits Origin / Main
message. This example is one of a few and has a combined log
git pull merging strategies. A --rebase option
can be passed to git pull to use a
rebase merging strategy instead of a merge
commit. The next example will show how a rebase pull works.
Assume that we are at a
starting point of our first diagram, and we have executed Main
git pull --rebase.
Remote Orngin / Main (c)
Fig. 2.18: Pulling Process
You start out thinking your repository is synchronized, but then git fetch reveals
that origin's version of main has progressed since you last checked it Then git merge
Local mnain immediately integrates the remote main into the local one.
Git pull and syncing:
Fig. 2.17
In this diagram, we can now see that a rebase pull does not create git pull is one of many commands that claim the responsibility of 'syncing' remote
the new H commit. content. The git remote command is used to specify what remote endpoints the
Instead, the rebase has copied the remote commits A--B-C
and rewritten the local
commits E--F--G to appear after them in the local
origin/main commit history. syncing commands will operate on. The git push command is used to upload content
Common Options: to a remote repository.
two
git pull <remote> The git fetch command can be confused with git pull comand. These
This command fetches the specified remote's copy are used to download remote content. An important safety distinction can
of the current branch and commands can be considered the "safe"
immediately merges it into the local copy. This is the same as be made between git pull and git fetch. The git fetch
git fetch <remote> fetch will download the remote
followed by git merge origin/<current-branch>. option while git pull can be considered unsafe. git
Alternatively, git pull will
git pul1 -no-commit <remote> content and not alter the state of the local repository.
This command is similar to the default invocation, fetches attempt to change the local state to match
the remote content but download remote content and immediately a
cause the local repository to get in conflicted
does not create a new merge Commit. that content. This ma unintentionally
git pull --rebase <remote> state.
Same as the previous pull instead of using git merge to integrate Pulling via Rebase:
the remote branch a linear history by preventing
with the local one, use git rebase.
option can be used to confirm
-rebase
git pull --verbose 1ne
Many developers prefer rebasing over merging. In this
This command gives verbose output during a pull which unhecessary merge commits. is even more like svn update than
a
flag
displays the content being Sense, using git pull with the -rebase
downloaded and the merge details.
plain git pull.
DevOps: MCA [Management- Sem. lIV] 2.28
Version
Control- GIT MCA [Managcment - Sem. IV]
DevOps :
2.29
Version Control - GIT
ln fact, pulling with --rebase is such a common workflow that there is a dedicatod When you merge two branches you can
configuration option for it: d sometimes get a
another developer unknowinglyboth work on the same part a
conflict. For example, you
git config --global branch.autosetuprebase always aoueloper pushes their changes to of file. The other
the remote repository. When you
sour local repository,
After running that command, all git pull commands will integrate via git rebase youl get a merge conflict. But Git has a waythen pull them to
instead of git merge.
Examples of Git Pull:
ne conflicts, so you can see both sets of changes to handle these
and decide which you want to
keep.
Small feature
The following examples demonstrate how to use git pull command in common
situations: -
repository's main branch using a rebase: 2.. Create a new local branch:
$ git branch name
git checkout main
git pull 3. List all local branches:
--rebase origin
$ git branch
This simply moves your local changes onto the top of what everybody else has
4. Switch to a given local branch:
already contributed.
$ git checkout branchname
2.16 BRANCHING 5. Merge changes from a branch into the local master
A branch is essentially a unique set of code changes with a unique name. Each $ git checkout master
repository can have one or more branches. The main branch is where all the changes $git merge branchname
eventually get merged into and it is known as the master (main)., Let's see how useful Integration with remote Repo:
repo.
Git branches are with an example. Push your local changes to the remote
remote repo to get most recent changes.
Assume that you need to work on a new feature for an application. To start work, Pul from
remote repo into your local repo, and put
you have to first create a unique branch. To fetch the most recent updates from the
them into your working directory:
So, while working you get a request to make a quick change that needs to go live
on the application today. But the thing is you $ git pull origin master
haven't finished your new feature. your local repo in the remote repo:
10 put your changes from
So, what you can do is, switch back to the master branch, make the change, and
$ git push origin master
push it live. Git Configuration File:
Then you can switch back to your new feature branch and finish your work. When are all simple text files.
Git's configuration files n
you're done, you merge the new feature branch into the master branch configuration settings.
and both "git/config Repository-specific
the new feature and that quick change are kept. gitconfig User-specific
configuration settings
DevOps: MCA IManagement - Verslon Control-GIT MCA[(Management - Sem.IV)
Sem.IV] 2.30 DovOps: 2.31
Version Control- GIT
The above diagram visualizes a repository
2.18 THE BRANCHES
a little feature, and one a
with two
isolated lines sof
development, one
for for longer-running
In Git, branches are a part of your everyday development process. A branch is a branches, it's not only possibleto work on both of feature. By developing them 1n
version of the repository that diverges from the main working project. It is a feature themin parallel, but also
the main branch free from questionable code. it keeps
available in most modern version control systems. The implementation behind Git branches
is much more lightweight
A Git project can have more than one branch. These branches are a pointer to a version control system models. Instead of than other
copying files from directory
snapshot of your changes. When you want to add a new feature or fix a bug, you Cit stores a branch as a rererence a
to commit. In sense,
to directory.
this a
spawn a new branch to summarize your changes. So, it is complex to merge the ts of a series of commitsit is not a branch represents the
container for commits. The history
unstable code with the main code base and also facilitates you to clean up your future is extrapolated through the commit relationships. for a branch
history before merging with the main branch. Refer the following image to clear idea is hranches are not like SVN branches. Git branches are an integral part your
of branching. astorvday workflow whereas SVN branches are of
only used to capture the occasional
Branch large-scale development effort.
Creating Branches:
Tt is
important to understand that branches are just pointers to commits. When you
create a branch, all Git needs to do is create a new pointer, it does not
change the
repository in any.other way. Ifyou start with a repository that looks like this:
Master Main
Fig. 2.22
Then, you create a branch using the following command:
git branch crazy-experiment
Branch 2
The repository history remains unchanged. All you get is a new pointer to the current
Commit:
Main
Fig. 2.20: Branching
This makes it harder for unstable code to get merged into
the main code base, and it
gives you the chance to clean up your future's history before
merging it into the main
branch.
Crazy Experiment
Little feature
Fig. 2.23
commits to it, you need to
new branch. To start adding
Note that this only creates the
checkout, and then use the standard git add and git comit
select it with git
Main
commands.
Creating Remote Branches: on
on remote branches. In order to operate
1ne git branch command also works repo
a remote repo must first be configured and added to the local
Temote branches,
config.
> git remote add
new-remote-repo https://bitbucket.com/user/repo.git
repo config
# Add remote repo to local crazy-experiment
<new-remote-repo
Big feature 8it push branch to new-remote-repo
# pushes the crazy-experiment
a copy the local branch crazy-experiment
to the remote
Fig. 2.21: A repository with two isolated lines S
command will push of
of development
repo <remote>.
.
Verslon :MCA (Management -Sem. IV]
DevOps: MCA [Management- Sem. IV 2.32 Control-GIT 2.33
Verslon Control - GIT
config --global user.emall
Deleting Branches: "[valid- emailj"
Bnemailaddress that will be assoclated
Once you have finished working on a branch and have merged it into the main code with each historymarker.
ait config -global color .ui auto
base, you're free to delete the branch without losing any history: Cot automatic
command line coloring for Git easy
for reviewing.
git branch -d crazy-experiment INIT:
SETUP and
However, if the branch has not been merged, the above command will output an error
Configuring user information, initializing and
message: cloning repositories:
error: The branch 'crazy-experiment' is not fully merged. If you are sure o git init
initialize an existing directory as a git repository.
you want to delete it, run 'git branch D crazy-experiment'.
This protects you from losing access to that entire line of development. If you really git clone (url]
Retrieve an entire repository from a hosted location via URL.
want to delete the branch (e.g. it's a failed experiment), you can use the capital -D
STAGE and SNAPSHOT:
flag:
Working with snapshots and the git staging area :
Practice Questions
QIAnswer the following questions in short.
1. What is Git?
2. State the types of version control systems.
3. What are three possible states of file in directory in git workflow?
4. What is the meaning of commands: push, commit?
5. Which command is used to check whether git is installed or not in Operating
system?
6. How to clone a remote repo to your current directory?
7. How to crate and switch local branch on git?
8. Explain any two commands for tracking change in git.
-
MCA (Management -Sem. IV]
DevOps: 3.2
Chef forConfiguratlon
Chef: Management
Need of
3.. If you
want to move to a new office and get
office, then system management the
same hardware
the new will will do all the work,
and software setup
at
so
v
happen, chef tool be used. Chef transfers infrastructure but runtime errors may
following Fig. 3.1. into code as
the shown in
Chef for Configuration BVT
Objectives...
After learning this chapter you will be able to:
D Understand the creation of server workstation, client and. repository for chef
configuration management.
Understand actual working handling of various chef commands for nodes and data
bags creation.
Staging AT Continuous Deployment Production
Fig. 3.1: Need of Chef
OVERVIEW OF CHEF; COMMON CHEF TERMINOLOGY (SERVER, Benefits of Chef:
3.1 WORKSTATION, CLIENT, REPOSITORY etc.) SERVERS AND 1. Speed up Software Delivery: When your infrastructure is automated all the
NODES CHEF CONFIGURATION CONCEPTS software requirements like testing, creating new environments for software
Chef is a declarative configuration management and automation platform used to deployments etc. becomes faster.
2. Risk Management: Chef lowers risk and improves compliance at all stages of
translate infrastructure into code. This enables a development and deployment
process with better testing, efficient
and predictable deployments, centralized deployment. It reduces the conflicts during the development and production
versioning, and reproducible environments across all servers. environment.
Increased Service Resiliency: By making the infrastructure automated it
Chef has Client-server architecture and it supports multiple 3.
platforms like Windows, occur. It can also recover from errors
Ubuntu, Centos, and Solaris etc. It can also be integrated with cloud
platform like
monitors for bugs and errors before they
more quickly.
AWS, Google Cloud Platform, and Open Stack etc.
us understand Configuration Management.
Before getting into Chef deeply let
. Cloud Acceptance: Chef can be easily adapted
can be easily
to a cloud environment and the
configured, installed and managed
For example,'a system engineer in an organization wants
to deploy or.update Servers and infrastructure
software or an operating system on more than hundreds automatically by Chef.
organization in one day. He can do it manually but with so many of -systems in the Cloud Environments: Chef
can run on different
errors and some 3. Managing Data Centers and
software may get crash while in progress. Sometimes is impossible you can manage all cloud and on premise platforms
it to revért back to Platforms. Under chef
previous version. Such issues can be solved with Configuration
Management. including servers. a for continuous
Configuration Management keeps all software Workflow: Chef provides pipeline
an organization and also repairs, deploys and hardware-related information of Emient IT operation andbuilding.to testing and all the way through delivery,
and updates whole application. With the Ceployment starting from
help of Configuration Management, it is helpful to
do such task with multiple system
administrators and developers who manage imany servers monitoring, and troubleshooting.
and their applications. Common Chef Terminology: cookbooks,
Chef is the important tool for configuration to and it stores
configuration
management in DevOps. 1, Chef Server: It contains all data related Chef-client gives
in Chef-client.
(3.1) which describes each node
ecipes, metadata
DevOps : MCA Manapement - Sem. NM Chet fot Centiguratton Mehng
perops : MCA [Managenent- Sem V
Node Objects:
3.4.2 Check Node details using Knife ANode Object in Chef is a jSON document that contains various pieces of information
Use the knife node subcommand to manage the nodes that exist on a Chef Infra
about a server, including:
Server. o
Node Name: A unique identifier for the node.
The following examples show how to use this knife subcommand: Environment: The environment the node belongs to (e.g., development,
1. List AlI Nodes: To list all nodes associated with your Chef Server: production).
knife node list Run List: A list of recipes and roles to be applied to the node.
2. Show Node Details: To view detailed information about a specific node:
Attributes: Custom data that describes the node's desired configuration.
knife node show NODE_NANE o Automatic Attributes: Information collected by the Chef client, like platform
Replace NODE_NAMEwith the name of the node you want to view. details, IP addresses, etc.
3, Search Nodes: You can use the knife search subcommand to search for nodes Policy: Information about the policy applied to the node.
based on specific criteria. For example, to search for nodes with a specific role:
ohai Data: System information collected by the Ohai tool.
knife search node 'roles: ROLE_NAME Tags: User-defined tags to help organize nodes.
Replace ROLE_NAME with the name of the role you are searching for.
Node Search:
4. Filter Nodes: You can filter the nodes displayed based on specific criteria using Node search in Chef is a powerful feature that allows you to find nodes based on
the knife node list command with filters. For example, to list nodes with a specific criteria. You can use node search to identify nodes with certain attributes,
specific environment:
roles, or any other data stored in the node object. The search results can then be used
knife node list 'chef_environment:ENVIRONMENT NAME to manage and manipulate nodes more efficiently.
Replace ENVIRONMENT_NAME with the name of the environment. The knife search command is used to perform node searches. Here's how
you can use
5. Node Attribute Values: Using the knife exec command,you can query and it:
display specific attribute values of nodes. For example, to display the value of a o To search for nodes with a specific role:
specific attribute for all nodes:
"roles:my_role"
knifeexec -E "nodes. all { In] puts n['attribute_name'] }" To search for nodes in a specific environment:
Replace attribute_name with the attribute you want to display. "chef_environment :my_environnent"
Remember to replace placeholders like NODE_NAME, ROLE_NAME, and ENVIRONMENT_NAME To search for nodes with a specific attribute:
with your actual node, role, and environment names. "my_attribute:attribute_value"
The knife command can be quite versatile. This command allows you to perform a wide o To search for nodes with a combination of criteria:
range of actions related to Chef Server management. To explore more options and "role:web server AND chef_environment:production"
more advanced search
subcommands, you can refer to the official Chef documentation or use the --help flag You can also use regular expressions, range queries, and
with any knife subcommand to get detailed usage information: techniques to narrow down your results.
large infrastructure with
knife SUBCOMMAND --help Node search can be particularly helpful when managinga
configurations
For example, many nodes, as it allows you to dynamically identify nodes and apply
based on their attributes and roles.
knife node show -- help ENVIRONMENTS, ADD
ENVIRONMENTS: HOW TO CREATE
3.5 NODE OBJECTS AND SEARCH 3.6
In Chef, nodes are representations of individual servers or systems that are managed SERVERS TO ENVIRONMENTS?
environment for development, testing, and
always a good idea to have separate
a
using the Chef Configuration Management tool. Nodes store information about the It is
separate environments to support an
state of a server, its attributes, run lists, environment, and other relevant details. You production. Chef enables grouping nodes into
interact with nodes to define how they should be configured and maintained. ordered development flow.
Here's a brief overview of node objects and how node search works in Chef.
DevOps : MCA [Management- Chof for Configuratlon Sem. IVI
Sem. IV 3.11 Managemon DevOps : MGA(Managomont 3.12 Chef for Configuratlon Management
For example, one environment may be called "testing" and another may be calloa
production". Since you don't want any code that is still in testing on your production 3.7.1 Create Roles
Method 1: In Chef Server directly
machines, each machine can only be in one environment. You can then have ,
configuration for machines in your testing environment, and a completely differont knife role create client1
8
configuration for computers in production. Add the run list èg. "recipe[ngínx1" under "run 1ist"
Additional environments can be created to reflect each organization's patterns and Save & exit
workflow. For example, Creating production, Staging, Testing, and Development The role will be created in Chef Server.
environments. Generally, an environment is also associated with one (or more) Example:
Cookbook versions. name."web_servers"
Default Environment: description "This role contains nodes, which act as web servers"
By default, an environment called"_default" is created. Each node will be placed into run_list "recipe[webserver]"
this environment unless another environment specified. Environments can be default_attributes 'ntp' =)
created to tag a server as part of a procesS group. =>
'ntpdate'
Create Environments:
An environment can be created in four different ways: ' disable' => true
Create a Ruby file in the environments sub-directory of the chef-repo and then
push it to the Chef Infra Server.
Create a JSON file directly in the chef-repo and then push it.to the Chef Infra Let ús download the role from the Chef server so we have it locally in a Chef
Server. repository.
>
o Using knife. knife role show client1 -d -Fjson> roles/clientl.json
Now, let us bootstrap the node using knife with roles.
Using the Chef Infra Server REST API. >
Once an environment exists on the Chef Infra Server, a node can be associated with knife bootstrap --run-list "role[webserver]" --sudo hostname
Edit the roles in Chef Server using following command.
that environment using the chef_environment method. >
knife role edit client1
3.7 ROLES: CREATE ROLES, ADD ROLES TO ORGANIZATION
A role defines specific patterns and processes across
nodes in an organization as Method 2: In local repo under chef-repo folder.
belonging to a single job function. >
viwebserver.rb
Each role has zero or more attributes. Each node has zero or more roles. Example:
name "web_servers"
When a role runs on a node, its configuration details are compared to the attributes of
that role. Then, the contents of the run-list of that role are applied to node's description "This role contains nodes, which act as web servers"
run _list "recipe[webserver]"
configuration details.
default_attributes 'ntp' =>
When running Chef Infra Clients, it combines its own attributes with
the run-lists {
contained in each assigned role.
'ntpdate' =>
Role data is stored in two formats: As Ruby file
that contains domain-specific
language or As JSON data. =>
How to use Roles in Chef?
'disable' true
1. Create a Role and add the cookbooks into it.
2. Assign the role into each node or bootstrap new
nodes using roles.
&
then upload to chef server using following commands:
3. Then run the list. $ knife role from file path/to/role/file
$ knife role from file web_servers.rb
DevOps :MCA [Management - Sem. Chot for Contiguratlon Managomon povOps : MCA (Managomont- Sem, IV
IV] 3.13
3.14 Chef for Configuratlon Managemont
Assigning Roles to Nodes: step 3: Edit a Node and Roles
>
knife node list oatalenb
$ knife node edit node_name h
de1
OR
# Assign the role to a node called server:
$ knife node run list add server 'role[web_servers)'
This will bring up the node's definition file, which will allow us to adda role to its
run list:
{ "name": "client1", "chef_environment": "_default", "normal":
{
"tags":|) }, "run_list": [ "recipe[nginx]" ])
For instance, we can replace our recipe with our role in this file: RoLE
('name": "client1", "chef_environment":"_default", "normal":{ "tags": [U}.
"run _list":["role[web_server]"])
Method3: Using Chef Autotmate UI. Fig. 3.3 (c)
Step 1: Create a Role Step 4: Run knife command fromn workstation.
Dt Mta Seegs
$ knife ssh "role :webserver» "sudo chef-client»
1, Default: default attribute is an attribute that does not have a value set on the
A Todelete a custom attribute, select the custom attribute and select
the"-" icon.
node. If a default attribute is not set in the default attribute file, the Chef-client 40 To change the value of a custom attribute, double click the value column
in the
will use a nil value for the attribute. You can override the default attributes list appropriate row and enter the new value.
just like any other attribute, which can also be set in the default attribute file. 44 Select File Revert to discard all your changes.
2. Automatic: An automatic attribute is set by the Chef-client node itself during the 12. Select File Save to save your changes.
Chef-client run. These server attributes are typically set based on information
gathered from the node such as the operating system type or plat form. Automatic 3.8.3 Defining Attributes in Cookbooks
attributes can be overridden like any other attribute, but they cannot be set in the In order to create an attribute file, you will first need to create a new file with ".rb"
default attribute file. extension. You can do this using any text editor. Once you have created the file, you
3. Normal: This is the most common type of attribute list and is typically used when will need to add the following code:
you want to set a specific value for an attribute on a node start. The value for a =
normal attribute can be set in the default attribute file, or it can be overridden on default["cookbook_name"]["attribute_name"] "attribute_value"
a per-node basis. Replacing "cookbook_name" with the name of the cookbook that contains the recipe
.vou wish to override and "attribute_name" with the variablename you wish to
4. Force_default: The value for this attribute list is always taken from the default
attribute file. If the force_default attribute is set on a node, any other values set override. The "attribute_value" will be used to set the value of the variable. Once you
for that node are ignored. This can be useful if you want to ensure that all nodes have added the desired code to the file, save it and then upload it to your Chef server.
in your environment have the same value for an attribute.
DATA BAGS: UNDERSTANDING THE DATA BAGS, CREATING
5. Override: An override attribute will take precedence over any other values that
have been set for an attribute, including the default value. This type of attribute is AND MANAGING THE DATA BAGS, CREATING THE DATA BAGS
often used when you need to quickly change the value of an attribute on a node 3.9 USING CLI AND CHEF CONSOLE, SAMPLE DATA BAGS FOR
run without having to edit the default attribute file. CREATING USERS
6. Force_override: A force_override attribute list overrides any other attribute
values, whether they are default values or override values. This type of attribute 3.9.1 Understanding the Data Bags
should be used sparingly, as it can make it difficult to track down the source of an Data bags are a way to store and manage global data that can be used
across nodes.
attribute value. They are essentially encrypted JSON data containers used to
store sensitive
to across nodes
3.8.2 Creating Custom Attributes information, configuration settings, or any data that needs be shared
User can create custom attributes for servers, device groups, customers, facilities, OS but shouldn't be exposed in plain text.
as
Data bags are commonly used to store items such Database connection
strings, API
Build Plans, and software policies. Custom attributes values are string values.
To add, delete, or modify the value of a custom attribute for a server:
keys, Passwords, and other Configuration data.
1. In the SA Client navigation pane, select the Devices tab. Chef Server
2. Select the All Managed Servers node.
Node A Node B
3. Select a server. Data Data
4, To view the custom attributes defined for the server, select Custom Attributes
from the View drop-down selector. This displays all the custom attributes defined
for the server. Data
5. Select Actions or right click the server and select Open. Bags
This displays information
about the server.
6. Select the Information tab in the navigation pane.
7. Select Custom Attributes in the navigation pane.
This displays all the custom
attributes defined for the server. Node B
Node A
8. To add a new custom attribute, select the "4" icon
and enter the name of the Shared, Global Data
custom attribute. Fig. 3.4: Data bags contain
DevOps:MCA [Management Sem. IV] Chef for Contiguration Management
3.17 DevOps s: MCA [Management- Sem. IV] 3.18 Chef for Configuration Management
3.9.2 Creating and Managing the Data bags D94 Sample Data bags for Creating Users
Following command shows how data bags work in Chef: Eollowing example shows how you can create
data bags to manage user accounts
1. Creating Data Bags using CLI: To create a new data bag, you use the knife 1sing Chef. This example will show how to create a data bag for user accounts,
Command Line tool or Chef APIs. Data bags are typically organized by name including their usernames, UIDs, and SSH keys.
similar to directories in a file system. 1. Create the Data Bag: Assuming you have the Chef Workstation set up,
here's how
knife data bag create BAG_NAMEG -NAME you can create a data bag for user accounts using the knife command-line
tool:
2. Creating Data Bag Items: Inside a data bag. you store individual items. Each itenm # Create the data bag
is a JSON object that contains the data you want to store. For instance, if you aré knife data bag create users
creating a data bag for database connection strings, each item might represent a 2 Create Data Bag Items: For each user, you will create a data bag item containing
different database. their information. Here is how you can create data bag items for two users, Alice
and Bob:
knife data bag create BAG_NAME ITEM NAME
# Create Alice's data bag item
3. Editing Data Bag Items: Once created, you can edit the data bag items using a text
knife data bag create users alice
editor or directly through the command line using the knife tool. knife data bag from file users alice.json
knife data bag edit BAG NAME ITEM_NAME # Create Bob's data bag item
4. Uploading Data Bags: After creating and editing data bags and their items, you knife data bag create users bob
upload them to the Chef Server using the knife command. knife data bag from file users -bob.json
knife data bag from file BAG _NAME ITEM_NAME.json
3. Populate Data Bag Item JSON Files: Here is example of JSON files for the
.alice.json and bob.json data bag items. These files contain information about the
5. Accessing Data Bags in Recipes: In Chef Recipes, you can access data bag items users, including their usernames, UIDs, and SSH keys.
and their content. These items can be used to configure resources within your
cookbooks. alice.json:
Following is JSON code:
# Load a data bag item
=
my_data data_bag_item('BAG NAME', 'ITEM NAME') "id": "alice",
# Access attributes within the data bag item "username": "alice",
db_host = my_data[ 'database']['host'] "uid": "1001",
"ssh _keys":
db_user = my_data[' database']['username' ] ..
Data bags are especially useful for separating sensitive data from Cookbooks and "ssh-rsa
AAAAB3NzaC1y¢2EAAAADAQABAAABAQ.
AAAAB3NzaC1yc2EAAAADAQABAAABAQ
configurations, providing better security and separation of concerns. However, it is "ssh-rsa
important to note that data bags are not inherently encrypted. They can be optionally
encrypted to enhance security. When -encrypted, data bag items can only be }
decrypted by nodes that have the decryption keys. bob.json:
Data Bags are treated as Global variables like JSON data. They. are indexed for {
searching and accessed during search process. We can access JSON data from Chet. ,
"id": "bob"
For example, a data bag can store global variables such as an app's source URL, the "username": "bob",
instance's hostname, and the associated stack's VPC identifier. "uid": "1002",
"ssh_keys":
3.9.3 Creating the Data Bags using Chef Console EAAAADAQABAAABAQ..."
Log in to the Chef Web console. AAAAB3NzaC1yc2
1 "ssh-rsa AAAAB3NzaClyc2EAAAADAQABAAABAQ..
2 Navigate to "Policy" and then "Data Bags". "ssh-rsa
3. Click the "Create New Data Bag" button.
4. Enter the name of the data bag and click "Create Data Bag"., public keys for the
users.
5. Inside the created data bag, click the "Create New Item" button. keys field contains the SSH
In these JSON files, the ssh public keys you want to use.
6. Enter the item's name and provide the necessary data in JSON format. You can replace the example
keys with the actual
7. Click "Create Item" to save the data bag item.
DevOps : MCA[Management - Sem. Chef for Configuration Management DevOps : MCA(Management Sem,
I
.
IV 3.19 3.20 Chef for Configuration Management
Using Data Bag Items in Recipes: You can use the data bag items in yóur Chef It helps IT teams define and maintain the desired state of servers, applications,
recipes to create user accounts and set up their SSH keys. Here is a simplifed and systems, ensuring that they are configured correctly and consistently over
example of how you might apply the data bag items: time.
# In a recipe Chef uses code-based "recipes" and "cookbooks" to define how resources should be
Users = data_bag('users') configured and it can handle tasks such as installing software, managing user
Users.each do user_id| accounts, and configuring network settings. This automation helps streamline the
user = data bag_ item('users ', user_id) process óf managing complex IT environments, reducing manual errors and
username =
user['username'] enhancing efficiency.
=
uid user['uid'] Check Your Understanding
ssh_keys = user['ssh keys')
1. What is Chef?
user username do
(a) A software recipe book (b) A configurationmanagement tool
uiduid
(c) A programming language (d) A cloud computing platform
home "/home/#{username)"
2. Which component of Chef is responsible for storing and managing configuration
shell "/bin/bash"
manage_home true data?
(a) Chef Workstation (b) Chef Node
end
(c) Chef Server (d) Chef Client
directory "/home/#{username}/.ssh" do
a
Owner username 3. What is the primary purpose of Chef recipe?
group username (a) To define server hardware specifications.
mode 0700' (b) To install software packages on nodes.
end (c) To manage user authentication.
file "/home/#{username}/. ssh/authorized _keys" do (d) To create virtual machine instances.
contentssh_keys.join("\n") 4. In Chef,what is a "Cokbook"?
username resources.
Owner
(a) A collection of recipes, templates, and
group username SSH keys.
.(b) A directory for storing
mode. '0600* Server.
(c) A configuration file for the Chef
end
(a) Atool for managing databases. Server?
end
runs on nodes and interacts with the Chef
The above example shows how you can create user accounts and set up their SSH keys 5 Which Chef component
(b) Chef Server
based on the data bag items you created. (a) Chef Workstation
(a) Chef Client
Summary (c) Chef Node
"bootstrapping" refer to in the context ofChef?
6. What does
In the context of Configuration Management in IT and software development, a over an open flame.
"chef" refers to a popular open-source tool called Chef. (a) Grilling recipes
Chef Server.
Chef is used to automate the deployment, management, and configuration of (b)
new server and connecting it to the
Initializing a
software and infrastructure in a consistent and scalable manner. (c) Deploying virtual machines.
(d)Encrypting sensitive data.
DevOps : MCA [Management -
Sem. IV] Chef for Configuration Management -
V
3.21 DevOps : MCA(Management Sem. 3.22 Chef for Configuration Management
7. What does the term "role" represent in Chef? How does the Chef Client work on nodes?
5
(a) A user's job title. 6. What are roles and how do they function in Chef?
(b) A type of cookbook. 7. Explain the use of attributes in Chef.
(c) A specific attribute of a node. How does data bag work in Chef and when should it be used?
(d) A way to define a server's function and configuration.
-9. Explain the process of searching for nodes in Chef.
8. How does Chef use the concept of "idempotence" in its recipes?
10. What is the difference between a cookboðk and a recipe in Chef?
(a) To create complex data structures.
11. How does Chef handle idempotence?
(b) To ensure that resources are only configured if necessary.
12. How do you manage dependencies between cookbooks in Chef?
(c) To manage database schemas.
13. Explain the çoncept of Environments in Chef.
(d) To handle user authentication.
14. How does Chef ensure security in managing sensitive data?
9. What is the purpose of Chefs "attributes"?
15. How do you integrate Chef with Version Control Systems?
(a) To store user credentials.
Q.III Write short notes on:
(b) To define the physical location of servers.
1. Need of Chef
(c) To configure the behavior of recipes and cookbooks.
2. Benefits of Chef
(d) To manage virtual machine instances.
3. Cookbooks
10. What is a "data bag" in Chef? 4. Data bags
(a) A container for storing encrypted data.
5. Organization set up
(b) A type of cookbook.
6. knife command
(c) A configuration file for Chef Server.
(d) A tool for managing databases.
Answers
1. (b) 2. (c) 3. (b) 4. (a) 5. (d) 6. (b) 7. (d) 8. (b) 9. (c) 10 ()
Practice Questions
Q.I Answer the following questions in short.
1. What is Chef?
2.What is a recipe in Chef?
3. What is bootstrapping in Chef?
4. What is an attribute in Chef?
a
5. What is role in Chef?
6. How does Chef ensure idempotence?
7. What are data bags in Chef?
8. How does Chef handle dependencies between cookbooks?
9. What is the role of Chef Server in the configuration process?
Q.II Answerthe following questions.
1 What is Chef and how does it work?
2
What are cookbooks and recipes in Chef?
3. How does bootstrapping work in Chef?
4. Explain the role of Chef Server in configuration management.
(Managomont -Sem. IV]
Dovops:MCA 4.2
Docker - Contalners
In this chapter, we will provide an introduction to Docker and cover some of the key
concepts and terminology used in the Docker ecosystem.
Docker is a tool designed to make it easier to create, deploy, and run applications by
Objectives... 1using containers. Containers allow a developer to package up an application with all
of the parts it needs such as libraries and other dependencies, and ship it all out as
After learning this chapter you will be able to: one package.
O Understand how Docker build, test, and deploy applications quickly. With Docker, developers can create containers that include all the necessary
Understand how to create Docker images. dependencies, libraries, and configuration needed to run an application, and then
Understand the Docker networking. easily deploy and. scale those containers across different environments, from
Learn how to use volumes for persistent storage using Docker. development to production.
Understand how to tag images. Docker Docker Docker
Container 1
Container 2 Container 3
Understand the working of Docker hub.
4.1 INTRODUCTION
When we are looking for a containerization,solution that provides maximum Bins/Libs
Bins/Libs Bins/Libs
compatibility in each environment with little or no configuration changes then
Docker isa good solution that enables us to create a snapshot of our application and
all its dependencies. Then we deploy this same snapshot in development, testing, and Docker Engine
production. In this chapter, we are going to learn Docker basics with networking
.concepts.
4.1.1 What is Docker? Host OS
Infrastructure
Build
Fig. 4.2: Container
C:\>docker
:
version Usernane: sheetalbhalgat
Client Password
Login Succeeded
cloud integration: v1.e.24 your account.
access
Version: 20.10.17 grants your terninal coplete accessto
API version: 1.41 tnE In withh your
Ps witha lialted-privilege.personal
token. Learn rt t tts://docs.bcter.calxtéSs-tokns/
Go version: go1.17.11 security,
Git conit: 100c701
Built: Mon Jun 6.23:09:02 2022 Fig. 4.5 (c)
oS/Arch: windowslamd 64
default a registry.
Docker logout: Log out from Docker
Context:
Experinental : true 4.
Server: Docker Desktop 4.10.1 (82475) $ docker logout
Veesion: Docker Hub for images.
API version:
20.10.,17
1.41 (minimum version 1.12)
5. Docker Search: Search the
Go version: go1.17.11
[options] Term
$ docker search
conit:
images with a name containing
G1t a89b842
Built: Mon Jun 6 23:01:23 2022
image by ID: Below example displays
0S/Arch:
Experinental :
linux/amd64
false
Search
containerd:
Version: 1.6.6 ubuntu'.
GltComnit: 10c12954828e7c7c9b6eDea9bec02b01407d3ae1
runc: $docker search ubuntu
Version: 1.1,2
GitComnít: V1.1.2-0-ga916309
:
docker- init
Version: 0.19.0
GltComnit: de40ade
Fig. 4.5 (a)
-
Docker-Contaliners
: MCAAanagement -Sem
M]
DevOps 4.12
411
DevOps : MCA [Manngement-Sem.
M
Fig. 45 (d)
Tag an image referenced by Name and Tag: To tag a local image with the name
6: Docker Pull: Pull an image or a repository froma registry
"ubuntu" and tag "latest" into the "sheetalbhalgat" repository with latest":
$ docker tag ubuntu:latest sheetal bhalgat/ubuntu : latest
$ docker pull [options] Name[ : Tag]
a or set of imar
Pull an image from Docker Hub: To download particular image, How to do "hello world" in Docker?
a
(Le., repository), use Docker Pull. If no tag is provided, Docker Engine uses the $doc ker run -1t hello-wor ld
:latest tag as a default. Thís command pulls the ubuntu:latest image:
$ docker pull ubuntu l eplett
\>docker pull ubuntu
C: 0nLI
gst: sha 1afetíalaetl shetnttFSae
Using default tag: latest $tas DunJnated ver ingr
r
telle-rld ltst
latest: Pulling from líbrary/ubuntu hás Ssegs has tht y stallt ion pei t: be rig caretly
Zabeste27e7f: Pull complete ,
Deskar tucA the follning stigs:
Digest: sha256:67211c14fa74fe78427ccs9d69a7fa9aeffle2Beal18ef3babc295a042tat421 I gnerate thás
1
Re ectes cliet trt ac tad th
Status: Downloaded newer inage for ubuntu:latest 1. Boctr ta
e
lld the talls tsge fras he Drte
docker. io/l ibrary/ ubuntu:latest kgr tar tad (artainr fraa tat igs Seh
e
s
th
J
ta
Fig. 45 (e)
esecdsl that Puts that atgut ts the orr tíet, j
et it
4 he Dchar dasn strd
7. Docker Push: Push an image or a repository to a registry.
Sdocker push (OPT IONS] NAME :TAG] Ore atiris, tu
(
u uta conteter ath
:
67885e448177: Pushed
Step 4 List and Check image.
ec75999a0cbl: Pushed
$ docker images
REPOSITORY SIZE
65bdd5Oee76a: Pushed fd71590fObf1256efcofla62e7d
TAG IMAGE ID CREATED
sha256:54567b1dc80c7e463f9cada74da121d
18.04: digest:
<none> <nones 401153215cb6 335MB
25 seconds ago cf89e2 size: 3025
[Managoement- Sem.IM
Docket Containers perOps : MCA 420
DevOps : MCA [Management - Sem V 419
authenticating to multíple Dockr Containers
registries, you must repeat
Step 9 : Check the image in a web browser each registry. the command for
aws ecr
1 Go toO https:/ fretistry.sb.upf.edu/ get-login-password
-rezicn regicn | docker
2. Login with your duster credentials. 1ogin -Username ANS s-passward-stdin
aws acCOunt_id.dkr.ecr.rezion.amazcnas.com
Docker Registry Frontend
Step 2
:
If your image repository doesn't edist in
the registr you intend to push to
yet, create it.
eten 3 :
Details for repository: info'centos Identify the local image to push.
Run the docker images commard to list
ECreted LAer Decker versien the containe imazes cn your
system.
docker images
You can identify an image wita the regcsitary:tag
7eta 17062ce alue or the image ID
1706.2ce
in the resulting command cutgut
Step 4 : Tag your image with the Amazon ECR registy, repcsiter, and
optional image tag name combinaticn to se.
E'page 10. Next
The registry fornat is aws_account iddrerS-st-2amazonascom.
The repository nane shculd match the repcsitory that you reated for
Fig. 4.8 your image. If you omit the image tag, we assume that the tag is atest.
Step 10: Convert Docker image to singularity image Example: The fcIcwing eranple tags a local inage with the D
Finally, if we have uploaded a Docker image, it will be necessary to convert
the file format (from 0CI to SIF). You can do it directly from compute
e9ae3c220b23 as as acccunt_iddrecs-west-2 am2onawscom/nj
repository:tag.
nodes: eSae3220623 zNS_account_iddc.ecr.g-west
S cd home/user/ docker tag
2.amazonaws.com/uy-repcsitory:tig
S singularity pull -ocker-login docker://registry.sb.upf.edu/<research
Step 5 :
Push the image sing the docker push command
group>/image> iddcecS-wSt-2amazonaas.cOm/ay
docker push aws 2ccount
4.4.22 Uploading Docker Image in AWS ECS
repository:tag
The Amazon ECR repository must exist before you push the image. Amazon ECR also
Step 6 (Optional): those tags to
provides a way to repicate your images to other repositories, across Regions own tags to your image and push
in Apply any additional
registry and across different accounts, by specifying a replication Step 4 and Step 5.
configuration in Amazon ECR by repeating
your private registry settings.
Step 1 : Authenticate your Docker client to the Amazon ECR 4.4.3 Understanding the Containers is isolated from
registry to which you process running on a bast machine thatleverages kernel
intend to push your image. container is a sandbaxed machine. That Isclation
Authentication toYens must be obtained oner processes running on that host have been in Linux
for a long
for each registry used, ana
ui cgroupsopen_in_new, features that
tokens are valid for 12 hours. espaces and easy to use.
capabilities approachable and stop, move, or
To 2uthenticate Docker to an
Amazon ECR registry, run ume. Docker makes these You can create, start,
login-password coOmmand. When passing the aws ecr b runnable instance of an image. can connect a contalner to one or
the authentication token n container isa or CLI. You on íts current
the docker login command, use the the Docker APl
value AWS for the username ete a container usine or even create
a new image based
specify the Amazon ECR t more networkS, attach storage to ít,
registry URI you want authenticate to. I
to
state.
4.21
Docket- Containers DavOps : MCA (Managoment- Sem. 4.22
Docker Containers
DevOps : MCA [Management Sem. IV) work?
other containers and its host How does Docker Compose
well isolated from
By default. a container is relatively compose is a yaml file which we can
a container's network, storage, or other . Docker in configure different types
isolated of services.
machine. You can control how or the host machinc.
a
Then with single command all contaíners will be built and
underlying subsystems
are from other containers from you provid,
up. fired
as well as any contiguration options There are three main steps Involvedin using compose:
A container is defined by its image any changes to its eti
Whena container is removed, Generate a Dockerfile for each project.
to it when you create or stat it.
that arent stored in persistent
storage disappear. Setup services in the docker - compose .ynl file.
4.4.4 Running commands in Container Fire up the containers.
How does the docker run command work?
runs an ubuntu container, attaches interactively to
voue 4.5 CUSTOM IMAGES
• The following command
pockerfile, Images, and Containers:
local command-line session, and runs bin/bash.
Dockerfile, Docker Images, and Docker Containers are three impertant terms that you
$ docker run -i -t ubuntu bin/bash
you are using the need to understand while using Docker.
When you run this command, the following happens (assuming
default registry confguration):
your configured Buld
1 If you do not have the ubuntu image locally, Docker pulls it from
registry. as though you had run docker pull ubuntu manually.
2. Docker creates a new container as though you had run a docker container create
Docker File Docker Image Docker Cortairer
command manually.
Fig. 4.9: Docker file, Images, and Cantainers
3. Docker allocates a read-write filesystem to the container, as its final layer. This
As you can see in the above diagram when the Dcckefile is built, it
becomes a Docker
allows a running container to create or modify files and directories in its local
image, and when we run the Docker image then it Enally becomes a Docker containe:.
filesystem.
(a) Dockerfile: A Dockerfile is a text document that contains
all the comnands that a
4. Docker creates a network interface to connect the container to the default
to aSsemble an image. So. Docker can build
network, since you didn't specify any networking options. This includes assigning user can call on the command ine
an IP address to the container. By default, containers can connect to external by reading the instructiers frum a Dcckerfle Ycu can se
images automatically
to execute several ccmmand-ize
networks using the host machine's network connection. docker build to create an automated buld
S. Docker startS the container and executes /bin/bash, Because instructions in succession.
the container is a Docker image can be compared to a template
running interactively and attached to your terminal (due to the -j
and -t flags). (0) Docker Image: In Layman's terns, are the
you can provide input using keyboard containers So, these read-cnly templates
while Docker logs the output to your that is used to create Docker can use docker run command to run
terminal container. You
building blocks of a Docker are stored in the Docker
6. When you run eat to terminate the /bin/bash a container. Decker images
command. the container stops but the image and create cr a pubie repesitory like
a
is not rermoved. You can start it again or remove a local repcsitory
it. Registry. It can be either user'susers colaberate in building an appicatian.
multiple to
4.45 Running Multiple Containers Docker hub which allows
is a unning instance cf a Docker image
as
container
With Docker compose, you can configure
and stat multiple containers (°) Docker Containerr Docker run application Sa, these are basically
with a singe package needed to the
yanl file. they hold the entire which is the ultimate utility
For example, assume applications created from Docker inages,
that you are working on a project the ready
Python for AL/ML, Node JS for real-time that uses a MySQ atav of Docker. docker image is
processing, and .NET coataines, a customired
would be cumbersome to setup
such an environment for serving API's. It ueT to run your application in a docker
instructions that instal specific
can do this with compose. for each team member. Do docker image inchudes
Ceated. This customized container.
copy the code into the docker
ckages and
-
DevOps :MCA
[Management Sem. IV]
Docker -Contalners 4.24
DevOps : MCA [Management - Sem. IV 4.23
ADD Or COPY instruction:
Docker-Containers
3.
are as followS:
Requirements to create Custom images The COPY and ADD instructions are
copy data into
Webdock cloud Ubuntu instance (18.04 or later) container. The COPY instruction is onlyused to the docker
to docker container while used to copy data from docker
You have shell (SSH) access to your server host
the ADD instruction can copy data from docker
Docker installed on Ubuntu instance host and web as well.
Clone of a demo node.js project Copy the source code into the docker container
using the COPY instruction
as follows:
4.5.1 Creating a Custom Image COPY
Step 1
:
Writing Dockerfile for custom docker image. This instruction will copy all the data from the working directory
Docker builds the docker image by reading the instructions
from a text file Br
h
of the
a Dockerfile to build the docker image docker host to the working directory of the docker container.
default Docker looks for file named
Dockerfile consists of instructions that are used to customize the docker image 4. RUN instruction:
go to the root The RUN instruction is used to install new packages or run some shell
We will write a Dockerfile for a node.js application. For this, first
directory of a node.js project. commands in the base docker image. For example, in order to install npm
packages, the RUN instruction will be used as follows:
$ cd node-app
npm
Create a Dockerfile using the following command in the terminal. install
RUN
The RUN instruction will run the command in the shell of the docker
touch Dockerfile
$
container.
Open the Dockerfile in your favorite editor.
5. EXPOSE instruction:
$ nano Dockerfile expose a port of a container. The port on
The EXPOSE instruction is used to as
Dockerfile Instructions: application runs can be exposed using the EXPOSE instruction
which the
Format of Dockerfile instruction:
follows:
INSTRUCTION arguments
EXPOSE 3000 will be
The instructions in Dockerfile are not case sensitive but it is convention to use on
port 3000 of the docker container
Now the application running using this docker
UPPERCASE letters for instructions. Dockerfile builds the docker image by when a container is launched
accessible from docker host
running the instructions in the order they are specified in Dockerfile.
1. FROM Instruction: image.
ENTRYPOINT instructions:
A Dockerfile alwaysstarts from a FROM instruction which specifies which 6. CMD and are used to execute the shell
ENTRYPOINT instructions container starts.
base image will be used to create the custom docker image. For example, if The CMD and container when the docker
you want to create a custom docker image for node.js application, then the commands inside the docker shell command that
used to provide the
ENTRYPONT instruction is ENTRYPOINT for docker is
node base image will be used as follows. The starts. The default
FROM node : 14 runs when the container
/bin/sh -c. the arguments passed to the
If you do not specify a version tag, by default it will use the node image wit instruction is used to define
the latest tag. The base docker image will be pulled from DockerHub While the CMD
command inside the docker
not available locally. shell command. -c node index.js
run the /bin/sh CMD instruction.
2. WORKDIR instruction: In order to use the following
Dockerfile provides WORKDIR instruction to set Container at runtime,
instructions and their
the working directory. CMD ["node", "index. js"]
important Dockerfile
WORKDIR /app the
e tolowing table contains
The above instruction will set
the working directory app inside the
container. All the remaining instructions will be executedto explanation.
in this directoy
TManagement - Sem. IVi
DevOps :MCA[Management - Sem. IV 4.25
Docker- Containers :
DevOps: MCA
4.26
Example: Chow both running and stopped containers: The docker ps command
only shows
Assign a name and allocate pseudo -TTY(. -name, -it) nunning containers by default. To see all containers, use the -a gption.
This example runs a container named ubuntu-container using the ubuntu:latest $ docker ps -a
image. The -it means interactive terminal and --name specifies the name of the C:\tesp»docker ps -3
container. It instructs Docker to allocate a pseudo-TTY connected to the container's CaTAINER ID DGE OEATES
6af6338c16e ubuntu
stdin. It creates an interactive bash shell in the container.
S docker run --name ubuntu-container Fig. 4.11
-it ubuntu
C:\temp»docker run --name ubuntu-container -it ubuntu
(b) Stop Docker Container: Stop one o
more running containers. The main process
rootg6afds38c16e3:/# inside the container will receive SIGTERM, and after a grace period, StGKTLL.
$ docker stop [OPTIONS] CONTAINER (cONTAINER...]
Fig. 4.10 Stop a running container by using following command:
4.5.3 Publishing the Custom Image $ docker stop ubuntu-container
• To publish custom image, user will need to create an account on (c) Start Docker Container: Start one or more stopped containers.
signup webpage. Here user will provide a name, the Docker Hub
password, and email address for his $ docker start [OPTIONS] CONTAINER [CONTAINER...]
account.
Once user has created the account, Start a stopped container by using following command:
he can push the image that he has previously
created, to make it available for others to use. $ docker start ubuntu-container
To do so, user will need
TAG of"my-docker-whale"
image. the ID and the (0) Restart Docker Container: Restart one or more containers.
Run again the "docker images" CONTAINER [CONTAINER...]
command and note the ID and $ docker restart [OPTIONS]
image e.g. a69f3fSela31. the TAG of his Docker Restart a Docker container by using following command:
Now, with the following
command, user has to prepare our $ docker restart ubuntu-container
journey to the outside world Docker Image for
name on the Docker Hub (the account name part
of the command is user account ls (e) Remove Docker Images:
profile page):
docker tag a69f3f5ela31 Docker rmi: Remove one or more images.
accountname/my-docker INAGE (INAGE..]
Run the"docker images" -whale:latest $ docker rmi [oPTIONS)
command and verify to remove a Docker image
Next, use the "docker login the newly tagged image. emove Docker
a image by id: Below example shows how
command to log into
line. The format for the login the Docker Hub from by id.
command is: the commanu
docker logín -username = $ docker rmi b4f2cd35d4d4
When prompted, enter user yourhubusername
--email=youremail@provider•
Now user can
password and press enter |C:\temp>docker rai baf2cd35d4d4
push the image to the key, Untagged: sha256:b4f2Cd35d4d4e2ccc676e03ffaaff12961bajec32SF2634cbes74cc533e7f4
sheetalbhalgat/ubuntu:latest
newly created repository:
docker push accountname/my-docker-whale eieted:
Fig. 4.12
-
MCA [Management
A
Sem. IV)
Docker - DevOps: 4.30
MCA [Management - Sem. IV]
Contalners
DevOps : 4.29 Docker-Containers
Flexibility
4.6 DOCKER NETWORKING Cross Platform
4.6.1 Introduction
First we should understand the Workflow of Docker.
Container
calability Decentralized
Staging
Container Docker
server
Container Container
Docker User-Friendly
image
Network
Support
Docker Docker
file hub
Fig. 4.14: Goals of Docker Networking
Project code Container
Docker iv) Decentralized: Docker usesa decentralized network, which enables the capability
container Production to have the applications spread and highly available. In the event that a container
server Container
or a host is suddenly missing from your pol of resources, you can either bring up
Virtual
Container
machine Container an additional resource or pass over to services that are still available.
(v) User-friendly: Docker makes it
easy to automate the deployment of services,
Fig. 4.13: The Workflow of Docker making them easy to use in day-to-day life.
use Docker
As you can see in the above diagram, a developer writes a code that stipulates (vi) Support: Docker offer's out-of-the-box
support. So, the ability
application requirements or the dependencies in an easy to write Docker File and this functionality very easy and straightforward
Enterprise Edition and get all of the
Docker File produces Docker Images. So, whatever dependencies are required for a use.
makes Docker platform very easy to
particular application are present in this image.
Now, Docker Containers are nothing but the runtime instance of Docker Image. Thesé 4.6.1.2 Types of Docker Networks as:
networks such
images are uploaded onto the Docker Hub (Git repository for Docker Images) which There are various kinds of Docker
contains public/private repositories. Bridge Network
From public repositories, you can pull your image as well and can upload own images Host Network
onto the Docker Hub. Then, from Docker Hub, various teams such as Quality Assurance
None Network
or Production teams will pull that image and prepare
their own containers. These MACVLANand IPVLAN Networks
individual containers communicate with each other through a network to perform the
required actions, and this is nothing but Docker Networking. Overlay Network
up and
Default Bridge Network: bridge network
So, you can define Docker Networking as' a communication passage can find a default
you
the isolated containers communicate with each other in various situations to pertora
through which After a fresh Docker installation, network 1s
docker
umng. We can see it by typing $
the required actions.
4.6.1.1 Goals of Docker Networking C:\>docker network
PNAME
ls DRIVER
SCOPE
(i) Flexibility: Docker provides flexibility by enabling any NETHORK ID bridge local
number of applications ou CSec6bea96bd bridge bridge local
various platforms to communicate with each other. 3c92f1df3722 gitops local
t
(ii) Cross-platform: Docker can be easily used f8109f8eSbbb host bridge l0cal
in cross-platform which works acro minikube
local
various servers with the help of Docker Swarm 91818d219a25
none null
Clusters. bb49e3af8606.
(iiü) Scalability: Docker is a
fully distributed network, which enables Fig.4.15 (a)
grow and scale individually applicatios
while ensuring performance.
- IV
Sem.
Docker -Contalners DovOps: MCA [Management 4.32
Docker-Containers
DevOps : MCA [Management - Sem. IV] 4.31
say, Nginx, it willa can understand that containers running on
network. If you run a container,
We
IP the same bridge network
Al Docker installations have this see each- other using their addresses. Onthe other hand,
a
are able to
the default bridge
attached by default to the bridge network: does not support tautomatic: service discovery. network
3 docker run -dit
-name nginx nginx:latest
tser-defined Bridge NetworkS:
"inspect" command, you can check the containers
running insid Docker CLI,
By using the . Using the it is possible to create other
networks. You can create a
network: bridge network using: second
$ docker network inspect bridge docker network create my_bridge --driver bridge
*Configonly": false, Now, attach "busybox1" and "busybox2" tothe same network:
"Containers": {
"9c7ectbd3a76254eaba45e1fb9c554dea3695307c6216aeead6c1b49e463b729": {
docker network connect my_bridge busybox1
"Name: "nginx".
"EndpointID": "3aaf996898e38929fe344578c474
acd7bc48d574cf9ab7eB304e240b1ba37cfe 'docker network connect my_bridge busybox2
name:
*HacAddress: "02:42:ac:11:09:04", Retry pinging "busybox1" using its
"IPv4Address": "172.17.0.4/16",
"IPv6Address": docker exec -it busybox2 ping busybox1
happen if use the containers' name instead of the IP? 4.6.3 Linking Containers, system that allows sent from a source
image does not have any layers. All the expose instructions
The reason is simple. This
ports 80:80 are no actual layers.
Service added to this image are metadata, there
You can also get the number of available layers using the following command:
expose 4573
'{{len .RootFS. Layers}}' debdutdeb/expose-demo:v1
docker image inspect -f
You should see an output like this: debdutdeb/expose
Fig. 4.17: Exposing Container Port in Docker
'{(len .RootFS. Layers}}'
In the above figure, you will see how the SERVER
container's port 80 is mapped to tne docker image inspect -f
port 80 of the host system. This way, the container demo:v1 0
outside world using the public IP address of is able to communicate to tne
Method 2:Via CLI Or docker compose: instruction in
the host system. an extra EXPOSE
On the other hand, the exposed ports cannot developers do not want to add
be accessed directly from outside the Sometimes application
container world. their Dockerfile. Docker API)
can detect the
Remember the following points: sure containers (throughthe the deployment
In such case to make
a other post-build, as part of
Exposed ports are used for internal multiple ports
container communication, within u port in use easily, you can expose
declarative method,
container world. process. CLI, or the
method, i.e. the
Published ports are used for communicated You can either select the imperative
with systems outside the contane
world. ie. compose files.
(Managoment- Sem. IV)
DocketContalners
DvOps :MCA 4.36
Docker-Containers
DevOps : MCA [Management - Sem IV] 4.35
L65 Container
Routing
(a) CLI Method: use There are three
steps for Container Routing:
you have to do is the expose
In this method, while creating a container, all optionally A custom
Docker network named
such that Docker adds
port number and it to the container first.
making it the default route.
1.
option (as many times as needed) with the
protocol with a /. 2. An IP tables rule to mark packets s coming out of that Docker
2.
,.hasedrouting on the host to route network.
Example: marked packets through the non-default
docker container run interface.
--expose 80 \ Erample of code:
--expose 90 \ # create a new Docker -managed, bridged connect ion
--expose 7e/udp \ #
'avpn' because docker chooses the default route alçhabetically
-d -name port-expose busybox:latest sleep 1d DOCKER_SUBNET="172.57.0.0/16"
Here, by efault the busybox image doesn't expose any ports.
docker network Create --subnet-$00CXER SUBNET
(b) Compose file Method: bridge
con.docker.network.bridge.name-=docker_vpn avgn
If you are using a compase file, you can add an array expose in the service # nark packetS from the docker_vpn
definition. You can convert the previous deployment to a compose file like so: interface during prercuting, to destine
them for non-default routing decisicns
version: "3.7" # 0x25 any hex (int mask) shculd wark
is arbitrary,
services:
firewall- cmd -- permanent --direct --add-rule igy4 argle PREROUTING -1
PortExpose: docker_vpn ! -d SDOCKER_SUBNET -1 MARK --set-ark ex25
inage: busybox # alternatively, for regular iptables:
command: sleep 1d
#iptables -t mangle -I PREROUTING e -i docker_vgn ! -d sDOCKER_SUBNET -1
contaíner_name: port-expOse MARK
--set-mark Øx25
expose:
-
# create routing table 18e is arbitrary, any integer 1-252
new
- 9e echo "100 vpn" >> /etc/iproute2/rt_tables
- 78/udp rew table
# Configure rules for when to route packets using the
ip rule add from all Fwmark ex25 Icckup vpn
Once you have the container running, just like
before you can inspect it to know new
which ports are being exposed. The command looks similar. # setup different default route cn the
a rcuting table
route
docker container inspect -f \ can differ from the nermal rcuting table's default
#
this route
{range Sexposed, S_ NetworkSettíngs.Ports)} p route add default via 10.17.0.1 dev tuna
{{printf "%s\n Sexposed)H{end})' \
port -expose
# connect the docker_vpn
uoCker network connect docker vpn
y
ccntainer
Sample Output: Summary
application's
docker teams is manging an development
container inspect
.NetworkSettings.Ports)}{{printf -f '{(range e common challenge for DevOpsacross various cloud and
$exposed, stack
70/udp
"%s\n Sexposed) )Hfend))
port-expose pendencies and technology routine tasks, they must keep the application
environments, As part of their platform that it runs
on. On
8e/tcp the underlying
Perationaland stable regardless of new features and updates.
on releasing
9e/tcp teams focus deploying
hand, Development compromise the application's stability by
Iler
Unfortunately, these often
environment-dependent bugs.
codes that introduce
- Sem. IV]
MCA [Management
Docker
-Containers
DevOps : 4.38
Docker
DevOps : 4.37
MCA [Management- Sem. IV]
9. Which of the following is not a containerr--based Containers
are increasingly adopting a alternative to Docker?
organizations (a) Kubernetes
To avoid this inefficiency, a stable framework without
(b) Core0S'rkt
designing (c) Canonical's LXD
containerized framework that allows (d) Windows
Server Containers
adding: 10. The Docker logo is
• Complexities (a) a butler (b) a sailboat
Security vulnerabilities (c) an octocat (d) a whale
o Operational loose ends
Answers
Containerization is the process of packaging an application's code witl
dependencies, libraries, and configuration files that the application needs to 1 (b) 2. (b) 3. (a) 4. (c) 5. (a) 4.(c) 7. (b) 8. (d) 9.(a) 10 (d)
launch and operate efficiently into a standalone executable unit.
Initially, containers didn't gain much prominence, mostly due to usability issues. Practice Questions
However, since Docker entered the scene by addressing these challenges.
0IAnswer the following questions in short.
containers have become practically mainstream.
1. What is docker?
Check Your Understanding 2. What is the use of docker?
1 Who introduced the Docker? 3. State the various components of docker architecture.
(a) Linus Torvalds, Mark Zuckerberg and Brendan Eich 4. What is image in the docker?
(b) Kamel Founadi, Solomon Hykes, and Sebastien Pahl
5. What are containers?
(c) Brendan Eich, Sebastien Pahl and Greg Duffy
6. What is the docker hub?
(à) Mike Cagney, Suhail Doshi and Chris Wanstrath
7. What is the docker container?
2. Which Markup Language is used to write Docker configuration files?
8. Which command is used to see the docker version?
(a) XMI (b) YAML
(c) DHTML 9. State the command used to search the docker hub forimages.
(d) JSON
3. Which programming language is used to write Docker? 10. Which command is used to pull an image or repository from a registry?
(a) GO (b) NET 11.What is the docker tag?
(c) C++ (à) C 12.Which command is used to restart the docker container?
4. is instances of Docker images that can be run using
command.
the Docker run Q.I Answer the following questions.
(a) File
1 How does docker work?
(b) Hub 2. Write down use cases of docker.
(c) Container (d) Cloud 3. Differentiate between Dockervs Virtual Machines.
5. Which command, you can see all the commands
that were un with an image via a 4. Explain docker architecture with diagram.
container?
run command.
(a) history
(b)
D.
Explain the working of the docker
ref used in the docker tag.
(c) -a (d) hist O. Explain various commands
4. What is the primary advantage I. How to create custom images in the docker?
of using Docker?
(a) Improved application docker file instruction?
security (b) Improved application Which are various arøuments of the
.
O
5.1 INTRODUCTION
Maven is a popular open-source build tool developed by the Apache Group to
build
publish, and deploy several projects at once for better project management.
Maven isa project management and comprehension Fig. 5.1 (a)
tool that provides developers
a 2. Add JAVA_HOME
complete build lifecycle framework. and MAVEN_HOME in environment variable:
Project requirements can be built automatically by Maven. Maven also can Right click on MyComputer -> properties > Advanced System Settings ->
be helpful
in collaborative work environment. Developer life becomes easy
when Maven is used Environment variables -> click new button
for project automation thus will be helpful for report Now add MAVEN_HOME in variable name and path of maven in variable value. It
creation, checks and testing
phase. must be the home directory of maven i.e. outer directory of bin. For example,
Functions of Maven:
Maven provides ways to developers to manage E:\apache-maven-3.1,1 which will displayed below:
o Builds the following: Sytem Propeta
o
Documentation Corpt Nre rn Art SnF
Reporting rret Vns
Dependencies
SCMs
Releases
Distribution Irde ane
o Mailing list Elxateelit
Maven handles compilation,
distribution, documentation,
other tasks seamlessly. Maven team collaboration and
build related tasks. increases reusability
and takes care of most
of the
5.2 MAVEN INSTALLATION
3. Add Maven path in the envinonment variable: 5.2.2 Installing Maven on Ubuntu
° f path is not set, click
on the New tab, then set the path of Maven. If it
is eet. terrminal, we run apt-cache search maven
Maven. to get all the available Maven
the path and append the path of so we are
is set by default, packages:
Here, we have installed jDK and its path
o
going to maven
Maven. $ apt-Cache search
append the path of E:Aapaet.
o The path of Maven should be %maven home%/bin. For example,
maven-3.1.1\bin.
1hxmlbeans -maven-plugin-java-doc :
Documentation for Maven XMLBeans
Plugin.
maven : Java software project management and comprehension tool.
maven-debian -helper : Helper tools for building Debian packages with Maven.
maven2 :Java software project management and comprehension toolCopy.
The Maven package always comes with the latest Apache Maven.
We run the command sudo apt-get install maven to install the latest Maven:
$ sudo apt-get install maverCopy
This will take a few minutes to download. Once downlcaded, we can run the mvn -
ar
version to verify our installation.
5.3 MAVEN BUILD REQUIREMENTS
Following steps will give an idea about the steps required to check requirements
before installing Maven.
mn7otere
Step 1 :
Verify Java installation in your machine.
First of all, open the console and execute a Java command based on the
operating system you are working on.
Task Command
It reads the POM, gets the needed configurati organization or a project. For example, a banking group com.company.bank
for the POM in the current directory. can has all bank related projects.
Some of the configuration that
information, and then executes the goal. 4 artifactId
This is ån Id of the project. This is generally name of the project. For
specified in the POM are following:
example, consumer-banking. Along with the groupld, the artifacttd defines
project dependencies the artifact's location within the repository.
plugins 5 version an
This is the version of the project. Along with the groupld, It is used within
goals to separate versions fromn each other.
artifact's repository
o build profiles For example,
project version
com.comnpany.bank:consumer-banking:1.0
o
com.company.bank:consumer-banking:1.1.
developers
o mailing list Command: mvn help:effective-pom above
on your computer. Use the content of
Before creating a POM, we should first decide the project group (groupld), its name Create a pom.xml in any directory
(artifactid) and its version as these attributes help in uniquely identifying the project mentioned example POM.
CYCLE
in repository. 5.5 MAVEN BUILDaLIFE well-defined sequence of phases, which
define the order
• A Build Lifecycle is an
POM Example:
are executed. Here phase represents a stage in life cycle. As
which the goals to be following sequence of
= Lifecycle consists of the
<project xmlns "http://maven. apache.org/POM/4.0.0" example, a typical Maven Build
Xmlns:xsi = "http://www.W3.org/2001/XMLSchema-instance" phases. Build Life Cycle
xsi:schemaLocation = "http://maven.apache.org/POM/4.0.0 Table 5.2: Phases in Maven Description
Handles phase.
http://maven. apache.org/xsd/maven-4.0.0.xsd"> Phase can be customizd in this
Prepare Resource copying Resource copying
<modelVersion>4.0.0</modelVersion> necessary
resources correct and if all
Validate Validating the Validates if the project is
<groupId> com.companyname. project-group</groupId> information is available. compilation is done.
information source code
<artifactId>project</artifactId> Compile Compilation In this phase, source code suitable for
testing
Tests the compiled
<version>1.0</version> Test Testing
framework. as
</projects JAR/WAR package
phase creates the
It should be noted that there should be a single POM file for each project. Package Packaging This packaging in
POM.xml.
mentioned in the local/remote
All POM files require the project element and three mandatory fields: groupse installs the package in
Install This phase
Installation maven repository.
artifactId, version. remote repository.
package to the
o Projects notation
in repository is groupId:artifactld:version. Deploy Deploying Copies the final
IV)
(Managomont-Som. 5.10
Bulld ToolMaven Devops :MCA
Bulld
Tool - Maven
DevOps : MCA (Management - Sem. IV) 5.9 Table 5.3: Lifecycle Phases
must run
register goals, which prior to,
and post phases to
ot
There are always pre LÍfecycle Phase and Description
after a particular phase. Sr. No.
a steps through a defined sequence of phats validate
When Maven starts building project. it 1.
correct and all necessary information
are registered with each phase. Validates whether project is
is available
and executes goals, which to complete the build
process.
Maven has the following three standard lifecycles:
1. clean
initialize
2.
Initializes build state, for example set properties.
2. default(or build) generate-sources
3. site 3. e
a Generate any SOurce code to be incuded in compilation phase.
goal represents a specific task which contributes to the building and managing of
A
zero or more A
build phases. goal not bound to any builA procesS-sOurces
project. It may be bound to
Process the source code. For example, filter any value.
phase could be executed outside of the build lifecycle by direct invocation.
The order of execution depends on the order in which the goal(s) and the build generate-resOurces
5
phase(s) are invoked. For example, consider the command below. The clean and Generate resources to be included in the package.
package arguments are buildphases while the dependency:copy-dependencies is a 6.
procesS-resources
CoDY and Process the resources into the destinaticn directory. ready for
goal.
packaging phase.
Vn clean deperndency:copy-dependencies package compile
7
Here the cdean phase will be executed first, followed by the dependency:copy Compile the source code of the project.
dependencies goal, and finally package phase will be executed. 8 procesS-classes
to do
Clean Lifecycle: Post-process the generated files from compilation. For erample,
When we execute mvn post-clean command, Maven invokes the clean lifecycle bytecode enhancement/optimization on Java classes.
consisting of the following phases: generate-test-sources
o pre-clean Generate any test source code to be inciuded in compilation phase.
10. process-test-sources
clean any values.
Process the test source code. For example, filiter
post-clean
11. test-compile
directory.
Maven clean goal (clean:clean) is bound to the clean phase in
the clean lifecycle. Its Compile the test source code into the test destination
clean:cleangoal deletes the output ofa build by deleting the build directory. Thus, 12. process-test-classes
when mvn clean command executes, Maven deletes the build directory. Process the generated files from test code file
compilation.
We can customize this behavior by mentioning goals in any 13 test
of the above phases of framework Junit is one).
clean life cycie. Run tests usinga suitable unit testing
In the following example, We will attach maven-antrun-plugin:run 14 prepare-package
goal to the pre necessary to prepare a package before the actual
clean, clean, and post-clean phases. This vwill allow us rertorm any operations
to echo text messages
displaying the phases of the clean lifecycle. packaging
15 a
You can try tuning mvn clean command, package format, such as
which will display pre-clean and clean. Ke the compiled code and
package it in its distributable
Nothing will be executed for post-clean phase.
JAR, WAR,or EAR file.
Default (or Build) Lifecycle: 16
This is the primary lifecycle of Maven
pre-integration-test tests are executed. For example,
and is used to build the application. It required before integration
following 21 phases: has the eiormupactions
Setting the required environment. contd.
IVI
Bulld (Managomont- Sem, 5.12
DevOps : MCA[Management Sem. IV] 5.11 Tool- Maven Devops:MCA Bulld Tool - Maven
Maven local
repository keeps your project's all dependencies (ibrary
jars, plugin jars
17 integration-test you run a Maven build, then Maven automatically downloads
if necessary into an environment where ctc.). When all the
Process and deploy the package ertancy iars into the local repository. It helps to avoid references to dependencies
integration tests can be run. a
stored on remote machine every time project
is build.
18. post-integration-test Stooeal repository by default gets created by Maven in %USER HOME% directory.
Performn actions required after integration
tests have been executed.
for To override the default location, mention another path in Maven settings.xml fle
example, cleaning up the environment. available at %M2_HOME%\conf dírectory.
19. verify
Run any check-ups to verifythe package is valid and meets quality criteria. settings xmlns "http://maven.apache.org/SETTINGS/1.0.0*
20
xmlns:xsi = "http://www.W3.org/2901/XMLSchema-instance"
install xsi:schemaLocation = "http://maven .apache .org/SETTINGS/1.8.8
Install the package into the local repository, which can be used as
http://maven.apache. org/xsd/settings-1.9.0.xsd">
dependency in other projects locally.
<localRepository> C:/MyLocalRepository</localRepository>
21 deploy
</settings>
Copies the final package to the remote repository for sharing with other .
developers and projects. When you run Maven command, Maven will download dependencies to your custom
path.
5.6 MAVEN LOCAL REPOSITORY (,m2) Maven local repository is located in your local system. It is created by the maven
In Maven terminology. a repository is a directory
where all the project jars, library jar, when yOu run any maven command.
plugins or any other project specific artifacts are By default, maven local repository is %USER_HOME%/.m2 directory. For
stored and can be used by Maven
easily. example, C:\Users\SSS IT\.m2.
Maven repositories are of three types.
The following diagram will
regarding these three types. give an idea
1 Local
2. Central Ctemcgiet
3. Rermote Doanicats
Rexet Pace
Organization's intemal
Lbres
Msc
Local
Repositor Pets
on
Internet
Remote
Local Repositor
Repositor Central Fig. 5.2 (b)
Repository
Update/Location of Local Repository:
the settingsml
Loca! ve can change the location of maven local repository by changing
example: E:\apache
Repositor file. Is located in MAVEN HOME/Conf/settings.xml, for
maven 3.1.1\conf\settings.xml.
Fig. 5.2 (a): Marven Let's see the default code of settings .xml fle.
1. Local Repository: Repositories
settings.xml
• Maven local repository
run any maven i s a folder location on
your
command for the first
time.
machine. It gets
created when you GCangs xmlns ="htto://maven, apache.org/SETTINGS/1.0.0
nsixsi="http://M.w3.org/2001/XMLSchema-instance"
IV]
Bulld
[Managomont - Som. 5.14
: MCA Bulld ToolMaven
5.13 Tool- Maven DovOps
DevOps : MCA [Management- Sem. IV]
repository is located on the web. Most of libraries can be missing
XSi:schemaLocation="http://maven. apache.org/SETTINGS/ Maven remote from
repository such as JBoss library etc, so we needIto define remote
http://maven.apache.org/xsd/settingS-1.0.0,xsd" the central repository
1.0.0 file.
pom.xml
<l-- localRepository in
Following is the code to
add the jUnít library in pom. xml file.
| The path to the local repository maven will use t0 store artifacte
pom.xml
s{user. home)/.m2/repository
I Default: <project Xmlns"http:://maven.apache.org/POM/4.0.0"
<localRepository>/path/to/local/repo</local Repository> Xmlns:xsi="http:/, /www.w3.org/2001/XMLSchema -1instance"
cchemaLocation="http://maven.apache.org/POM/4.0,0
-)
htto://maven. apache.org/xsd/maven-4.0.0.xsd"y
</settings> <modelVersion>4.0.0</modelversion>
Now change the path to local repository. Afer changing the path of local repository.
it will look like this: (groupId> com.javatpoint.application1</groupId>
<artifactId>my application1</artifactId>
-
settings.xml
<version>1, 0</version>
<packaging>jar</packaging>
<settings xmlns="http: //maven.apache.org/SETTINGS/1.0.0"
<name>Maven Quick Start Archetype</name>
Xmlns:xsi- "http://www.w3.org/2001/XMLSchema-instance"
Xsi:schemaLocation="http://maven. apache.org/SETTINGS/1.0.0 <url>http://maven.apache.org</url>
.
http://mven.apache org/xsd/settings -1.0.0.xsd">
<localRepository>e:/mavenlocalrepository</ local Repository> <dependencies>
<dependency>
</settings> <groupId>junit</groupId>
• Now the path of local repository is e:/mavenlocalrepository. <artifactId>junit</artifactId>
2. Central Repository: <version>4.8.2</version>
Maven central repository is repository provided by Maven <scope>test</scope>
community. It contains a
large number of commonly used libraries. </dependency>
When Maven does not find any dependency in local repository,
it starts searching in </dependencies>
central repository using URL - https://repo1.maven.org/maven2/
Main concepts of Central repository are as </project>
follows:
Thisrepository is managed by Maven community. |5.8 GROUP ID, ARTIFACT ID,SNAPSHOT
It is not required to be configured. group or individual that created a project,
Lne groupld is a parameter indicating the
It requires internet access to be searched. which is often a reversed company domain name.
To browse the content project, and we use the standard
of central maven repository, maven community The artifactid is base package name used in the
the
URL - https://search.maven.org/#browse. has provided a
Using this library, a developer can searcii archetype.
all the available libraries in central run the following command:
repository. Example: In order to build as simple Java project, let's
5.7 MAVEN GLOBAL REPOSITORY mvn
archetype:generate \
Sometimes, Maven does not -DgroupId=com.baeldung
find a mentioned dependency \
well. It then stops the build process in central repository -DartifactId=baeldung \
and output error message to
such situation, Maven provides concept
of Remote Repository,
console. TO preve -DarchetypeArtifactId=maven -archetype-quickstart
own custom repository which is develope
containing required libraries or -DarchetypeVersion=1.4 \
other project jars. -DinteractiveMode=false
IV) Bulld Tool Maven
5.16
(Manngomont - Som,
we are extracting
template, now strong as follows. the file
of
Bulld Tool. pevops MCA
: the project and
S15 Maven
creating the
our projects from all other projects, After same in the tool|: suite of
The maven gropid is used to uniquely identify the
2. X
opening
package name, so we can
say
The $Toup idis following the rule the javawas
of
thatit ill
not enforcing o AAAe
reserved. Maven
is y
Pe
be start by using the omain name which the Pe
rule there are multiple legacies which of a
were not following the convention, Md
and
same, we are group ids single word. It is very difficult
înstead the
of
using for us
cent
tor getting the single mord goup ID which was approved by inclusion and
repository of maven.
can create multiple subgroups as per
Wale using the maven group id, Ire
determine the granularity of group ID is to use
quirement. The best way tomaven ed
structure ofthe project. The project configuration is done by using the project Dy
osttoetis
cbject model which was represented by a file name as pomxml. The pom wii b Btpet
describing the dependency ivhich was managed by the project and it is also used in w
plugin configuration for building the software. The pom XML file also defines the tleetg
relationship between multi-module projects.
Key Points:
The pomml is maven default XML, all the pom is inheriting from the default or
parent This pom is nothing but the default pom which was inherited by default.
Maven groupid uses the default pom for executing the relevant goal which wae (b)
defined in a maven groupid Fig. 5.3
Maven, the groupid we are checking the
Maven Groupld Naming:
3.
Inthis step, while opening the project of
At the time of working with maven groupid, the important thing about the class file is pom.xml file as follows.
that we don't need to pick the name from them; will be taking their name
automatically from 1:1 mapping from the file of Java. Maven is
two names, so it is very simple for us. For defining the maven asking
us to pick the
grouped naming, we
o1280e0•9'EG¢o
need to follow the below steps:
1 In this step, wve are creating the template of the project in the spring initializer.
The below figure shows the template of the maven
follows. grouped naming project as rt
attatyit-tatrrtas
a2).3Asis
Group name -com.groupid akg paret Ty
Artifact--maven groupid
Name- maven groupid
Packaging-jar
Java version-8 qtira
Answers
1 (a) 2. (b) 3. (c) 4. (c) 5. (a) 6. (a) 7. (d) 8. (d) 9. (a) 10. (a)
Practice Questions
Q.I Answer the following questions in short.
1 What is Maven?
2. How to confirm whether Maven is installed or
not?
3. What is POM?
4. State the configuration that can be specified
in the POM.
5. What is Groupld?
Bibliography
Web Reference:
Chapter 1:
https://www.simplilearn.com
https://www.browserStack.com/guide
https://www.gavstech.com
https://www.atlassian.com
https://www.testsigma.com
https://www.netapp.com/devops-solutions/
https://www.pluralsight.com
https://docs.gitLab.com
https://www.techonthenet.com/linux
https://www.javatpoint.com
https://learn.microsoft.com/en-us/training/modules/introduction-to-devops/
https://www.spiceworks /tech/devops/articles/what-is-devops/
• https:// softobiz.com
Chapter 2:
https://git-scm.com
education.github.com
www.freecodecamp.org
https://www.atlassian.com/git/tutorials/syncing/git-pull
Chapter 3:
What is Chef? DevOps Tool For Configuration Management (intellipaat.com)
-
Chef Tutorials: Chef roles Tutorials and Example DevopsSchool.com
Chapter 4:
https://docs.docker.com
https://docs.aws.amazon.com
Chapter 5:
https://www.tutorialspoint.com/maven
(B.1)