[go: up one dir, main page]

0% found this document useful (0 votes)
103 views15 pages

DevOps: Core Concepts & Practices

The document provides an overview of DevOps practices including: 1. DevOps aims to improve collaboration between development and operations teams through practices like continuous integration, delivery, and deployment. 2. Core values include culture, automation, measurement, and sharing to break down silos and improve feedback. 3. Continuous integration, delivery, and deployment are implemented through pipelines that automatically build, test, and deploy code changes.

Uploaded by

Morar Danny
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as RTF, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
103 views15 pages

DevOps: Core Concepts & Practices

The document provides an overview of DevOps practices including: 1. DevOps aims to improve collaboration between development and operations teams through practices like continuous integration, delivery, and deployment. 2. Core values include culture, automation, measurement, and sharing to break down silos and improve feedback. 3. Continuous integration, delivery, and deployment are implemented through pipelines that automatically build, test, and deploy code changes.

Uploaded by

Morar Danny
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as RTF, PDF, TXT or read online on Scribd
You are on page 1/ 15

01 Devops - Welcome

DevOps - help developers, system administrator

02 Devops Basics:

The practice of OPS and development engineers working together in the entire service lifecycle from
design through the dev process to production support. Replaces the model where the team to Test and
Deploy (1 code, 2 test, 3 deploy, 4 Operate, 5 StrikeOut).

Five levels:

1 Values; 2 Principles; 3 Methods; 4 Pratcices; 5 Tools

Why would you care? 1 - High performing IT organizations deploy more frequently, fail less, recover
faster

2 - Lean management (Reducing times within the production system as well as response time from
supply to customer) and continous delivery practices help deliver value faster

3 - Hihg performance s achievable whether your apps are greenfield (early stages/not made),

brownfield(existing application) or legacy (outdated/obsolete)

Meaning: Dev (code) - Ops (systems)

03 Core Values

CAMS Model (John Willis, Daemon Edwards) - Culture, Automation, Measurement, Sharing

1) Culture - It's driven by behavior - mutual understanding of people and where they're coming from

IT was split in two: Development - Creating features

Operations: Maintaining stability


2) Automation - Not just automated tooling (even if it's critical part of the journey) - People over
process over tools. But you have to understand culture first

3) Measurement - MTTR(Mean Time to Repair(average time required to repair a failed component) -


Cycle Time Appears(actual time spend on working producing an item) - Costs - Revenue - Employee
Satisfaction) - helps engage the team and overall goal. Metrics help to see if we've improved anything

4) Sharing - Collaboration by sharing ideas and problems - Kaizen (discrete continous improvement) -
feedback loop that continous improves

04 Principles - Three Ways (Jim Kim): a) Systems Thinking; b) Amplifying Feedback Loops c) The Culture
of continous experimentation and learning

a)Systems Thinking - Make sure to focus on the overall outcome of the entire Pipeline or Value Chain
(e.g improving performance in one area can cause the bottleneck to move to another area); more
application servers can overwhelm a database server

This overall flow - Concept to Cash (if you write the software in the world but can't deliver it to the
customer so they can use it - you lose)

b) Amplifying Feedback Loops - Creating shortning and amplifying feedback loops between the parts of
the organization that are in the flow of this value chain

- Takes its own output into consideration when deciding what to do next

Bug found by dev - less time wasted, QA Team - Ticketing System, still Resolved, Customer Release -
wastes time and money for the same or worse outcome

c) The Culture of continous experimentation and learning - Actively trying out what works and what
doesn't work - the focus is on doing by repetition and practice, new skills by picking up and trying them

05 Continous integration CI and continous delivery & deployment CD

CI - The practice of automatically building and unit testing an entire application frequently, ideally on
every source code check in

CDelivery - The practice of deploying every change to a production-like environment and performing
automated integration and acceptance testing (Code -> Build -> Unit Tests -> Code Validation ->
Packaging -> Artifact)

Continous Deployment - The practice of automatically deploing every build to production after it passes
the automation tests

06 Benefits of Continuous Delivery

5 Benefits:

1) Empowering Teams - Pipeline is a self-service system that makes the process of software delivery
transparent and understandable to the entire team

2) Lowered Cycle Times - The time from Code commit to running into production (using older methods -
time was measured in weeks/months) (now hours/minutes)

3) Better Security

4) Rhythm of Practice - Removes stress from Dev/Ops - release dates are no longer a big event

5) More Time to be Productive - Deliver more value faster (adding new features, feature development) -
less time reworking features

07 Building Pipelines in Practice (CD)

Toolwise: You start with a Source Code repository and it's job is to keep code safe and Version it so
people don't commit to the same version. You then have a Build System(Gulp, Packer, Make/Rake,
Maven) which watches the repository and triggers whatever build is required when it changes (Jenkins,
Bamboo, Teamcity, Circle CI). It's job is to run unit test and build code and provide feedback and visibility
to the build process. Build tool is basically the compiler depending on the language that you're using.
Allows developers to build on their local machine and keep as much code out of the build console config
as possible.

Unit Test(JUnit for Java; Code Hygiene Linters/Formatters - Golint/Gofmt, RuboCop for Ruby)are basic
tests to verify the code is doing it's job, first line of defence if the build is working as expected. The next
step is packaging into an Artifact (java code, .jar file). Once the code is packaged, it goes into the Artifact
Repo. (storage device, amazon s3). Deployment tool (Deployinator, Rundeck,UrbanCode,
Thoughtworks) deploys the build both into the test and production environments. You run it into the
test environment (CI Environment). Next, you deploy an Integration test( Robot, Protractor, Cucumber
)designed to exercise a real running service on a real environment. End 2 End stage including manual
testing component. Then you use the same artifact that passed testing in the same Deployment tool in
the Production Environment.
08 Small + fast is better

Continous Integration - The practice of automatically building and unit testing the entire app frequently
on every code checking. -

Continous Delivery - Additional practice to deploy every change to a prod-like env and perform an
automated integration and acceptance testing (Code -> Build -> Unit Tests -> Code Validation ->
Packaging -> Artifact)

Continous Deploy - Everyr change goes to fully automated testing that is deployed automatic to
production (Fb, Etsy)

Benefits: 1) Time tot market goes down: Rapid experimentation and market experimentation of ideas

2) Quality increases, not decreases: 3x lower change failure rate Higher IT companies

3) Continous Delivery limits your work in progress - Evaluate and deploy changes 1 by 1 testing every
commit and making sure the software is a running state

4) Shortens lead times for changes - Shrinking unintegrated changes that are the software equivalent of
WIP

5) Improves mean time to recover - Less than one hour (1 to 6 months competitors)

09 Continuous Integration Practices (6 practices)

(Code -> Build -> Unit Tests -> Code Validation -> Packaging -> Artifact)

1) Build should pass the coffe test (<5 minutes) - less time that it takes to get a coffee

2)Commit really small bits - Smaller changes are easier to reason about and isolate failures

3) Don't leave the build broken - It blocks delivery - delay meetings until the build is fixed

4) Use a trunk-based development flow - Branch Based/Trunk-based (master) - work off trunk and
commit back to trunk so no longer using branches and use feature flags as it reduces the amount of WIP,
ensures the code is checked frequently

5) Don't allow flaky test, fix them -

6) The build should return a status (pass/fail), log (all results of the run) and artifact (tagged with the
build number)
10 Continous Delivery Pipeline (CD)

Continous Delivery - Apply every change to a production-like environment and performing automated
integration and acceptance testing in the way

1) Only build artifacts once - and used in all the environments. This way ensures testing steps are valid
since it's using the same artifact

2) Artifacts should be immutable - a way to combat this is to set the repository only to "Write" and the
Deployer only has read access to the artifact

This way it builds trust between the teams - all have confidence that the underlying bit didn't change
between diff stages.

How it works: Code is checked into version control system - that commit triggers as build in the CI
system. Next we have the deployment workflow - builds going into shelf then the deploy goes to stage.
Smoke Testing, Integration Testing and Acceptance testing all happen. Once it passes all those tests the
artifacat is released and deployed to the production environment

3) Deployment should go to a copy of production - should match all the data

4) Stop deploys if a previous step fails - it's in the expected step is it should be.

5) Deployment should be idempotent - Redeployment should leave the system in the same state

11 The role of QA in Devops

Types of testing:

1) Unit Testing - lowest level of the language/framework it supports, validate base functionality of coes
on the devs machine

2) Code Hygiene - Linting(static code analysis tool used to flag programming errors,bugs,stylistic errors,
suspicious constructs), Code Formatting, Banned Function checks

3) Integration Testing - similar to unit testing performed with all the app's components independencies
operating in the test environment

4) Security Testing - Simulated attack test

5) TDD(test driven development) - start writing test before writing code. Start with outcome as a test,
write code to pass the test, repeat. Test is build along with the app
BDD(behavior driven development) - Works with the business stakeholders, describe business
functionality of the app, tests are based on natural language descriptions

ATDD(acceptance test driven development) - Testing from an outside perspective - Builds on TDD/BDD,
involves defining scenarios from end user scenarios - build automated testing on those and testing is
continous during development.

Techniques for Dealing with Slow Tests - Use nonblocking tests in your pipeline

Use time-scheduling testing (nightly test suite)

Use monitoring to accomplish testing goals

6) Infrastructure Testing - starting up a host, running the config mgmt code and running the tasks and
turning it all off.

7) Performance testing - Low Test, Stress Test, Soak test(stability, response time by requesting the
designed load on a system - Load Testing), Spike Test (Sudden and extreme increase or decrease in load)

12 DevOps - Your CI toolchain (Tools)

1 - Version Control; 2 CI Systems ; 3 Build, 4- Test, 5 Artifact Repository ; 6 - Deployment

1) Version Control - where we commit code changes - e.g Github or Bitbucket

2) Continous Integration Systems - Jenkins, GOCD, Bamboo, Teamcity - TravisCI, CircleCI

3) Build - Make/Rake (Ruby); Maven (workflow approach from the dev desktop); Testing on Frontend
(Gulp, Packer for infrastructure)

4) Unit Testing - JUnit for Java; Code Hygiene Linters/Formatters - Golint/Gofmt, RuboCop for Ruby

4) Integration Testing - using frameworks or inhouse scripts - Robot, Protractor, Cucumber - can hook up
into Selenium for UI testing

4) Performance Testing - ApacheBench, JMeter

4) Security Testing - Brakeman, Veracode

5) Artifact Repository - Artifactory, Nexus - manage diff artifacts formats, Docker Hub, AWS S3

6) Deployment - Rundeck, UrbanCode, ThoughtWorks, Deployinator (Etsy)


------------------------------------------------------------------

CORE 01 - Software Testing

I Introduction to Software Development and Testing - What is Software Testing and why do we need it?

1. Check expectations of how the software should work - User Want/Needs

2. Investigate unknowns and different type of risks that affect - Experience Value

Example:

Comment on a website - No blank comment/allow numbers/allow letters

Edge Cases - Emojis/Symbols/WhiteSpaces

First part of testing - checking expectations of how the comments should work based on req's

Second Part - Investigate other kinds of risks and uncovering information about potential problems

Why Do we Test?

1. Discover Problems

2. Discover Risks - Security Problems

3. To asses quality relating to - Software meeting expectations and The goodness of the
experience of using the software

Things can go wrong with Software:

- Errors in Code / Lack of understanding of User's Needs

e.g Facebook bug exposing photos including pictures they had not posted; BBC News - software glitch
couldn't go live and they used prerecorded videos

1. Testing is easy - misconception

2. Testing is just checking what the software works as expected - misconception

3. Testing assures quality - doesn't ensure but can improve quality

II How do we test software?

1. Executing Test Cases - What happens?

- predefined steps including actions and outcomes

- outcomes are based off of business rules


- execute a step, observe the system and compare against outcomes

Who is involved?

-testers and developers for automation testing

2. Exploratory Testing

What happens?

-The person testing has freedom to test at will

-generate a test idea, execute it, observe what happens and then repeat

-charters can help guide what testing is carried out

Who is involved?

- Testers & Developers

3. Ad hoc testing

What happens?

- unstructured testing with no plan

- ad hoc is done without any prior planning

- carried out when a lot of change is happening

Who is involved?

-develoepr

-maybe testers to dive in and learn something about the product

4. Monitoring

What happens?

- released code is regularly checked by tools

- is it still up?any errors? how are user behaving?

- how you monitor depends on what you have built (e.g website easy to monitor rather than
medical device)

Who is involved?

- Ops/Infrastrucutre

-Developer

-Tester - influence ideas on what to monitor and use data to plan future testing
5.UAT - User Acceptance Testing

What happens?

- Does the product match what the end user or business wants?

- Has it met specific requirements? Design? User Expectations?

Who is involved?

- Product owner

-BUsiness Analyst

-Tester

-Designer

6.Automation

What happens?

-The process of building tools and using tools to support testing

- Automation can be used in many activities to support teams

- Automated Test Cases, TOols for ETs and monitoring tools

- supports testing, doesn't replace it

Who is involved?

- Developer

- Tester

- Ops/Infrastructure

REPORTING Testing

1. Raising Bugs

What happens?

- reports details on problems that occur in a system

-details such as how to reproduce, what happend, logs, videos, screenshots

- It's important to add lots of useful information

Who is involved?

- testers
- developers

- product owner

- business analyst

- project manager

2. Reporting Metrics

What happens?

- Counting bugs, Test Cases, ET sessions to learn how testing is going

- Not the most accurate way of reporting and misleading at times (e.g 30 types but low
compared to 1 issue that messes the whole app)

-Can be useful to get a general idea of testing

Who is involved?

- Project Manager

- Tester

a) ET's testing debriefs

What happens?

-One to one interviews after an ET session is run

-Useful in reviewing how the session went

-Can be used to raise issues and review bugs

Who is involved?

- Tester - is debriefed

- debriefing to Product Owner/Business Analyst/Developers

b) Formal Reporting

What happens?

-Required for regulatory reasons or company policy

-Teams maybe required to provide detailed evidene of their testing

- These reports might have to follow a standardised report template

Who is involved?

- Tester
- Project Manager

TESTING REQUIREMENTS OR IDEAS

a) Discussing and questioning requirements

- Requirements come in many forms

- We ask questions to help understand requirements better

- We can ask questions, share examples, create sample diagrams or mroe

- Who is involved? Developer, Tester, Product Owner, Business Analyst, Designer

"Three Amigos" - where Product Owner/BA, Developer, Tester is discussing testing requirements or
ideas

b) Reviewing designs or diagrams

- The format or requirements may vary (style guides, visual designs

- These can be explored or questioned

- Who is involved? Developer, Product Owner, Business Anaylst, Designer, Tester

OTHER TESTING ACTIVITIES

*NOT NECESARILLY FOCUSED ON TESTING*

a) Collaboration - Team members work together on the same activity - similar roles or diverse group of
roles through:

- Pairing - working with someone else to share ideas to solve problems and solutions and agree
on what to do. Clarify ideas, assumptions being made and support one another

- Mobbing - similar to Pairing but in larger groups and mix of roles - more formal where folks
usually rotate duties by sharing ideas and implement them. Identify, build and test at the same time

-Demos - Agile ceremony where teams show their work to the business or end users. It enables
the users to see how things are progressing and give vital feedback

b) Improving ways of working

- Retrospectives - regular team meetings where they reflect on their work and progress.
Celebrate good work and identify issues and bottlenecks

- Workshops - Learn new skills and collaborate to solve new larger problems team's facing. e.g
Workshop on identifying new requirements. Self-organized by a team or Coach who helps facilitate
productive conversions

- Coaching sessions - supporting a team or individuals and their process on improving the way of
working. Agile Coaching - adopt agile processses
III What makes a good Tester?

a) Curiosity - Different routes and different processes by thinking at other users. What can go wrong.

b) Adaptability - Come accross different issues and adapt to those. Waterfall - no contribution at all to
the product until the product was completely finished. Agile Approach - Scrum - provide value straight
off by questioning the departments

c) Pragmatism - understand environments they work in, the AUTs

d) Collaboration

What skills do tester have?

a) Communication - discuss ideas, problems, requirements, blockers - strong communication


skills - being an excellent listener, asking good questions

b) Questioning - reveal new information through questioning ideas/requirements. What have we


tested and what haven't we tested

c) Critical and lateral thinking - unusual and diverse questions that explore system and ideas in
various ways.

d) Technical Knowledge - a strong place to start is having the same technical knowledge as a user
will

------------------------------------------------------------------------------------------------------------------

CORE 02 - Software Testing

I Modern Testing - Introduction

Devs can be good testers

Helping and coaching the whole team test and make better software is a necessary role

Eliminating/Abandoning Test Teams without a plan is a poor business decision

Modern Testing is the antidote to traditional testing methods

Differences:

Traditional Testing - separate and often siloed test team which is concerned with consistency and
predictability

Agile - Test Specialist on Feature Team - works primarily on testing tasks needed for feature/product
quality

Modern Testing - Not necesarrily a role but often is the test specialist - data-driven and customer
obsessed - passionate about productivty - antidote to Traditional Testing Methods

Modern Testing Principes

1. Our Priority is improving the business

Using Traditional testing - Bug ping pong/Developers seen as adversary than collaborator/over-tested

Some traits of Modern Testing:

- be proactive

-how can we do this better

-collaborate and help

-adapt

-listen to customers

-try new things

-focus on problem solving

-always test

2. We accelerate the team, and use models like Lean Thinking and the Theory of Constraints to help
identify, prioritize and mitigate bottlenecks from the system.

Theory of Constraints: Identify the constraint -> exploit the constraint -> sync to the constraint ->
fix/mitigate the constraint -> repeat the process

Lean Software Dev: 1. Eliminate Waste; 2. Build Quality In; 3. Create Knowledge; 4. Defer Commitment;
5. Deliver Fast; 6. Respect People; 7. Optimize the Whole

3. We are a force for continous improvement, helping the team adapt and optimize in order to
succeeed, rather than providin a safety net to catch failures.

Continous Improvement - Continually seek out things that we can do to help and improve the team -
people, process and tools - CI to prevent bugs, or smth the team to comunicate better

Kaizen - Good Change - Process improvement philosophy

Have good retrospectives -> talk to people -> try things -> try more things -> reflect

Scientific Method: 1. Observe -> We had a typo

2. Question -> Why dont we catch typs ourselves

3. Hypothesis -> spell check tool on text were changing

4. Prediction -> add tool, no more typos

5. Measure -> what was our rate of typo after the change
6. Iterate -Problem solved?

4. We care deeply about the quality culture of our team and we coach, lead and nurture the team
towards a more mature quality culture.

What's Quality Culture -> Shared mindset that delivering high quality software to customer is our top
priority, and that all of our practices support this effort. Is alot about putting the customers first rather
than focusing on short-term objectives

Four levels: Chaos, growing, competent and optimizing

1. "Chaos" - testers own quality - bugs may be blamed on testers and testing activities that QA Does.
Testers focus mostly or entirely on verification of new codes (features and bug fixes)

2. Growing -> Feature devs own the majority of functional testing including unit tests and many/most
integration tests and write frameworks to assist in this effort.

3. Compentent -> Every member of the team writes tests and values quality. Feature devs write all or
nearly all functional tests and contributes to non functional. Holes in quality ownership

4. Optimizing -> Every team member values quality and customer value above all else, and contributes
with both ideas and implementation of functional and non functional tests. Quality is builtin from the
start on all features. The test specialists focus is on coaching the team and assisting quality efforts. Seen
as a valuable resource.

5. We believe that the customer is the only one capable to judge the quality of our product

YOU ARE NOT THE CUSTOMER

When we say the customer evaluates quality -> Remember quality is value - so we need to find out if our
software has value

- We're not asking customers to test - or to find bugs

- Build a product to solve a customer's problem. What question would you ask about the problem, what
is the customer hypothesis?

How to get customer feedback - Talk to them directly/meet with them / forums, newsgroups, mail
lists/support/data

We want customers to tell us if we're building the right product to solve their problems

6. We use data extensively to deeply understand customer usage and then close the gaps between
product hypotheses and business impact

We use it to - understand the customer, understand market fit, gain insight, make informed decision,
learn how customers use the product, make predictions about your customer

Model of Data Growth Model


Data Oblivious - Intuition and CUstomer feedbacks define actions taken

Data Affirmed - Data is Only believed if it affirms intuition


Data Driven - Intuition validated by data defines actions taken

Data Centric - Data Analysis is core to all decisions

Collecting Data -> Two schools -> Collect Everything / Only collect what you know you'll use-NEITHER OF
THESE are right

It can be a tough way to start, you may end up ignoring it, you may not know what to do with it.

Approaches: - Success factor based, scenario based, hypothesis based

Success: - What do I need to know how to understand CS using this features? What defines success?

- Google Search - Search is performant, collect latency, load times

- Search is relevant - collect data on link clicks, refined searches.

ASK: - Time to return a search result; which and how many links in the search result and user selects
(relevance); how many times the user re-visits the search results (Relevance)

7. We expand testing abilities and knowhow accross the tea; understanding that this may reduce or
eliminate the need for dedicated testing specialist

You might also like