OBJECT-ORIENTED
SOFTWARE ENGINEERING
UNIT 07 : Software Testing Techniques
© 2018, PRAMOD PARAJULI
Disclaimer
These slides are part of teaching materials for Object Oriented
Software Engineering (OOSE). These slides do not cover all
aspect of learning OOSE, nor are these be taken as primary
source of information. As the core textbooks and reference books
for learning the subject has already been specified and provided
to the students, students are encouraged to learn from the original
sources.
Contents in these slides are copyrighted to the instructor and
authors of original texts where applicable.
OOSE UNIT 07 – Software Testing Techniques 2
SOFTWARE TESTING
Process of executing a program with the intent of
finding an error
Good test case has high probability of finding an as-
yet undiscovered error
A successful test is one that uncovers an as-yet
undiscovered error
OOSE UNIT 07 – Software Testing Techniques 3
TESTING PRINCIPLE
All test should be traceable to customer requirements
Test should be planned long before testing begins
Pareto principle applies to S/W testing
Testing should begin ‘in the small’ and progress
towards testing ‘in large’
Exhaustive testing is not possible
OOSE UNIT 07 – Software Testing Techniques 4
TESTING TYPES
Unit/component/defect testing
Mutation testing
Integration testing
Component System
System testing testing testing
●
Alpha testing Software developer Independent testing team
●
Beta testing
●
Acceptance testing
OOSE UNIT 07 – Software Testing Techniques 5
TESTING PRIORITIES
Only exhaustive testing can show a program is free from
defects. However, exhaustive testing is impossible
Tests should exercise a system's capabilities rather than its
components
Testing old capabilities is more important than testing new
capabilities
Testing typical situations is more important than boundary
value cases
OOSE UNIT 07 – Software Testing Techniques 6
TEST DATA AND TEST CASES
Test data Inputs which have been devised to test
the system
Test cases Inputs to test the system and the
predicted outputs from these inputs if the system
operates according to its specification
OOSE UNIT 07 – Software Testing Techniques 7
DEFECT (UNIT) TESTING PROCESS
Test Test Test Test
cases data results repor ts
Design test Prepar e test Run pr ogram Compar e results
cases data with test da ta to test cases
OOSE UNIT 07 – Software Testing Techniques 8
BLACK-BOX TESTING
An approach to testing where the program is considered as a
‘black-box’
The program test cases are based on the system specification
Test planning can begin early in the software process
1. Equivalence class partitioning
2. Boundary value analysis
OOSE UNIT 07 – Software Testing Techniques 9
EQUIVALENCE CLASS PARTITIONING
Input data and output results often
fall into different classes where
all members of a class are related Invalid inputs Valid inputs
Each of these classes is an
equivalence partition where the
program behaves in an System
equivalent way for each class
member
Test cases should be chosen from
each partition
Outputs
OOSE UNIT 07 – Software Testing Techniques 10
EQUIVALENCE CLASS PARTITIONING
Partition system inputs and outputs into ‘equivalence sets’
If input is a 5-digit integer
between 10,000 and 99,999, 3 11
equivalence partitions are 4 7 10
<10,000, 10,000-99, 999 and
> 10, 000 Less than 4 Between 4 and 1 0 More than 1 0
Number of input values
9999 100000
Choose test cases at the boundary of these sets 10000 50000 99999
00000, 09999, 10000, 99999, 10001
Less than 1 0000 Between 1 0000 and 99999 More than 99999
Input v alues
OOSE UNIT 07 – Software Testing Techniques 11
EXAMPLE: SEARCH ROUTINE
OOSE UNIT 07 – Software Testing Techniques 12
STRUCTURAL TESTING
Sometime called white-box 1)statement coverage
testing (glass-box testing)
Derivation of test cases 2)branch coverage
according to program
structure. Knowledge of the
3)condition coverage
program is used to identify 4)path coverage
additional test cases
Objective is to exercise all
program statements (not all
path combinations)
OOSE UNIT 07 – Software Testing Techniques 13
PATH TESTING
The objective of path 1
testing is to ensure
that the set of test
while bottom < = top
bottom > top
2
cases is such that if (elemArray [mid] == key
each path through 3
the program is 8 4
(if (elemArray [mid]< key
executed at least 5 6
once 9
OOSE UNIT 07 – Software Testing Techniques 14
INTEGRATION TESTING
Tests complete systems or subsystems
composed of integrated components
Integration testing should be black-box testing
with tests derived from the specification
Main difficulty is localising errors
Incremental integration testing reduces this
problem
OOSE UNIT 07 – Software Testing Techniques 15
INTEGRATION TESTING
A T1
T1
A
T1 T2
A B
T2
T2 B T3
T3
B C
T3 T4
C
T4
D T5
Test sequence 1 Test sequence 2 Test sequence 3
OOSE UNIT 07 – Software Testing Techniques 16
APPROACHES TO INTEGRATION TESTING
Big-bang integration testing
Top-down testing
●
Start with high-level system and integrate from the top-
down replacing individual components by stubs where
appropriate
Bottom-up testing
●
Integrate individual components in levels until the
complete system is created
In practice, most integration involves a combination
of these strategies
OOSE UNIT 07 – Software Testing Techniques 17
TOP-DOWN APPROACH
Testing
Level 1 Level 1 . ..
sequence
Level 2 Level 2 Le vel 2 Level 2
Le vel 2
stubs
Le vel 3
stubs
OOSE UNIT 07 – Software Testing Techniques 18
BOTTOM-UP APPROACH
Test
drivers
Testing
Level N Level N Le vel N Level N Level N
sequence
Test
drivers
Level N–1 Level N–1 Level N–1
OOSE UNIT 07 – Software Testing Techniques 19
TESTING APPROACHES
Architectural validation
●
Top-down integration testing is better at discovering errors
in the system architecture
System demonstration
●
Top-down integration testing allows a limited
demonstration at an early stage in the development
Test implementation
●
Often easier with bottom-up integration testing
Test observation
Problems with both approaches. Extra code may be
required to observe tests
OOSE UNIT 07 – Software Testing Techniques 20
SYSTEM TESTING
Alpha testing – test carried by test team within the
organization
Beta testing – testing performed by a selected group
of friendly customers
Acceptance testing – performed by customer to
determine whether or not to accept the delivery of
the system
Stress testing – test systems beyond its maximum
design load
OOSE UNIT 07 – Software Testing Techniques 21
TESTING WORKBENCHES
Test da ta
Specification
gener ator
Source Test
Test da ta Oracle
code mana ger
Dynamic Program Test
Test r esults
anal yser being tested predictions
Execution File
Simulator
repor t comparator
Repor t Test results
gener ator repor t
OOSE UNIT 07 – Software Testing Techniques 22
OBJECT-ORIENTED TESTING
The components to be tested are object classes that are
instantiated as objects
Larger grain than individual functions so approaches
to white-box testing have to be extended
No obvious ‘top’ to the system for top-down
integration and testing
OOSE UNIT 07 – Software Testing Techniques 23
OOT LEVELS
Testing operations associated with objects
Testing object classes
Testing clusters of cooperating objects
Testing the complete OO system
OOSE UNIT 07 – Software Testing Techniques 24
OBJECT CLASS TESTING
Complete test coverage of a class involves
●
Testing all operations associated with an object
●
Setting and interrogating all object attributes
●
Exercising the object in all possible states
Inheritance makes it more difficult to design
object class tests as the information to be
tested is not localised
OOSE UNIT 07 – Software Testing Techniques 25
INTERFACE TESTING
Test
Takes place when modules or cases
sub-systems are integrated to
create larger systems
Objectives are to detect faults due
to interface errors or invalid
assumptions about interfaces
Particularly important for object- A B
oriented development as objects
are defined by their interface
C
OOSE UNIT 07 – Software Testing Techniques 26
INTERFACE TYPES
Parameter interfaces
●
Data passed from one procedure to another
Shared memory interfaces
●
Block of memory is shared between procedures
Procedural interfaces
●
Sub-system encapsulates a set of procedures to be called by
other sub-systems
Message passing interfaces
●
Sub-systems request services from other sub-systems
OOSE UNIT 07 – Software Testing Techniques 27
INTERFACE ERRORS
Interface misuse
●
A calling component calls another component and makes an
error in its use of its interface e.g. parameters in the wrong
order
Interface misunderstanding
●
A calling component embeds assumptions about the behaviour of
the called component which are incorrect
Timing errors
●
The called and the calling component operate at different speeds
and out-of-date information is accessed
OOSE UNIT 07 – Software Testing Techniques 28
INTERFACE TESTING GUIDELINES
Design tests so that parameters to a called procedure are at
the extreme ends of their ranges
Always test pointer parameters with null pointers
Design tests which cause the component to fail
Use stress testing in message passing systems
In shared memory systems, vary the order in which
components are activated
OOSE UNIT 07 – Software Testing Techniques 29
WEATHER STATION INTERFACE
Use a state model to
WeatherStation
identify state transitions
for testing identifier
Examples of testing repor tWeather ()
calibrate (instruments)
sequences test ()
●
Shutdown -> Waiting -> Shutdown star tup (instruments)
●
Waiting -> Calibrating -> Testing -> Transmitting -
> Waiting
shutdown (instruments)
●
Waiting -> Collecting -> Waiting -> Summarising -
> Transmitting -> Waiting
OOSE UNIT 07 – Software Testing Techniques 30
OBJECT INTEGRATION
Levels of integration are less distinct in object-
oriented systems
Cluster testing is concerned with integrating
and testing clusters of cooperating objects
Identify clusters using knowledge of the
operation of objects and the system features
that are implemented by these clusters
OOSE UNIT 07 – Software Testing Techniques 31
APPROACHES TO CLUSTER TESTING
Use-case or scenario testing
●
Testing is based on a user interactions with the system
●
Has the advantage that it tests system features as
experienced by users
Thread testing
●
Tests the systems response to events as processing threads
through the system
Object interaction testing
●
Tests sequences of object interactions that stop when an
object operation does not call on services from another
object
OOSE UNIT 07 – Software Testing Techniques 32
SCENARIO-BASED TESTING
Identify scenarios from use-cases and
supplement these with interaction diagrams
that show the objects involved in the scenario
Consider the scenario in the weather station
system where a report is generated
OOSE UNIT 07 – Software Testing Techniques 33
SCENARIO-BASED TESTING - EXAMPLE
:CommsController :WeatherStation :WeatherData
request (report)
acknowledge ()
report ()
summarise ()
send (report)
reply (report)
acknowledge ()
OOSE UNIT 07 – Software Testing Techniques 34
SCENARIO-BASED TESTING - EXAMPLE
Thread of methods executed
●
CommsController:request -> WeatherStation:report ->
WeatherData:summarise
Inputs and outputs
●
Input of report request with associated acknowledge and a
final output of a report
●
Can be tested by creating raw data and ensuring that it is
summarised properly
●
Use the same raw data to test the WeatherData object
OOSE UNIT 07 – Software Testing Techniques 35
End of Unit 07 : Software Testing Techniques
OOSE UNIT 07 – Software Testing Techniques 36