[go: up one dir, main page]

0% found this document useful (0 votes)
20 views39 pages

Ccs366 Software Testing and Automation Laboratory PDF

The document outlines a comprehensive test plan for evaluating the functionality and usability of the e-commerce application www.amazon.in. It includes steps for identifying testing scope, objectives, environment, deliverables, strategy, scheduling, risk analysis, resource planning, test case design, data setup, execution, and defect reporting. Additionally, it details the design of test cases to ensure thorough testing of various application features and functionalities.

Uploaded by

keerthiga
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views39 pages

Ccs366 Software Testing and Automation Laboratory PDF

The document outlines a comprehensive test plan for evaluating the functionality and usability of the e-commerce application www.amazon.in. It includes steps for identifying testing scope, objectives, environment, deliverables, strategy, scheduling, risk analysis, resource planning, test case design, data setup, execution, and defect reporting. Additionally, it details the design of test cases to ensure thorough testing of various application features and functionalities.

Uploaded by

keerthiga
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

Exp No.

1 Develop the Test Plan for Testing an E-commerce


Date: Web/Mobile Application(www.amazon.in)

Aim:
Theaimofthisexperimentistodevelopacomprehensivetestplanfortestingthefunctionalityandus
abilityofthe e-commerce web/mobileapplication www.amazon.in.

Algorithm:
1. IdentifytheScope:
Determinethescopeoftesting,includingthefeaturesandfunctionalitiesthatneedto betested.

2. Define Test Objectives: Specify the primary objectives of testing, such as functional
testing,usabilitytesting, performance testing,securitytesting, etc.

3. IdentifyTestEnvironment:Definetheplatforms,browsers,devices,andoperatingsystemsonw
hichthe application willbe tested.

4. DetermineTestDeliverables:Decideonthedocumentsandartifactsthatwillbegenerateddur
ingthetestingprocess,such astest cases,test reports,and defectlogs.

5. Create Test Strategy: Develop an overall approach for testing, including the testing
techniques,entryand exitcriteria, andthe rolesandresponsibilitiesofthe testingteam.

6. DefineTestScopeandSchedule:Specifythetimelineforeach testing
Phaseandthescopeoftestingfor each phase.

7. Risk Analysis: Identify potential risks and their impact on the testing process, and devise
riskmitigationstrategies.

8. ResourcePlanning:Allocatethenecessaryresources,includingthetestingteam,hardware,andso
ftwarerequired for testing.
9. Test Case Design: Prepare detailed test cases based on the requirements and
functionalities ofthee-commerceapplication.

10. TestDataSetup:Arrangetestdata required forexecutingthetestcaseseffectively.

11. Test Execution: Execute the test cases and record the test results.

12. Defect Reporting: Document any defects encountered during testing and track
theirresolution.

1
TestPlan:
Thetestplanshouldcoverthefollowingsections:
1. Introduction: Briefly describe the purpose of the test plan and provide an overview of
the e-commerceapplicationto be tested.

2. TestObjectives:Listtheprimaryobjectivesoftestingtheapplication.

3. TestScope:Specifythefeaturesandfunctionalitiestobetestedandanylimitationsontesting.

4. Test Environment: Describe the hardware, software, browsers, and devices to be


used fortesting.

5. TestStrategy:Explaintheoverallapproachtobefollowedduringtesting.

6. TestSchedule:Provideadetailedtimelineforeachtestingphase.

7. RiskAnalysis: Identifypotentialrisksandthestrategiestomitigatethem.

8. ResourcePlanning:Specifytheresourcesrequiredfortesting.

9. TestCaseDesign:Includeasummaryofthetestcasesdevelopedfortheapplication.

10. TestDataSetup:Describetheprocessofarrangingtestdatafortesting.

11. DefectReporting:Explaintheprocedureforreportingandtrackingdefects.

TestCase Table:

Expecte ActualR
Proces No. TestCase Steps Description Status d esult Comm
s Result ent

1. Verify The test


Test Scope Reviewthe thescope planincludes
Plan TC001 ofTestin test oftesting. Done allfeatures.
g plandocum
ent.
The test
1. Verify objective
TestObjec Reviewthe thetestobj are well-
TC002 tives test ectives. Done defined.
plandocum
ent.
2
Testenviron
1. Check mentsaremen
TestEnviron Reviewthe thespecifiede tioned.
TC003 ment test nvironments. Done
plandocum
ent.

1. Ensure The test


TestDeliver Reviewthe alldeliverab planincludes
TC004 ables test lesarelisted. Done alldeliverabl
plandocum es.
ent.
The
1. Verify teststrate
TestStr Reviewthe theoverall gy
TC005 ategy test approach. Done isclearlys
plandocum tated.
ent.

Test 1. Check The


Scopeand Reviewthe theschedule scheduleand
TC006 Schedule test andscope. Done scope
plandocum aredefined.
ent.

Risks
RiskAn 1. Reviewthe Ensurepote andmitigatio
TC007 alysis test ntial Done nstrategies
plandocumen risksareide arementione
t. ntified. d.

Resources
1. Reviewthe Check needed
Resourc test therequ fortesting
TC008 ePlannin plandocumen iredres Done arelisted.
g t. ources.

1. Reviewand Test
executethe Validate casesare
TestCase testcases. theprepar accuratean
TC009 Design ed Done dfunctional
testcases. .

1. Reviewthe
test Verify Test data
Test datasetupproc theavailabi isavailable
TC010 DataSetu ess. lity Done fortesting.
p oftestdata.

3
1 Run the
test cases Test
and resultsare
TC011 TestExec document
Execute InProgr recordedand
ution thetestca ess documented
the
ses. .
outcomes

1.Lodgefects Ensured Defects


DefectRe withdetailedin efectsar NotSta arereported
TC012 porting formation. ereporte rted withsufficie
dcorrect ntdetails.
ly.

1.
Monitordefec Verify Defects
DefectT t thetrack NotSta aretracked
TC013 racking statusandupda ing rted untilresoluti
tes. ofdefect on.
s.

4
Explanation:
The test plan is a crucial document that outlines the entire testing process. It ensures that
allaspectsofthee-
commerceapplicationarethoroughlytested,andtheresultsaresystematicallydocumented.

Result:
Upon completion of the experiment, you will have a well-structured test plan that provides a
clearroadmapfor testingthee-commerceweb/mobileapplicationwww.amazon.in.

5
Exp No.2 DesigntheTestCasesforTestingtheE-commerceApplication
Date:

Aim:
Theaimofthisexperimentistodesignasetofcomprehensiveandeffectivetestcasesfortestingthee-
commerceapplication www.amazon.in.

Algorithm:
1. Understand Requirements: Familiarize yourself with the functional and non-
functionalrequirementsofthe e-commerce application.

2. IdentifyTestScenarios:Basedontherequirements,identifydifferenttestscenariosthatcoverall
aspectsofthe application.

3. Write Test Cases: Develop test cases for each identified scenario,
includingpreconditions,steps to beexecuted, and expected outcomes.

4. Cover Edge Cases: Ensure that the test cases cover edge cases and boundary
conditions toverifythe robustnessofthe application.

5. Prioritize Test Cases: Prioritize the test cases based on their criticality and relevance
to theapplication.

6. ReviewTestCases:Conductapeerreviewofthetestcasestoensuretheiraccuracyandcomp
leteness.

7. OptimizeTestCases:Optimizethetestcasesforreusabilityandmaintainability.

TestCaseDesign:
Thetestcasedesignshouldincludethefollowingcomponentsforeachtestcase:

1. TestCaseID:Auniqueidentifierforeachtestcase.

2. TestScenario:Descriptionofthescenariobeingtested.

6
3. TestCaseDescription:Detailedstepstoexecutethetest.

4. Precondition:Thenecessaryconditionsthatmustbesatisfiedbeforeexecutingthetestcase.

5. TestSteps:Thesequenceofactionstobeperformedduringthetest.

6. ExpectedResult:Theoutcomethatisexpectedfromthetest.

TestCase Table:

Expecte Actual
Proces No. TestCase Steps Description Status dResult Result Comment
s

1.
TestC Navigateto Verify User
aseDe UserRegistr theregistrat userregistr cansuccessf
sign TC001 ation ionpage. ationproce Done ullyregister
ss. .

1. Verify User
Navigateto userloginp cansuccessf
TC002 UserLogin the rocess. Done ullylogin.
loginpage.

Searchres
1. Enter ultsreleva
akeyword Verifysearch nt
SearchFunct inthe functionality tothekeyw
TC003 ionality searchbar. . Done ord.

Verifyaddi Product
1. ngproduct isadded to
Browsethe s tothecart. theshoppin
TC004 Add to Cart productcat Done gcart.
alog.

1. Click Itemsinthe
Shopping ontheshop Verify shoppingca
CartValid pingcartic theshopping rt
TC005 ation on. cartcontents. Done aredisplaye
d.

7
1. Click Checkoutp
onthe"Che Verify rocessproc
Checkou ckout"butt thecheck NotSta eeds
TC006 tProcess on. outproces rted asexpected
s. .

Explanation:
Test cases are designed to validate the functionality and behaviour of the e-commerce
application.
Theyensurethattheapplicationperformsasintendedandmeetsthespecifiedrequirements.

Result:
Upon completion of the experiment, you will have a set of well-defined test cases
ready fortestingthee-commerce applicationwww.amazon.in.

8
Exp No.3 Test the E-commerce Application and Report the Defects in It
Date:

Aim:
The aim of this experiment is to execute the designed test cases and identify defects or
issues inthee-commerceapplication www.amazon.in.

Algorithm:
1. TestEnvironmentSetup:Setupthetestingenvironmentwiththerequiredhardware,software,an
dtest data.

2. Test Case Execution: Execute the test cases designed in Experiment 2, following the
specifiedsteps.

3. DefectIdentification:Duringtestexecution,recordanydiscrepanciesorissuesencountered.

4. DefectReporting:Logtheidentifieddefectswithdetailedinformation,includingstepstore
produce,severity, and priority.

5. DefectTracking:Tracktheprogressofdefectresolutionandverifyfixesastheyareimp
lemented.

6. Retesting:Afterdefectfixes,retesttheaffectedareastoensuretheissuesareresolved.

7. Regression Testing: Conduct regression testing to ensure new changes do not introduce
newdefects.

9
TestCase Table:

Expecte Actual
Proces No. TestCase Steps Description Status dResult Result Comment
s

1.
TestC Navigateto Verify User
aseDe UserRegistr theregistrat userregistr cansuccessf
sign TC001 ation ionpage. ationproce Done ullyregister
ss. .

1. Verify User
Navigateto userloginp cansuccessf
TC002 UserLogin the rocess. Done ullylogin.
loginpage.

Searchres
1. Enter ultsreleva
akeyword Verifysearch nt
SearchFunct inthe functionality tothekeyw
TC003 ionality searchbar. . Done ord.

Verifyaddi Product
1. ngproduct isadded to
Browsethe s tothecart. theshoppin
TC004 Add to Cart productcat Done gcart.
alog.

1. Click Itemsinthe
Shopping ontheshop Verify shoppingca
CartValid pingcartic theshopping rtaredispla
TC005 ation on. cartcontents. Done yed.

1. Click Checkoutp
onthe"Che Verify rocessproc
Checkou ckout"butt thecheck NotSta eedsasexpe
TC006 tProcess on. outproces rted cted.
s.

10
Explanation:
Testing the e-commerce application aims to validate its functionality and usability .By
identifying and reporting defects, you ensure the applications quality and reliability.

Result:
Upon completion of the experiment, you will have a list of identified defects and their status
after resolution.

11
Exp No.4 Test the E-commerce Application and Report the Defects in It
Date:

Aim:
TheaimofthisexperimentistocreateacomprehensivetestplananddesigntestcasesforanInventory
Control System.

Algorithm:
Follow the same algorithm as described in Experiment 1 for developing the test plan
for an inventory control system.

FollowthesamealgorithmasdescribedinExperiment2fordesigningtestcasesforaninventorycontr
olsystem.

Test Plan:

Expected Actual
Proces No. Test Case Steps Description Status Result Result Comment
s

The test
1. Review the plan
requirements Verify includes
Test Scope of and project the scope all
Plan TC001 Testing documentation of testing. Done essential
. features.

2.Identify the
modules to
be tested.

3.
Determine
the out-of-
scope items.

12
Expected Actual
Proces No. Test Case Steps Description Status Result Result Commen
s t

1. Review the The test


requirements Verify objectives
Test andproject the test are clearly
TC002 Objectives documentation objectives Done defined.
. .

2. Discuss
with
stakeholders to
understand
expectations.

1.Identifythe
hardware and Verify the The test
Test software required Not environment
TC003 Environment requirements. environment Started is defined.
s.

2. Set up the
required
hardware
and software.

1. Determine All
the documents Verify the necessary
Test and artifacts required Not documents
TC004 Deliverables to be deliverable Started are listed.
produced.

2. Create
templates
for test
reports,
defect logs
,etc.

1. Decide on Verify the


the testing overall The test
Test approach approach Not strategy is
TC005 Strategy and for testing. Started defined.
techniques.

13
Expected Actual
Proces No. Test Case Steps Descriptio Status Result Result Comment
s n

2. Determine
the entry and
exit criteria.

1. Define the
Test Scope timeline for Verify the The
and each testing schedule Not schedule is
TC00 Schedule phase. for testing. Started established
6 .

2. Determine
the scope of
testing for each
phase.

Potential
risks are
1. Identify Verify risk identified
potential risks analysis with
Risk in the testing and Not mitigation
TC00 Analysis Started plans.
process. mitigation
7
strategies.

2. Discuss risk
mitigation
strategies with
the team.

1. Allocate the Resources


required Verify the needed
Resource resources for availability Not for testing
TC00 Planning testing. of resources. Started are
8 allocated.

2. Determine
the roles and
responsibilities
of the team.

14
Test Case Design:

Expected Actual
Process No. Test Case Steps Descriptio Status Result Result Commen
n t

1. Review the Verify the All


Test Module A – requirement functionalit functionalitie
Case Functionality related to y of Not s of Module
Design TC001 Test Module A. Module A. Started A are tested.

2.Identify test
scenarios for
Module A.

3.
Developdet
ailed test
cases for
Module A.

1. Review the Verify the


Module B - requirements integration Module B
Integration related to of Module Not is
TC002 Test Module B. B with Started successfull
others. y
integrated.
2. Identify
integration
points with
other modules.

3.Design test
cases for
testing
integration
scenarios.

15
Expected Actual
Proces No. Test Case Steps Description Status Result Result Comment
s

1. Review Verify the Module C


Module C- the performanc performs
Performanc performance e of Module Not optimally
TC003 e Test requirements C. Started under
for Module load.
C.
2.
Determine
performanc
e metrics to
be
measured.
3. Develop
performanc
e test cases
for Module
C.

1. Review the
Module D usability Verify the Module D is
–Usability requirements usability Not user-
TC004 Test for Module of Module Started friendly and
D. D. intuitive.

2. Identify
usability
aspects to
be tested.

3. Create
test cases
for
evaluating
Module D'
usability.

Module E
1. Review is protected
the security Verify the against
Module E- requirements security Not security
TC005 Security for Module of Started threats.
Test E. Module
E.
2.
Identify
potential
16
Expected Actual
Proces No. Test Case Steps Description Status Result Result Comment
s

Security
vulnerabilities
.

3. Design
security test
cases to
assess
Module E.

Explanation:
Aim ventory control system is critical for managing stock and supplies .Proper testing
ensures the system functions accurately and efficiently.

Result:
Upon completion of the experiment, you will have a well-structured test plan and a set of
test cases ready for testing the Inventory Control System.

17
Exp No.5 Execute the Test Cases against a Client-Server or Desktop Application
Date: and Identify the Defects

Aim:
The aim of this experiment is to execute the test cases against a client-server or
desktop application and identify defects.

Algorithm:
1. Test Environment Setup: Set up the testing environment, including the client-server or
desktop application , required hardware, and test data.

2. Test Case Execution : Execute the test cases designed in Experiment 2 against the application.

3. Defect Identification : During test execution, record any discrepancies or issues encountered.

4. Defect Reporting : Log the identified defects with detailed information ,including
steps to reproduce, severity, and priority.

5. Defect Tracking: Track the progress of defect resolution and verify fixes as they
are implemented.

6. Retesting : After defect fixes ,retest he affected areas to ensure the issues are resolved.

7. Regression Testing: Conduct regression testing to ensure new changes do not introduce
new defects.

18
Test Case Table:

Expected Actual
Process No. Test Case Steps Description Status Result Result Commen
t

User can
Test Case 1. Launch the Verify user Not successfull
Executio TC001 User application. login Started y login.
n Login process.

2. Enter vali
login credentials.

3. Click on
the"Login"button.

Invalid
datashowsa
1. Access a data Verify ppropriatee
input form. datavalidati NotSta rrormessag
TC002 Data ononthefor rted es.
Validation m.

2. Enter invalid
data in the form
fields.

3. Submit the
form.

1. Access the file Verify file File is


upload feature. upload Not uploaded
TC003 File functionality Started successfully
Upload . .

2.Selectafilefrom
the system.

3. Click on the
"Upload" button.

19
Applicatio
Verify the gracefully
Network 1. Disconnec application's Not handles
TC004 Connectivit the network. response. Started disconnectio
y n.

2. Attempt to
perfor
manaction
requiring net
work access.

1. Simulate Verify Application


Con concurrent application Not performs
TC005 current user sessions. performance. Started well under
Users load.

2.
Performaction
s
simultaneously
.
1. Test the Applicatio
application Verify cross- n works
on different platform Not on all
TC006 Compatibility platforms. compatibility Started specified
. platforms.

2. Execute test
son various
browsers.

1. Monitor Data is
network Verify correctly
Client-Server traffic communicatio Not transmitted
TC007 Communicatio between client n integrity. Started and
n and server. received.

Explanation:
Testing a client-server or desktop application ensures its functionality across different
platforms and environments.

20
21
Result:
Upon completion of the experiment , you will have a list of identified defects and their status
after resolution for the client-server or desktop application.

22
Exp No.6 Test the Performance of the E-commerce Application
Date:

Aim:
The aim of this experiment is to test the performance of the e-commerce
applicationwww.amazon.in.

Algorithm:
1. Identify Performance Metrics : Determine the performance metrics to be measured
,such as response time ,through put ,and resource utilization.

2. Define Test Scenarios : Create test scenarios that simulate various user interactions and
loads on the application.

3. Performance Test Setup: Set up the performance testing environment with


appropriate hardware and software.

4. Execute Performance Tests: Run the performance tests using the defined scenarios and
collect performance data.

5. Analyze Performance Data: Analyze the collected data to identify any performance bottle
necks or issues.

6. Performance Tuning: Implement necessary optimizations to improve the


application's performance.

Performance Table:

Expected Actual
Process No. Test Case Steps Descriptio Statu Result Result Comment
n s
The home
Page loads
1.Access the With in the
home page of specified
Response thee- Measure response
the
Performan Time for commerce response Not time
ce
23
Testing TC00 Home application. time. Starte threshold.
1 Page d

24
Expected Actual
Process No. Test Steps Descriptio Statu Result Result Commen
Case n s t

2. Use a
performance
testing tool to
record the time.

3.Analyzethereco
rded data to
determine
response time.

Theappli
cation
can
handle
1. Simulate peak-
Through peak-hour hour
put traffic on the Measure Not traffic
TC002 during application. the Starte with out
Peak throughput d significa
Hours . nt delays.
2. Execute
performance
tests during
peak hours.

3.Analyzethedata
to determine the
through put.

Resource
1. Monitor utilizatio
CPU, memory n
,and network Measurer remains
Resource usage during e source Not with in
TC003 Utilization testing. utilizatio Starte acceptabl
n. d e limits.
2. Execute
performanc
e tests while
monitoring
resources.

25
ExpectedR Actual
Process No. Test Steps Description Status esult Result Comment
Case

3.Analyzethe
data to
assess
resource
utilization.
The
1. Simulate application
multiple remains
concurrent Measure stable and
Con users appper Not responsive
TC004 current accessing the formance Starte under
Users app. under load. d load.
2. Increase
the number
of concurrent
users
gradually.

3. Record the
application's
behavior with
increased
load.
Measure
1. Apply system The system
maximum behavior recovers
load to test under gracefully
Stress the system's extreme Not after stress
TC005 Testi breaking load. Starte is
ng point. d removed.
2. Apply the
maximum
user load the
applicationca
n handle.

3. Observe
the
application's
response
under stress.

26
Expected Actual
Process No. Test Case Steps Descriptio Statu Result Result Comment
n s

Performan
ce bottle
necks are
1. Identify addressed
performance and
bottle necks Improve application
Performance and areas of application Not performs
TC006 Tuning improvement performanc Starte better.
. e. d
2.Analyzethe
performancet
estresults.

3. Implement
necessary
optimizations.

Explanation:
Performance testing helps to identify bottlenecks in the e-commerce application, ensuring it
can handle real-world user loads effectively.

Result:
Upon completion of the experiment, you will have performance test results and any
optimizations made to improve the application's performance.

27
Exp No.7 Automate the testing of e-commerce applications using Selenium.
Date:

Aim:
The aim of this task is to automate the testing of an e-commerce web
application(www.amazon.in) using Selenium WebDriver, which will help improve testing
efficiency and reliability.

Algorithm:
1. Set up the environment:
- Install Java Development Kit (JDK)and configure the Java environment variables.
- Install an Integrated Development Environment (IDE) like Eclipse or IntelliJ.
- Download Selenium WebDriver and the required web drivers for the browsers you
intend to test(e.g., Chrome Driver, Gecko Driver for Firefox).

2. Create a new Java project in the IDE:


-SetupanewJavaprojectintheIDEandincludetheSeleniumWebDriverlibrary.

3. Develop test cases:


- Identifythekeyfunctionalitiesandscenariostotestinthee-commerceapplication.
- Design test cases covering various aspects like login, search, product details, add to
cart, checkout, etc.

4. Implement Selenium automation scripts:


- WriteJavacodeusingSeleniumWebDrivertoautomatetheidentifiedtestcases.
- Utilize different Selenium commands to interact with the web elements, navigate
through pages, and perform various actions.

5. Execute the automated test cases:


- Run the automated test scripts against the e-commerce application.
- Observe the test execution and identify any failures or defects.

28
6. Analyze the test results:
- Review the test execution results to identify any failed test cases.
- Debug and fix any issues with the automation scripts if necessary.

7. Report defects:
- Document any defects found during the automated testing process.
- Provide detailed information about each defect, including steps to reproduce and
expected results.

Program:

Package program;
Import org.openqa .selenium .By;
Import org. open qa. Selenium .Web Driver;
import org. open qa. Selenium .chrome. Chrome
Driver ;public class selenium {
public static void main(String[]args)
{
System.setProperty("webdriver.chrome.driver","C:\\Users\\Admin\\Downloads\\chromedri
ver- win64\\chromedriver-win64\\chromedriver.exe");
WebDriverd=newChrome Driver()
;d.get("https://www.amazon.in");
d.findElement(By.xpath("//*[@id=\"nav-link-accountList\"]/span/span")).click();
d.findElement(By.id("ap_email")).sendKeys("youremail@gmail.com");d.findElement(By.x
path("//*[@id=\"continue\"]")).click();d.findElement(By.id("ap_password")).sendKeys("yo
ur password");d.findElement(By.xpath("//*[@id=\"signInSubmit\"]")).click();
Stringu=d.getCurrentUrl();if(u.equals("https://www.amazon.in/?
ref_=navya_sign in"))
{
System . out. Print ln ("Test Case Passed");
}

29
els
e
{ System .out. print ln("Test Case Failed");

}
d. close();
}
}

Automation Process:

30
31
Console output:

Result:
The successful completion of this task will yield:
- Automated test scripts for the e-commerce application using Selenium Web Driver.
- Identificationofdefects,ifany,intheapplication.

32
Exp No.8 Integrate Test NG with the above test automation.
Date:

Aim:
TheaimofthistaskistointegrateTestNGwiththeexistingSeleniumautomationscriptsforthee-
commerce application, enhancing test management, parallel execution, and reporting
capabilities.

Algorithm:
1. Setup Test NG in the project:

-Add Test NG library to the existing Java project.

2. Organize test cases using Test NG annotations:


- Add Test NG annotations(@Test,@ Before Test, @After Test, etc.)to the existing test cases.
- Group similar test cases using Test NG's grouping mechanism.

3. Implement data-driven testing(optional):


-UtilizeTestNG'sdataproviderstoimplementdata-driventestingifrequired.

4. Configure Test NG test suite:

- Create an XML configuration file for TestNG to define test suites, test groups, and
other configurations.

5. Execute the automated test cases using Test NG:


- Run the automated test suite using Test NG.
- Observe the test execution and identify any failures or defects.

6. Analyze the test results:


- ReviewtheTestNG-generatedtestreportstoidentifyanyfailedtestcases.
- UtilizeTestNG'sreportingcapabilitiestounderstandthetestexecutionstatus.

7. Report defects(I fan y):


- Documen any defects found during the automated testing process.

33
- Provide detailed information about teach defect ,including step store produce and
expected results.

Program Code(Program1.java):

Package my test;

Import java .time. Duration;

Import org.openqa.selenium.By;

Import org.openqa.selenium.WebDriver;

Import org. openqa. selenium. chrome. Chrome Driver;

importorg.testng.Assert;

import org. testng. annotations. After Method;

import org. testng. Annotations .Before Method;

import org. testng. annotations.Test;

public classProgram1{

Web Driver driver;

@BeforeMethod

Public void set Up()

System .set Property ("webdriver.chrome.driver","C:\\ selenium\\chromedriver


_win32\\ chromedriver.exe");

driver= new Chrome Driver();

driver .get("https://amazon.in");

driver .manage().window().maximize();
34
driver .manage().timeouts().implicitly Wait (Duration.ofSeconds(5));

35
@Test

Public void verify Title()

String actual Title=driver . get Title();

String expected Title="Online Shopping site in India: Shop Online for Mobiles, Books
, Watches , Shoe sand More- Amazon .in";

Assert. Assert Equals (actual Title, expected Title);

@Test

Public void verify Logo()

Boolean flag=driver.findElement(By.xpath("//a[@id='nav-logo-

sprites']")).isDisplayed();Assert.assertTrue(flag);

@AfterMethod

Public void tear Down()

Driver . quit();

Program Code(pom.xml) :

<project
xmlns="http://maven.apache.org/POM/4.0.0"xmlns:xsi="http://www.w3.org/2001/XMLSchema
-
instance"xsi:schemaLocation="http://maven.apache.org/POM/4.0.0https://maven.apache.org/xsd
/maven-4.0.0.xsd">

<model Version>4.0.0</model Version>

<groupId>MiniProject2</groupId>
36
<artifact Id>MiniProject2</artifact Id>

<version>0.0.1-SNAPSHOT</version>

<dependencies>

<!--https://mvnrepository.com/artifact/org.seleniumhq.selenium/selenium-java-->

<dependency>

<group Id>org .selenium hq. selenium</group Id>

<artifact Id>selenium-java</artifact Id>

<version>4.3.0</version>

</dependency>

</dependencies>

<build>

<source Directory> src </source Directory>

<plugins>

<plugin>

<artifact Id>maven-compiler -plugin</artifact Id>

<version>3.8.1</version>

<configuration>

<release>16</release>

</configuration>

</plugin>

</plugins>

</build>

37
</project>

38
Program Code(testng.xml):

<?xml version="1.0"encoding="UTF-8"?>

<! DOC TYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">

<suite name="Suite">

<test name="Test">

<classes>

<class name="mytest.Program1"></class>

</classes>

</test><!--Test -->

</suite><!--Suite -->

Output:

Result:
The successful completion of this task will yield:
- Integration of Test NG with the existing Selenium automation scripts.
- Enhanced test management and reporting capabilities.
- Identification of defects , Ifany ,in the application and improved efficiency in handling
test scenarios.
39

You might also like