[go: up one dir, main page]

0% found this document useful (0 votes)
154 views17 pages

1.1. Overview of System X: Development Project

The document outlines the test approach for a new business systems development project. It will involve replacing legacy systems, introducing new functionality, and processing transactions across multiple European countries. The test approach includes functional, integration, user acceptance, performance, regression, load, and technical testing. The objectives are to ensure the system meets requirements and specifications, interfaces properly with other systems, and is of sufficient quality for deployment. System testing will be managed by the Software Quality Assurance team according to the outlined test phases, cycles, entrance/exit criteria, and governance process.

Uploaded by

mathewk000
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
154 views17 pages

1.1. Overview of System X: Development Project

The document outlines the test approach for a new business systems development project. It will involve replacing legacy systems, introducing new functionality, and processing transactions across multiple European countries. The test approach includes functional, integration, user acceptance, performance, regression, load, and technical testing. The objectives are to ensure the system meets requirements and specifications, interfaces properly with other systems, and is of sufficient quality for deployment. System testing will be managed by the Software Quality Assurance team according to the outlined test phases, cycles, entrance/exit criteria, and governance process.

Uploaded by

mathewk000
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 17

1.

INTRODUCTION
1.1. Overview of System X
To aim of this phase of the project is to implement a new X System platform that will enable: Removal of legacy office systems Introduction of ABC Processing of Special Transactions No constraint on location of capture Enable capture of transactions for other processing systems New Reconciliation Process Positioning for European ECU Currency and future initiatives

This programme will result in significant changes to the current departmental and inter-office processes. The functionality will be delivered on a phased basis. Phase 1 will incorporate the following facilities : Replacement of the legacy System A New Reconciliation System Outsourcing system for departments in different european countries. New/Revised Audit Trail & Query Facilities [Detailed inclusions are listed later in this document]

1.2. Purpose of this Document


This document is to serve as the Draft Test Approach for the Business Systems Development Project. Preparation for this test consists of three major stages: The Test Approach sets the scope of system testing, the overall strategy to be adopted, the activities to be completed, the general resources required and the methods and processes to be used to test the release. It also details the activities, dependencies and effort required to conduct the System Test. Test Planning details the activities, dependencies and effort required to conduct the System Test. Test Conditions/Cases documents the tests to be applied, the data to be processed, the automated testing coverage and the expected results.

1.3. Formal Reviewing


There will be several formal review points before and during system test. This is a vital element in achieving a quality product. 1.3.1. Formal Review Points 1. 2. 3. 4. 5. 6. 7. Design Documentation Testing Approach Unit Test Plans Unit Test Conditions & Results System Test Conditions System Test Progress Post System Test Review

1.4. Objectives of System Test


At a high level, this System Test intends to prove that : The functionality, delivered by the development team, is as specified by the business in the Business Design Specification Document and the Requirements Documentation. The software is of high quality; the software will replace/support the intended business functions and achieves the standards required by the company for the development of new systems. The software delivered interfaces correctly with existing systems, including Windows 98. [Detailed objectives are listed later in this document.]

1.4.1. Software Quality Assurance involvement

The above V Model shows the optimum testing process, where test preparation commences as soon as the Requirements Catalogue is produced. System Test planning commenced at an early stage, and for this reason, the System test will benefit from Quality initiatives throughout the project lifecycle. The responsibility for testing between the Project & Software Qualtiy Assurance (S.Q.A.) is as follows: Unit Test is the responsibility of the Development Team System Testing is the responsibility of SQA User Acceptance Testing is the Responsibility of the User Representatives Team Technology Compliance Testing is the responsibility of the Systems Installation & Support Group. 2. SCOPE AND OBJECTIVES o 2.1. Scope of Test Approach - System Functions 2.1.1. Inclusions 2.1.2. Exclusions o 2.2. Testing Process o 2.3. Testing Scope 2.3.1. Functional Testing 2.3.2. Integration Testing 2.3.3. Business (User) Acceptance Test 2.3.4. Performance Testing 2.3.5. Regression Testing 2.3.6. Bash & Multi-User Testing 2.3.7. Technical Testing 2.3.8. Operations Acceptance Testing (OAT) o 2.4. System Test Entrance/Exit Criteria

Entrance Criteria Exit Criteria

2. SCOPE AND OBJECTIVES


2.1. Scope of Test Approach - System Functions
2.1.1. INCLUSIONS The contents of this release are as follows :Phase 1 Deliverables o o o o o o o o o New & revised Transaction Processing with automated support New Customer Query Processes and systems Revised Inter-Office Audit process Relocate Exceptions to Head Office New centralised Agency Management system Revised Query Management process Revised Retrievals process New International Reconciliation process New Account Reconciliation process

2.1.2. EXCLUSIONS
When the scope of each Phase has been agreed and signed off, no further inclusions will be considered for inclusion in this release, except: (1) Where there is the express permission and agreement of the Business Analyst and the System Test Controller; (2) Where the changes/inclusions will not require significant effort on behalf of the test team (i.e. requiring extra preparation - new test conditions etc.) and will not adversely affect the test schedule. [See Section 9.1.] 2.1.3. SPECIFIC EXCLUSIONS Cash management is not included in this phase Sign On/Sign Off functions are excluded - this will be addressed by existing processes The existing Special Order facility will not be replaced Foreign Currency Transactions International Data Exchanges Accounting or reporting of Euro transactions

Reference & Source Documentation:

1. 2. 3. 4. 5.

Business Processes Design Document - Document Ref: BPD-1011 Transaction Requirements for Phase 1 - Document Ref: TR_PHASE1-4032 Project Issues & Risks Database - T:\Data\Project\PROJECT.MDB The System Development Standards - Document Ref: DEVSTD-1098-2 System Development Lifecycle - Document Ref: SDLC-301

2.2. Testing Process

The diagram above outlines the Test Process approach that will be followed.

a. Organise Project involves creating a System Test Plan, Schedule & Test Approach, and requesting/assigning resources. b. Design/Build System Test involves identifying Test Cycles, Test Cases, Entrance & Exit Criteria, Expected Results, etc. In general, test conditions/expected results will be identified by the Test Team in conjunction with the Project Business Analyst or Business Expert. The Test Team will then identify Test Cases and the Data required. The Test conditions are derived from the Business Design and the Transaction Requirements Documents c. Design/Build Test Procedures includes setting up procedures such as Error Management systems and Status reporting, and setting up the data tables for the Automated Testing Tool. d. Build Test Environment includes requesting/building hardware, software and data set-ups. e. Execute Project Integration Test - See Section 3 - Test Phases & Cycles f. Execute Operations Acceptance Test - See Section 3 - Test Phases & Cycles g. Signoff - Signoff happens when all pre-defined exit criteria have been achieved. See Section 2.4.

2.2.1. Exclusions SQA will not deal directly with the business design regarding any design / functional issues / queries.

The development team is the supplier to SQA - if design / functional issues arise they should be resolved by the development team and its suppliers.

2.3. Testing Scope


Outlined below are the main test types that will be performed for this release. All system test plans and conditions will be developed from the functional specification and the requirements catalogue.

2.3.1. Functional Testing


The objective of this test is to ensure that each element of the application meets the functional requirements of the business as outlined in the : Requirements Catalogue Business Design Specification Year 2000 Development Standards Other functional documents produced during the course of the project i.e. resolution to issues/change requests/feedback.

This stage will also include Validation Testing - which is intensive testing of the new Front end fields and screens. Windows GUI Standards; valid, invalid and limit data input; screen & field look and appearance, and overall consistency with the rest of the application. The third stage includes Specific Functional testing - these are low-level tests which aim to test the individual processes and data flows.

2.3.2. Integration Testing


This test proves that all areas of the system interface with each other correctly and that there are no gaps in the data flow. Final Integration Test proves that system works as integrated unit when all the fixes are complete.

2.3.3. Business (User) Acceptance Test


This test, which is planned and executed by the Business Representative(s), ensures that the system operates in the manner expected, and any supporting material such as procedures, forms etc. are accurate and suitable for the purpose intended. It is high level testing, ensuring that there are no gaps in functionality.

2.3.4. Performance Testing


These tests ensure that the system provides acceptable response times (which should not exceed 4 seconds).

2.3.5. Regression Testing


A Regression test will be performed after the release of each Phase to ensure that -

There is no impact on previously released software, and to ensure that there is an increase in the functionality and stability of the software.

The regression testing will be automated using the automated testing tool.

2.3.6. Bash & Multi-User Testing


Multi-user testing will attempt to prove that it is possible for an acceptable number of users to work with the system at the same time. The object of Bash testing is an ad-hoc attempt to break the system.

2.3.7. Technical Testing


Technical Testing will be the responsibility of the Development Team.

2.3.8. Operations Acceptance Testing (OAT)


This phase of testing is to be performed by the Systems Installation and Support group, prior to implementing the system in a live site. The SIS team will define their own testing criteria, and carry out the tests.

2.4. System Test Entrance/Exit Criteria


2.4.1. Entrance Criteria
The Entrance Criteria specified by the system test controller, should be fulfilled before System Test can commence. In the event, that any criterion has not been achieved, the System Test may commence if Business Team and Test Controller are in full agreement that the risk is manageable. All developed code must be unit tested. Unit and Link Testing must be completed and signed off by development team. System Test plans must be signed off by Business Analyst and Test Controller. All human resources must be assigned and in place. All test hardware and environments must be in place, and free for System test use. The Acceptance Tests must be completed, with a pass rate of not less than 80%.

Acceptance Tests: 25 test cases will be performed for the acceptance tests. To achieve the acceptance criteria 20 of the 25 cases should be completed successfully - i.e. a pass rate of 80% must be achieved before the software will be accepted for System Test proper to start. This means that any errors found during acceptance testing should not prevent the completion of 80% of the acceptance test applications. Note: These tests are not intended to perform in depth testing of the software. [For details of the acceptance tests to be performed see X:\Testing\Phase_1\Testcond\Criteria.doc]

Resumption Criteria In the event that system testing is suspended resumption criteria will be specified and testing will not re-commence until the software reaches these criteria.

2.4.2. Exit Criteria


The Exit Criteria detailed below must be achieved before the Phase 1 software can be recommended for promotion to Operations Acceptance status. Furthermore, I recommend that there be a minimum 2 days effort Final Integration testing AFTER the final fix/change has been retested. [See section 9.3] All High Priority errors from System Test must be fixed and tested If any medium or low-priority errors are outstanding - the implementation risk must be signed off as acceptable by Business Analyst and Business Expert Project Integration Test must be signed off by Test Controller and Business Analyst. Business Acceptance Test must be signed off by Business Expert.

Chapter 4 - Test Schedule

4. System Test Schedule

4. System Test Schedule


These are screenshots of several high level views of the project schedule. These schedules are intended as examples only and probably will not correspond exactly with the rest of the test plan. Click on the small image to view the main schedule.

Chapter 5 - Resources

5. RESOURCES o 5.1. Human o 5.2. Hardware Hardware components required o 5.3. Software Test Host environments Test Branch Software Error Measurement System

5. RESOURCES
5.1. Human
Resource Type Resource Title No. Date Req' d
Project Mgmt/Functional Business Analyst 1 A.N. Other Assigned

Who

Status

Testing

Test Controller Testers

1 4

1st May

A. Smith Assigned To Be Assigned

Test Support Team

Support Programmers Technical Support WAN Support CIS Support Bookkeeping Support External Liaison Support Business Expert/ Business Representative

4 1 1 1 1 1 1

15th May 1st May 25th May 25th May 15th May 25th May 1st May

To be Assigned To be Assigned To be Assigned To be Assigned To be Assigned Assigned To be Assigned

Technical - External

C. Jones

Business

5.2. Hardware
One seperate, controlled system will be required for the initial phase of testing, setup as per one standard, complete office environment. In order to maintain the integrity of the test environment his network will not be accessible to anybody outside this project. The printers are also exclusively for use by the test network.

Hardware components required


1 Network Controller 6 Networked PC's (See below) 1 DAP Workstation 1 Motorola 6520 1 Alpha AXP Server 1 Batch Waste Printer 1 HP LaserJet 4v Printer

PC Specifications The 6 PC's required for the test environment will include the following: 1 x P100, 1Gb HD, 16Mb RAM [Current Minimum Specification] 3 x P166, 1.5Gb HD, 32Mb RAM [Current Standard Specification] 1 x P333, 2.5Gb HD, 64Mb RAM [Current Maximum Specification] These specifications are the various specifications currently in use in different branches. 1 x Pentium running Windows NT is also required as the Test center for controlling and executing the automated testing.

5.3. Software
Test IMS environments
Test IMS region X will be required for System Testing. Additional or amended data will be populated where required.

Test Environment Software


System Test will be run on the following Software Versions :Custom Destop Vers.97.0.1 Windows 95 Operating System Visual Basic 5 Runtime Files MS Office 97 Novell Netware

Error Measurement System


This system test will use a bespoke MS Access database Error Management system. A new database will be implemented for the sole use of this project.

Chapter 6 - Roles & Responsibilities

6. ROLES AND RESPONSIBILITIES o 6.1. Management Team o 6.2. Testing Team o 6.3. Business Team o 6.4. Testing Support Team o 6.5. External Support Team

6. ROLES AND RESPONSIBILITIES


6.1. Management Team
Project Leader - B. Ruthlenn Ensure Phase 1 is delivered to schedule, budget & quality Ensure Exit Criteria are achieved prior to System Test Signoff Regularly review Testing progress with Test Controller. Liaise with external Groups e.g. New Systems Raise and manage issues/risks relating to project or outside Test Teams control. Review & sign off Test approach, plans and schedule.

SQA Project Leader - C. Nicely Ensure Phase 1 is delivered to schedule, budget & quality Regularly review Testing progress Manage issues/risks relating to System Test Team Provide resources necessary for completing system test.

6.2. Testing Team


Test Planner / Controller - D. Everyman Ensure Phase 1 is delivered to schedule, budget & quality Produce High Level and Detailed Test Conditions Produce Expected Results Report progress at regular status reporting meetings Co-ordinate review & signoff of Test Conditions Manage individual test cycles & resolve tester queries/problems. Ensure test systems outages/problems are reported immediately and followed up. Ensure Entrance criteria are achieved prior to System Test start. Ensure Exit criteria are achieved prior to System Test signoff. Testers Identify Test Data Execute Test Conditions and Markoff results Raise Software Error Reports Administer Error Measurement System

6.3. Business Team


Business Analyst - E. Showman Review high level / detailed test plans for System Test Define Procedures Resolve design issues Resolve Business issues Take part in daily test Error Review Team meetings Business Representative - ?? (To be Assigned) Execute User Acceptance Testing Define Test Conditions/Expected Results for Business Acceptance Test Resolve user issues Resolve Design issues

6.4. Testing Support Team


Support Programmers Take part in daily Error Review Team meetings Co-ordinate/provide support for system test.

Resolve errors Re-release test software after amendments Support Systems Testers

6.5. External Support Team


CIS Support Provide CIS support, if required. Resolve CIS queries, if required. IMS Support Provide System Test Support Support IMS Regions Resolve Spooling Issues (if necessary) Bookkeeping Integration & Compliance (if necessary) Resolve queries arising from remote backup Bookkeeping Support Provide Bookkeeping Technical support, if required. Resolve queries, if required. Technical Support Provide support for hardware environment Provide support for Test software Promote Software to system test environment Access Support Provide and support Test Databases

Chapter 7 - Error Management & Configuration Management 7. Error Management & Configuration Management
During System Test, errors will be recorded as they are detected on Error Report forms. These forms will be input on the Error Management System each evening with status "Error Raised" or "Query Raised". The Error Review Team will meet each morning (10am, Conference Room) to review and prioritise DN's raised the previous day, and assign them or drop them as appropriate. This team will consist of the following representatives:

A. Boring - Development Team Leader B. Curie - Business Analyst C. Durine - Test Controller D. Ewards - Business Representative

Errors, which are agreed as valid, will be categorised as follows by the Error Review Team :

Category A - Serious errors that prevent System test of a particular function continuing or serious data type error Category B - Serious or missing data related errors that will not prevent implementation. Category C - Minor errors that do not prevent or hinder functionality.

Category A errors should be turned around by Bug Fix Team in 48 hours (this is turn around from time raised at Error Review Team meeting to time fix is released to System Test environment). In the event of an A error that prevents System Test continuing, the turnaround should be within 4 hours. Category B errors should be turned around in 1 day; while Category C errors should be turned around in 3 days. However, the release of newer versions of the software will be co-ordinated with the Test Controller - new versions should only be released when agreed, and where there is a definite benefit (i.e. contains fixes to X or more numbers of bugs).

8. STATUS REPORTING
8.1. Status Reporting
Test preparation and Testing progress will be formally reported during a weekly Status Meeting. The attendees at this meeting are :

Byron Ruthlenn - Project Manager Dion Ryan- Business Design Team Pat Smith - Development Team Leader

A status report will be prepared by the Test Controller to facilitate this meeting. This report will contain the following information :1. 2. 3. 4. 5. 6. Current Status v. Plan (Ahead/Behind/On Schedule) Progress of tasks planned for previous week Tasks planned for next week including tasks carried from previous week Error Statistics from Error Measurement system Issues/Risks AOB.

Chapter 9 - Issues, Risks and Assumptions

9. ISSUES/RISKS & ASSUMPTIONS 9.1. Issues/Risks 9.2. Assumptions

9. Issues, Risks and Assumptions


9.1. Issues/Risks
1. No further changes or inclusions will be considered for inclusion in this release except (1) where there is the express permission and agreement of the Business Analyst and the System Test Controller; (2) Where the changes/inclusions will not require significant effort on behalf of the test team and will not adversely affect the test schedule. This is a potentially serious issue, as any major changes to design will entail additional time to replan testing and to create or amend test conditions . Resp : Byron Ruthlenn Final list of inclusions to be Signed off. 2. The design of the software must be final, and design documentation must be complete, informative and signed off by all parties prior to System Test proper commences. Resp : D.A. Stone 3. A weakness in the 'phased delivery' approach is that the the high degree of interdependency in the code means that the smallest changes can have serious effects to areas of the application which apparently have not been changed. The assumption of the test team is that previously delivered and tested functionality will only require regression testing to verify that it 'still' works. I.e. testing will not be intended to discover new errors. Because of this I recommend that there be a minimum 2 days regression testing AFTER the final fix/change has been retested. This however, imposes a fixed time constraint on the completion of system testing which requires the agreement of the Project Leader. Resp : Byron Ruthlenn 4. Automated Testing The majority of the Regression testing will be performed using the automated test tool. However, due to the workload required to implement (and debug) the test tool fully it is likely that the return will only be maximised after the 3rd time running the regression test suite for each release. The other major uses of the test tool are for (1) Load Testing, (2) Multi-User Testing, and (3) Repetitive data entry. Resp : Test Controller

9.2. Assumptions

Software will be delivered on time. Software is of the required quality. The software will not be impacted by impending Y2K compliance changes to the external software infrastructure - i.e. any external software changes will have to be compatible with this application. All "Show-Stopper" bugs receive immediate attention from the development team. All bugs found in a version of the software will be fixed and unit tested by the development team before the next version is released. Functionality is delivered to schedule. Required resources available. All service agreements will be met. The automated test tool will function & interface correctly with the software. All documentation will be up to date and delivered to the system test team. Functional and technical specifications will be signed off by the business. All service agreements will be met. The Intranet will be fully functional prior to project commencement.

You might also like