Anda di halaman 1dari 42

Test Plan

For

Doyen Horse
Rating System
V2.1
Date: 30/10/2010

Authors:
Ranisha Fernando
Rachel Phillips
Other Contributors:
Ben Hartley
Anusha Raveendran

Test Plan V1.0

Contents
1.0

Introduction.................................................................................................- 3 1.1 Purpose................................................................................................- 3 1.2 Testing Strategy....................................................................................- 4 1.3 Scope...................................................................................................- 4 1.4 Intended Audience................................................................................- 5 1.5 Constraints and Assumptions................................................................- 6 1.5.1 Assumptions...........................................................................- 6 1.5.2 Dependencies........................................................................- 7 1.6 Reference Material...............................................................................- 8 1.7 Definitions and Acronyms.....................................................................- 9 2.0
Resource Requirements for Testing..........................................................- 10 2.1 Hardware Requirements.....................................................................- 10 2.2 Software Requirements......................................................................- 10 2.3 Testing Tools.......................................................................................- 11 2.3.1 Bugzilla.................................................................................- 11 2.3.2 Dreamweaver.......................................................................- 11 2.3.3 W3C Schools Validators.......................................................- 11 2.3.4 PHP Validator.......................................................................- 11 3.0 Staffing...........................................................................................................- 12 3.1 Roles and Responsibilities..................................................................- 12 3.1.1 Test Manager.......................................................................- 12 3.1.2 Quality Assurance Manager.................................................- 12 4.0 Testing Methodology.......................................................................................- 13 5.0 Types of Testing..............................................................................................- 15 5.1 Non-execution testing.........................................................................- 15 5.1.1 User Documentation Testing................................................- 15 5.1.3 Static Content Testing..........................................................- 16 5.1.2.1 Appearance Testing...............................................- 16 5.1.4 Non-executable Unit Testing.................................................- 17 5.2 Execution Testing................................................................................- 18 5.2.1 Unit Testing..........................................................................- 19 5.2.2 Integration Testing................................................................- 19 5.3 System Testing...................................................................................- 20 6.0 Testing Boundaries.........................................................................................- 21 6.1 Features to be tested..........................................................................- 21 6.2 Features not to be tested....................................................................- 21 7.0 Test Deliverables............................................................................................- 23 8.0 Item pass/fail criteria.......................................................................................- 24 8.1 Suspension Criteria.............................................................................- 24 8.2 Resumption Requirements.................................................................- 24 8.3 Approval Criteria.................................................................................- 24 9.0 Testing traceability to requirements................................................................- 25 10.0 Test Cases....................................................................................................- 27 10.1 Log in function..................................................................................- 27 10.2 Internal links......................................................................................- 28 10.3 External Links...................................................................................- 29 10.4 User Data Entry forms......................................................................- 30 10.5 Calculation of Ratings.......................................................................- 32 10.6 Display of Results.............................................................................- 33 10.7 Data storage, retrieval and deletion..................................................- 34 10.8 System test.......................................................................................- 35 11.0 Testing Schedule..........................................................................................- 37 12.0 Risks and contingencies...............................................................................- 38 13.0 Approvals......................................................................................................- 40 14.0 Appendix.......................................................................................................- 41 -

Test Plan V1.0

1.0 Introduction
1.1 Purpose
The purpose of the testing plan is to ensure the software meets its
requirements and standards.
Testing is one of the important parts of Doyen horse rating system
development process. It ensures that all of the functional requirements of the
project are performing as intended before implementation of the system.
Testing is also part of quality assurance in the project.
Some of the key points of this document are:
Test Strategy
Testing Methodology
Resource requirements for testing
Roles and responsibilities of the testing process
Test cases
Risk identification and management

Test Plan V1.0

1.2 Testing Strategy


We need to allocate sufficient resources and skills required to complete the
testing. Team members have been allocated for testing, allowing for the risk
of tests failing and the need for redevelopment. A member from the
development team is always available if there is a need for recoding, and
testing of other requirements should continue.
Most of the functional requirements can be tested or their needs estimated for
testing at this stage of the development process. Functional requirements are
tested according to proposed test cases. Results are compared and the
defects will be referred back to the development team to be fixed accordingly.
Test cases are created to test usability, correctness, completeness,
consistency, maintainability and security aspects of the software.

1.3 Scope
The test plan has been developed to meet the quality and the standard of the
software which is listed in Software Requirement Specification (SRS)
document.
It is not intended to be a software development document. It is simply a guide
to required and the methods to be employed to complete testing, both during
development and at completion of the software.
Once all the tests have been completed, the software should meet the
standard and quality expected by the stakeholders of the project.

Test Plan V1.0

1.4 Intended Audience


This document will be used primarily by the Techno Solutions team during the
development of the software. It should be used in conjunction with Software
Requirement Specification (SRS) and Software Detailed Design document
(SDD).

1.4.1 Supervisor
Name
Contact Number
Fax Number
Email Address
Institute

Richard Dazeley
+61 3 5327 9769
+61 3 5327 9289
r.dazeley@ballarat.edu.au
University Of Ballarat

1.4.2 Techno Solutions Team


Name
Rachel Phillips

Contact Number
0448 291 496

Email Address
rachelphillips@students.ballart.ed

Ben Hartly

0423 667 008

u.au
bjhart@ncable.net.au

Anusha

0423 280 625

anushkuti@gmail.com

Raveendran
Ranisha Fernando

0400 935 234

rani3fer@yahoo.com

Test Plan V1.0

1.5 Constraints and Assumptions


Due to unavoidable circumstances, the project manager has been away from
the testing process and may also be away during the implementation. Other
team members have also been ill throughout the development process.
This risk has so far been well managed by the team. While there will be a
system should be ready to present on the 25 th October as scheduled, it will
not have been through all of the testing required before final implementation.
Final implementation should, at this stage experience a delay of
approximately 2-3 weeks depending on circumstances in the upcoming
weeks.

1.5.1 Assumptions
Software with manually entered form guide data will be delivered on
time
Software with automatically scraped form guide data will be delivered
approx 2-3 weeks behind schedule
Software will be up to the required standard.
All bugs found in testing will be fixed before final implementation of the
software.
All the required resources are available for testing.
All the documentation is accurate and up to date for testing team to
perform their tasks.
Client has provided all the mathematical formulas for the calculations to
ensure the system remains accurate.
Client provides the online resources where he gets the information as
the input; it is assumed the data will be compatible with our coding. If it
is not there are many sources available with the data needed.

Test Plan V1.0

If the URL in source code is not found, client will be able to manually
enter the URL of the information source or the form guide data needed.
Assumed that the client will make all effort to attend client meetings., to
ensure his needs are met
The team has got the required skills to perform the project or will be
able to outsource them.

1.5.2 Dependencies
The horse form information is taken from a website and will need to be
compatible with the program.
The application will run properly in Mozilla FireFox and utilizes Wamp
assumed as it is not finalized yet

Test Plan V1.0

1.6 Reference Material


External
Baker,H.Test Plan V 2.0. Retrieved on October 5, 2010 from
http://people.cis.ksu.edu/~hadassa/documents_pdf/phase3/TestPlan2.pdf
Microsoft. (2005). Testing Methodologies. Retrieved on October 5, 2010 from
http://msdn.microsoft.com/en-us/library/ff649520.aspx
SourceForge. Testing tools. Retrieved on September 25, 2010 from
http://sourceforge.net/
Software Testing. (2009). GUI Test Drivers. Retrieved on September 25, 2010
from http://www.testingfaqs.org/t-gui.html
South Australian government, (2002). Risk assessment matrix. Retrieved
September 25, 2010, from
www.safework.sa.gov.au/contentPages/docs/swiY1A6T2RiskAssessmentMatr
ix.pdf
Team E. Inkjet & Toner Information and Ordering Website:Test plan. Retrieved
on October 2, 2010 from http://uobcommunity.ballarat.edu.au/units/itmsprojects/exdocs/teame_2006/documentat
ion.htm
Internal
Software Requirements Specification, April 2010.
Software Detailed Design Document.
Software Project Management Plan.

Test Plan V1.0

1.7 Definitions and Acronyms


Bugzilla:

Testing tool which assists to track down bugs and fix them

DB:

Database

GUI:

Graphical User Interface

HTML:

Hyper Text Markup Language

SDD:

Software Detailed Design

SPMP:

Software Project Management Plan

Spiral Testing Methodology: Testing Methodology which involves an


iterative approach which facilitates rapid changes in a project
SRS:

Software Requirement Specification

TP:

Test Plan

UB: University of Ballarat


WAMP: Acts as a server for a system when a server is not available to
system

Test Plan V1.0

2.0 Resource Requirements for Testing


2.1 Hardware Requirements
Doyen horse rating system is expected to run only on the clients machine
with a local server.
Therefore, hardware requirements are minimal and include:

256 MB RAM

512MHz Processor

800 x 600 resolution

Windows operating system

2.2 Software Requirements


The system requires a browser and server to run. Mozilla Firefox is clients
preferred browser, and as such the application will be developed. The
application will be run and tested on Windows operating environment,
therefore, the software will not be transferable to Mac or Linux environments.
Client is also required to download WAMP server to have the application
running locally on his machine. The database used here is mysql based and
is managed through the php administration section of WAMP Server.
Software requirements for the system are needed during the testing stage.
They are minimal and include:

Mozilla Firefox 7 or higher

Internet explorer 6

Wamp Server

Test Plan V1.0

10

2.3 Testing Tools


2.3.1 Bugzilla
Bugzilla is an open source defect tracking application. Defect tracking
application helps developers and testers to keep track of defect in the
application, to manage code changes, to communicate with team members, to
submit patches and manage software quality.

2.3.2 Dreamweaver
As the software application coding will be developed using Dreamweaver,
validation of code, in particular syntax errors, will be performed continually at
this time. It is hoped that all syntax errors will be picked up at this time

2.3.3 W3C Schools Validators


All HTML and CSS files will be validated using tools available through the
World Wide Web Consortium. The validation tools available from
http://www.w3schools.com/site/site_validate.asp may be employed for
validating files:

2.3.4 PHP Validator


In order to validate php code an open source tool will be used. It is free to
download and use and was written by Nilesh Gamit. The software is available
at http://sourceforge.net/projects/php-validator/.

Test Plan V1.0

11

3.0 Staffing
3.1 Roles and Responsibilities
3.1.1 Test Manager
The Test manager is in-charge of the whole testing process and managing
resource throughout the testing program.

3.1.2 Quality Assurance Manager


The Quality Assurance manager works as a part of the testing team to ensure
the software meets its standards.

Name
Ranisha Fernando

Role
Test Manager

Responsibilities
In-charge of the whole

Ben Hartley

Primary Tester

testing process
In-charge of most

Quality Assurance

testing
Involves in testing and

Manager and

testing the quality

Secondary Tester
Back up Tester

Involves only in major

Anusha Raveendran

Rachel Philips

testing aspects

Test Plan V1.0

12

4.0 Testing Methodology


A Spiral testing methodology has been as a component of the overall
development methodology used in Doyen horse rating system. Spiral testing
methodology follows a cycle of iterative testing approach which is suitable for
our application and agile development. As our development is following an
iterative approach is logical to test within each sprint as well as at the
completion of development. This will enable us to test software as
development occurs.
There have been some changes to the system during development, due to
both client preferences changing and new functions being added to the
calculation of Ratings. The iterative development and testing process has
been ideally suited to coping with these changes as is adaptive and flexible in
nature.
Figure 1 Testing sprint within Agile development methodology
Testing &
evaluation

Test (including all


previous tests)

Solve any
errors

Refactor

Serious errors
Risk & time
management
Form
programming
focus groups

Test Plan V1.0

13

Figure 2 Lifecycle of an Agile methodology sprint


Main program
Doyen rating system
Version xx.xx

Refer to
overall
system
UML
diagrams

Requirements

Requirement
Development
Prioritise
Requirement
s

Additional
UML if
required

Code the
requirement
Documentation
Update SRS
with final
requirement
details

Testing &
evaluation

Implement
into main
program

Working
software

Solve any
errors

Concrete feedback

Research for next


requirement (optional)

Test (including all


previous tests)

Refactor

Serious errors
Risk & time
management
Form
programming
focus groups

Work may begin on next function

Test Plan V1.0

14

5.0 Types of Testing


5.1 Non-execution testing
Non-execution testing involves testing elements of the system that are nonexecutable such as

User documentation testing

Static content testing

Appearance testing

Performance and results of these tests should be documented using the


appropriate checklists and spreadsheets

5.1.1 User Documentation Testing


As a part of the project, user manual will provided to the user with all the
information needed to run the application. The user manual will be proofread
by a team member who has not been part of creating the document. User
manual will be tested by an external member who is not part of the team to
get a perspective of real user. Client will also be given the document to check
whether it is understandable to the client or not.

Test Plan V1.0

15

5.1.3 Static Content Testing


All pages have static content such as links. These should be tested to ensure
that they take the user to the intended page and are not confusing to use.

5.1.2.1 Appearance Testing


Appearance of basic page elements is static. Each page should have static
page elements such as the header, navigation and footer sections. Each page
should be tested to ensure these static elements display as intended.
The content section of each page should be checked to make sure it is
readable and user friendly and displays the information intended for the page.

Test Plan V1.0

16

5.1.4 Non-executable Unit Testing


Code testing that does not involve actual execution of the application will be
performed under two sections: code inspections and walkthroughs. Code
inspections and walkthroughs will focus on redundant codes, uncommented
codes, code grammar or syntax and logic of the code.
A final comprehensive code walkthrough and inspection should be performed
before moving on to executable Unit testing
Code Walkthroughs
Code walkthroughs will focus on all the coding done for web scraping,
calculations and GUI and check for their accuracy.

Code Inspections
As part of the code inspection, two programmers will work in pairs and
go through all the coding and check. This is done half way through the
development process.

Test Plan V1.0

17

5.2 Execution Testing


Execution testing involves performing test cases and compare results with the
actual outcome to the expected outcome.
A final code walkthrough and inspection should be completed before any
execution testing is performed. Unit tests should be documented using the
appropriate checklist. Examples of these checklists are located in the
appendix of this document.
Unit Testing
Integration Testing
System Testing
Usability Testing
Acceptance Testing

Final Code
Inspections/
walkthroughs
Unit Testing
Integration
Testing

Usability Testing

System
Testing

Acceptance
Testing

Test Plan V1.0

18

5.2.1 Unit Testing


Unit testing involves checking small modules and other parts of the software
to ensure that all the functions are working as intended.
All the calculations of doyen horse rating system will be tested under unit
testing. Each Race calculation is treated as a unit and is tested separately.
Races that have been previously calculated using the manual Doyen system
will be used to test the accuracy of automated calculations.
Web scraping will also be subject to Unit testing. There are three different
pages, the pages accessed are decided by the user. Therefore it is very
important that as many pages are tested as possible. This area will be subject
to many small unit tests.
Storing data to the database and retrieving data from the database are also
tested as two different units and are tested individually. Testing will ensure that
the intended data is retrieved by the system. Dummy data will be fed into the
database for retrieval testing. Successful unit tests of rating calculations will
be fed to the database to test storing of data functionality.

5.2.2 Integration Testing


Integration testing checks the functionality different modules pages or classes
groups combine together.
An integration test will be performed on all parts of the system that need to
interact with each other. These tests involve executing a command that
sends a request or requires a response.
This test will often involve combining one or two tests into one, such as
automatic retrieval of form guide data with calculation of Race ratings.
Or storing results in the database and then retrieving the results. This will
ensure that parts of the system that interact with one another are performing
together as intended.
Test Plan V1.0

19

5.3 System Testing


System testing is the overall testing of the software. At this stage, each
module of the application is treated as one and tested accordingly. From
stage one to the final stage; the system is tested.
This testing will involve the entry of at least 6 real races that have been
previously calculated using the manual system. The testing will involve the
use of the system from front to back.

Logging in to the system

Automatic retrieval of data (if employed)

User entry of data

Calculation of race rating based on that data

Saving of completed results into the database

Retrieval of data from the system

Test Plan V1.0

20

6.0 Testing Boundaries


6.1 Features to be tested
Functionality of the system in the following areas will be tested:
Web scraping checking the web scraping of the application and
compare the results with actual site.
Assigning the values to variable to be used in the calculations
checking the data sorting of the application.
All the calculate functions checking all the calculating functions to
make sure that they produce accurate results
Calculated results checking the expected results with the results
produced by the functions.
Storing in database checking that storing calculated results and some
of race information is done properly.
Retrieving information from the database checking that retrieving
information is done properly and the system is able to print results to
the user
GUI checking that the interface is user-friendly and works as
designed.
A Testing Traceability to Functional Requirements can be found in section 7 of
this document

6.2 Features not to be tested


The absolute 100% removal of errors during testing is impossible for any
system which is been built so far. We can only assume what errors may
happen and test what happens to the system when they occur.
In a system where external data is used some errors may also occur outside
the system that we cannot resolve. As a result, we need to identify the areas
where the testing does not take place and are not critical part of the system
we a required to develop.

Test Plan V1.0

21

Areas of quality, reliability or accuracy not to be tested are:


Data accuracy will not be tested as all the information is taken from a
client recommended site. Data accuracy is not a requirement of the
system.
Manually entered data will not be test or validated as client is
responsible for entering data.

Test Plan V1.0

22

7.0 Test Deliverables


Testing Plan is the major deliverable of the testing process. The test plan
contains tests that have been identified so far and how these tests should be
implemented and documented. When new test are identified and implemented
the Testing plan should be updated.
Test cases should be produced when the need for a test is identified. A need
may be identified during the development process. The test case is the first
step in the test development process.
Individual Test logs Different tests have different logs containing headings
related to the test being performed. When performing any test, the test log
identified in the Test Case should be used to record results.
Testing Log should also be updated when any test has been performed.
Whether it passed or failed.
Test reports should be produced after a test has passed, for many tests this
cannot happen until integration testing component has been completed as
well. It should detail the tests undertaken, the date the test were passed and
then be submitted to the project manager for approval. Once the test report
has been approved a test is considered complete.

Test Plan V1.0

23

8.0 Item pass/fail criteria


8.1 Suspension Criteria
Testing will be suspended if the actual output is not equal to the
expected outcome. This suspension is done in each module which has
failed the test. However, Major tests like integration and system tests
will not be suspended.

8.2 Resumption Requirements


All the suspended tests will be resumed when recoding to fix the error
has occurred. All details of the failed test and steps taken to correct
the error should be documented.

8.3 Approval Criteria


All tests should have passed before the approval process can
commence. Any tests that are still failing should be discussed with the
client and a risk report containing a plan, if possible, to fix the bug. If
bug cannot be fixed the error and associated actions to assist the user
should the error occur should be included in the user documentation

Test Plan V1.0

24

9.0 Testing traceability to requirements


Test
No.

REQ No.

T01

REQ01

T01

T02

T02

T04

T04

T04

Description

Allow user to log into


the system with a
user name and
password
REQ02
System must
validate user
information
REQ03
php function to
scrape data from
race venue page of
cyberhorse.com.au
REQ03.1 Links to external
form guide
information needed
by user for choosing
a race
REQ04
data that has been
scraped is formatted
and echoed to
screen for user to
view
REQ04.1 External links to race
information should
open in new tab or
window so user
remains within site
REQ05
php function to
capture the users
venue selection
REQ05.1 Venue selection by
user from external
site. Met by REQ3.1
REQ06
Php function to
scrape data about
races at chosen
venue page of
cyberhorse.com.au
REQ06.1 Race selection by
user from external
site. Met by REQ3.1
REQ07
Php function to
capture user race
selection
REQ07.1 User inputs via a
html form to allow
user to enter

Test Plan V1.0

Related
use
case
UC01

Pre
use
case
None

Post
use
case
UC02

Complete

UC01

UC01

UC02

UC02

none

UC03

UC02

None

UC03

UC02

UC02

UC03

Under
review

UC02

None

UC03

UC03

UC02

UC04

Under
review

UC03

UC02

UC04

UC04

UC03

UC05

Under
review

Tests
Complete

X
See
research
report

X
See
research
report

X
See
research
report

X
See
research
report

UC04

UC04

UC05

Under
review

X
See
research
report

25

REQ08

T04

REQ8.1

REQ09

REQ10

REQ11

T05

REQ11.1

T05

REQ13

T06

REQ14
REQ15

T07

REQ16

T07

REQ17

T07

REQ18

information about the


race to calculate
ratings for
Php function to
scrape form guide of
users selected race
from
cyberhorse.com.au
User inputs via a
html form to allow
user to enter
information about
individual horses in
race to calculate
ratings for
Data that has been
scraped is formatted
and echoed to
screen for user to
view
Data from form guide
is parsed into
classes by the
system
Variables passed as
parameters to
functions
Form is submitted to
php file for
processing to
calculate ratings
Converting Clients
pseudo code of
formula into
mathematical
statements
Function to send
results to formatted
table for user to view
Structured Query
Language
Function to send
results to database
for storing
Function to retrieve
previous results
store in system
Function to delete
previous result from
system

Test Plan V1.0

UC05

UC04

UC06

Under
review

X
See
research
report

for one
horse

UC06

UC04

UC06

Under
review

X
See
research
report

UC06

UC06

UC07

Under
review

UC07

UC06

UC08

Under
review

X
See
research
report
X
See
research
report

Current

UC07

UC07

UC07

UC07

UC07

UC07

Current

None

None

None

Current

UC09
UC15

UC07

None

UC10
UC15

UC07

None

UC11
UC15

UC07

UC0810

26

10.0 Test Cases


10.1 Log in function
Test ID
Related

T01
REQ01 Allow user to log into the system with a user name and

REQ s

password

Test type/s

REQ02 System must validate user information


Unit first testing will have a username and password
embedded in HTML
Integration when interacting with user names and passwords
stored in the database

Aim of test

To prove that system allows user entry when correct details are
entered
and
Refuses when incorrect details are entered or user does not

Test

exist
Enter into the system and attempt to log in with a variety of user

Description

names and passwords.


Combinations of all valid, all invalid, and one valid one invalid

Required

should be used.
loginTestLog.doc

documentatio

updating of testLog.doc (after each of test)

n
Passing

loginTestReport.doc (on completion of testing)


Log in page is first page visible when entering the system

criteria

Allows entry only when correct user details are entered

Refuses entry to system when incorrect user details are


entered

Example Test
log

Test Plan V1.0

Date

Tester
name

Log
in
name

Valid
user?
/X

Log in
password

Valid
password?
/X

Log in
successful
/X

Test
Passed?
/X

27

10.2 Internal links


Test ID
Related

T02
General useability of system

REQ s
Test type/s
Aim of test

Unit and Static content


To ensure that all internal links are working correctly and

Test

directing to correct page


Unit test During development of pages, internal links should

Description

be checked with Dreamweaver link checker before static


content testing begins.
Static content test - After page development pages con

Required

All links be followed on all pages


linksTestLog.doc

documentatio

updating of testLog.doc (after each of test)

n
Passing

interanalLinksTestReport.doc (on completion of testing)


Link takes user to the intended location

criteria
Example Test
log

Test Plan V1.0

Date

Tester
name

Link
name

Links
intended
redirection

Links
actual
redirection

If external
Open in new
tab or
window
/X

Test
Passed?
/X

28

10.3 External Links


Test ID
Related

T03
REQ 3.1 Links to external form guide information needed by

REQ s

user for choosing a race


REQ 4.1 External links to race information should open in new

Test type/s
Aim of test

tab or window so user remains within site


Unit and Static content
To ensure that all external links are working correctly and

Test

directing to correct page


Unit test During development of pages, external links should

Description

be checked with Dreamweaver link checker before static


content testing begins.
Code should also be checked to ensure the that target is set to
_blank
Static content test - After page development pages completed
and unit tests done

Required

All links be followed on all pages


linksTestLog.doc

documentation updating of testLog.doc (after each of test)


Passing
criteria
Example Test

linksTestReport.doc (on completion of testing)


Link takes user to the intended location

Location of link opens in new tab or window

Date

Tester
name

Link
name

Links intended
redirection

Links actual
redirection

Test Passed?
/X

log

Test Plan V1.0

29

10.4 User Data Entry forms


Test ID
Related

T04
REQ 7.1 User inputs via a html form to allow user to enter

REQ s

information about the race to calculate ratings for


REQ 8.1 User inputs via a html form to allow user to enter
information about individual horses in race to calculate ratings

Test type/s
Aim of test

for
Unit, Integration & Useability
To identify any issues with usablility of the input form for the
user
To ensure form is operating as intended
To ensure form is compatible with requirements of

Test

calculateRatings.php
Unit test

Description

Non executable - code walkthroughs and inspections should


ensure that all input html fields have been assigned a name,
the name should correspond with variables assigned in SDD.
Executable HTML page should be loaded and the form filled
out with data to ensure fields are working appropriately.
Usability The form should be inspected by several different
people in this testing. Testers should try to identify any issues
on the form such as labels that may be misunderstood, test
that is hard to read, formatting issues, spelling errors etc
Integration Once the form is complete and can be submitted
to calculateRatings.php, the form should be filled out and
submitted.
For first rounds of testing calculateRatings.php should simply

Required

display a message to indicate submission has been successful


formTestLog.doc

documentation usabilityTestLog.doc
updating of testLog.doc (after each of test)
formTestReport.doc (on completion of testing)
Passing
Test Plan V1.0

all details needed by the system appear on the form


30

criteria

any fields requiring numerical values are set to 0 when


page loads

None or minimal impact usability issues remain

The form can be successfully submitted to


calculateRatings.php for processing

Example Test
log

Test Plan V1.0

Date

Tester

Are the
fields for
all
required
data

Are all
inputs
working
as
expected

Does
form
submit
correctly

Does all
other
buttons
work
correctly

Test
pass/fail
/X

31

10.5 Calculation of Ratings


Test ID
Related

T05
REQ11.1Form is submitted to php file for processing to calculate

REQ s
Test type/s
Aim of test

ratings

Unit, Integration
To determine the accuracy of the calculation of ratings in

Test

comparison to the manual system


Unit

description

Test data should be entered for each function as it is


developed and compared to a manual calculation before
moving on to the next function
Integration
Once all the functions have been developed, data for an
entire race and horse should be entered and compared with

Required

a manual calculation
calculateRatingsTestLog.doc

documentation
Passing

All functions return the same result as the manual system

criteria
Example test
log

Test Plan V1.0

Date

Tester

Source
of
Data
used

Horse
name

Matches
manual
calculation
/ X

Recommended
action to
correct

Test
pass/
fail
/
X

32

10.6 Display of Results


Test ID
Related

T06
REQ14 Function to send results to formatted table for user

REQ s
Test type/s
Aim of test
Test

to view
Unit, Appearance
To ensure results are displayed properly
Examine the results table generated by

Description

calculateRatings.php
Does it represent what the system has calculated
Does it show a result in every column
Is it user friendly
Is it clear who the horse with highest rating is
Other pages
do common page elements all appear the same across all
pages
Are fonts, layout and colours the same on all pages

Required
documentation
Passing criteria
Example test
log

Test Plan V1.0

Does the static navigation appear on all pages


GUITestLog.doc
The table is displaying the results of the system correctly
Date

Tester

Page
name

Are all
common
page
elements
displaying
/ X

Are font,
layout
and
colors
correct

Does
content
display as
intended
/X

Is GUI
user
friendly
/X

33

10.7 Data storage, retrieval and deletion


Test ID
Related

Test

T07
REQ 14 Function to send results to database for storing
REQ15 Function to retrieve previous results store in system
REQ16 Function to delete previous result from system
Integration
To ensure that results are able to be stored, retrieved and
deleted from the database
Generate a result in the system by entering race and horse

Description

data

REQ s
Test type/s
Aim of test

Click on save result button


Click on previous results button
See if generated result appears in list
Click on generated result to view
Click on delete previous results
See if generated result appears in list
Click on delete button
Click on previous results button
Required
documentation
Passing criteria
Example test
log

Test Plan V1.0

See if generated result has been removed from list


dataStorageTestLog.doc
A result can be stored, viewed and deleted in the system
Date

Tester

Results
expected
id no

Could
the
result
be
saved
/
X

Can the
result be
viewed
via
previous
results
/X

Can the
result be
deleted
/X

Data
storage
test
pass/fail
/X

34

10.8 System test


Test ID
Related
REQ s
Test type/s
Aim of test
Test
Description

T08
All
System
To ensure the system operates according to functional
requirements
Log in to system
Enter race data
Enter data for at least 5 horses
Click on calculate doyen ratings button
Click on save result button
Click on previous results button
See if generated result appears in list
Click on generated result to view
Click on delete previous results
See if generated result appears in list
Click on delete button
Click on previous results button
See if generated result has been removed from list

Required

systemTestLog.doc

documentation
Passing criteria
Example test

All elements of system operate as expected

log

Test Plan V1.0

35

Date

Tester

Data
source

Race
details

Example System test log

Is the log in
facility
working
/X

Are the user


input form
working
/X

Did the
system
calculate
ratings for all
horses
/X

Could
you save
the result
/X

Could you
view the via
previous
results link
/X

Could you
print the
result
/X

Could you
delete the
result
/X

Test
pass/fail
/ X

11.0 Testing Schedule


All the testing is done by collaboration of all the team members. The following
chart gives brief details, the members who are responsible for each testing
method.

Order

Type of Testing

Member Responsible

1 each

Code walkthroughs

Ranisha Fernando. Ben Hartley,

development

and inspections

Rachel Phillips

Unit Testing

Anusha Raveendran, Ben Hartley,

sprint
2 each
development

Rachel Philips

sprint
3

Static content Testing

Integration Testing

Ben Hartley

Appearance testing

Anusha Raveedran

System Testing

Ranisha Fernando

User documentation

Rachel Philips, John Murphy and a

Testing

peer

12.0 Risks and contingencies


During the testing process of project if a team member encounters with a risk
then he/she is required to contact the project manger immediately and fill out
a risk report (Appendix B)
Things to be included in your report-:
Risk: A title that describes the risk.
Description: A description of the risk explaining how it is a risk to the
project.
Probability (High/Medium/Low): An estimation of the chance of this
risk occurring.
Damage (High/Medium/Low): An estimate of the overall damage to
the project if the risk occurs.
Plan: A plan to minimize the impact caused by the risk, or avoids this
risk from occurring.
The management plan for the risk, and any changes associated with is must
be approved by the project manager, and the SPMP updated to reflect
changes.
Risks that have already been identified and a plan made in case they happen
are on next page.

Likelihood
Very likely (and
will almost
certainly happen)

Major

Serious

Extreme

Likely (and will


probably happen at
some time)
Unlikely (but could
happen at
High
sometime)
Very unlikely (and
might happen only
rarely)
Figure 3 Risk Assessment Matrix

Minor

Insignificant

High

Medium

High

Medium

Medium

Medium

Low

Low

Identified Risks
Risk
Team member sickness

Serious program error

Loss of files

Not meeting deadlines

Proposed management strategy


If team member is unable to work
from home on set tasks, they will be
assigned to other team members. If
this is not possible it may be
necessary to negotiate for extra time
with supervisor and/or client
If the error is so serious as to effect
the start time of next sprint, the team
will be split into two. One half will
work on correcting the error, the other
will comment out the offending code
and begin the next sprint.
All files related to project are kept
online in two separate places, on a
laptop, on a USB stick. If files are lost
the effect should be at best a couple
of hours, at worst, one week
If deadlines are in danger of being
missed it is important the risk is
identified as soon as possible. It may
be possible in some circumstances to
obtain extra time but as much notice
as possible is needed to implement
the change in time change to
minimize effect on overall project
timeline.

Please refer to Risk management plan for a more elaborative explanation of


the risk management plan.

13.0 Approvals
Project manager requires signing off final testing results. Quality
assurance manager needs to approve the testing plan and check the quality
of testing. Supervisors recommendation of the testing is important.

14.0 Appendix
Appendix A Example of loginTestLog.doc
Date

Tester name

Log in
name

Valid
user?
/X

Log in
Valid
password user?
/X

Log in
Test
successful Passed?
/X
/X

Appendix B Example Risk report


Risk Report
Risk identified

Description

Identified by

Name/s

Probability

Chance of Risk Occurring (circle/highlight)


Very Low | Low | Moderate | High | Very High

Impact

Damage Rating (circle/highlight)


Low | Medium | High
Description of damage caused

Mitigation

Approved course
of action (filled in
after consultation)

Approved by

Suggested course of action to prevent this issue occurring, or


to reduce the amount of damage caused should this issue
occur.

Anda mungkin juga menyukai