Anda di halaman 1dari 75

MICT Park Yangon 22 January 2010

Leonard Aye

@ MSN

len.aye@nandawon.com leonardaye@gmail.com leonardaye@hotmail.com Leonard Aye lenaye

Do we need to test software? Definitions of Good quality software Categories of Software Definition of Testing Development models & testing processes
Case studies:
Waterfall V-Model Agile (SCRUM)

Building a test team Q&A


3

Date: June 4th 1996 Place: European Space Agency Launch Site Location: French Guiana, South America

Rocket: Ariane 5

Why

did it blow up?

There was an error in the software that calculates the

rockets height and direction. Error occurred when a 64-bit Floating Point Number was converted to a Signed 16-bit Integer Number, causing the resulting Integer number to overflow There was no error handling for this type of Overflow error because this condition was never discovered during testing.

Whats

the cost of this failure?

Development time: 11 years


Total cost: US $11 Billion

Whats

the Conclusion?

There should have been more testing!

What

is testing? What types of testing are there? What do we test exactly? How much testing is enough? How much time do we need for testing? How many testers do you need? What makes a good tester?

What

is BAD quality software?

Its a bad piece of software if it is full of bugs (errors)! Its a bad piece of software even if its 100% bugs free but it doesnt do what the user want

What

is BAD quality software?

Its a bad piece of software if it is full of bugs (errors)! Its a bad piece of software even if its 100% bugs free, but it doesnt do what the user want Its a bad piece of software even if its 100% bugs free and it does exactly what the user want, but not in the way the user wants it
10

What

is BAD quality software?

Its a bad piece of software even if its 100% bugs free and it does exactly what the user want, and it works in the way the user wants it, but it works sloooowly

11

It works without serious errors Easy to install, configure and use Has features the users want Works in the way the users expect it to work Responsive and high-performance Good aesthetics, e.g. clean and uncluttered interface and ease of navigation Has good and extensive documentation Good technical support

12

main types

Off-the-shelf /Packaged software


Bespoke software Customised software

13

Off-the-Shelf/Packaged software Ready-made software which you can pick up off the

shelf in a supermarket.
Desktop Applications:

Some Examples
Windows Vista, Windows 7, Mac OS X MS Office Photoshop Chrome browser

Online sites & serives:


Google Facebook Youtube

14

Bespoke

software

Software specifically written for one customer or

one organisation only 2 types of bespoke software


New computerised system for a business Replace existing computerised system with new (faster/better) system

15

Customised

software

A generic application which is configured for

each customer/client
Some commercial off-the-shelf examples:
Accounting packages, e.g. Sage, MS Money Payroll systems, e.g. Zpay, Intuit Ecommerce packages, e.g. E-Volved, Actinic, EROL

16

Off-the-Shelf
User Requirements Project duration Deployment Testing process Customer feedback From within your own organisation No limits, except for external pressure via websites or shops Defined by development process via email or website

Bespoke/Customised
From customer Defined by customer Defined with customer Defined by development process Direct from customer

Customer interaction

Little or none

High

17

definition
to the ambiguous, so as to avoid the unthinkable happening to the anonymous. James Bach

Testing is the process of comparing the invisible

18

few well known definitions

Testing is a process used to identify the correctness and

completeness of computer software. Testing is verifying software features against the system requirements Testing is a process of verifying that a program functions properly.

Preferred

definition

Testing is a process of finding bugs in the software Kaner, Falk & Nguyen, Testing Computer Software, 1999

Testing is a process of trying to break the software during

Development so that it can be fixed, thereby reducing companys time and money after the product is released.
19

Integration Testing

System Testing

Functional Testing Non-Functional Testing


User

Acceptance Testing End-to-End Testing Regression Testing Exploratory Testing

20

Part 1 Do we need to test software? Definitions of Good quality software Categories of Software Definition of Testing Development models & test processes Case studies:
Waterfall V-Model Agile (SCRUM)

Building a test team Q&A


21

Software development model defines a set of rules

on how a software is to be built. There are many and varied development models:
Agile DSDM (Dynamic Systems Development Method) FDD (Feature Driven Development) Iterative RAD (Rapid Application Development) RUP (Rational Unified Process) Scrum TDD (Test Driven Development) V-Model Waterfall XP (eXtreme Programming)

22

Traditional V-Model

Iterative/Incremental Agile

Waterfall

FDD (Feature Driven Development)


RAD (Rapid Application Development) DSDM (Dynamic Systems Development Method) Scrum RUP (Rational Unified Process) XP (eXtreme Programming)

TDD (Test Driven Development)


23

Waterfall model is a software development process in which progress is seen as flowing steadily downwards (like a waterfall) through the phases of: Requirements Design Construction Testing, and Maintenance.

24

Requirements

Analysis

Design

Construction

Testing

Maintenance
25

Small to medium scale projects, i.e. < 10 people and/or timescale ~6-9 months Requirements are clearly defined and unlikely to change throughout the life of the project

26

msn shopping
Price comparison site Online Merchants send in their product catalogue, daily/weekly Products are categorised and published on their site Business model Online Merchants get greater exposure to their products from users on msn User clicks are recorded and Charge per click-throughs

28

29

Environment
Pr 1 Pr 2

Load Balancer Pr 6

Dev

Test

SQL Server

SQL Server

SQL Server

Staging
SQL Server

SQL Server

SQL Server

Merchant Merchant

Merchant Merchant

30

Requirements
Website is non-business critical
Overnight switchover (big bang)

Website must display consistently on all existing web browsers Website must have high availability (24x7) Website must be able to support high user traffic Merchant products upload <4 hrs

31

How do we test this system?


System Testing
Functional Testing Non-Functional Testing

End-to-End testing

What do we need to test this?


Test Servers (Application and DB server) with all necessary apps installed, i.e. IIS & SQL Server User PCs (Win 98, Win ME, Win 2K, Win NT) with IE 3, IE4, IE4.01 SP1 ,IE 4.02 SP2, etc. Test Data (manually created dummy data)
32

System Testing Ensuring the system meets the user/design requirements


Functional Testing
Aim is to find errors (bugs) Does it work? Does it do what its supposed to do? e.g. functional testing a car would be: Check that engine starts if you turn on the ignition key Check that the break lights come on if you press on the break pedals Check that radio comes on if you turn on the radio
33

System Testing

UI Test cases Layout of products Product description too long Display of special characters Missing images (at site and from merchant) Missing information, e.g. Detailed description Missing mandatory information, i.e. price Special offer (normal price, sale price, and sale indicator) Incorrect or dead-links back to merchant site Functional Test cases Correct categorisation of products Product filtering by type, by brand, by store, by price range (pre-defined or manual) Product sorting Product searching via Search box Product listing per page Non-Functional Testing
Web server breakdown SQL server breakdown Load/Stress testing
Average page load times: < 10ms Min concurrent user: 400

34

Non-Functional Testing
Aim is NOT to find faults but to measure performance against the required targets How does it work? Fast, slow, etc.
Is it easy to use? (Usability testing) Can users of all different versions of OSs and browsers access the site? (Compatibility testing) Will the system be available in case of server failures (Reliability and Availability testing) Can the system cope with the growth of data volume or user volume in the future (Load testing) e.g. non-functional testing of a car would be: Check that it accelerates from 0 60 mph in under 10 secs Check that the electric door opens/closes 100,000 times without failure

35

Usability testing
Invited people from outside the office and asked them to use the website as normal users, i.e.
Search for products Find a particular product Filter products by price Filter products by brand, etc.

This is followed by questionnaires for their feedback Amending the layout and design of the site based on users feedback

36

Compatibility testing
Testing that the application/website is compatible to use on as many supported browers and OSs as possible
Set up a test laboratory with several PCs running different OSs and web browsers Check how the site displayed on each type of browser Use VMWare to set up Virtual Machines

37

Reliability testing
Testing that the system is still available in the event of failure in any part of the system
Manually turned off each web servers and check that the user transactions are re-routed correctly Manually turned off each DB server and check the user data request is served by a different DB server
Load Balancer WS 1 WS 2 WS 6

SQL Server

SQL Server

SQL Server

38

Availability testing
Testing that the system is still available in the event of total system and site failure
Set up a duplicate/back up site at a different location Manually turned off main site and check that the user transactions are re-routed to backup site Check for data synchronisation between the main DBs and backup DBs
London
Load Balancer
W S 1
SQL Ser ver

Derby
Load Balancer
W S 6
SQL Ser ver

W S 2
SQL Ser ver

W S 1
SQL Ser ver

W S 2
SQL Ser ver

W S 6
SQL Ser ver

Sync

39

Load Injectors

Load Controller

Load Testing Example

Time (ms) 70 60

Load testing analysis

Load Balancer

50 40 30 20 10 0

SQL Server

SQL Server

SQL Server

SQL Server

SQL Server

SQL Server

Users 50

100

150

200

250

300

350

400

450

Concurrent Users

40

Summary:
Duration: 10 months from concept to production Written entirely ASP and SQL Team:
1 PM 2 Merchant Liaisons 4 Dev 2 DBAs 4 Testers (London, Dublin, Outsourced company)

Big-Bang deployment
Old system turned off Friday night and new system brought up on Monday morning

41

Model works well if: System and team are relatively small Clearly defined and stable project requirements Testing methods used: System Testing
User Interface Functional Non-functional testing
Usability, Compatibility, Reliability, Availability, Load testing

42

Part 1 Do we need to test software? Definitions of Good quality software Categories of Software Definition of TestingDevelopment models & test

processes

Case studies:
Waterfall (examples of non-functional testing) V-Model Agile (SCRUM)

Building a test team Q&A


43

User Requirements

User Acceptance Test Design

User Acceptance Testing

System Design

System Test Design

System Testing

Module Design

Int. Test Design

Integration Testing

Construction

44

Accepts that testing takes roughly 50% of project time Activities like test planning and test case designs take place at the beginning of the project well before code is written, i.e.
User Acceptance tests are written once Requirements are approved System Acceptance tests are written once System Designs are

completed Integration tests are written once Module Designs are completed

45

LCH.Clearnet
Clearing house for London Stock Exchange
Also acts as counter-party for trade for bonds, swaps, futures, options. Volume of trades cleared per year: 1.7 billions Value of transactions per year: 600 trillion (2008)

Project: GCS (Generic Clearing System)


To combine separate individual clearing systems into a single generic clearing

system using new technology and tools, i.e.


HP/Unix servers with Oracle 9 BEA WebLogic for business logic and processing IBM MQ Series for messaging J2EE, XML for application development Case tools: Rational Suite Rose for Requirements & UML models, Requisite Pro for designing use-cases, TestManager for test case management, ClearQuest for defect management.

46

Development

Process

System consists of 4 main system components


Development teams structured to match the system components Using V-Model, requirements for each component were captured, system

models were designed, interface specification were created and coding started. Total Dev team is around 120 people:
~50 Devs, 15 Testers, 4 Test automation Devs Plus BAs, System Analysts, DBAs, various managers.

System Interface

Trade Validation

Trade Processing Engine

Risk Management

Settlement & Clearing

System Interface

47

Implication

for testing

Each system component is to be built with all features fully implemented.


Each component requires its own component testing. Without having interface to other components, testing each component requires specially created Test Harnesses

Once each component is completed and tested, they will then be integrated

one component at a time.


Integration testing is required between each component.

Once all the components are fully integrated, the entire system will be tested.
System testing is required for the complete system. Additionally, End-to-End plus performance and stress testing are required.

System Interface

Trade Validation

Trade Processing Engine

Risk Management

Settlement & Clearing

System Interface

48

Test

Harnesses
Component A Component B

Q: How to test A without B, and vice-versa? A: Drivers to simulate Up-stream interface

Stubs to simulate Down-stream interface

49

Testing

Component A in isolation

1 Driver: Usually a piece code, or an application, which can

send all possible input data range and data type 2 Stubs: To intercept output data from Module A and, if required, send confirmation back to Module A.

Driver

Component A

Stub

Stub

50

Integration Testing
To check that when multiple system components are integrated

together, the combined component worked as a single unit. Understanding the interface specifications between components and generating test data required to test integration Entering data using specially written Drivers and verifying results via Stubs or other means, e.g. direct SQL calls to DB tables.

51

User

Acceptance Testing

To verify with the users that the system meets their

requirements Test cases usually prepared by Testers with the involvement of End Users Tests are run to show positive system behaviour as well as negative system behaviour, e.g. handling of error conditions Tests are run by Users with Testers present

52

Outcome

of GCS project

Project took too long to implement


After 2.5 years we were still at Integration Testing stage

Requirements constantly changed due to

government and industry regulations


Changing requirements meant the each system has to be redesigned (data models, interface specs, functional specs), recoded and retested

After 4 years, company decided to cancel the project

after spending US $75 million

53

Summary
Similar to Waterfall model in that both models cannot cope with

changing requirements once the coding has started Many large-scale projects in the past 20 years have failed or delivered too late with too little functionality or delivered with features and processes which are out of date

Sowhats

the solution?

54

Traditional V-Model

Iterative/Incremental Agile

Waterfall

FDD (Feature Driven Development)


RAD (Rapid Application Development) DSDM (Dynamic Systems Development Method) Scrum RUP (Rational Unified Process) XP (eXtreme Programming)

TDD (Test Driven Development)


55

Part 1 Do we need to test software? Definitions of Good quality software Categories of Software Definition of TestingDevelopment models & test

processes

Case studies:
Waterfall (functional & non-functional testing) V-Model (integration and UAT) Agile/Iterative (SCRUM)

Building a test team Q&A


56

Marshall Wace

Asset Management

Hedge funds investment company Buys/Sells stocks, currencies and futures Total assets managed: ~ US $15 Billion Regularly gives a return of > 10% interest

57

Project summary To build a new trading system to replace

existing one using new technologies


C#, .Net 3, WCF, WGF, XOML

New system must be flexible to react to

changing trading laws and regulations System is critical to the business


It must guarantee 100% reliability

Minimum disruption to the business when new

system is deployed

New system must be introduced slowly and co-exist with the current system.
58

Existing system
Order Execution

Trade idea

Validation

Trade Book

Settlement

Restrictions

59

SCRUM

60

SCRUM Summary
Each iteration is called a Sprint lasting 2-4 weeks.
Requirements are added to Product Backlog At the start of the sprint, requirements are picked from the

Backlog to be developed. Usually 5-6 but no more than 10 requirements per sprint. The sprint team is no bigger than 5-6 people, consisting of developers and 1 or 2 testers. Daily meetings for 15 mins. What youve done yesterday What youre going to do today Any problems thats stopping you from doing your job.
61

Key

principles

Deliver working software incrementally


Fast ROI (Return on Investment) Managing changing requirement is part of the

process Each increment is time-boxed Within each increment, all steps of software lifecycle takes place, i.e. analysis, design, dev & test (mini-waterfall), several times

62

Testing in SCRUM As features/functions are added incrementally

over many Sprints, testing has to ensure that:1. All new features work correctly 2. All existing features do not break.

Sprint 1

Sprint 2

Sprint 3

How?
Functional tests: check new features for bugs Regression tests: check existing features continue to work Exploratory tests: uncovers unexpected behaviour
63

How

does SCRUM cope with change?

New requirements are added


Existing requirements removed or reprioritised Each new Sprint only deals with the most up to

date requirements Urgent requirements are brought into current Sprint by removing planned requirement(s)

64

Testing

in SCRUM

Within the Sprint Specifications are usually not written down Communication with Devs and Users is key Functional testing of features as soon as implemented User Acceptance testing before delivery to user

65

Testing

in SCRUM

Across several Sprints Regression tests are run at the beginning of new Sprint In simplest terms, regression means re-running of existing tests from previous Sprints, to make sure all existing features continue to work after new code has been added. But
Sprint 1 = Functional tests 1 (Ft1) Regression in Sprint 2 = Ft1 Regression in Sprint 3 = Ft2 + Ft1

No of test cases grow very quickly and can become unmanageable


66

Regression
200 180 160 140 120 100 80 60 40 20 0 1

testing

Tests Cumulative

67

Managing

regression tests

Selectively re-run only those tests which are

relevant, i.e. if modules A, D and E are affected in current sprint, rerun tests from modules A, D and E from previous sprints Automated testing
Custom tool Commercial tool (Robot, QTP) Open source tools (Selenium, Ruby, Watir)

68

Selected low-volume, low-risk trades Trade is replicated in both new and old system and outputs compared

Trade idea

Trade validation

Trade Validation

Order Execution

Order Execution

Trade book
Trade book

Settlement
Settlement

Restrictions
Restrictions
69

Project

Summary

Project timescale is 2 years but users were able

to start using the system within 3 months Deployment was seamless to users Able to change the system when Credit Crunch happened, i.e.
Disabling short-selling of stocks Removing Bear Sterns related processes Implementing new changes in law in Hedge Funds trading
70

Part 1 Do we need to test software? Definitions of Good quality software Categories of Software Definition of Testing Development models & test processes Case studies:
Waterfall (functional & non-functional testing) V-Model (integration and UAT) Agile/Iterative (SCRUM)

Building a test team Q&A


71

How

many testers do you need? Difference between Developer & Tester

72

How

many testers do you need?

Dev/Test Ratio Technology consultancies (Accenture, IBM, Logica) = 6-7:1 Global software company, e.g. Microsoft = 1:1 NASA Space shuttle project = 1:3 NASA Space shuttle life support systems = 1:7 Most IT firms = 4:1

73

This code hasn't yet been tested. I dont know if it has any bugs.
I only changed one line of code. Anything is possible, given enough time. Let me fix this one final bug, and we can ship tomorrow. Its an "undocumented feature" I like to build things.

Developer speaks

This code hasn't yet been tested. I dont know if its going to work.
The whole system has to be retested. Everything has flaws, and given enough time I can prove it. If you fix this bug, youll add 2 new bugs. Its a bug. I like to break things.

The way Tester thinks

74

Is it half full or half empty?


Developer (an optimist): The glass is half full. Tester (a pessimist): The glass is half empty. Really good Tester (a realist): The glass is twice as big as it needs to be.

75

Books worth reading


Testing Computer Software Kaner, Falk & Nguyen, 1999, ISBN: 0471358460 Managing the Testing Process Black, 2009, ISBN: 0470404159 How to Break Web Software: Functional and Security Testing of Web Applications and Web Services Andrews & Whitaker, 2006, ISBN: 0321369440

Professional Associations

BCS SIGIST (British Computer Society Special Interest Group in Software Testing): http://www.bcs.org/server.php?show=nav.9262 Software Testing Institute: http://www.softwaretestinginstitute.com/ International Institute for Software Testing (IIST): http://www.testinginstitute.com/

Obtaining Professional qualifications


BCS offers ISEB 3 types of Certification examinations in Software Testing: Foundation, Intermediate and Practitioner. http://www.bcs.org/server.php?show=nav.10920 IIST offers 2 types Certifications on software testing: Certified Software Test Professional and Certified Test Manager. http://www.testinginstitute.com/certification.php

Online resources

Agile manifesto : http://agilemanifesto.org/ Scrum Alliance: http://www.scrumalliance.org/ James Bach (proponent of Agile testing, Exploratory testing) : http://www.satisfice.com Open source test tools: http://www.opensourcetesting.org
Most popular test management tool is TestLink: http://www.teamst.org/ Most popular online bug reporting tool is Bugzilla: https://bugzilla.mozilla.org/ Selenium (automated test tool for testing websites on Firefox): http://seleniumhq.org/

Free Tester magazine (available every 3 months) from SIGIST: http://www.bcs.org/server.php?show=nav.9265

76