we! test tools 6 to chec- that lin-s are %ali9 'T3D coe
usage is correct9 client6sie an
ser%er6sie programs wor-9 a we! site's
interactions are secure:
other tools 6 for test case management9 ocumentation
management9 !ug reporting9 an configuration management:
What akes a good Software Test engineer?
A goo test engineer has a 'test to !rea-' attitue9 an a!ility to ta-e the point of %iew of the customer9 a
strong esire for #uality9 an an attention to etail: Tact an iplomacy are useful in maintaining a
cooperati%e relationship with e%elopers9 an an a!ility to communicate with !oth technical ;e%elopers)
an non6technical ;customers9 management) people is useful: 7re%ious software e%elopment e(perience
can !e helpful as it pro%ies a eeper unerstaning of the software e%elopment process9 gi%es the tester
an appreciation for the e%elopers' point of %iew9 an reuce the learning cur%e in automate test tool
programming: Iugment s-ills are neee to assess high6ris- areas of an application on which to focus
testing efforts when time is limite:
QA and Testing FAQs 7age . of 1*
LiveTech
What akes a good Software QA engineer?
The same #ualities a goo tester has are useful for a QA engineer: Aitionally9 they must !e a!le to
unerstan the entire software e%elopment process an how it can fit into the !usiness approach an goals
of the organi)ation: 2ommunication s-ills an the a!ility to unerstan %arious sies of issues are
important: 1n organi)ations in the early stages of implementing QA processes9 patience an iplomacy are
especially neee: An a!ility to fin pro!lems as well as to see 'what's missing' is important for inspections
an re%iews:
What akes a good QA or Test anager?
A goo QA9 test9 or QA5Test;com!ine) manager shoul?
!e familiar with the software e%elopment process
!e a!le to maintain enthusiasm of their team an promote a positi%e atmosphere9 espite what is a
somewhat 'negati%e' process ;e:g:9 loo-ing for or pre%enting pro!lems)
!e a!le to promote teamwor- to increase proucti%ity
!e a!le to promote cooperation !etween software9 test9 an QA engineers
ha%e the iplomatic s-ills neee to promote impro%ements in QA processes
ha%e the a!ility to withstan pressures an say 'no' to other managers when #uality is insufficient
or QA processes are not !eing ahere to
ha%e people jugment s-ills for hiring an -eeping s-ille personnel
!e a!le to communicate with technical an non6technical people9 engineers9 managers9 an
customers:
!e a!le to run meetings an -eep them focuse
What's the role of docuentation in QA?
2ritical: ;Eote that ocumentation can !e electronic9 not necessarily paper9 may !e em!ee in coe
comments9 etc:) QA practices shoul !e ocumente such that they are repeata!le: Specifications9 esigns9
!usiness rules9 inspection reports9 configurations9 coe changes9 test plans9 test cases9 !ug reports9 user
manuals9 etc: shoul all !e ocumente in some form: There shoul ieally !e a system for easily fining
an o!taining information an etermining what ocumentation will ha%e a particular piece of information:
2hange management for ocumentation shoul !e use if possi!le:
What's the !ig deal a!out 'requireents'?
4ne of the most relia!le methos of ensuring pro!lems9 or failure9 in a large9 comple( software project is to
ha%e poorly ocumente re#uirements specifications: 8e#uirements are the etails escri!ing an
application's e(ternally6percei%e functionality an properties: 8e#uirements shoul !e clear9 complete9
reasona!ly etaile9 cohesi%e9 attaina!le9 an testa!le: A non6testa!le re#uirement woul !e9 for e(ample9
'user6frienly' ;too su!jecti%e): A testa!le re#uirement woul !e something li-e 'the user must enter their
pre%iously6assigne passwor to access the application': =etermining an organi)ing re#uirements etails
in a useful an efficient way can !e a ifficult effort> ifferent methos are a%aila!le epening on the
particular project: 3any !oo-s are a%aila!le that escri!e %arious approaches to this tas-: ;See the
<oo-store section's 'Software 8e#uirements 0ngineering' category for !oo-s on Software 8e#uirements:)
2are shoul !e ta-en to in%ol%e ADD of a project's significant 'customers' in the re#uirements process:
'2ustomers' coul !e in6house personnel or out9 an coul inclue en6users9 customer acceptance testers9
customer contract officers9 customer management9 future software maintenance engineers9 salespeople9 etc:
Anyone who coul later erail the project if their e(pectations aren't met shoul !e inclue if possi!le:
4rgani)ations %ary consiera!ly in their hanling of re#uirements specifications: 1eally9 the re#uirements
are spelle out in a ocument with statements such as 'The prouct shall:::::': '=esign' specifications shoul
not !e confuse with 're#uirements'> esign specifications shoul !e tracea!le !ac- to the re#uirements:
QA and Testing FAQs 7age 1/ of 1*
LiveTech
1n some organi)ations re#uirements may en up in high le%el project plans9 functional specification
ocuments9 in esign ocuments9 or in other ocuments at %arious le%els of etail: Eo matter what they are
calle9 some type of ocumentation with etaile re#uirements will !e neee !y testers in orer to
properly plan an e(ecute tests: Without such ocumentation9 there will !e no clear6cut way to etermine if
a software application is performing correctly:
'Agile' methos such as B7 use methos re#uiring close interaction an cooperation !etween programmers
an customers5en6users to iterati%ely e%elop re#uirements: The programmer uses 'Test first' e%elopment
to first create automate unit testing coe9 which essentially em!oies the re#uirements:
What ste#s are needed to develo# and run software tests?
The following are some of the steps to consier?
4!tain re#uirements9 functional esign9 an internal esign specifications an other necessary
ocuments
4!tain !uget an scheule re#uirements
=etermine project6relate personnel an their responsi!ilities9 reporting re#uirements9 re#uire
stanars an processes ;such as release processes9 change processes9 etc:)
=etermine project conte(t9 relati%e to the e(isting #uality culture of the organi)ation an !usiness9
an how it might impact testing scope9 approaches9 an methos:
1entify application's higher6ris- aspects9 set priorities9 an etermine scope an limitations of
tests
=etermine test approaches an methos 6 unit9 integration9 functional9 system9 loa9 usa!ility tests9
etc:
=etermine test en%ironment re#uirements ;harware9 software9 communications9 etc:)
=etermine testware re#uirements ;recor5play!ac- tools9 co%erage analy)ers9 test trac-ing9
pro!lem5!ug trac-ing9 etc:)
=etermine test input ata re#uirements
1entify tas-s9 those responsi!le for tas-s9 an la!or re#uirements
Set scheule estimates9 timelines9 milestones
=etermine input e#ui%alence classes9 !ounary %alue analyses9 error classes
7repare test plan ocument an ha%e neee re%iews5appro%als
Write test cases
'a%e neee re%iews5inspections5appro%als of test cases
7repare test en%ironment an testware9 o!tain neee user manuals5reference
ocuments5configuration guies5installation guies9 set up test trac-ing processes9 set up logging
an archi%ing processes9 set up or o!tain test input ata
4!tain an install software releases
7erform tests
0%aluate an report results
Trac- pro!lems5!ugs an fi(es
8etest as neee
3aintain an upate test plans9 test cases9 test en%ironment9 an testware through life cycle
What's a 'test #lan'?
A software project test plan is a ocument that escri!es the o!jecti%es9 scope9 approach9 an focus of a
software testing effort: The process of preparing a test plan is a useful way to thin- through the efforts
neee to %aliate the accepta!ility of a software prouct: The complete ocument will help people
outsie the test group unerstan the 'why' an 'how' of prouct %aliation: 1t shoul !e thorough enough to
!e useful !ut not so thorough that no one outsie the test group will rea it: The following are some of the
items that might !e inclue in a test plan9 epening on the particular project?
QA and Testing FAQs 7age 11 of 1*
LiveTech
Title
1entification of software incluing %ersion5release num!ers
8e%ision history of ocument incluing authors9 ates9 appro%als
Ta!le of 2ontents
7urpose of ocument9 intene auience
4!jecti%e of testing effort
Software prouct o%er%iew
8ele%ant relate ocument list9 such as re#uirements9 esign ocuments9 other test plans9 etc:
8ele%ant stanars or legal re#uirements
Tracea!ility re#uirements
8ele%ant naming con%entions an ientifier con%entions
4%erall software project organi)ation an personnel5contact6info5responsi!ilities
Test organi)ation an personnel5contact6info5responsi!ilities
Assumptions an epenencies
7roject ris- analysis
Testing priorities an focus
Scope an limitations of testing
Test outline 6 a ecomposition of the test approach !y test type9 feature9 functionality9 process9
system9 moule9 etc: as applica!le
4utline of ata input e#ui%alence classes9 !ounary %alue analysis9 error classes
Test en%ironment 6 harware9 operating systems9 other re#uire software9 ata configurations9
interfaces to other systems
Test en%ironment %aliity analysis 6 ifferences !etween the test an prouction systems an their
impact on test %aliity:
Test en%ironment setup an configuration issues
Software migration processes
Software 23 processes
Test ata setup re#uirements
=ata!ase setup re#uirements
4utline of system6logging5error6logging5other capa!ilities9 an tools such as screen capture
software9 that will !e use to help escri!e an report !ugs
=iscussion of any speciali)e software or harware tools that will !e use !y testers to help trac-
the cause or source of !ugs
Test automation 6 justification an o%er%iew
Test tools to !e use9 incluing %ersions9 patches9 etc:
Test script5test coe maintenance processes an %ersion control
7ro!lem trac-ing an resolution 6 tools an processes
7roject test metrics to !e use
8eporting re#uirements an testing eli%era!les
Software entrance an e(it criteria
1nitial sanity testing perio an criteria
Test suspension an restart criteria
7ersonnel allocation
7ersonnel pre6training nees
Test site5location
4utsie test organi)ations to !e utili)e an their purpose9 responsi!ilities9 eli%era!les9 contact
persons9 an coorination issues
8ele%ant proprietary9 classifie9 security9 an licensing issues:
4pen issues
Appeni( 6 glossary9 acronyms9 etc:
QA and Testing FAQs 7age 12 of 1*
LiveTech
What's a 'test case'?
A test case is a ocument that escri!es an input9 action9 or e%ent an an e(pecte response9 to
etermine if a feature of an application is wor-ing correctly: A test case shoul contain particulars
such as test case ientifier9 test case name9 o!jecti%e9 test conitions5setup9 input ata
re#uirements9 steps9 an e(pecte results:
Eote that the process of e%eloping test cases can help fin pro!lems in the re#uirements or esign
of an application9 since it re#uires completely thin-ing through the operation of the application:
Aor this reason9 it's useful to prepare test cases early in the e%elopment cycle if possi!le:
What should !e done after a !ug is found?
The !ug nees to !e communicate an assigne to e%elopers that can fi( it: After the pro!lem is
resol%e9 fi(es shoul !e re6teste9 an eterminations mae regaring re#uirements for regression testing
to chec- that fi(es in't create pro!lems elsewhere: 1f a pro!lem6trac-ing system is in place9 it shoul
encapsulate these processes: A %ariety of commercial pro!lem6trac-ing5management software tools are
a%aila!le ;see the 'Tools' section for we! resources with listings of such tools): The following are items to
consier in the trac-ing process?
2omplete information such that e%elopers can unerstan the !ug9 get an iea of it's se%erity9 an
reprouce it if necessary:
<ug ientifier ;num!er9 1=9 etc:)
2urrent !ug status ;e:g:9 '8elease for 8etest'9 'Eew'9 etc:)
The application name or ientifier an %ersion
The function9 moule9 feature9 o!ject9 screen9 etc: where the !ug occurre
0n%ironment specifics9 system9 platform9 rele%ant harware specifics
Test case name5num!er5ientifier
4ne6line !ug escription
Aull !ug escription
=escription of steps neee to reprouce the !ug if not co%ere !y a test case or if the e%eloper
oesn't ha%e easy access to the test case5test script5test tool
Eames an5or escriptions of file5ata5messages5etc: use in test
Aile e(cerpts5error messages5log file e(cerpts5screen shots5test tool logs that woul !e helpful in
fining the cause of the pro!lem
Se%erity estimate ;a $6le%el range such as 16$ or 'critical'6to6'low' is common)
Was the !ug reprouci!le?
Tester name
Test ate
<ug reporting ate
Eame of e%eloper5group5organi)ation the pro!lem is assigne to
=escription of pro!lem cause
=escription of fi(
2oe section5file5moule5class5metho that was fi(e
=ate of fi(
Application %ersion that contains the fi(
Tester responsi!le for retest
8etest ate
8etest results
QA and Testing FAQs 7age 13 of 1*
LiveTech
8egression testing re#uirements
Tester responsi!le for regression tests
8egression testing results
A reporting or trac-ing process shoul ena!le notification of appropriate personnel at %arious stages: Aor
instance9 testers nee to -now when retesting is neee9 e%elopers nee to -now when !ugs are foun an
how to get the neee information9 an reporting5summary capa!ilities are neee for managers:
What is 'configuration anageent'?
2onfiguration management co%ers the processes use to control9 coorinate9 an trac-? coe9 re#uirements9
ocumentation9 pro!lems9 change re#uests9 esigns9 tools5compilers5li!raries5patches9 changes mae to
them9 an who ma-es the changes:
What if the software is so !uggy it can't really !e tested at all?
The !est !et in this situation is for the testers to go through the process of reporting whate%er !ugs or
!loc-ing6type pro!lems initially show up9 with the focus !eing on critical !ugs: Since this type of pro!lem
can se%erely affect scheules9 an inicates eeper pro!lems in the software e%elopment process ;such as
insufficient unit testing or insufficient integration testing9 poor esign9 improper !uil or release
proceures9 etc:) managers shoul !e notifie9 an pro%ie with some ocumentation as e%ience of the
pro!lem:
"ow can it !e known when to sto# testing?
This can !e ifficult to etermine: 3any moern software applications are so comple(9 an run in such an
interepenent en%ironment9 that complete testing can ne%er !e one: 2ommon factors in eciing when to
stop are?
=ealines ;release ealines9 testing ealines9 etc:)
Test cases complete with certain percentage passe
Test !uget eplete
2o%erage of coe5functionality5re#uirements reaches a specifie point
<ug rate falls !elow a certain le%el
<eta or alpha testing perio ens
What if there isn't enough tie for thorough testing?
Fse ris- analysis to etermine where testing shoul !e focuse:
Since it's rarely possi!le to test e%ery possi!le aspect of an application9 e%ery possi!le com!ination of
e%ents9 e%ery epenency9 or e%erything that coul go wrong9 ris- analysis is appropriate to most software
e%elopment projects: This re#uires jugment s-ills9 common sense9 an e(perience: ;1f warrante9 formal
methos are also a%aila!le:) 2onsierations can inclue?
Which functionality is most important to the project's intene purpose?
Which functionality is most %isi!le to the user?
Which functionality has the largest safety impact?
Which functionality has the largest financial impact on users?
Which aspects of the application are most important to the customer?
Which aspects of the application can !e teste early in the e%elopment cycle?
Which parts of the coe are most comple(9 an thus most su!ject to errors?
Which parts of the application were e%elope in rush or panic moe?
Which aspects of similar5relate pre%ious projects cause pro!lems?
Which aspects of similar5relate pre%ious projects ha large maintenance e(penses?
Which parts of the re#uirements an esign are unclear or poorly thought out?
QA and Testing FAQs 7age 1" of 1*
LiveTech
What o the e%elopers thin- are the highest6ris- aspects of the application?
What -ins of pro!lems woul cause the worst pu!licity?
What -ins of pro!lems woul cause the most customer ser%ice complaints?
What -ins of tests coul easily co%er multiple functionalities?
Which tests will ha%e the !est high6ris-6co%erage to time6re#uire ratio?
What if the #ro-ect isn't !ig enough to -ustify e$tensive testing?
2onsier the impact of project errors9 not the si)e of the project: 'owe%er9 if e(tensi%e testing is still not
justifie9 ris- analysis is again neee an the same consierations as escri!e pre%iously in 'What if there
isn't enough time for thorough testing?' apply: The tester might then o a hoc testing9 or write up a limite
test plan !ase on the ris- analysis:
What can !e done if requireents are changing continuously?
A common pro!lem an a major heaache:
Wor- with the project's sta-eholers early on to unerstan how re#uirements might change so
that alternate test plans an strategies can !e wor-e out in a%ance9 if possi!le:
1t's helpful if the application's initial esign allows for some aapta!ility so that later changes o
not re#uire reoing the application from scratch:
1f the coe is well6commente an well6ocumente this ma-es changes easier for the e%elopers:
Fse rapi prototyping whene%er possi!le to help customers feel sure of their re#uirements an
minimi)e changes:
The project's initial scheule shoul allow for some e(tra time commensurate with the possi!ility
of changes:
Try to mo%e new re#uirements to a '7hase 2' %ersion of an application9 while using the original
re#uirements for the '7hase 1' %ersion:
Eegotiate to allow only easily6implemente new re#uirements into the project9 while mo%ing more
ifficult new re#uirements into future %ersions of the application:
<e sure that customers an management unerstan the scheuling impacts9 inherent ris-s9 an
costs of significant re#uirements changes: Then let management or the customers ;not the
e%elopers or testers) ecie if the changes are warrante 6 after all9 that's their jo!:
<alance the effort put into setting up automate testing with the e(pecte effort re#uire to re6o
them to eal with changes:
Try to esign some fle(i!ility into automate test scripts:
Aocus initial automate testing on application aspects that are most li-ely to remain unchange:
=e%ote appropriate effort to ris- analysis of changes to minimi)e regression testing nees:
=esign some fle(i!ility into test cases ;this is not easily one> the !est !et might !e to minimi)e
the etail in the test cases9 or set up only higher6le%el generic6type test plans)
Aocus less on etaile test plans an test cases an more on a hoc testing ;with an unerstaning
of the ae ris- that this entails):
What if the a##lication has functionality that wasn't in the requireents?
1t may ta-e serious effort to etermine if an application has significant une(pecte or hien functionality9
an it woul inicate eeper pro!lems in the software e%elopment process: 1f the functionality isn't
necessary to the purpose of the application9 it shoul !e remo%e9 as it may ha%e un-nown impacts or
epenencies that were not ta-en into account !y the esigner or the customer: 1f not remo%e9 esign
information will !e neee to etermine ae testing nees or regression testing nees: 3anagement
shoul !e mae aware of any significant ae ris-s as a result of the une(pecte functionality: 1f the
functionality only effects areas such as minor impro%ements in the user interface9 for e(ample9 it may not
!e a significant ris-:
QA and Testing FAQs 7age 1$ of 1*
LiveTech
"ow can Software QA #rocesses !e i#leented without stifling #roductivity?
<y implementing QA processes slowly o%er time9 using consensus to reach agreement on processes9 an
ajusting an e(perimenting as an organi)ation grows an matures9 proucti%ity will !e impro%e instea
of stifle: 7ro!lem pre%ention will lessen the nee for pro!lem etection9 panics an !urn6out will
ecrease9 an there will !e impro%e focus an less waste effort: At the same time9 attempts shoul !e
mae to -eep processes simple an efficient9 minimi)e paperwor-9 promote computer6!ase processes an
automate trac-ing an reporting9 minimi)e time re#uire in meetings9 an promote training as part of the
QA process: 'owe%er9 no one 6 especially talente technical types 6 li-es rules or !ureaucracy9 an in the
short run things may slow own a !it: A typical scenario woul !e that more ays of planning an
e%elopment will !e neee9 !ut less time will !e re#uire for late6night !ug6fi(ing an calming of irate
customers:
What if an organi%ation is growing so fast that fi$ed QA #rocesses are i#ossi!le?
This is a common pro!lem in the software inustry9 especially in new technology areas: There is no easy
solution in this situation9 other than?
'ire goo people
3anagement shoul 'ruthlessly prioriti)e' #uality issues an maintain focus on the customer
0%eryone in the organi)ation shoul !e clear on what '#uality' means to the customer
"ow does a client.server environent affect testing?
2lient5ser%er applications can !e #uite comple( ue to the multiple epenencies among clients9 ata
communications9 harware9 an ser%ers: Thus testing re#uirements can !e e(tensi%e: When time is limite
;as it usually is) the focus shoul !e on integration an system testing: Aitionally9
loa5stress5performance testing may !e useful in etermining client5ser%er application limitations an
capa!ilities: There are commercial tools to assist with such testing: ;See the 'Tools' section for we!
resources with listings that inclue these -ins of test tools:)
"ow can World Wide We! sites !e tested?
We! sites are essentially client5ser%er applications 6 with we! ser%ers an '!rowser' clients: 2onsieration
shoul !e gi%en to the interactions !etween html pages9 T27517 communications9 1nternet connections9
firewalls9 applications that run in we! pages ;such as applets9 Ia%aScript9 plug6in applications)9 an
applications that run on the ser%er sie ;such as cgi scripts9 ata!ase interfaces9 logging applications9
ynamic page generators9 asp9 etc:): Aitionally9 there are a wie %ariety of ser%ers an !rowsers9 %arious
%ersions of each9 small !ut sometimes significant ifferences !etween them9 %ariations in connection
spees9 rapily changing technologies9 an multiple stanars an protocols: The en result is that testing
for we! sites can !ecome a major ongoing effort: 4ther consierations might inclue?
What are the e(pecte loas on the ser%er ;e:g:9 num!er of hits per unit time?)9 an what -in of
performance is re#uire uner such loas ;such as we! ser%er response time9 ata!ase #uery
response times): What -ins of tools will !e neee for performance testing ;such as we! loa
testing tools9 other tools alreay in house that can !e aapte9 we! ro!ot ownloaing tools9 etc:)?
Who is the target auience? What -in of !rowsers will they !e using? What -in of connection
spees will they !y using? Are they intra6 organi)ation ;thus with li-ely high connection spees
an similar !rowsers) or 1nternet6wie ;thus with a wie %ariety of connection spees an !rowser
types)?
What -in of performance is e(pecte on the client sie ;e:g:9 how fast shoul pages appear9 how
fast shoul animations9 applets9 etc: loa an run)?
Will own time for ser%er an content maintenance5upgraes !e allowe? how much?
What -ins of security ;firewalls9 encryptions9 passwors9 etc:) will !e re#uire an what is it
e(pecte to o? 'ow can it !e teste?
'ow relia!le are the site's 1nternet connections re#uire to !e? An how oes that affect !ac-up
system or reunant connection re#uirements an testing?
QA and Testing FAQs 7age 1& of 1*
LiveTech
What processes will !e re#uire to manage upates to the we! site's content9 an what are the
re#uirements for maintaining9 trac-ing9 an controlling page content9 graphics9 lin-s9 etc:?
Which 'T3D specification will !e ahere to? 'ow strictly? What %ariations will !e allowe for
targete !rowsers?
Will there !e any stanars or re#uirements for page appearance an5or graphics throughout a site
or parts of a site??
'ow will internal an e(ternal lin-s !e %aliate an upate? how often?
2an testing !e one on the prouction system9 or will a separate test system !e re#uire? 'ow are
!rowser caching9 %ariations in !rowser option settings9 ial6up connection %aria!ilities9 an real6
worl internet 'traffic congestion' pro!lems to !e accounte for in testing?
'ow e(tensi%e or customi)e are the ser%er logging an reporting re#uirements> are they
consiere an integral part of the system an o they re#uire testing?
'ow are cgi programs9 applets9 ja%ascripts9 Acti%eB components9 etc: to !e maintaine9 trac-e9
controlle9 an teste?
Some sources of site security information inclue the Fsenet newsgroup 'comp:security:announce' an lin-s
concerning we! site security in the '4ther 8esources' section:
Some usa!ility guielines to consier 6 these are su!jecti%e an may or may not apply to a gi%en situation
;Eote? more information on usa!ility testing issues can !e foun in articles a!out we! site usa!ility in the
'4ther 8esources' section)?
7ages shoul !e 36$ screens ma( unless content is tightly focuse on a single topic: 1f larger9
pro%ie internal lin-s within the page:
The page layouts an esign elements shoul !e consistent throughout a site9 so that it's clear to
the user that they're still within a site:
7ages shoul !e as !rowser6inepenent as possi!le9 or pages shoul !e pro%ie or generate
!ase on the !rowser6type:
All pages shoul ha%e lin-s e(ternal to the page> there shoul !e no ea6en pages:
The page owner9 re%ision ate9 an a lin- to a contact person or organi)ation shoul !e inclue
on each page:
3any new we! site test tools ha%e appeare in the recent years an more than 2./ of them are liste in the
'We! Test Tools' section:
"ow is testing affected !y o!-ect-oriented designs?
Well6engineere o!ject6oriente esign can ma-e it easier to trace from coe to internal esign to
functional esign to re#uirements: While there will !e little affect on !lac- !o( testing ;where an
unerstaning of the internal esign of the application is unnecessary)9 white6!o( testing can !e oriente to
the application's o!jects: 1f the application was well6esigne this can simplify test esign:
What is '$tree /rograing and what's it got to do with testing?
0(treme 7rogramming ;B7) is a software e%elopment approach for small teams on ris-6prone projects
with unsta!le re#uirements: 1t was create !y Ment <ec- who escri!e the approach in his !oo- '0(treme
7rogramming 0(plaine' ;See the Software#atest:com <oo-s page:): Testing ;'e(treme testing') is a core
aspect of 0(treme 7rogramming: 7rogrammers are e(pecte to write unit an functional test coe first 6
!efore the application is e%elope: Test coe is uner source control along with the rest of the coe:
2ustomers are e(pecte to !e an integral part of the project team an to help e%elop scenarios for
acceptance5!lac- !o( testing: Acceptance tests are prefera!ly automate9 an are moifie an rerun for
each of the fre#uent e%elopment iterations: QA an test personnel are also re#uire to !e an integral part
of the project team: =etaile re#uirements ocumentation is not use9 an fre#uent re6scheuling9 re6
estimating9 an re6prioriti)ing is e(pecte: Aor more info on B7 an other 'agile' software e%elopment
approaches ;Scrum9 2rystal9 etc:) see resource listings in the Software#atest:com '4ther 8esources' section:
QA and Testing FAQs 7age 1* of 1*