Testing: -------Testing is defined as the process in which defects are identified,isolated,subjected for rectification and ensured that the product is defect free in order to provide the quality to the product and hence customer satisfaction
Identification of defects:One has to identify the areas where in the requirements are not justified properly which have to be considered as defects.Hence several defects are encountered to the system.
Isolation of defects:All the defects encountered in the previous step will be listed out in a seperate document for them to be dealt with.
Subjected to rectification:The list of defects is documented in a seperate document and should be sent to the development team for the process of rectification.
What is verification: A QA People uses verification method to ensure the system complies with an organization standards and processes, replying on review or non executable methods (such as software, hardware, documentation and personnel) Are we Building the Right Product
What is validation: ------------------Validation physically ensures that the system operates according to plan by Executing the system functions through series of tests that can be observed or evaluated. Are we building the Product Right
What is quality assurance: -------------------------A planned and systematic pattern for all actions necessary to provide adequate confidence that the item or product conforms to established technical requirements In this Process QA people will verify the completeness and correctness of roles responsibilities of partments,employees weather they are working according to standards are not.
What is quality control: -----------------------Quality Control is defined as a set of activities or techniques whose purpose is to ensure that all quality requirements are being met. In order to achieve this purpose, processes are monitored and performance problems are solved.
Bug: A fault in a program which causes the program to perform in an unintended or unanticipated manner.
Sign In: -------it is the process in which for the Customer and development team coming to the aggrement that the project has to be completed with in the dead line and with in the budjet.
-----------------------------------------------------------
SOFTWARE DEVELOPMENT LIFE CYCLE: -------------------------------Phased engineering approach to software development Requirements Analysis & Specification Design Architecture, System or High Level, Detailed or Low Level Coding / Construction Testing Maintenance
Responsibility:1.HLD 2. LLD
Role: Documents:TDD
Contents of TDD: ---------------Flow charts Data Flow diagrams Data tables Psudo code
Testing: -------Responsibilty:To validate a application to ensure that all the requirements of customer is met. Role:Test Engineer Documents:Test deliverables
-----------------------------------------------------------
Waterfall model describes a process of stepwise refinement Widely used in defense & aerospace industries.
But software is different: No fabrication step. Program code is just another design level. Hence, no commit step - software can always b changed!. No body of experience for design analysis (yet)
Waterfall model takes a static view of requirements - ignores changing needs. - Lack of user involvement once specification is written Doesnt accommodate prototyping, reuse, etc.
Prototyping is used for: - understanding the requirements for the user interface - examining feasibility of a proposed design approach - exploring system performance issues
DisAdvantages: 1. Users treat the prototype as the solution 2. A prototype is only a partial specification
v-model: --------
-----------------------------------------------------------
LEVELS OF TESTING: -->Unit Testing -->Integration Testing -->System Testing -->User Acceptance Testing
1)Unit Testing: --------------The testing done to show whether a unit (the smallest piece of software that can be independently complied or assembled, loaded, and tested) satisfies its functional specification or its implemented structure matches the intended design structure.
2)Integration Testing: ---------------------Integration Testing refers to the testing in which software units of an application are combined and tested for evaluating the interaction between them.
-----------------------------------------------------------
---------------------------After completion of all posible modules integration as a system.The separate testing team in an organization validates that build through a set of black box testing techniques. These techniques are clasified into four divisions
----------------------------------------------------------Usability Testying: In general,a System testing starts with usability.This is done in early days of the job. In this level,testing team follows two testing techniques.
b)Manual support testing:In this technique, context sensitiveness(with respect to work or task)of the user manuals.This is done end days of the job. Eg:Help documents of user manuals.
Receive build from developers | User Interface testing | Remaining functional &system tests | Manual support testing
-----------------------------------------------------------
2.Functional Testing:A mandatory part in black box testing is functional testing.during this tests,testing team concentrate on "meet customer requirements".Functional testing is classified into below sub tests
a)Functionality testing:It is also known as requirements testing and validates correctness of every functionality. Functionality testing consists of below six coverage:
* Behavuioral coverage:Meant that property of objects changes with respect to task.For example,whenever we type user-id & passwoed automatically submit button is enabled.Means that
* Error-Handling coverage:In this testing, we have to prevent negative navigations.For example,in login process, if we give wrong password it has to give a warning message so that it should not terminate.
* Input Domain coverage:In this we have to check, the size and type of input object is correct or not.
* Backend coverage:In this we have to see, that the impact of front-end operations are storing on back-end database. It means that there are some operations performed on front-end ,but the change has to take place in back-end tables also.
* Service levels coverage:In this we have to see, that the order of functionalities are placed in right position. For example: There are some funcionalities listed below
Here in tis ,there is one wrong placement,i.e.forgot password. If you forgot password how can you login to the window.
-----------------------------------------------------------
b)input domain testing: It is a part of functionality testing, but test engineers are giving special treatment to input domains of object, such as 'Boundary value Analysis' (BVA)and 'Equivalence class partitions'(ECP).
BVA defines the range and size of the object.For example take age, the range is 18-60,here the range is taken into consideration, not the size.ECP defines what type of
Example 1: A login process allows user-id and password to authorized users.From designed documents, user-id allows alpha-numeric from 4-16 characters long and password allows in lower case from 4-8 characters long prepare BVA and ECP for user-id and password.
Example2:Atextbox allows 12-digits numbers.In this number '*'is mandatory and '-'is optional.Give BVA and ECP for this textbox.
c)Sanitation Testing: --------------------It is also known as garabage testing.During this test,test engineers find extra functionality in build with respect to SRS
d)Instalation Testing: ---------------------Easy interface is during installation and occupied disk space is checked after installation.
It is also known as comparative testing.During this test engineers try to find competitiveness of our application product through comparision with other product.This test is done only for the software product,not for application software. -----------------------------------------------------------
3.Performance Testing: ---------------------Performance Testing: Testing conducted to evaluate the compliance of a system or component with specified performancerequirements. Often this is performed using an automated test tool to simulate large number of users. Also know as "Load Testing".
a)Recovery Testing: ------------------It is also known as Reliability Testing. during this test,test engineers validate that whether our application build change from abnormal state to normal state or not?
B)Compatability Testing: -----------------------It is also known as portability testing. During this test test engineers validate that whether our application build run on customer expected platforms or not? During this test engineers are facing two types of compatability problems i.e forward compatability and backward compatability
c)Storage Testing: -----------------The execution of our application build under huge amount of resources to estimate storage limit is called storage Testing.
d)Load Testing: ------------The execution of our application build under customer expected configuration and customer expected load to estimate performance is called Load Testing.
e)Sress Testing: -------------the execution of our application build under customer expected configuration and un interval loads to estimate performance is called Stress Testing.
-----------------------------------------------------------
4.Security Testing: ------------------This Testing Technique is complex to be applied.During this test testing team concetrate on "privacy to user perations" in our application build.This testing technique is lassified into below sub tests
b)Access Control: ----------------whether avalid user has permission to use specific service or not?
c)Encryption/Decryption: -----------------------Here sender i.e client performs,encryption and receiver i.e server performs decryption.
Note:In small scale organizations,authorization and access control are covered by the test engineers.Developers perform the encryption and decryption procedures.
----------------------------------------------------------VI)user Acceptance Testing: --------------------------After completion of all possible functional and system testing,our project management concetrates on User Acceptance Testing to collect feedback from customer site people.
Alpha Testing: -------------It is the type of UAT conducted on an application with in the
-------------------------------------------------------
1.Monkey Testing or Chimpanzee Tssting: --------------------------------------The covarage of "main activities" in our application build during testing is called monkey Testing.Due to lack of time, testing team follows this type of testing.
for example Mail compose,Mail replay Mail forward are three are there we conduct only Mail open and Mail compose.Because,Mail replay and Mail forward is similar to Mail compose due to lack of time.
2.Exploratory Testing: ---------------------The covarage of all activities in level by level during testing is called exploratory testing.Due to lack of knowledge on the that application,test engineers are following this style of Testing.
3.Ad-hoc testing: ----------------A tester conducts a test on application build, depends on predetermined ideas,called Ad-hoc Testing.Based on the past experiance tester tests the build of the project.
4.Sanity testing: ----------------Whether a development team released build is stable is stable for complete testing or not?For example specifying that it is specifying that it is not good without reason,just watch is not working.
5.Smoke Testing: ---------------It is an extra shake-up in sanity testing, in this level , testing team reject a build with reason when that build is not working to be applied complete testing. For example to say that watch is not working due to keyrod i.e with reason.
6.Retesting: -----------The re-execution of a test with multiple test data on same application build is called Retesting.
Test Data --------Input1 -----Min Min Max Max value 0 0 Input2 -----Min Max Min Max 0 Value 0
7.Regression Testing: ---------------------The re-execution of tests on modified build to ensure "bug fix work" and possibilities of side effects occurrence is called Regression Testing.
Note:From the definitions of Restesting and Regression testing, test repetition is a mandatory task in test engineer's job. Due to this reason,test engineers are going to test Automation.
TESTING LIFE CYCLE: ------------------An Effective testing lifecycle must be imposed over the development lifecycle before testing improvements can be made.
The goal is to make the testing flow naturally from the development work.
The testing work should not disrupt or lengthen the development work.
Quality control(QC) <---Test policy | | | Test strategy | Quality Analysts(QA) | Test Methodology | | | Test Lead <-----Test Plan | Company Level
| | | Test cases | | | Test procedures | | | Test Engineers | | Test Execution | | Test Log | | | Defect Report Test scripts Project Level
-----------------------------------------------------------
1)Test policy:It is a company level document and developed by Quality control people(QC->almost management)
LOC->Lines of code FP->Functional points(i.e Number of screens,input's, output's queries,forms,reports) QAM->Quality Assessment Measurement TMM->Test Management Measurement PCM->Process Capability Measurement
-----------------------------------------------------------
Testing process
Testing Measurements:QAM,TMM,PCM
Above test policy can defines,"Testing objective".To meet that objective Quality Analyst people can define,Testing
2)Test Strategy: ---------------It is a company level document and developed by Quality Analyst(QA)people.This test strategy defines a common testing approach to be followed.
This testing approach is done in the folloeing way i.e in matrix is called as test Responsibility Matric(TRM)/Test Matrix(TM). It is mainly based on the Development stages and test factors.In Development stages we have,five stages, and in testing factors side we have,fifteen factors.These are shown in the following figure:
Or Test factors |
-----------------------------------------------------------
4)Roles and Responsibilities: Names of jobs in testing team and their responsibilities during tesing.
6)Communication and status Reporting: Required negotiation b/w every two consecutive jobs in testing team.
7)Defect Reporting & Tracking: Required negotiation b/w testing team and development team to track defects.
9)Risks & Mitigations: List of expected failures and possible solutions to over come during testing.
10)Training Plan: Required training sessions to testing team to understand business requirements.
11)Change & configuration Management: How to handle change requests of customer during testing and maintenance.
TEST FACTORS: To define a quality software,Quality Analyst -----------people are using fifteen test factors.
2)Access Control: Whether a valid user have permissions to use specific services or not?
3)Audit Trail: Whether our applicaiton maintains metadata about user operations or not?
4)Continuity of processing: The integration of internal modules for control and data transmission (Integraion Testing).
6)Coupling: co-existence with other existing software(inter system testing) to share common resources.
9)File Integrity: Creation of back up during execution of our applicaiton (for recovery).
14)Maintainable: Whether our application build is long time serviceable in customer site or not?
15)Methodology: Whether test engineers are following standards or not during testing.
TEST FACTORS
Vs
2) Access control
3) Audit trail
testing
4) Continuity of processing
testing)
5) Correctness
->Functionality /Requirements
testing
6) Coupling
7) Ease of use
8) Ease of operate
->Installation testing
9) File integrity
testing
10) Reliability
11) Portability
12) Performance
testing
14) Maintainable
->Compliance testing
-----------------------------------------------------------
Quality
Test factors
Testing techniques -> By test lead | Test cases -> By test engineers
TEST METHODOLOGY:It is a project level document and developed by quality analyst or corresponding 'project manager'(PM) The test methodology is a refinement form of the test strategy with respect to corresponding project. to develop a test methodology from corresponding test strategy QA/PM follows below approach.
----------------------------------------------------------1)traditional + + + + + project
(out sourcing)
-----------------------------------------------------------
NOTE:Depends on project type,quality analyst(QA)or project manager(PM)decrease number of columns in TRM(Test responsibility Matrix)means i.e in development stages.
NOTE:Depends on current project version requirements, quality analyst(QA)or project manager(PM)decrease number of rows in TRM,means that is done in test factors.
NOTE:Depends on expected future enhancements,quality analyst(QA) or project manager(PM)can add some of the previously removed test factors into TRM.
NOTE: Depends on analyzed risks,quality analyst(QA)or project manager(PM) decrease some of selected rows in TRM.
It is also a refinement form of v-model. this model defines mapping b/w development process and testing process.
________________________________________________ | Design(HLD & LLD's) | Coding | Unit & integration testing | test | test | study S/wRS and design docs | planning initiation
(intial build)
&
Create test batches/test suites | | Bug fixing modified select a batch and & ---------> <---------------| |
Developers Defect if test engineers got any mismatch | INDEPEND | Report | | ---------->| | | |
Batches |
other wise ------------------------| | test closer | | Level-3(Final Regression/pre-Acceptance test/post mortem) | | UAT(User Acceptance test) | | sign off
4)TEST PLANNING: After completion of test methodology creation and finalization of required testing process, test lead category people concentrate on test planning to define"what to test?", "how to test"?,"who to test?","When to Test"?
2)Introduction:About project
8)Suspension Ceiteria:What are the technological problms,raised during execution of above features testing.
9)Feature pass or fail criteria:When a feature is pass and when a feature is fail.
11)Test Deliverables:Required testing documents to be prepared during above modules Testing by Test Engineers.
12)Staff & Training needs:The names of selected test engineers and required trining sessions to them to understand business logic (i.e customer Requirement)
15)Risks & Mitigations:Non-technical problems raised during testing and their solutions to over come.
To develop above like test plan document, test lead follows below work bench (approach)
s/wRS,Design Documents | | Inputs output | | Finalized TRM ----------------------------------------------------------4)review test plan 3)prepare test plan Test plan 2)Identify tactical risks
1)Testing team formation:In general ,the test plan process starts with testing team formation,depends on below faactors
Team Size: Team is based on the developers and expressed in terms of ratio's i.e Developers:Testers=3:1
2)Identify Tactical Risks:After completion of testing team formation,test lead concentrate on risks analysis or cause-root analysis.
Examples:
Risk 1: Lack of knowledge on that domain of test engineers (training sessions required to test engineers)
Risk 4: Lack of test data(Improper documents,and mitigation is ad-hoc testing,i.e based on past experience)
Risk 5: Delays in delivery (in terms of job completion, mitigation is working for over time)
3)Prepare test plan: After completion of testing team formation and risks analysis ,test lead prepare test plan document in IEEE format.
4)Review test plan: After completion of test plan document preparaion ,test lead conducts reviews on that document for completeness and correctness. In this review, test lead applies covrage analysis.
4) Test suit-id :The name of test batch, in which this case is a member
7) Test Effort(person per hour) :Time to excute this test case (Eg: Average time to execute a test case is 20 minutes)
8) Test Duration :Date and time to execute this test case after receiving build from developers.
9) Test Setup :Necessary tasks to do before starts this test case execution .
10) Test procedure :A step by step process to execute this test case
---------------------------------------------------------| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
-----------------------------------------------------------
These are filled during test design, These are filled during test execution
11) Test case pass/fail; criteria: When this case is pass & when this case is fail.
NOTE: In general, test engineers are creating test case document with the step by step procedure only. They can try to remember remaining fields for the further test execution. ----------------------------------------------------------Case study d1: prepare test case document for "Successful file save"in notepad.
| Required |
-----------------------------------------------------------1) | open notepad| ------ |Empty editor opened and save| | | option disable | | | 2) |Fill with |Valid text |Save option enabled | text | | | | |
3) |Click save | ------ |Save window appears with | | | | | |default file name
4) |Enter file |Unique file|Saved file name appears in |name & cilck | name |save | | |title bar of notepad
-----------------------------------------------------------
Case study 2: Prepare test case document for "Successful Mail Reply".
3) Test procedure :
|Expected
-----------------------------------------------------------1) |Login to site|Valid user-|Mail page must be opened | | |id & pwd | | |
3) |Select receiv| ------ |Mail message window appears |ed mail sub | | | | |(Mail opened)
4) |cilck reply | ------ |Compose window appears with | | | | | | | | | | | | |to:Received mail_id |sub:Re:(Received Mial sub) |CC:Off |BCC:Off |Message:Recevied mail msg |with comment | |
5) |Enter new msg|Valid text |Acknowledgement from sever |& click send | |
Test Execution: --------------After completion of test design and their reviews,testing team team concetrate on build release from development team.
Development
Testing
Stable Build<----------------> | |
Level-0-->all P0 Test cases Level-1-->all P0,P1 & P2 Test cases as batches Level-2-->Selected P0,P1 & P2 Test cases with respect to modifications Level-3-->Selected P0,P1 & P2 Test cases with respect to high bug density modules.
3)Test Harness:
TestHarness=Test Environment + Test Bed (Test Environment is hardwares and softwares required for testing,where as Test Bed means that "testing documents"
Test Reporting: --------------During test execution,test engineers are reporting mismatches to developers through an IEEE defect report.
IEEE Format:
6)Reproducible:Yes/No Yes->If defect appears every time during test repetition No->If defect does not appears every time(i.e appears rarely)during test execution
Medium->Mandatory to solve,but able to continue remaining testing before solving that defect.
11)Status:New/reopen:
14)Assigned to:The Name of Responsible Person at development side to receive that defect
Use Case: --------Ingeneral,test engineers are preparing "Test Cases" are depending on "Use Cases in SRS. Every use case in SRS,describes that how to use a functionality?These use cases are known as functional specifications(FS).
Types Of defects(bugs):
----------------------1)User Interface bugs(Low severity) Eg1:Spelling mistake(High priority based on customer Eg2:improper right alignment(Low Priority) requirements)
2)Input domain bugs(Medium Severity) Eg1:Does not allows valid type(High priority) Eg2:Allows invalid type also(Low Priority)
3)Error handling bugs(Medium severity) Eg1:Does not return error message(High priority) Eg2:Incomplete meaning of error message(Low Priority)
4)Calculation bugs(High Severity) Eg1:Dependent outputs are wrong(High priority) Eg2:Final output is wrong(Low Priority)
5)Race Condition bugs(High Severity) Eg1:dead lock(High Priority) Eg2:Does not run on expected platforms(Low Priority)
----------------------------------------------------------Hold: ----If the information is not enough to decide whether it is a bug or not a bug this bug is assigned to the status "Hold status".
As per Design: -------------In case the defect is raised on functionality which has got the modification as per the requirements the defect will not be accepted by the developer and asign the status " as per design"
Testers error: -------------in case the Test engineer misunderstand the functionality he will perform wrong testing and raises wrong defects which are not accepted by the development team and always such type of defects are assigned with the "Testers error"
-----------------------------------------------------------
Reporting : ----------How Test engineer sends defect profiles to the Development team.
Bug Reporting process: ---------------------it is the processs in which the test engineer send the bug report document to the developer
Basicaly bug reporting process is the process in which the testing team sends the information to the development team for the process of rectification.There are three types of bug reporting process are as below.
Classical Bug Repoorting: ------------------------In this process the test engineers prepares the respective defect profile Documents inturned send to the quality lead.Quality lead consolidates these documents into one document which can be sent to pm through an email as an attachment pm is them assign these Documents to the developers so that they begin the process of defect Rectification.
-->No provision for the development team to view the defect information and to know the status of functionality while testing is going on
Repository Oriented Bug Reporting process: -----------------------------------------In this process once the test engineers developed the individual defect profile documents,these are send to the Quality lead for the consolidation.Quality lead keeps this Consolidated document in a common Repository(Like a database)once the Quality lead sent the documents then he will intimate the same to PM through Mail.
Bug Tracking Tool Oriented Bug Report: In this process Gug Tracking Tool is introduced apart from the common Repository in the tool to place on the Repository so as to be available for all the
members of the team.When testing happens the test engineers will not be preparing individual defect profile but as entered into defect profile template provided by the tool.