Anda di halaman 1dari 33

Lec2: Software quality program

concepts, practices, and measurement

SEN 653&CEN/CSC/CSE-458: Software


Quality Assurance , Testing and Reliability

By: Nujhat Nahar

CSE
Department of Computer
Science&Engineering
Software Quality Program Concepts

The software quality program is the overall approach to influence and


determine the level of quality achieved in a software product. It consists
of the activities necessary to:
– Establish requirements for the quality of a software product
– Establish, implement, and enforce methodologies, process and
procedures to develop, operate, and maintain the software
– Establish and implement methodologies, process, and procedures to
evaluate the quality of a software product and to evaluate associated
documentation, processes, and activities that impact the quality of the
product.

CSE
Department of Computer
Science&Engineering
W. Edwards Deming’s Circle
•A significant step can be
taken when senior software
management use the Deming
Circle in conjunction with the
software development cycle
so that each development
phase is subject to the P-D-C-
A approach. This method
focuses attention as the
development proceeds and so
allows time to “act” when
required.

CSE
Department of Computer
Science&Engineering
W. Edwards Deming’s Fourteen Points for Software Managers

1. Create constancy of purpose for the improvement of systems and


service, with the aim to become excellent, satisfy, and provide jobs
2. Adopt to new philosophy.
3. Cease dependence on mass inspection (specially testing) to achieve
quality.
4. End the practice of awarding business on price alone. Minimize total
cost.
5. Constantly and forever improve the system development process, to
improve quality and productivity, and thus constantly decrease the
time and costs of systems.

CSE
Department of Computer
Science&Engineering
W. Edwards Deming’s Fourteen Points for Software Managers…

6. Institute training on the job.


7. Institute leadership.
8. Drive out fear, so that everyone may work effectively.
9. Break down barriers between areas.
10. Eliminate slogans, and targets that ask for zero defects.
11. Eliminate numerical quotas and goals. Substitute
;leadership.
12. Remove barriers to pride of workmanship.
13. Institute a vigorous program of education and self-
improvement for everyone.
14. Put everyone to work to accomplish the transformation.

CSE
Department of Computer
Science&Engineering
Six Sigma for Software Engineering:
• The Six Sigma methodology defines three core
steps:
– Define customers requirements, and project goals via
well defined methods of customer communication.
– Measure the existing process and its output to
determine current quality performance (collect defect
metrics).
– Analyze defect metrics and determine vital few causes.

CSE
Department of Computer
Science&Engineering
Six Sigma for Software Engineering…:
• If an existing software process is in place, but
improvement is required, Six Sigma suggests two
additional steps:
– Improve the process by eliminating the root causes of
defects.
– Control the process to ensure that future work does not
reintroduce the cause of defects.
– Define, measure, analyze, improve and control.

CSE
Department of Computer
Science&Engineering
Six Sigma for Software Engineering…:

• If an organization is developing a software process


(rather than improving an existing process), the
core steps are augmented as follows:
– Design the process to (1) avoid the root causes of
defects and (2) to meet customer requirements.
– Verify that the process model will, in fact, avoid defects
and meet customer requirements.

CSE
Department of Computer
Science&Engineering
Review or Inspection: A tool/method for Quality management

• Primary Purpose:
• The inspection has only one primary purpose, that is, to remove
defects as early as possible in the development process. The
purpose of the inspection preparation and meeting is to:
• Identify potential defects during preparation and validate them at
the meeting
• Validate the fact that identified items are actual defects
• Record the existence of the defect; and
• Provide the record to the developer to use in making fixes

CSE
Department of Computer
Science&Engineering
Review or Inspection: A tool for Quality
management…
• Secondary purposes:
• To provide traceability of requirements to design
• To provide a technically correct base for the next phase of
development
• To increase programming quality
• To increase product quality to deliver
• To achieve lower cycle cost
• To increase effectiveness to test activity
• To provide a first indication of program maintainability
• To encourage entry/exit criteria for software management.

CSE
Department of Computer
Science&Engineering
Review Guidelines
• Review the product, not the producer
• Set an agenda and maintain it
• Limit debate and rebuttal
• Enunciate problem areas, but don’t attempt to solve every
problem noted.
• Take written notes
• Limit the number of participants and insist on advance
participation.
• Develop a checklist for each product that is likely to be
received.
• Allocate resources and schedule time for FTRs.
• Conduct meaningful training for all reviewers.
• Review your early reviews.

CSE
Department of Computer
Science&Engineering
Inspection Phases
The moderator of an inspection is responsible for the
entire inspection process for the software product.
There are six distinct inspection phases:
– Planning
– Overview
– Preparation
– Inspection meeting
– Rework
– Follow-up

CSE
Department of Computer
Science&Engineering
• Inspection types:
• High level design inspection (I0): To ensure that the functional design
at the task level is a correct expansion of the software requirements.
• Low-level design(I1): The objective of I1 is to stepwise refine the I0
design to an intermediate level before translation to the target
language code is authorized (page 224).
• Code inspection(I2): The purpose of code inspection is to ensure that
coding is done following all conventions (page 226).

CSE
Department of Computer
Science&Engineering
Types of defects and their definitions:

• Design defect: function description does not meet the requirements


specification.
• Logic defect: logic is missing, wrong, or extra information
• Syntax defect: does nor adhere to the grammar of the
design/code`language defined.
• Standards defect: does not meet the software standards
requirements. This include in-house standards, project standards, and
military standards invoked in the contract.
• Data defect: missing, extra, wrong data definition or usage.

CSE
Department of Computer
Science&Engineering
• Interface defect: incompatible definition/format of
information exchanged between two modules
• Return code/message defect: incorrect or missing
values/messages sent.
• Comment defect: the explanation accompanying the
design/code language is incorrect, inexplicit or missing
• Requirements change defect: change in the requirements
specification which is the direct and proximate reason for
the required change in the design or code
• Performance improvement defect: code will not perform
in the amount of time/space/CPU allocated.

CSE
Department of Computer
Science&Engineering
Inspection Prerequisites:

• The requirements for the proper conduct


of an inspection are as: A team of
technically competent, trained inspectors
– A trained moderator
– Proper planning and distribution of materials
– A good professional attitude
– Full preparation prior to the inspection meeting
– Completed design or cleanly compiled code
– Updated resource requirements.

CSE
Department of Computer
Science&Engineering
Inspection Metrics:
Goal Question Metric
Plan How much does the Average effort per KLOC
inspection process cost? -Thousand (Kilos)Line of
How much calendar time code
does the inspection
process take?
Monitor What is the quality of the Average faults per KLOC
& inspected software? Average inspection rate
Control To what degree did the staff Average preparation rate
conform to the procedures? Average lines of code
What is the status of the inspected
inspection process? Percentage of
What is the status of the reinspections?
inspection process? Total KLOC inspected.

CSE
Department of Computer
Science&Engineering
Inspection Metrics:
Goal Question Metric

Improve How effective is the .Defect removal efficiency


inspection process? Average faults detected per
KLOC
What is the Average inspection rate
productivity of the Average preparation rate
inspection process? Average lines of code inspected
Average effort per fault detected
Average inspection rate
Average preparation rate
Average lines of code inspected.

CSE
Department of Computer
Science&Engineering
Nine metrics of QM for code inspection:

1. Total non-commented lines of code inspected, in


thousands (KLOC)

N
 LOC inspected i
Total KLOC inspected  i 1
1,000
Where N is the total number of inspection s.

CSE
Department of Computer
Science&Engineering
Software Quality Indicators…:

2. Average lines of code inspected

total KLOC inspected x 1,000


Average LOC inspected 
N
where N is the total number of inspection s.

CSE
Department of Computer
Science&Engineering
Software Quality Indicators…:

3. Average preparation rate:

total KLOC inspected x 1,000


Average preperatio n rate 
N Preperatio n time
i

i 1 number of inspectors i

Where N is the total number of inspections. To compute


preparation rates for a single inspection, use this computation with
N=1. An unweighted average of preparation rates was rejected
because it does not account for differences in the sizes of
individual inspection.

CSE
Department of Computer
Science&Engineering
4. Average inspection rate

Total KLOC inspected x 1000


Average inspection rate 
N
 inspection duration i
i 1
Where N is the total number of inspections. To
compute the inspection rate for a single
inspection, use this computation with N=1. An
unweighted average of inspection rates was
rejected because it does not account for slow
inspection rates on small inspections.

CSE
Department of Computer
Science&Engineering
• 5. Average effort per
KLOC
N
 inspection effort i
Average effort KLOC  i 1
total KLOC inspeceted
where N is the totalumbe r of inspection and where
inspection effort i  preperatio n time i  (number of participan ts i X inspection duration i )
 rework tim e i

This metric does not include the effort for the inspection’s planning
and follow-up phases because experience has shown that their
effort is small and does not warrant the cost of collecting the data.

CSE
Department of Computer
Science&Engineering
• 6. Average effort per fault
detected
N
 inspection effort i
Average effort per fault detected  i 1
N
 total faults detected i
i 1
Where N is the total number of inspections. As with the average
effort per KLOC, this effort computation includes only time
spent by the inspection team preparing for meetings, holding the
meetings, and correcting the detected faults.

CSE
Department of Computer
Science&Engineering
Large Satellite Communication Systems

This project took place in 1993. The environment consisted of 330,000


SLOC on five target platforms with many COTS products. The inspection
process was handled as follows:
– Developers deliver code to subcontractor moderator team
– Subcontractor schedules inspection meeting
– Developers and subcontractors review code
– Inspection meeting
– Subcontractors delivers inspection meeting minutes to developers
– developers correct errors
– Subcontractor delivers final report

CSE
Department of Computer
Science&Engineering
• Results include 73 inspection meetings on
• 33,000 SLOC-(Source Line of code) during a
four-month period. The total number of issues
recorded was 2,760 (1,180 major and 1,580
minor).
• Labor hours were 4,150 for subcontractors,
including start-up, and 480 for developers.
• Cost savings=cost to find and fix during test –
cost to find and fix using inspection prior to test.
• Cost savings = $ 1 million

CSE
Department of Computer
Science&Engineering
Software Quality Indicators
• The quality indicators address management
concerns, take advantage of data that is already
being collected, are independent of the software
development methodology being used, are specific
to phases in the development cycle, and provide
information on the status of a project.

CSE
Department of Computer
Science&Engineering
Some recommended quality indicators include
1. Progress: Measures the amount of work accomplished by the developer in each
phase. This measure flows through the development life cycle with a number of
requirements defined and baselined, then the amount of preliminary and detailed
designed completed, then the amount of code completed, and various levels of
tests completed.
2. Stability: Assesses whether the products of each phase are sufficiently stable to
allow the next phase to proceed. This measures the number of changes to
requirements, design, and implementation.
3. Process compliance: Measures the developer’s compliance with the development
procedures approved at the beginning of the project. Captures the number of
procedures identified for use on the project versus those complied with on the
project.
4. Quality evaluation effort: Measures the percentage of the developer’s effort that
is being spent on internal quality evaluation activities. Percent of time developers
are required to deal with quality evaluations and related corrective actions.

CSE
Department of Computer
Science&Engineering
Some recommended quality indicators include
5. Test coverage: Measures the amount of the software system covered by the
developer’s testing process. For module testing, this counts the number of basis
paths executed/covered, and for system testing it measures the percentage of
functions tested.
6. Defect detection efficiency: Measures how many of the defects detectable in a
phase were actually discovered during that phase. Starts at 100% and is reduced as
defects are uncovered at a later development phase.
7. Defect removal rate: Measures the number of defects detected and resolved over
time. Number of opened and closed system problem reports (SPR) reported
through the development phases.
8. Defect age profile: Measures the number of defects that have remained
unresolved for a long period of time. By month reporting of SPRs remaining open
greater than 1 month.
9. Defect density: Detects defect-prone components of the system. Provides
measure of SPRs/Computer Software Component (CSC) to determine which is the
most defect-prone CSC.
10. Complexity: Measures the complexity of the code. Collects basis path counts of
code modules to determine how complex each module is.

CSE
Department of Computer
Science&Engineering
Software Product Quality Factors:

CSE
Department of Computer
Science&Engineering
CSE
Department of Computer
Science&Engineering
CSE
Department of Computer
Science&Engineering
Thank You

CSE
Department of Computer
Science&Engineering

Anda mungkin juga menyukai