1
PRX-QPM-03 v1.0 Baseline Parametric Model Calibration Process (Expert Mode) 01/10/06
2
PRX-QPM-03 v1.0 Baseline Parametric Model Calibration Process (Expert Mode) 01/10/06
By modifying the default effort multiplier (e.g., 2.94) and schedule multiplier (e.g. 3.67), the algorithm
can be calibrated to an individual project. However, the need is for a calibration that is representative of
projects performing under the same life cycle strategy within a common business domain. In short, one
must compare and calibrate apples to apples. A specific example would be calibrating an algorithm for all
satellite communication projects using an incremental life cycle model. Statistically, it is desirable to
have data points from at least six projects in the common business domain. The Excel spreadsheet in
Table 1, provides an example of a tool that will generate a calibrated effort multiplier following the
guidance of COCOMO II, reference (c) using data from eight projects. Alternately, one could calibrate
each project individually, through trial and error, and then use the arithmetic mean of the resulting effort
multiplier constants. It should be noted that the Effort Adjustment Factors (EAF) and Scale Factors (SF)
are set at their default values when initiating the COCOMO applications. The assumption is that projects
within a specific business domain share common variables as represented by EAF and SF. To that end,
those variables are built into the calibrated effort and schedule multiplier constants. The approach is that
EAF and SF variables would be applied to quantify variances between baseline upgrades or individual
projects within the business domain.
For the schedule multiplier constant, a spreadsheet solution such as illustrated for the effort multiplier
could be developed, or the use of the arithmetic mean of the constants developed through trail and error
for each project could be applied.
3
PRX-QPM-03 v1.0 Baseline Parametric Model Calibration Process (Expert Mode) 01/10/06
Additional calibrations can involve modeling the distribution of effort and schedule results to the phases
of the governing life cycle model. This involves tracking the actual effort and schedule allocations to the
respective phases of the referenced life cycle model. For simplicity, only the software subset of a system
is used and basic life cycle phases are used as listed below:
a. Requirements
b. Product design
c. Detailed design
d. Code and unit test
e. Integration and test
Two spreadsheets can be developed, one addressing percent effort distribution by phase and the other
percent schedule distribution by phase. Each spreadsheet would have project data by row and the
respective phases as columns. In this manner, an arithmetic mean can be calculated for the sample
projects’ distribution of effort and schedule by phase. This data can then be used to partition the results
derived from the calibrated effort and schedule equations. Such data can be invaluable in future planning
estimates. In a like manner, a further break down of the gross effort and schedule data can be partitioned
to activities such as project office support, QA, CM, documentation, etc. Naturally, these calculations are
dependent on the collection and availability of archived project data. Currently, data collection
granularity is not sufficient to achieve this level of calibration.
Calibration can be simplified by the use of parametric cost estimating models and their associated tools.
Costar is an example of a COCOMO II estimation tool that has an associated calibration tool. The tool,
Calibrate COCOMO (CALICO) allows the user to build models that drive the Costar tool. These models
could contain calibrated equations, effort and schedule distribution tables (see Figure 2), life cycle phase
titles such as requirements and product design, milestone titles, etc, as needed to accurately model an
organization’s means of production.
5. Validate results
4
PRX-QPM-03 v1.0 Baseline Parametric Model Calibration Process (Expert Mode) 01/10/06
Once the calibrated effort and schedule multiplier constants have been developed, the new algorithm
should be applied to each project within the sample to determine the Mean Absolute Deviation (MAD)
error. Analysis of the minimum and maximum values will determine the need for adjustment to the
constants. The objective of the adjustments is to establish a normal distribution of error (+/-) for the
projects in the sample. If one project is the cause of a significant divergence, it should be analyzed for
special causes associated with the deviation. For example, the special cause may be an unplanned work
stoppage.
6. Communicate results
The results of the analysis of the data are published in the Baseline Data Analysis (BDA) report. The
result of the parametric calibration are published in the BDA to both support projects using parametric
modeling to achieve more accurate estimates for future work and to provide the PI initiative visibility into
overall process improvement. For example, if over time the effort multiplier constant is reduced in value
for a specific business domain then it can be concluded that productivity is improving. The BDA report is
considered sensitive information as it contains information on Center productivity and defect
containment. Consequently, following QA analysis, review and approval, the report is made available
only from the SPI Agents Infosite.