Anda di halaman 1dari 7

Chapter 31

Quality Loss Function


Chapter Outline
31.1 What is Quality Loss? 439 31.7 Design of Experiments 443
31.2 Precision vs. Accuracy 440 31.7.1 Outer Arrays 443
31.3 History of the Development 31.7.2 Management of
of the Concept of the Loss Interactions 443
Function 441 31.8 Robustification 444
31.4 Taguchi Philosophy 441 31.9 Noise Variables 444
31.5 Quality Loss Function 442 31.10 Case Study 444
31.6 Off-Line Quality Control 31.11 Conclusion 444
Rule for Manufacturing 443 Further Reading 445

31.1  WHAT IS QUALITY LOSS?


The quality loss function as defined by Taguchi is the loss imparted to the soci-
ety by the product from the time the product is designed to the time it is shipped
to the customer. In fact, he defined quality as the conformity around a target
value with a lower standard deviation in the outputs. It is a graphical represen-
tation of a variety of non-perfect parts that can each lead to losses, and these
losses can be measured in rupee value. These losses basically originate from:
l Cost to produce,
l Failure to function,
l Maintenance and repair cost,
l Loss of brand name leading to customer dissatisfaction,
l Cost of redesign and rework.
Wikipedia defines Taguchi loss function as the graphical depiction of loss to
describe a phenomenon affecting the value of products produced by a company.
It emphasizes the need for incorporating quality and reliability at the design
stage, prior to production. It is particularly significant in products not involving
close tolerances, but with wider than usual tolerances, or in other words, where
the design specifies larger variations between the upper and lower control limits
to suit the manufacturing facilities, rather than as required for the matching
­between two mating components in an assembly components.

Total Quality Management: Key Concepts and Case Studies. http://dx.doi.org/10.1016/B978-0-12-811035-5.00031-3


Copyright © 2017 BSP Books Pvt. Ltd. Published by Elsevier Inc. All rights reserved. 439
440   Total Quality Management: Key Concepts and Case Studies

The pre-1950 concept cost of quality as discussed in Chapter 8 is based on gen-


eral non-statistical aspects of the quality costs to provide the basic understanding of
how product costs are built up from the materials and labor costs point of view. On
the other hand, the cost due to poor quality as emphasized by Taguchi’s loss func-
tion (post-1950) and discussed in this chapter has a different approach and treatment
involving statistical analysis and graphical depiction. It emphasizes precision more
than accuracy within the production standards.

31.2  PRECISION VS. ACCURACY


The two terms precision and accuracy are more or less synonymous in gen-
eral sense, but in scientific measurement especially related to quality manage-
ment, they are significantly different. Accuracy of a measurement or a system
is the degree of closeness of measurements of a quantity to that quantity's true
value. On the other hand, precision is the degree to which repeated measure-
ments under unchanged conditions show the same results. These are illustrated
in Fig. 31.1. Interestingly, measurements can be accurate but not precise as in
Fig. 31.1A, or can be precise but not accurate as in Fig. 31.1C. The ideal situ-
ation is when the measurements are both accurate and precise as in Fig. 31.1E,
and as emphasized by Taguchi. If the measurements vary too much and also far
away from the actual values, the measurements are neither accurate nor precise.

FIG. 31.1  Illustrations of precision vs. accuracy.


Quality Loss Function  Chapter | 31  441

31.3  HISTORY OF THE DEVELOPMENT OF THE CONCEPT


OF THE LOSS FUNCTION
During his work at the Japanese Electrical telecommunications lab in the 1950s
and 1960s, Dr. Genichi Taguchi observed that for developing new products, a
great deal of the engineers’ time and energy was spent in experimentation and
testing. This, in fact, affected the final quality output to a large extent.
Taguchi argued that the performance requirements of the system are gener-
ally underspecified, that is, given too loose tolerances to allow for the process
variations resulting in the quality loss function as described earlier.
He also deduced that 85% of the poor quality can be attributed to manufac-
turing process defects and only 15% to the operative. This led him to believe in
the philosophy that instead of attributing the poor quality to the operative, the
process and the product should be designed perfectly by building the quality into
the design. It should start from the very beginning, that is, the product concep-
tion stage and continue during the design stage, the process development stage,
and also the production stage, when care should be taken to eliminate variation.
The traditional method of calculating the losses are based on the number
of parts rejected and reworked in a production facility. This method does not
distinguish between two samples, both being within the specifications, but with
different nominal values within those limits, as explained in the Fig. 31.1. It is
thus to be understood that any item manufactured away from nominal would
result in some loss to the customer, or the wider community through early wear-
out; difficulties in interfacing with other parts, etc., even though they may be
within the acceptable limits. These losses may be minimal, as Edwards Deming
put them—unknown and unknowable. But Taguchi argued that such losses
would inevitably find their way back to the originating corporation, like the
saying, “Little drops of water make a mighty ocean.”
Taguchi, in his series of lectures emphasized the two related ideas.
l By statistical methods that are concerned with the analysis of variance.
Experiments can be designed to enable identification of the important design
factors responsible for degrading product performance.
l By meticulously and effectively judging the effectiveness of designs, the
degree of degradation or loss as a function of the deviation of any design
parameter from its target value can be assessed.

31.4  TAGUCHI PHILOSOPHY


Taguchi’s philosophy basically consists of three components which are
­described in the following paragraphs.
l A specific loss function
l The philosophy of off-line quality control; and
l Innovations in the design of experiments.
442   Total Quality Management: Key Concepts and Case Studies

31.5  QUALITY LOSS FUNCTION


Taguchi loss function or quality loss function is a graphical depiction of the losses
accrued by the phenomenon that affects the value of products produced by a com-
pany, by variations within the production standards, even though the products them-
selves are within tolerance limits and hence, with acceptable quality standards.
Deming, the quality guru and Taguchi’s contemporary, stated in his book,
Out of the Crisis, that Taguchi Loss Function shows “a minimal loss at the
nominal value, and an ever-increasing loss with departure either way from the
nominal value.”
Taguchi loss function is largely credited for the increased focus on continu-
ous improvement throughout the business world. It has also been instrumental
to the Six Sigma movement and the concept of variation management.
This significance can be understood from Figs. 31.2 and 31.3.

(A) (B)
FIG. 31.2  (A) Normal precision and (B) Higher precision.

FIG. 31.3  Loss incurred by not being precise, though accurate.


Quality Loss Function  Chapter | 31  443

31.6  OFF-LINE QUALITY CONTROL RULE


FOR MANUFACTURING
According to Taguchi, more than in the manufacturing stages, it is during the
design stage that the variation originates. This is called Taguchi philosophy,
which later evolved into Design for Quality and Design for Six Sigma, which
are described in detail in Chapter 32.
His strategy for quality engineering that can be used has three stages of
design, viz:
l System design
l Parameter design
l Tolerance design

System design is the conceptual and non-statistical stage for engineering,


marketing, and gathering customer knowledge and involves creativity and in-
novation with the adaptation of value engineering, as illustrated in Chapter 33.
Parameter design is concerned with the detail design phase of conventional
engineering, the nominal values of the various dimensions, and design param-
eters need to be set. It involves analysis of how the product should perform
against defined parameters and an optimal solution of cost-effective manufac-
turing, irrespective of the operating parameters.
Tolerance design—with a successfully completed parameter design, and an
understanding of the effect that the various parameters have on performance,
resources can be focused on reducing and controlling variation in the critical
few dimensions.

31.7  DESIGN OF EXPERIMENTS


While R. A. Fisher is credited for developing the concept of design of experi-
ments as early as 1954, Taguchi developed a number of innovations in designing
these experiments.

31.7.1  Outer Arrays


Because the conventional sampling method is inadequate here, as there is no
way of obtaining a random sample of future conditions, Taguchi proposed judg-
mental sampling, extending each experiment with an outer array that simulates
the random environment in which the product would function.

31.7.2  Management of Interactions


The impact of the uncontrolled variations, as explained below, plays a signifi-
cant role in creating noise and affects the assessment of the end result of an
experiment. Hence, Taguchi emphasized on robustification and noise variables
as explained below.
444   Total Quality Management: Key Concepts and Case Studies

31.8 ROBUSTIFICATION
Robustification, as per Wiktionary, is to make designs more robust, that is,
more tolerant of unexpected events. Robustification, also called robust param-
eter design, is the process of determining the settings of the control factors that
minimize response variation from uncontrollable factors. This can be done by
understanding and controlling the interaction between these variables. While
these variables may easily be controlled in an experimental setting, outside of
the experimental world, they are very hard and are called noise variables.

31.9  NOISE VARIABLES


Noise variables are those which are difficult or impossible to control at the de-
sign and production level, but can be controlled at the analysis or experimental
stages, as in the variation in the loading pattern, or material variation.
While the controlled variables in electronic signal transmission perform per-
fectly, the uncontrolled variables cause audible noise, which is bothersome to
the signal receivers. In a similar manner, in TQM, also such unwanted and un-
controlled variables are called “noise,” even though they are not audible.

31.10  CASE STUDY


The following illustration explains how specifications of loose tolerances would
create a loss to a company in the long run.
A company that manufactures parts that require a large amount of machining
grew tired of the high costs of tooling. To avoid premature replacement of these
expensive tools, the manager suggested that operators set the machine to run at
the high-end of the specification limits. As the tool would wear down, the products
would end up measuring on the low-end of the specification limits. So, the ma-
chine would start by producing parts on the high-end and after a period of time,
the machine would produce parts that fell just inside of the specs. The variation
of parts produced on this machine was much greater than it should be, since the
strategy was to use the entire spec width allowed rather than produce the highest
quality part possible. Products may fall within spec, but will not produce close to
the nominal. Several of these “good parts” may not assemble well, may require
recall, or may come back under warranty. The Taguchi loss would be very high.

This case study makes us consider if the savings of tool life was worth the
cost of poor product?

31.11 CONCLUSION
For those who have been thinking that by specifying loose tolerances at the
design stage, the rejection losses of the manufactured goods would be lower,
Taguchi philosophy and the explanation is an eye-opener and made them realize
Quality Loss Function  Chapter | 31  445

that in the long run, the total losses, especially the losses due to the customer
dissatisfaction, would be higher. Hence, precision plays a significant role in
reducing the total losses, as explained in this chapter.

On the Lighter Side


The words “robust” and “lean” are more or less
antonyms in English. Nevertheless in TQM they are
more or less synonymous in representing the concept
of ensuring that no single factor or variable is neglected
at the conceptual and design stages at the same time
eliminating all wasteful elements in the operations or
functions at post-design stages.

FURTHER READING
The following websites can be referred to for further information and addition-
ally to some text books indicated in the bibliography.
[1] en.wikipedia.org/wiki/Taguchi_methods.
[2] http://www.businessdictionary.com/definition/Taguchi-loss-function.html#ixzz37PnlNpIf.
[3] http://www.scribd.com/doc/73104345/Taguchi-Loss-Function.
[4] www.terninko.com/loss.htm.
[5] It provides an animated graph showing how the loss reduces when the kurtosis increases—
http://elsmar.com/Taguchi.html.
[6] www.annauniversity.info/NetLearn/TQM_TAGUCHI.ppt.
[7] https://www.kellogg.northwestern.edu/faculty/dranove/htm/dranove/coursepages/Mgmt%20
469/noisy-variables.pdf.

Anda mungkin juga menyukai