Anda di halaman 1dari 76

Six Sigma Yellow Belt

Dr. M. Kamran Zaman


Measure
Define Unit,defect
& Defect
Opportunity
Set specification
limits

Develop Data
collection plan
Collect data

Analyze Measure Improve Control Define
Establish
Data Collection
Plan
Define
Performance
Standards
CCR
Gap
Sigma=
X
What is the specific CTQ characteristic you will improve?
How will you measure the process performance?
What data do you need what is the data collection plan?
I s the data valid and accurate?
Outputs
Process Xs
or Factors
PROCESS
X
1
X
2
X
3
X
4
Y
1
Y
2
Y
3
Measure
...measure what you care about;
know your measure is good...
DMAIC
Measure
Measure is a logical follow-up to Define and in a
bridge to the next step- Analyze.
Overview of Brainstorming Techniques
A commonly used tool to solicit ideas by using categories to stimulate cause and
effect relationship with a problem. It uses verbal inputs in a team environment.
Problem or
Condition
The Y
The Xs
(Causes)
l
Categories
Material Measurement Environment
People Machine Method
The
Problem
Cause and Effect Diagram
Problem or
Condition
The Y
The Xs
(Causes)
l
Categories
Material Measurement Environment
People Machine Method
The
Problem
Cause and Effect Diagram
The Vital Few
A Six Sigma Belt does not just discover which Xs are important in a
process (the vital few).
The team considers all possible Xs that can contribute or cause
the problem observed.
The team uses 3 primary sources of X identification:
Process Mapping
Fishbone Analysis
Basic Data Analysis Graphical and Statistical
A List of Xs is established and compiled.
The team then prioritizes which Xs it will explore first, and
eliminates the obvious low impact Xs from further
consideration.

The Focus of Six Sigma
Y
Dependent
Output
Effect
Symptom
Monitor
X1 . . . Xn
Independent
Input-Process
Cause
Problem
Control
Y
f(X)
Would you control shooter or target to get the Gold Medal at
Olympics
Y = F (x)
OUTPUT SIGNAL IN-PROCESS PARAMETERS
RELATIONSHIP or EQUATION
THAT EXPLAINS Y IN TERMS OF X
Distance traveled Car speed, traveling time
Determined by
Money to Spend Income, Commitments,
Credit Rating
Determined by
OUTPUT (Y) IS DETERMINED BY THE VALUES
OF THE IN-PROCESS PARAMETERS (Xs)
Controlling the Output


We use a variety of Six Sigma tools to help separate
the vital few variables effecting our Y from the
trivial many.
Some processes contain many, many variables.
However, our Y is not effected equally by all of
them.
By focusing on the vital few we instantly gain
leverage.

Archimedes said: Give me a lever big enough and
fulcrum on which to place it, and I shall move the
world.
(X6)
(X2)
(X4)
(X1)
(X7)
(X5)
(X3)
(X8)
(X10)
(X9)
Y= F (X)
Overview of Process Mapping
In order to correctly manage a process, you must be able to describe it
in a way that can be easily understood.
The preferred method for describing a process is to identify it
with a generic name, show the workflow with a Process Map
and describe its purpose with an operational description.
The first activity of the Measure Phase is to adequately
describe the process under investigation.
Finish
Step A Step B Step C
Step D
Start
1. Process inputs (Xs)
2. Supplier requirements
3. Process outputs (Ys)
4. Actual customer needs
5. All value-added and non-value added process tasks and steps
6. Data collection points
Cycle times
Defects
Inventory levels
Cost of poor quality, etc.
7. Decision points
8. Problems that have immediate fixes
9. Process control needs
Information from Process Mapping
By mapping processes we can identify many important
characteristics and develop information for other analytical tools:
There are usually three views of a process

Process Mapping
What you THINK it is..
1
What it ACTUALLY is..
2 3
What it SHOULD be..
Define
Performance
Standards
Performance Standard
A performance Standard defines

The customer want
Clearly whether a process is performing well or not
e.g. loan approval within 24 hours
first call resolution
A Performance Standard translates the Voice of
Customer in a measurable metric.
The Basic Six Sigma Metrics

Better: DPU, DPMO, RTY (there are others, but they derive from these basic
three)
Faster: Cycle Time
Cheaper: COPQ
In any process improvement endeavor, the ultimate objective is to
make the process:
If you make the process better by eliminating defects you will make it faster.
If you choose to make the process faster, you will have to eliminate defects to be as
fast as you can be.
If you make the process better or faster, you will necessarily make it cheaper.

The metrics for all Six Sigma projects fall into one of these three categories
Nomenclature
Number of operation steps = m
Defects = D
Unit = U
Opportunities for a defect = O
Yield = Y
Basic Relationships
Total Opportunities = TOP = U x O
Defects per Unit = DPU = D/U
Defects per Unit Opportunity = DPO = DPU/O = D/UxO
Defects per million Opportunity = DPMO = DPO x 10
6


The Basic Six Sigma Metrics
Unit: The event/transaction produced or processed
Defect: Any event/transaction that does not meet the Customer requirement
Opportunity: Any inputs to event/transaction that can be measured that
provides a chance of not meeting a Customer requirement
Defective: A unit with one or more defects
Specification Limits: Tolerance Limit beyond which customer would be
dissatisfied - VOC
Defective and Defect
A nonconforming unit is a defective unit
Defect is nonconformance on one of many possible
quality characteristics of a unit that causes customer
dissatisfaction.
A defect does not necessarily make the unit defective
Examples:
Scratch on water bottle
(However if customer wants a scratch free bottle,
then this will be defective bottle)
Defect Opportunity
Circumstances in which CTQ can fail to meet.
Number of defect opportunities relate to complexity of
unit.
Complex units Greater opportunities of defect than
simple units
Examples:
A units has 5 parts, and in each part there are 3
opportunities of defects Total defect opportunities
are 5 x 3 = 15
DPO (Defect Per Opportunity)
Number of defects divided by number of defect
opportunities
Examples:
In previous case (15 defect opportunities), if 10 units have 2
defects.
Defects per unit = 2 / 10 = 0.2
DPO = 2 / (15 x 10) = 0.0133333
Yield
Probability of a part made within specifications.


Yield = e
-DPU

DPU = Defects per unit

Example
Five defects are observed in 467 units produced. The number
of defects per unit is 5/467. What will be the yield of the
process?
0.98935 or 98.94%
The number of acceptable is called Yield

First Time Yield
FTY is the traditional quality metric for yield
Unfortunately, it does not account for any necessary rework

FTY =
Total Units Passed
Total Units Tested
Units in = 100
Units Out = 100

Units in = 100
Units Out = 100

Units in = 100
Units Out = 100
Units Passed = 50
Units Tested = 50
FTY = 100 %
Process A (Grips) Process B (Shafts)
Process C (Club Heads) Final Product (Set of Irons)
Defects Repaired
40
Defects Repaired
30
Defects Repaired
20
Traditional metrics when chosen poorly can lead the
team in a direction that is not consistent with the
focus of the business. Some of the metrics we must
be concerned about would be FTY - FIRST TIME
YIELD.

It is very possible to have 100% FTY and spend
tremendous amounts in excess repairs and rework.


First Time Yield
Rolled Throughput Yield
RTY is a more appropriate metric for problem solving
It accounts for losses due to rework steps

RTY = X
1
* X
2
* X
3
Units in = 100
Units W/O Rework = 60
RTY = 0.6
Units in = 100
Units W/O Rework = 70
RTY = 0.7
Units in = 100
Units W/O Rework = 80
RTY = 0.8
Units Passed = 34
Units Tested = 100
RTY = 33.6 %
Process A (Grips) Process B (Shafts)
Process C (Club Heads)
Final Product (Set of Irons)
Defects Repaired
40
Defects Repaired
30
Defects Repaired
20
Instead of relying on FTY - First Time Yield, a more efficient
metric to use is RTY - Rolled Throughput Yield. RTY has a
direct correlation (relationship) to Cost of Poor Quality.

In the few organizations where data is readily available, the
RTY can be calculated using actual defect data. The data
provided by this calculation would be a Bi-Nominal
Distribution since the lowest yield possible would be zero.

As depicted here, RTY is the multiplied yield of each
subsequent operation throughout a process (X
1
* X
2
* X
3
)

Rolled Throughput Yield
DPMO Calculations
Characteristic Defects Units Opportunities Total Opportunities Defects per Unit Defects per Total
Opportunities
Defects per Million
Opportunities
D U O U X O D/U D/UxO DPO x 10
6

Type A 21 327 92 30084 0.06422 0.000698045 698.04547
Type B 10 350 85 29750 0.02857 0.000336134 336.13445
Type C 8 37 43 1591 0.21621 0.005028284 5028.28410
Type D 68 743 50 37150 0.09152 0.001830417 1830.41723
Type E 74 80 60 4800 0.92500 0.015416667 15416.6667
Type F 20 928 28 25984 0.02515 0.000769704 769.704433
Quality Level Calculation


Sigma Quality Level = 0.8406 + 29.37 2.221{ln(dpm)}
Example
Lets take an Example of a
Coffee Shop
What are the things which make a Good Hot Coffee??
Temperature
Aroma
Crockery
Froth
Ambience
Coffee Beans
Service
Availability
Price
Blend
Example: Defects & Opportunity
What happens when the coffee is not HOT & the Service
is poor??
10 Opportunities & 2 Defects
Out of 10 parameters only 8 get fulfilled

Defects = 2
Opportunities = 10
Unit = 1
D/(U*O)= 0.2
DPMO =
0.2*1000000
DPMO = 200000
Only 2.34 Sigma



Sigma DPMO YIELD Sigma DPMO YIELD
6 3.4 99.99966% 2.9 81,000 91.9%
5.9 5.4 99.99946% 2.8 97,000 90.3%
5.8 8.5 99.99915% 2.7 120,000 88.0%
5.7 13 99.99866% 2.6 140,000 86.0%
5.6 21 99.9979% 2.5 160,000 84.0%
5.5 32 99.9968% 2.4 180,000 82.0%
5.4 48 99.9952% 2.3 210,000 79.0%
5.3 72 99.9928% 2.2 240,000 76.0%
5.2 108 99.9892% 2.1 270,000 73.0%
5.1 159 99.984% 2 310,000 69.0%
5 233 99.977% 1.9 340,000 66.0%
4.9 337 99.966% 1.8 380,000 62.0%
4.8 483 99.952% 1.7 420,000 58.0%
4.7 687 99.931% 1.6 460,000 54.0%
4.6 968 99.90% 1.5 500,000 50.0%
4.5 1,300 99.87% 1.4 540,000 46.0%
4.4 1,900 99.81% 1.3 580,000 42.0%
4.3 2,600 99.74% 1.2 620,000 38.0%
4.2 3,500 99.65% 1.1 660,000 34.0%
4.1 4,700 99.53% 1 690,000 31.0%
4 6,200 99.38% 0.9 730,000 27.0%
3.9 8,200 99.18% 0.8 760,000 24.0%
3.8 11,000 98.9% 0.7 790,000 21.0%
3.7 14,000 98.6% 0.6 820,000 18.0%
3.6 18,000 98.2% 0.5 840,000 16.0%
3.5 23,000 97.7% 0.4 860,000 14.0%
3.4 29,000 97.1% 0.3 880,000 12.0%
3.3 36,000 96.4% 0.2 900,000 10.0%
3.2 45,000 95.5% 0.1 920,000 8.0%
3.1 55,000 94.5%
3 67,000 93.3%
Establish
Data Collection
Plan
Data Collection
1. What is the data source or location? Answers to this question help to
clearly identify the point at which the raw data is collected. The most common
mistake made in answering this question is not to identify where reports come
from. Examples of raw data collection include: on tags, in log books, entered
into data bases, scribbled on surveys, interpreted from phone conversations or
automatically tallied by machinery.

2. Who is the data collector? The answer to this question is typically a front
line employee: operator, clerk, waiter or other. If the data is scanned or
automatically tallied by machinery or computer, then a simple entry of the
method employed is adequate.

3. What is the sampling plan? This question is often confused with reporting.
The question is meant to apply to raw data. How often is data collected?
Examples include: continuously, once per minute, each setup, each shift, each
customer contact, every fifth call.

Data Collection
Six Sigma project leaders should develop a sound data
collection plan to gather reliable and statistically valid
data in the DMAIC measurement phase.

Incorporating these steps into a data collection plan will
improve the likelihood that the data and measurements
can be used to support the ensuing analysis.
Data Collection Plan

Y - Measure Data Source & Location Sample Size Who When How
X data that should also
be collected





Step 1: Define Goals And Objectives

A good data collection plan should include
A brief description of the project
The specific data that is needed
The rationale for collecting the data
What insight the data might provide (to a process being studied)
and how it will help the improvement team
What will be done with the data once it has been collected
Being clear on these elements will facilitate the accurate and
efficient collection of data.

Step 2: Define Operational Definitions and
Methodology

The improvement team should clearly define what data is to be collected and
how. It should decide what is to be evaluated and determine how a numerical
value will be assigned, so as to facilitate measurement.

How many observations are needed
What time interval should be part of the study
Whether past, present, and future data will be collected
The methodologies that will be employed to record all the data

It is best to obtain complete understanding of and agreement on all the
applicable definitions, procedures and guidelines that will be used in the
collection of data. Overlooking this step can yield misleading results if
members of the improvement team are interpreting loosely defined terms
differently when collecting data. Serious problems can arise for the
organization when business decisions are made based on this potentially
unreliable data.

Step 3: Ensuring Repeatability,
Reproducibility, Accuracy and Stability

The data being collected (and measured) will be repeatable if the
same operator is able to reach essentially the same outcome
multiple times on one particular item with the same equipment.

The data will be reproducible if all the operators who are
measuring the same items with the same equipment are reaching
essentially the same outcomes. In addition, the degree to which
the measurement system is accurate will generally be the
difference between an observed average measurement and the
associated known standard value.


Step 4: The Data Collection Process

Once the data collection process has been planned and defined, it
is best to follow through with the process from start to finish,
ensuring that the plan is being executed consistently and
accurately. Assuming project lead has communicated to all the
data collectors and participants what is to be collected and the
rationale behind it, he or she might need to do additional
preparation by reviewing with the team all the applicable
definitions, procedures, and guidelines, etc., and checking for
universal agreement. This could be followed up with some form
of training or demonstration that will further enhance a common
understanding of the data collection process as defined in the
plan.

Step 5: After The Data Collection Process

The project lead should check to see that the results (data and
measurements) are reasonable and that they meet the criteria. If
the results are not meeting the criteria, then the project lead
should determine where any breakdowns exist and what to do
with any data and/or measurements that are suspect.

Reviewing the operational definitions and methodology with the
participants should help to clear up any misunderstandings or
misinterpretations that may have caused the breakdowns.

Collect Visual Data to See the
Problem
Where possible use a Digital or Video Camera and capture the
defect or process problem. A picture is worth a thousand
words in understanding and communication of the origin and
nature of problems.
Data Segmentation
Segmentation involves dividing data into
logical categories for analyzing data. For
instance, while recording the errors made by a
data entry process, the project manager may
choose to capture the step at which the error
occurred, the operator who made the error and
so on.
Sampling
Sample
Determining sample size is a very important
issue because samples that are too large may
waste time, resources and money, while samples
that are too small may lead to inaccurate results
Sampling
There are normally two types of studies: population and
process. With a population study, the analyst is interested in
estimating or describing some characteristic of the population
(inferential statistics).

With a process study, the analyst is interested in predicting a
process characteristic or change over time. It is important to
make the distinction for proper selection of a sampling
strategy.
Sampling Strategies
Random sampling

Stratified random sampling

Systematic sampling

Rational sub-grouping

Random Sampling
Random samples are used in population
sampling situations when reviewing historical or
batch data. The key to random sampling is that
each unit in the population has an equal
probability of being selected in the sample
Stratified Random Sampling
Stratified random samples are used in population
sampling situations when reviewing historical or
batch data. Stratified random sampling is used
when the population has different groups (strata)
and the analyst needs to ensure that those groups
are fairly represented in the sample. In stratified
random sampling, independent samples are
drawn from each group. The size of each sample
is proportional to the relative size of the group.
Systematic Sampling
Systematic sampling is typically used in process
sampling situations when data is collected in real time
during process operation.

Systematic sampling involves taking samples
according to some systematic rule e.g., every fourth
unit, the first five units every hour, etc.
Rational Sub grouping
Rational sub-grouping is the process of putting
measurements into meaningful groups to better
understand the important sources of variation. Rational
sub-grouping is typically used in process sampling
situations when data is collected in real time during
process operations. It involves grouping measurements
produced under similar conditions, sometimes called
short-term variation. This type of grouping assists in
understanding the sources of variation between
subgroups, sometimes called long-term variation.
Sampling Strategy Tips
Use following pointers for sampling strategy for a given process:

It is always better to collect small sample spread over longer
time period than one large sample over a shorter time period.
Sample more frequently for unstable process and less
frequently for stable process
Sample more frequently for process with short cycle time and
less frequently for process with long cycle time
Sampling Strategy Tips
To understand the sampling frequency one should understand
the objective of data collection The most important issue to
remember when considering sample frequency is the data
collection objective. The sampling frequency is driven by the
objective of data collection

e.g. if the data collected is for monitoring process one might
want to sample data daily. However if the objective is to
collect data for the same process to study capability one might
want to collect data for few months by sampling few data
points each week or month.
Analyze
Sub process
mapping
Cause & Effect
Diagram




Analyze Measure Improve Control Define
Establish
Process Capability
Identify Variation
Sources
Calculate
Process
Capability
Process Capability
There are two popular measures for
quantitatively determining if a process is
capable:

Process capability ratio (Cp)

Process capability index (Cpk)

Process Capability Ratio
For a process to be capable, its values must fall
within upper and lower specifications.

Process capability is within 3 standard
deviation from the process mean.

Cp= upper specification lower spec/6
Cp = (USL-LSL) / 6

A capable process has a Cp of at least 1.0.
Cp = 1.0 means 99.73 % out puts are within
specifications, it suggests that a very capable
process
The higher the process capability ratio, the
greater the likelihood the process will be
within design specifications.

Process Capability Ratio
A comparison between the specified
limits and the process limits
Your Cp is good ( > 1)
Here the process limits fall out
of the specified limits
Your Cp is bad ( < 1)
Process Capability Ratio
Process Capability Ratio
Process Capability
When the process is in statistical control,
process capability is equal to 6
Cp
Cp does not measure process performance in
terms of the nominal or target value.


WHAT IF YOUR PROCESS CENTRE IS
SHIFTED FROM YOUR SPECIFICATION
CENTRE ?

Shift In The Process Mean
Process Capability Index Cpk
It measures the difference between the desired
and actual dimensions of goods or services
produced.

Cpk = min of [Upper spec limit X/ 3,
X - Lower spec limit / 3]
Process Capability Ratio
Cpk
Cpk
Cpk gives additional information about the
centering. Therefore, it is also called process
performance index.

Increasing value of Cpk means that the process
is increasingly becoming capable
Cpk
Cpk > 1 a capable process

CpK < 1 a not capable process
Process Capability Study
Evaluating of a newly established process
Evaluating the performance of a new
machinery.
Reviewing specification based on the
inherent variability of the process
Process studies
Studying the effect of adjustments made
to the process
Data Analysis
To understand the level of current process,
analyze the collected data with the help of
statistical tools.
Baseline Process
Study data for stability, shape
Statistically indicate the nature of the problem
Calculate the baseline capability of the process.

Describe the process by its

Descriptive statistics
Nature of distribution
Understand Specification Limits and Centering or Target
Values
Calculate: probability of a defect and process capability
Setting a Goal for Y metric
Following are approaches which one could adopt to
define the Goal for Y metric:
Benchmarking: One can target to achieve best in
industry.
Arbitrary Defect Reduction: This is frequently used
with discrete metric.
E.g.reduce DPMO by 50%
Other sources for setting a goal
Following are few more options which drive the goal
for process metric:
Corporate mandate
Compliance/legal requirement
Voice Of Customer
Industry Standards (e.g. ISO)
In Analyze phase, you work with all the information
gathered in the Measure phase to determine potential
causes and to prepare for making key changes to positively
alter each scenario.
Analyze - Summary
Thank You

Anda mungkin juga menyukai