Anda di halaman 1dari 13

Notes

The Nature of Design


Artefact Something that results from purposeful human activity.
Design Arrangement, structure, form, materials and the human activity in which the
characteristics of an artefact are determined. The word design may be used as a noun to
describe models or drawings, as a verb to describe the act of realising an artefact.
Designs can be a product like a phone, a system like an air traffic control protocol or it may
be a service and increasingly combinations are offered such as product-service systems such
as a launderette.
Engineering designers define in detail every part of an artefact.
Industrial designers may be responsible for aesthetics and aspects of user interaction and
ergonomics.
Brian Arthur wrote the book 'The Nature of Technology' and puts forward three fundamental
propositions:
1. All technologies are combinations of components or assemblies or subsystems at hand.
2. Each component of a technology is itself (in miniature) a technology.
3. All technologies harness and exploit some natural phenomenon, usually several (for
example flow of electrons in a semi-conductor or enthalpy in high pressure steam).
Based on Arthur's view of technology, there a number of different types of activity that a
designer can undertake including:
1. Searching for new combinations of technology to address a new need, or an existing need in
a novel way (for example a smart phone combines a phone, music player, web browser, GPS
etc.).
2. Taking a natural phenomenon and embedding it it in a useful device e.g. making nuclear
fission exploitable in an electrical generation plant.
3. Structural Deepening Adding systems and sub-assemblies to an artefact to help to help it
work properly, handle exceptions, extend its range of application and provide redundancy in
event of failure.
4. Internal Replacement Replacing components or materials to improve the performance of
an artefact.
Standard Engineering Where new artefacts are created by putting together methods and
devices under design principles that are known and accepted e.g. a new bridge. In this work,
the engineer's task is to ensure that the design principles are applied so that the new artefact
meets the specification required.
Pahl and Beitz' Modes of Design are another view on the different categories of design
presented in their book on engineering design.
1. Original Design Original or new tasks are solved using new solution principles. This is
represented by categories 1 and 2 in Arthur's classification.
2. Adaptive Design Where known solution principles are used, but they may be adapted to
new circumstances, and their embodiment is varied. Original design may take place at subsystem level. This equates broadly to category 3 and 4 and to Arthur's standard engineering.
3. Variant Design Also referred to as fixed principle design, in which the sizes and
arrangement of parts is varied within closely prescribed rules or limits. This is a special case
of standard engineering.
Dominant Design Coined by Abernathy and Utterback and referring to the phenomenon
where particular design arrangements come to dominate a market with alternatives and
become de facto standards.
Advantages of dominant designs are that it helps to achieve economies of scale through
standardisation and by concentrating effort on a dominant design an industry can achieve
very high levels of performance and reliability.

Improvement of dominant designs occurs through structural deepening and internal


replacement over the lifetime of the technology but there eventually comes a time when
these do not do much to improve the performance of the technology.
The disadvantage of dominant designs is that when the limit of the technology is reached, a
novel new principle is needed but we are already locked-in to the current dominant design
and the new technology may not be available or it may be very economically risky to
develop it so progress slows down.
Adaptive Stretch the adaptation of a locked-in established principle. At some point the old
principle becomes too difficult to stretch and the way is left open for a novel principle.
Paradigm Shift A fundamental change of approach to a technology, for example we are
currently having to move away from technology based around fossil fuel due to
environmental considerations.
New Product Introduction (NPI) The process by which a company identifies and exploits
an opportunity to bring a new product to market and comprises the early stages of the
Product Life Cycle.
Studies have estimated that the majority of all costs and problems of quality are created in
product design and development.
Typically 75% of faults originate in the development and planning stages but around 80% of
these faults remain undetected until final test or use.
Costs 'fixed' at the planning and design stages are typically between 60 to 85%, but the costs
actually incurred may only be 5% of the total committed for the project.
As well as quality, cost, delivery and performance, designers must consider many other
properties known as the 'ilities.'
Product Development Models and Standards
Many of the models listed in this section are not scientific laws but descriptive models of the
processes found in industry. In other cases prescriptive models are used based on empirical
evidence which if adhered to are likely to result in a high quality design outcome.
The product life cycle stages:
1. Feasibility Study A check that a proposed product is technically and commercially feasible
in order to form a design brief.
2. Design Takes the idea and turns it into a set of instructions for manufacture in the form of
drawings, computer models, production plans and so on.
3. Development Where prototypes may be built and then tested in order to be improved upon.
4. Production The product is manufactured and assembled.
5. Distribution The product is delivered to the customer base.
6. Use The product is operated by the customer.
7. Disposal The final stage where the used product is dealt with in various ways including remanufacturing, recycling or disposal to landfill.
The product life cycle can be summarised into four key stages including motivation and
need, creation, use and disposal and the first two stages may be termed new product
development. This is a critical stage and this importance is recognised by the existence of
several product development models to guide the design process of NPI.
Sequential Models The product life cycle is descried by a sequential series of phases and
stages. The most widely used NPI and design process models adopt this pattern. The stages
represent the order in which information has to be generated in the NPD process.
Double Diamond Design Process Model A very simple sequential design process model
which divides the process into alternating divergent and convergent stages. In divergent
stages, the designer tries to expand the range of information, concepts and ideas and at the
convergent stages structured approaches are used to home in on a decision. The four stages
are:
1. Discover: Divergent stage in which users and markets are explored with the aim of

2.
3.
4.

1.
2.
3.

4.

expanding understanding of issues and important competitive factors.


Define: Convergent stage in which market and user information is aligned with business
objectives to define design goals, brief, project team and management and so on.
Develop: Divergent stage in which design solutions are developed and iterated with the
design team exploring a wide range of solutions and expanding its understanding of the
options for delivery of a suitable design.
Deliver: Final convergent quarter where product is completed tested and launched.
The double diamond is especially used in more artistic design fields such as industrial
design.
Pahl and Beitz Design Process Model Sequential multi-stage process engineering design
model. This model divides the design process into four high-level phases including: task
clarification, conceptual design, embodiment design and detail design. Important points to
note about this model are as follows
The output from the task clarification phase is the specification or product design
specification.
The conceptual design phase explores high-level solution principles (design concepts) that
might meet the PDS and selects the most suitable.
The selected concept is then fleshed out in the embodiment phase (for example to define the
shape and size of components needed to implement a particular solution principle),at the
completion of which the product structure (bill of materials) and the form and dimension of
all parts should be defined.
The detail design phase adds detail such as tolerances, surface condition and treatment and
so on.
Pahl and Beitz is a very good basis for bespoke production machines and is suited for
systems engineering and problem-solving.
Stage-gate Models In order to decide whether iteration is needed within a previous step of
a design model, there need to be decision points within the process and these points are
called gateways.
The stage-gate model developed by and presented as a trade marked product by Robert
Cooper has five phases: scoping, building business case, development, validation and launch
(with discovery and post-launch review as optional stages).
The gateways in stage-gate models are there to make a go/no go decision based on the
information generated in the preceding stage. The gateways are an important part of the risk
management process.
Stage-gate models are useful for large, expensive projects with distributed development as it
allows for tight monitoring and control over the design process.
Agile and Cyclic Models NPI models which emphasise adaptation, iteration, continual
feedback and rapid and flexible response to change, especially to feedback from the users.
It is argued that stage-gate models tend to serialise and compartmentalise activities where it
is argued that validation should be done continually throughout the process. It is also argued
that stage-gating depends upon being able to incrementally adapt previous designs so that
you know what to expect at each gateway and this stifles innovation. Agile and cyclic
models came about as a response to these criticisms and are particularly useful for software
engineering.
Agile methods repeatedly cycle through requirements formulation, analysis, design,
implementation, testing and evaluation using fast prototyping techniques to get quick
feedback on ideas.
The multi-stage design process models reflect the need for design activities to be done in
sequence for example to choose the overall solution principle before developing the design
in detail.
V-Model Graphical model that describes the actions that have to be performed and the

outcomes that have to be achieved at each stage of the design process. Coming down the left
leg of the V involves identification of system requirements, creation of system
specifications, design of system architecture and then specification of the requirements for
all the components. At the bottom of the V the component engineering is done. Coming up
the right side of the V represents integration of parts and their validation through testing,
where one first tests parts, then sub-systems, checking against specifications. The geometry
of the V means that the integration and test activities align horizontally with the design
activities to which they apply.
Concurrent Models Design models in which overlapping of process steps occurs. Also
known as simultaneous engineering.
The advantages of concurrent design models are the fact that the overall NPI process is
expedited and by starting later stages of development before earlier stages have finished, it
may be possible to identify problems early and to correct them as special expertise is applied
earlier in the NPI process.
However, the inter-group communication and increased coordination required for concurrent
engineering increases the complexity of the design process.
Concurrent models are most effective when the artefact being designed is an adaptation of a
previous artefact and thus the required engineering is well known.
Quality Assurance
As well as providing guidance in the management of design, standards organisations provide
extensive guidance in how organisations can define, establish and maintain a quality
assurance system for their products and services.
The key tenets of the ISO 9000 quality management system are:
1. The quality policy is a formal statement from management, linked to business and customer
needs.
2. The quality policy is understood and followed by all employees. Each employee works
toward measurable objectives.
3. Records show how and where materials and products were processed to allow problems to
be traced to the source.
4. When developing new products, the business plans the stages of development, with
appropriate testing at each stage. It tests and documents whether the product meets design
requirements, regulatory requirements, and user needs.
5. The business has documented procedures for dealing with actual and potential nonconformances.
Total Quality Management developed from an examination of organisation-wide
approaches to quality control adopted by Japanese firms. Its basic management principle is
to efficiently harness the skills of the entire workforce with the aim of:
1. Adopting a prevention rather than inspection philosophy.
2. Reducing costs.
3. Improving customer satisfaction.
4. Achieving 'right first time'.
5. Continuous quality improvement.
TQM affects three key areas of the product development process including strategic
external and internal management of processes, cultural empowerment and teamwork,
technical thought of as a toolbox, techniques used to facilitate TQM.
Lean Manufacturing Often abbreviated to 'lean' is a production philosophy that considers
expenditure of resources on anything except the direct creation of value for the customer
wasteful where value is anything that the customer may be willing to pay for.
Lean concentrates on the original Toyota 'seven wastes' of transportation, inventory (raw
materials, work in progress and finished good stocks), motion (damage to products
associated with moving them), waiting, over-processing (including using parts that are more

precise or higher quality than necessary, over-production and defects.


Six Sigma A set of quality management methods and tools, including statistical methods to
improve manufacturing quality by identifying and removing the causes of defects.
Product Strategy and Architecture
According to Porter, the two basic types of competitive advantage a firm can possess are
low cost or differentiation. Low cost might be achieved by economies of scale, proprietary
technology, preferential access to energy or raw materials or other factors.
The sources of competitive advantage form the first dimension of Porter's generic strategies.
The second dimension is that of market scope i.e. should the firm seek to achieve cost or
differentiation over a broad or narrow range of products.
Other considerations are whether a firm should offer a product, service or combination of
both which are known as product service systems (PSS).
Firms must also consider whether their activities should be vertically or horizontally
integrated. Vertical integration is where a single firm undertakes many of the stages of the
production of a product and horizontal integration is where a firm tries to achieve economies
of scale by covering different market segments in the same industry.
Also to be considered is whether a firm should be a first-to-market or fast-follower. First to
market has the advantage of enabling premium prices due to uniqueness while fast follower
allows a firm to learn from the mistakes of the companies first to market.
Closely related to the above point is high risk versus low risk products and t especially the
balance thereof.
When should a product be retired from the market? The issue here is that introducing a new
product can cannibalise the sales of the previous product during the transition period.
There are also many order fulfilment options for a product:
1. Engineer-to-Order (ETO) The product is designed and built to customer specifications, for
example the construction of sky scrapers buildings etc.
2. Build-to-Order (BTO) or Make-to-Order (MTO) The artefact is based on a standard design
but component production and manufacture/assembly of the delivered artefact is linked to
the customer order and specifications e.g. high end cars.
3. Assemble-to-Order (ATO) The product is built to customer specifications from a stock of
existing components for example a PC.
4. Make-to-Stock (MTS) product is built against a sales forecast and sold to the customer
from finished goods stock for example low end cars.
5. Ship-to-Stock (STS) As MTS but where the stock is held in a retailer for example fastmoving consumer goods (FMCGs), consumer electronics.
Typically volume increases as you move from ETO to STS.
A strategic imperative for companies is to try to move the stock holding decoupling points
away from end users i.e. move toward the ETO end of the spectrum as it reduces reliance on
forecasts which may be inaccurate.
A big factor in enabling this strategic objective is the architecture of the product and
especially its modularity.
Product Architecture The way in which an artefact is organised into into elements subsystems or sub-assemblies and component parts and the definition of the interfaces between
those elements.
Module A high-level element of an artefact architecture.
Modularity makes the complexity of a system manageable by dividing an artefact into
manageable parts and makes gradual evolution of knowledge about the system possible.
Secondly, modularity organises and enables parallel work. Finally, modularity in a design
allows modules to be changed and improved over time without modifying the system as a
whole.
Ulrich defined product architecture as involving: the scheme by which the function of the

1.
2.
3.
4.

1.
2.
3.
4.

product is allocated to physical components; the arrangement of functional elements; the


mapping from functional elements to physical components and the specification of
interfaces among interactive physical components.
Functions are what the product does as opposed to its physical characteristics.
The original understanding of modularity was that modules could be interchanged with one
another to customise products or make them perform different functions.
Modularity in manufacturing is still very important but not necessarily on the basis of
functional division, for example an artefact may be split into several modules to speed up
manufacturing time with concurrent manufacturing.
Modularity is useful in upgrading an artefact through life as more advanced technology may
be compatible with older interfaces allowing new technology to be easily plugged in.
Modularity is very useful for repair as in many situations artefacts need to be returned to
service as quickly as possible and elements may be removed and replaced into artefacts very
easily.
Modularity at end of life is important to allow for artefacts to be re-manufactured, or their
constituent parts recycled or reused at the end of their useful life. Interfaces are also very
important in this regard as modules should be able to be removed easily and without
damage.
Modularity is not always an unequivocal advantage as an integral architecture may have
performance advantages and interfaces may introduce potential failure points into a system,
especially in electrical/electronic components and systems.
Approaches to modular design are distinguished by Ulrich as:
Integral Architecture Where an artefact is built as a single piece.
Slot Architecture Where each interface is of a different type to others so that interchange
between modules is not possible.
Bus Architecture A common bus is provided (often as electrical connectors) to which
components interface with the same type of connection e.g. track lighting, shelving system,
expansion boards in a computer etc.
Sectional Architecture An assembly which is built up by connecting interchangeable
components by identical interfaces.
Ulrich also suggests there are multiple ways of mapping functions to modules of an artefact.
One-to-one Mapping Where a module has a single assigned function e.g. engine.
Function Driven Integrity Where multiple modules work together to perform a single
function e.g. conditioner.
Component Integrity Where a single module performs multiple functions e.g. a car body.
Combined Integrity A combination of function driven and component integrity.
Modules may be assigned on the basis of technology, spatial arrangement, need to process
the technology at the end of life, bottom up clustering of elements, division by engineering
domain e.g. electrical or mechanical, need to separately assemble and test, division
according to a work split.
Due to the importance of interfaces, robust approaches for their design are vital.
Standardisation Standards and codes of practice for the design of a module and
component interfaces are essential in design.
Joining Elements Particularly important are technological elements for connecting
different parts of a system or product e.g. screws.
Methods for the design of assemblies are of paramount importance and include design for
assembly and kinematic design which comprises design rules which ensure that assemblies
are not over or under constrained.
Modularity is of particular importance when used in product families and platforms.
Product Family A set of products generally offered by the same firm that share
technologies and address related market applications.

Product families are advantageous because they allow the reuse of parts and sub-systems but
also design solutions and engineering processes; they allow organisational learning what a
firm learns in the NPI for one product can be applied to the rest of the family; They allow
economies of scale and they allow brand reputation to be exploited.
Product Platform Characterised by a modular product architecture and comprise the
collection of assets (components, processes, knowledge, people) that are shared by a product
family.
The use of product platforms allows firms to produce product families that offer
considerable variety to their customers while controlling the complexity of the engineering
involved.
In computing and electronics platforms refer to operating systems such as windows which
comprise the shared set of assets upon which families of products are based but in this case
the platform is the core technological ecosystem upon which new products are developed.
Mass customisation is possible by using product platforms using approaches already
mentioned such as modularity and other methods such as multi-function assemblies where a
single component may perform different functions in different products, the use of software
to customise products and late configuration is made possible.
Designing products for late and local configuration is called design for postponement.
Design Tools and Techniques
Characteristics An artefact's design, shape, structure and so on. Represented by proposals
or models during the design phase. The characteristics define a product's structure and shape
and can be directly determined by the designer.
Properties Based on the characteristics of an artefact you can evaluate how you think it
will perform i.e. how strong, how reliable and how efficient etc. which are the properties of
the artefact. Properties define the behaviour of an artefact and cannot be directly determined
by the designer.
Many design techniques come under the general heading of Design for X where x stands for
the desired property of the artefact. The desired property is likely to be one of the 'ilities'.
Design for X techniques may be split into categories including:
1. Those that help estimate how well the design will perform with respect to property x.
2. Those that give guidelines or instructions on the characteristics a design should have in
order to achieve a particular property.
3. Those that can be applied across a range of properties for example to handle the effect of
uncertainty in parameters describing characteristics.
Design for Manufacture and Assembly Design for manufacturability and assemblability of
the artefact (both type 1 and type 2 techniques).
Poka Yoke Guidelines for producing designs or products and processes which are
inherently reliable and resistant to operator mistakes (type 2 techniques).
Life Cycle Assessment (LCA) Techniques for predicting the environmental impact of an
artefact or process (type 1 techniques).
Design for End-of-Life Techniques that allow the remanufacturability and recyclability of
the artefact to be estimated (mainly type 2 techniques).
Taguchi's Robust Design Techniques for predicting the sensitivity of an artefact to
variations in its production or use (type 3 techniques).
Six-Sigma and Process Capability Techniques to assist in reducing the impact of material
and process uncertainty on artefact performance (type 1 and 3 techniques).
Techniques that help a design team collect and organise its understanding of an artefact and
its performance include:
1. Design of Experiments (DOE) Techniques to allow the relationship between properties and
characteristics or environmental attributes to be experimentally explored with a minimum
number of tests.

2. Quality Function Deployment (QFD) Used to understand and quantify the importance of
customer needs and requirements and to support the definition of product and process
requirements.
3. Failure Modes and Effects Analysis (FMEA) Used to help understand how the artefact
might fail and the steps that can be used to reduce the likelihood and impact of failures.
Design for Manufacture and Assembly
DFM/DFA are important approaches to assist in the control of manufacturing costs.
A number of significant DFM/DFA include:
1. Boothroyd-Dewhurst's Design for Manufacture and Assembly (DFMA).
2. Computer Sciences Corporation's (CSC) DFA/Manufacturing Analysis (MA).
3. Hitachi's Assembly Evaluation Method (AEM).
In practice, the first is the most widely used and all three concentrate on DFA.
DFA can be applied without first carrying out an evaluation step by guidelines to part
assembly and design. Examples of these guidelines include:
1. Reduce part count and types by consolidation and integration.
2. Reduce number of fasteners to a minimum and avoid threaded fasteners.
3. Use common, efficient fastenings systems (when they must be used).
4. Modularise the design.
5. Design for an optimum assembly sequence.
6. Provide a base for assembly to act as a fixture or work carrier.
7. Design the assembly process in a layered fashion (ideally with parts assembled from above).
8. Keep the centre of gravity of the assembly low.
9. Use gravity to aid assembly operations.
10. Ensure that product weight allows easy handling.
11. Design parts for multi-functional uses where possible.
12. Eliminate unnecessary joining processes.
13. Strive to eliminate adjustments (especially blind adjustments/shimming).
14. Ensure adequate access and unrestricted vision.
15. Use standard components where possible.
16. Maximise part symmetry.
17. Design parts that cannot be installed incorrectly (use Poka Yoke princples).
18. Minimise handling and reorientation of parts.
19. Design parts for ease of handling from bulk (avoid nesting, tangling).
20. Design parts to be stiff and rigid, not brittle or fragile.
21. Design parts to be self-aligning and self-locating (tapers, chamfers, radii).
22. Use good detail design for assembly.
23. Avoid burrs and flash on component parts.
DFM similarly provides guidelines aimed at the component rather than assembly level for
product design that eases manufacturing and lowers cost. These rules include:
1. Identify critical characteristics (tolerances, surface finishes).
2. Identify factors that influence manufacture of the critical characteristics.
3. Estimate manufacturing costs.
4. Minimise component cost.
5. Establish maximum tolerances for each characteristic.
6. Determine process capability to achieve characteristics early in the design process.
7. Avoid tight tolerances.
8. Design the part to be easily inspectable.
9. Minimise the number of machined surfaces.
10. Minimise the number of reorientations during manufacture (especially during machining).
11. Use standard manufacturing processes where possible.
12. Use generous radii/fillets on castings, mouldings and machined parts.
13. Avoid secondary processes.

14. Design parts for ease of tooling/jigging/fixturing using standard systems.


15. Utilise the special characteristics of processes (e.g. moulded inserts, colours).
16. Use good detail design for manufacture and conform to drafting standards.
The CSC Method for DFA comprises four stages.
1. Functional Analysis Each part is classified as essential or non-essential and given this
awareness, redesign can be evolved around the essential components from which reduced
part count normally results.
2. Manufacturing Analysis Component costs are calculated considering materials,
manufacturing processes and aspects such as complexity, volume and tolerance. This allows
handling nfor part reduction ideas to be tested since combining components can lead to
complexity and changes to manufacturing processes.
3. Handling Analysis Components must be correctly orientated before assembly can take
place. This analysis helps to determine the difficulty of this and Poka Yoke can be used to
ensure zero defects.
4. Fitting Analysis An assembly sequence plan must be constructed and the difficulty of
assembling each part rated.
It is good practice to have at least 60% of parts be essential.
Handling and fitting ratios should be under 2.5.
The potential benefits of DFM/DFA are reduced part count, systematic component costing
and process selection, lower component and assembly costs, standardised components
assembly sequence and methods across product families leading to improved
reproducability, faster product development and reduced time to market, lower level of
engineering changes, modifications and concessions.
DFM/DFA can be applied relatively easily but there are important issues in its application:
1. It should be management led.
2. Training is required before use.
3. Resources consumed in product development can be significant.
4. A team-based application and systematic approach is practically essential.
5. The approach contains many subjective analysis processes.
6. Manufacturing and technical feasibility of new design solutions need to be validated.
7. Early life failures, caused by latent defects may not be detected.
Poka Yoke
Poka Yoke is the drive toward zero defects or mistake/fool-proofing across the product
development process.
It is based on two primary principles including designing a process so that a defect cannot be
made and designing a product or process so that if defects are made it is immediately
obvious and can therefore be immediately corrected.
The process of Poka Yoke involves:
Self-Checks on the basis that the best person to detect mistakes is the operator.
Successive Checks In an assembly line a check should be carried out by the next operator.
Source Inspections The best time to detect a mistake is immediately as it happens and
therefore a system should be designed to highlight or prevent mistakes.
There are three general categories of Poka Yoke devices:
1. Contact Type The use of shape, dimensions or other physical properties to detect the
contact or non-contact of a particular feature.
2. Constant Number Type Detects errors if a fixed number of parts have not been made.
3. Performance Sequence Type Detects errors if the fixed steps in a sequence have not been
performed or in the wrong order.
The two recognised ways in which Poka Yoke may be activated include:
1. Shut-Out Activation Prevents incorrect action from taking place.
2. Attention Activation Brings attention to an incorrect action but does not prevent its

execution.
Shut-out is preferred where possible because human idiocy is completely eliminated.
Life Cycle Assessment
Life Cycle Assessment (LCA) A design tool which attempts to evaluate the environmental
impact of a given product or process throughout its life including raw product extraction,
manufacture, transport, use and disposal.
LCA is covered by international standards including ISO 14040:2006 and these standards
describe a four stage process including:
1. Goal and Scope Definition Involves establishing the principal choices of the study
including methodological choices, assumptions and limitations, particularly with regard to
system boundaries and impacts to be considered.
2. Inventory Analysis A life cycle inventory includes information on all of the environmental
inputs and outputs associated with a product or service.
3. Impact Assessment The inventory can be difficult to interpret so further analysis helps to
classify the inventory and comprises four steps:
Classification of substances into classes according to their environmental impacts. A 'big 6'
of of environmental impact categories include: acidification potential, eutrophication
potential, global warming potential, ozone depletion potential, photochemical ozone creation
potential and primary energy use.
Characterisation of impacts by multiplying by a factor which reflects their relative
contribution.
Normalisation of impacts by comparison to a reference value, for example the average
impact of a European citizen in one year.
Weighting of impact categories to generate a single score.
4. Interpretation Describes checks to make to ensure the conclusions are adequately
supported by the data and procedures used in the study, including uncertainty, sensitivity analysis
and contribution analysis.
There are variants in the scope of a LCA according to the aspect of the life cycle being
examined including:
Cradle-to-Grave Full LCA from resource extraction to disposal.
Cradle-to-Gate Partial LCA from resource extraction to the factory gate before
transportation to the end user.
Cradle-to-Cradle As cradle-to-grave but the disposal phase involves a recycling process.
Gate-to-Gate Partial LCA examining only one process in a production chain.
Well-to-Wheel A specific LCA used for transport fuels and vehicles.
LCAs can be very complex and lots of software is available to aid in the task.
LCA can be time-consuming and costly and the results may be difficult to interpret. Another
difficulty is that the required information to carry out an LCA may not be available in the
early stages of a design project.
Design for Sustainability
A set of design principles to ensure that a product is likely to be sustainable.
The key challenges of sustainable product design include:
1. Eliminating the use of non-renewable natural resources.
2. Eliminating disposal of synthetic and inorganic materials that do not decay quickly.
3. Eliminating creation of toxic wastes that are not part of natural life cycles.
The internal drivers for sustainable design include public image, operational safety,
employee motivation, ethical responsibility and influencing of consumer behaviour.
The external drivers for sustainable design include environmental legislation, market
demand, competition, trade organisations, suppliers and social pressures.
The United Nations Environment Programme sums up approaches to sustainable
development as 6 Res:

1. Re-think the product and its functions. For example the product may be used more
efficiently.
2. Re-place harmful substances with safer alternatives.
3. Re-pair Make the product easy to repair.
4. Re-use Design the product for disassembly so parts may be reused.
5. Re-duce energy, material consumption and socio-economic impacts throughout a product's
life cycle.
6. Re-cycle Select materials that can be recycled.
Some of these principles may be in conflict with other design principles such as design for
assembly.
Robust Design
Robust design creates performance characteristics that are very insensitive to variations in
the manufacturing process, and other variations relating to the environment and time.
Robust design is the design of a product or process that results in functionally acceptable
products within economic tolerances.
Robust design improves product quality by reducing the effects of variability.
When design is not robust, we see overly specified tolerances, uncontrollable variance in
production, slow ramp-up times, high scrap rates and reduced quality in the product.
There are several approaches to robust design but we consider Taguchi's Tolerance Design.
Taguchi's approach relates what he calls quality loss to allocated tolerances in a product with
the goal of minimising the total cost, which comprises the cost of quality loss due to
variation and the cost to control the tolerances.
Taguchi's methodology is based on the precept that the lowest cost to society represents the
product with the highest quality, which is achieved by reducing variation in product
characteristics. This approach is expressed by the loss function.
Design of Experiments
The clever design of experiments to minimise cost and time taken.
Six Sigma Process Capability
Designed by Motorola in the 80s as a set of quality management methods and tools
including statistical methods that aim to improve the manufacturing quality by identifying
and removing the causes of defects and minimising variability in manufacturing and
business processes.
Capability Study A statistical tool that measures the variations within a manufacturing
process. Samples of the product are taken, measured and the variation is compared with a
tolerance or specification limit.
Process capability is attributable to a combination of the variability in all of the inputs.
Machine capability is computed when the rest of the inputs are fixed.
A capability study can be carried out on any of the inputs by fixing all of the others.
There are five occasions when capability studies should be carried out:
1. Before the machine/process is bought (to see if it is capable of operating within your
specification.
2. When it is installed.
3. At regular intervals to check that the process is given the performance required.
4. If the operating conditions change (e.g. materials, lubrication).
5. As part of a process capability improvement.
There are two main kinds of variability:
Common Cause or Inherent Variability due to the set of factors that are inherent in a
machine/process by virtue of its design, construction and the nature of its operation e.g.
positional repeatability, machine rigidity, which cannot be removed without undue expense
or process redesign.
Assignable-Cause or Special-Case Variability due to identifiable sources that can be

systematically identified and eliminated.


When only common cause is present the process is operating at its best and this variability
must be within the set tolerance limits.
The general factors influencing variability include tool and functional accuracy, operator,
set-up errors and deformation due to mechanical and thermal effects, measurement errors,
material impurities, specifications, equipment, method or job instructions and environment.
Cp is a means of quantifying a process in order to verify that the characteristics are within
specification, where the process distribution is symmetrical about the target value.
Cpk uses the mean value to calculate where the process is centred as well as its distribution.
For both indices of capability values between 1.33 and 2.5 indicate a capable process with
anything lower being unacceptable.
Quality Function Deployment
QFD is used to understand and quantify the importance of customer needs and requirements,
and to support the definition of product and process requirements.
It organises requirements into a series of charts known as the House of Quality.
The objectives of QFD are to:
1. Prioritise customer requirements and relate them to all stages of product development.
2. Focus resources on the aspects of the product that are important for customer satisfaction.
3. Provide a structured, team-driven product development process.
These are achieved by cascading the customer requirements AKA the voice of the customer
down through the product development process in four separate phases.
1. Product Planning Customer needs from market research information and product
specification are ranked in a matrix against the important design requirements, yielding a
numerical quantification between the important customer requirements and product design
issues. Together with the importance rating for the customer for a particular requirement, a
points rating for the design requirement is determined.
2. Design Deployment The critical design requirements from phase 1 are ranked with design
characteristics where the relationship between them is again quantified, yielding important
interface issues between design and manufacture.
3. Process Planning The important design characteristics from phase 2 are ranked with key
process operations, where quantification yields actions to improve the understanding of the
processes involved and gain necessary expertise early on.
4. Production Planning The critical process operations from phase 3 are ranked with
production requirement issues, ultimately translating the important customer requirements to
production planning and establishing important actions to be taken.
The potential benefits of QFD include:
Improved product quality.
Reduced design changes and associated costs.
Reduced lead times.
Improved documentation and traceability.
Promotes team-working.
Provides a structured approach to product development.
Improves customer/supplier relationship.
Key issues that have to be considered are:
QFD can be used on products, services or software.
Its application should be management led and there should be an overall strategy for
implementation and application.
Training is required to use the analysis initially.
Multi-disciplinary teams based application is the best option.
Use of the approach can be subjective and tedious.

Organisations do not extend the use of QFD past the first phase usually.
Involvement of customers and suppliers is very important.
Chart development should be be reviewed at regular intervals with customers and suppliers.
Identification of customer needs is sometimes difficult.
QFD should be supported with existing data wherever possible.
QFD can help to build up a knowledge base for product families, although:
The approach is applied too late in many cases.
QFD in summary helps identify where an organisation has a competitive advantage, where it
can gain a competitive advantage and where it lags and must improve.
Failure Modes and Effects Analysis
FMEA is a systematic element by element assessment to highlight the effects of a
component, product or system failure to meet all requirements of a customer specification,
including safety.
It helps to indicate by high scores those elements that require priority to prevent failure.
The following factors are assessed in FMEA:
1. Potential Failure Mode How could the component, product, process or system element fail
to meet each aspect of the specification.
2. Potential Effects of Failure What would be the consequence of the failure?
3. Potential Causes of Failure What would make a failure occur in the way suggested by the
failure mode
4. Current Controls What is done at present to reduce the chance of this failure occurring?
5. Occurrence (O) Probability that a failure will take place, given that there is a fault.
6. Severity (S) The effect the failure has on the user/environment, if the failure happens.
7. Detectability (D) Probability that the fault will go undetected before failure takes place.
The risk priority number (RPN) = O x S x D.

Anda mungkin juga menyukai