Description

Introduction Concept Evaluation implies both comparison and decision making . The Goal : To expend the least amount of resources on deciding which concepts have the highest potential for becoming a quality product. The Difficulty : To choose the best concept with very limited knowledge and data on which to base this selection. Design is learning , and resources are limited The greater knowledge about the concept, the fewer surprises

Concept Evaluation implies both comparison and decision making .

The Goal : To expend the least amount of resources on deciding which concepts have the highest potential for becoming a quality product.

The Difficulty : To choose the best concept with very limited knowledge and data on which to base this selection.

Design is learning , and resources are limited

The greater knowledge about the concept, the fewer surprises

Introduction-cont. Two Types of Comparisons Absolute : Alternative concept is compared directly with a target set by a criterion Relative: Alternatives are compared with each other using measures defined by the criteria. Possible only when there is more than one option. For comparisons, the alternatives and criteria must be: In the same language (meters vs. long) At the same level of abstraction

Two Types of Comparisons

Absolute : Alternative concept is compared directly with a target set by a criterion

Relative: Alternatives are compared with each other using measures defined by the criteria.

Evaluation based on Feasibility Judgment Three Immediate Reactions of a Designer as a concept is generated based on designer’s “ gut feel ”: – It is not Feasible. – It might work if something else happens. – It is worth considering. A comparison based on experience and knowledge

Three Immediate Reactions of a Designer as a concept is generated based on designer’s “ gut feel ”:

– It is not Feasible.

– It might work if something else happens.

– It is worth considering.

A comparison based on experience and knowledge

Evaluation based on Feasibility Judgment Implications of Each of these Reactions: – It Is Not Feasible • Before discarding an idea, ask “Why is it not feasible?” - Technologically infeasible - Not meeting customer’s requirements - Concept is different - NIH • Make sure not to discard an idea because: – a concept is similar to ones that are already established, or – a concept is not invented here (less ego-satisfying).

Implications of Each of these Reactions:

– It Is Not Feasible

• Before discarding an idea, ask “Why is it not feasible?”

- Technologically infeasible

- Not meeting customer’s requirements

- Concept is different

- NIH

• Make sure not to discard an idea because:

– a concept is similar to ones that are already established, or

– a concept is not invented here (less ego-satisfying).

Evaluation based on Feasibility Judgment – It is Conditional. • To judge a concept workable if something else happens. • Factors are the readiness of technology, the possibility of obtaining currently unavailable information, or the development of some other part of the product.

– It is Conditional.

• To judge a concept workable if something else happens.

• Factors are the readiness of technology, the possibility of obtaining currently unavailable information, or the development of some other part of the product.

Evaluation based on Feasibility Judgment – It is Worth Considering • The hardest concept to evaluate is one that is not obviously a good idea or a bad one, but looks worth considering. • Such a concept requires engineering knowledge and experience. If sufficient knowledge is not immediately available, it must be developed using models or prototypes that are easily evaluated.

– It is Worth Considering

• The hardest concept to evaluate is one that is not obviously a good idea or a bad one, but looks worth considering.

• Such a concept requires engineering knowledge and experience. If sufficient knowledge is not immediately available, it must be developed using models or prototypes that are easily evaluated.

Evaluation based on GO/NO-GO Screening Measures for deciding to go or no-go: 1– Criteria defined by the customer requirements: • Absolute evaluation by comparing each alternative concept with the customer requirements. • A concept with a few no-go responses may be worth modifying rather than eliminating • This type of evaluation not only weeds out designs that should not be considered further, but also helps generates new ideas.

Measures for deciding to go or no-go:

1– Criteria defined by the customer requirements:

• Absolute evaluation by comparing each alternative concept with the customer requirements.

• A concept with a few no-go responses may be worth modifying rather than eliminating

• This type of evaluation not only weeds out designs that should not be considered further, but also helps generates new ideas.

Evaluation based on GO/NO-GO Screening 2– Readiness of the technologies used: • This technique refines the evaluation by forcing an absolute comparison with state-of-the-art capabilities. • The Technology must be mature enough that its use is a design issue, not a research issue. • There are high incentive to include new technologies in products.

2– Readiness of the technologies used:

• This technique refines the evaluation by forcing an absolute comparison with state-of-the-art capabilities.

• The Technology must be mature enough that its use is a design issue, not a research issue.

• There are high incentive to include new technologies in products.

Evaluation based on GO/NO-GO Screening • 6 Measures for a Technology’s Maturity: – Are the critical parameters that control the function identified? – Are the safe operating latitude and sensitivity of the parameters known? – Have the failure modes been identified? – Can the technology be manufactured with known process? – Does hardware exist that demonstrates positive answers to the preceding four questions? – Is the technology controllable through the product’s life cycle? • If these questions are not answered in the positive, a consultant or vendor is added to the team.

• 6 Measures for a Technology’s Maturity:

– Are the critical parameters that control the function identified?

– Are the safe operating latitude and sensitivity of the parameters known?

– Have the failure modes been identified?

– Can the technology be manufactured with known process?

– Does hardware exist that demonstrates positive answers to the preceding four questions?

– Is the technology controllable through the product’s life cycle?

• If these questions are not answered in the positive, a consultant or vendor is added to the team.

Evaluation based on a Basic Decision Matrix Decision-Matrix Method (or Pugh’s Method): Select decision criteria Formulate decision matrix Clarify design concepts being evaluated Choose “Datum” or best initial concept Compare other concepts to Datum based on +, -, S scale. Evaluate the ratings: important to discuss concepts strengths and weaknesses. Good discussion can lead to new, combined, better solution concepts Select a new “datum” concept and rerun analysis Plan further work. Often new needs for information and concepts come from first meeting. Second working session to repeat above and select a concept.

Decision-Matrix Method (or Pugh’s Method):

Select decision criteria

Formulate decision matrix

Clarify design concepts being evaluated

Choose “Datum” or best initial concept

Compare other concepts to Datum based on +, -, S scale.

Evaluate the ratings: important to discuss concepts strengths and weaknesses. Good discussion can lead to new, combined, better solution concepts

Select a new “datum” concept and rerun analysis

Plan further work. Often new needs for information and concepts come from first meeting.

Consider combining strengths of various concepts and rerunning with new concepts

Evaluation based on a Weighted Decision Matrix

Evaluation based on a Weighted Decision Matrix

Evaluation based on a Weighted Decision Matrix

Robust Decision Making Robust decision refers to make decisions that are as insensitive as possible to the uncertainty, incompleteness, and evolution of the information that they are based on. For robust decision making, we need to improve the method used to evaluate the alternatives (step 4 in decision-matrix method). Word Equations used for Robust Decision Making – Satisfaction = belief that an alternative meets the criteria – Belief = knowledge + confidence • Belief is the confidence placed on an alternative’s ability to meet a target set by a criterion, requirement, or specification, based on current knowledge. • Belief (virtual sum of knowledge and confidence) can be expressed on a “Belief map.”

Robust decision refers to make decisions that are as insensitive as possible to the uncertainty, incompleteness, and evolution of the information that they are based on.

For robust decision making, we need to improve the method

used to evaluate the alternatives (step 4 in decision-matrix method).

Word Equations used for Robust Decision Making

– Satisfaction = belief that an alternative meets the criteria

– Belief = knowledge + confidence

• Belief is the confidence placed on an alternative’s ability to meet a target set by a criterion, requirement, or specification, based on current knowledge.

• Belief (virtual sum of knowledge and confidence) can be expressed on a “Belief map.”

Belief Map

Belief Map-Cont.

Belief Map-Cont. Belief=1 Belief=.5 Belief=.5 Belief=0

Evaluation based on Advanced Decision Matrix Steps 1 through 3: same as the Decision Matrix Method Step 4: Evaluate Alternatives – Use a belief map for comparison – If little is known or the evaluation result is that the alternative possibly meets the criterion, then belief = 0.5 Step 5: Compute Satisfaction – Satisfaction = S (belief x importance weighting) • Max satisfaction = 100 (evaluator is 100% satisfied.)

Steps 1 through 3: same as the Decision Matrix Method

Step 4: Evaluate Alternatives

– Use a belief map for comparison

– If little is known or the evaluation result is that the alternative possibly meets the criterion, then belief = 0.5

Step 5: Compute Satisfaction

– Satisfaction = S (belief x importance weighting)

• Max satisfaction = 100 (evaluator is 100% satisfied.)

Evaluation based on Advanced Decision Matrix

Evaluation based on Analytic Hierarchy Process Use Saaty’s fundamental scale for pairwise comparison Determine weighting factors on criteria Determine ratings for each concept relative to each factor by fractional quantitative or qualitative ranking or pairwise comparison between concepts for each criteria. Create decision matrix Highest weighted sum is selected. Software: Expert Choice

Use Saaty’s fundamental scale for pairwise comparison

Determine weighting factors on criteria

Determine ratings for each concept relative to each factor by fractional quantitative or qualitative ranking or pairwise comparison between concepts for each criteria.

Create decision matrix

Highest weighted sum is selected.

Software: Expert Choice

Evaluation based on Analytic Hierarchy Process

Evaluation based on Analytic Hierarchy Process

Evaluation based on Analytic Hierarchy Process

Evaluation based on Analytic Hierarchy Process

Decision Management Method Selection Logic

Information Presentation in Concept Evaluation There are two ways to present the information in Concept evaluation: Design-build-test cycle : building physical models or prototypes. - For New technology or complex known technology Design-test-build cycle : developing analytical models and simulating (i.e., testing) the concept before any thing built. - For systems that are understood and can be modeled mathematically .

Concept Generation and Selection - MTU

These presentations are classified and categorized, so you will always find everything clearly laid out and in context.
You are watching Concept Evaluation And Selection presentation right now. We are staying up to date!