Concept Evaluation And Selection


Published on

Product Design & Development

Published in: Education, Technology

Concept Evaluation And Selection

  1. 1. Concept Evaluation and Selection Prepared by: We’am Obaidat Supervised by: Dr. Abdullah Dwairi
  2. 2. Introduction <ul><li>Concept Evaluation implies both comparison and decision making . </li></ul><ul><li>The Goal : To expend the least amount of resources on deciding which concepts have the highest potential for becoming a quality product. </li></ul><ul><li>The Difficulty : To choose the best concept with very limited knowledge and data on which to base this selection. </li></ul><ul><li>Design is learning , and resources are limited </li></ul><ul><li>The greater knowledge about the concept, the fewer surprises </li></ul>
  3. 3. Introduction-cont. <ul><li>Two Types of Comparisons </li></ul><ul><ul><li>Absolute : Alternative concept is compared directly with a target set by a criterion </li></ul></ul><ul><ul><li>Relative: Alternatives are compared with each other using measures defined by the criteria. </li></ul></ul><ul><ul><ul><li>Possible only when there is more than one option. </li></ul></ul></ul><ul><li>For comparisons, the alternatives and criteria must be: </li></ul><ul><ul><li>In the same language (meters vs. long) </li></ul></ul><ul><ul><li>At the same level of abstraction </li></ul></ul>
  4. 4. Concept Evaluation Techniques <ul><li>There are many techniques used to evaluate concept such as: </li></ul><ul><li>Feasibility Judgment </li></ul><ul><li>GO/NO-GO Screening </li></ul><ul><li>Basic Decision Matrix </li></ul><ul><li>Weighted Decision Matrix </li></ul><ul><li>Advanced Decision Matrix </li></ul><ul><li>Analytical Hierarchy Process (AHP) </li></ul>
  5. 5. Concept Evaluation Techniques
  6. 6. Evaluation based on Feasibility Judgment <ul><li>Three Immediate Reactions of a Designer as a concept is generated based on designer’s “ gut feel ”: </li></ul><ul><li>– It is not Feasible. </li></ul><ul><li>– It might work if something else happens. </li></ul><ul><li>– It is worth considering. </li></ul><ul><li>A comparison based on experience and knowledge </li></ul>
  7. 7. Evaluation based on Feasibility Judgment <ul><li>Implications of Each of these Reactions: </li></ul><ul><li>– It Is Not Feasible </li></ul><ul><li>• Before discarding an idea, ask “Why is it not feasible?” </li></ul><ul><li>- Technologically infeasible </li></ul><ul><li>- Not meeting customer’s requirements </li></ul><ul><li>- Concept is different </li></ul><ul><li>- NIH </li></ul><ul><li>• Make sure not to discard an idea because: </li></ul><ul><li>– a concept is similar to ones that are already established, or </li></ul><ul><li>– a concept is not invented here (less ego-satisfying). </li></ul>
  8. 8. Evaluation based on Feasibility Judgment <ul><li>– It is Conditional. </li></ul><ul><li>• To judge a concept workable if something else happens. </li></ul><ul><li>• Factors are the readiness of technology, the possibility of obtaining currently unavailable information, or the development of some other part of the product. </li></ul>
  9. 9. Evaluation based on Feasibility Judgment <ul><li>– It is Worth Considering </li></ul><ul><li>• The hardest concept to evaluate is one that is not obviously a good idea or a bad one, but looks worth considering. </li></ul><ul><li>• Such a concept requires engineering knowledge and experience. If sufficient knowledge is not immediately available, it must be developed using models or prototypes that are easily evaluated. </li></ul>
  10. 10. Evaluation based on GO/NO-GO Screening <ul><li>Measures for deciding to go or no-go: </li></ul><ul><li>1– Criteria defined by the customer requirements: </li></ul><ul><li>• Absolute evaluation by comparing each alternative concept with the customer requirements. </li></ul><ul><li>• A concept with a few no-go responses may be worth modifying rather than eliminating </li></ul><ul><li>• This type of evaluation not only weeds out designs that should not be considered further, but also helps generates new ideas. </li></ul>
  11. 11. Evaluation based on GO/NO-GO Screening <ul><li>2– Readiness of the technologies used: </li></ul><ul><li>• This technique refines the evaluation by forcing an absolute comparison with state-of-the-art capabilities. </li></ul><ul><li>• The Technology must be mature enough that its use is a design issue, not a research issue. </li></ul><ul><li>• There are high incentive to include new technologies in products. </li></ul>
  12. 12. Evaluation based on GO/NO-GO Screening <ul><li>• 6 Measures for a Technology’s Maturity: </li></ul><ul><li>– Are the critical parameters that control the function identified? </li></ul><ul><li>– Are the safe operating latitude and sensitivity of the parameters known? </li></ul><ul><li>– Have the failure modes been identified? </li></ul><ul><li>– Can the technology be manufactured with known process? </li></ul><ul><li>– Does hardware exist that demonstrates positive answers to the preceding four questions? </li></ul><ul><li>– Is the technology controllable through the product’s life cycle? </li></ul><ul><li>• If these questions are not answered in the positive, a consultant or vendor is added to the team. </li></ul>
  13. 13. Evaluation based on a Basic Decision Matrix <ul><li>Decision-Matrix Method (or Pugh’s Method): </li></ul><ul><li>Select decision criteria </li></ul><ul><li>Formulate decision matrix </li></ul><ul><li>Clarify design concepts being evaluated </li></ul><ul><li>Choose “Datum” or best initial concept </li></ul><ul><li>Compare other concepts to Datum based on +, -, S scale. </li></ul><ul><li>Evaluate the ratings: important to discuss concepts strengths and weaknesses. Good discussion can lead to new, combined, better solution concepts </li></ul><ul><li>Select a new “datum” concept and rerun analysis </li></ul><ul><li>Plan further work. Often new needs for information and concepts come from first meeting. </li></ul><ul><li>Second working session to repeat above and select a concept. </li></ul>
  14. 14. D A T U M 42 9 23 -3 35 Weighted total 4 2 2 1 3 Overall total 1 1 1 1 0 Total - 5 3 3 2 3 Total + + S + + S 7 Fits hand better + + + S + 10 More secure locking + S S S S 9 Stacking stability + S S S + 10 Hinge doesn’t come apart S + + + S 15 Easier to remove CD - + - S S 9 Easier to remove leaflet + S S S + 15 Easier Opening S - S - S 25 Manufacturing Cost Con.5 Con. 4 Con. 3 Con. 2 Con. 1 Importance Criterion
  15. 15. Evaluation based on a Basic Decision Matrix <ul><li>Notes from above example: </li></ul><ul><li>Value of S = 0 </li></ul><ul><li>Overall Total for concept 3= no. of (+) - no. of (-) </li></ul><ul><li>= 3-1=2 </li></ul><ul><li>Weight Total for concept 2= 25*(-1) + 15*0 + 9*0 +15*1 + 10*0 + 9*0 + 10*0 + 7*1 = -3 </li></ul><ul><li>From above table Concept 5 is the best </li></ul>
  16. 16. Evaluation based on a Weighted Decision Matrix <ul><li>Develop a criteria weighting matrix </li></ul><ul><li>Select interval scale for evaluation scoring </li></ul><ul><li>Create weighted decision matrix and sum weighted evaluations. </li></ul><ul><li>Select highest value </li></ul><ul><li>Consider combining strengths of various concepts and rerunning with new concepts </li></ul>
  17. 17. Evaluation based on a Weighted Decision Matrix
  18. 18. Evaluation based on a Weighted Decision Matrix
  19. 19. Evaluation based on a Weighted Decision Matrix
  20. 20. Robust Decision Making <ul><li>Robust decision refers to make decisions that are as insensitive as possible to the uncertainty, incompleteness, and evolution of the information that they are based on. </li></ul><ul><li>For robust decision making, we need to improve the method </li></ul><ul><li>used to evaluate the alternatives (step 4 in decision-matrix method). </li></ul><ul><li>Word Equations used for Robust Decision Making </li></ul><ul><li>– Satisfaction = belief that an alternative meets the criteria </li></ul><ul><li>– Belief = knowledge + confidence </li></ul><ul><li>• Belief is the confidence placed on an alternative’s ability to meet a target set by a criterion, requirement, or specification, based on current knowledge. </li></ul><ul><li>• Belief (virtual sum of knowledge and confidence) can be expressed on a “Belief map.” </li></ul>
  21. 21. Belief Map
  22. 22. Belief Map-Cont.
  23. 23. Belief Map-Cont. Belief=1 Belief=.5 Belief=.5 Belief=0
  24. 24. Evaluation based on Advanced Decision Matrix <ul><li>Steps 1 through 3: same as the Decision Matrix Method </li></ul><ul><li>Step 4: Evaluate Alternatives </li></ul><ul><li>– Use a belief map for comparison </li></ul><ul><li>– If little is known or the evaluation result is that the alternative possibly meets the criterion, then belief = 0.5 </li></ul><ul><li>Step 5: Compute Satisfaction </li></ul><ul><li>– Satisfaction = S (belief x importance weighting) </li></ul><ul><li>• Max satisfaction = 100 (evaluator is 100% satisfied.) </li></ul>
  25. 25. Evaluation based on Advanced Decision Matrix
  26. 26. Evaluation based on Analytic Hierarchy Process <ul><li>Use Saaty’s fundamental scale for pairwise comparison </li></ul><ul><li>Determine weighting factors on criteria </li></ul><ul><li>Determine ratings for each concept relative to each factor by fractional quantitative or qualitative ranking or pairwise comparison between concepts for each criteria. </li></ul><ul><li>Create decision matrix </li></ul><ul><li>Highest weighted sum is selected. </li></ul><ul><li>Software: Expert Choice </li></ul>
  27. 27. Evaluation based on Analytic Hierarchy Process
  28. 28. Evaluation based on Analytic Hierarchy Process
  29. 29. Evaluation based on Analytic Hierarchy Process
  30. 31. Evaluation based on Analytic Hierarchy Process
  31. 32. Decision Management Method Selection Logic
  32. 33. Information Presentation in Concept Evaluation <ul><li>There are two ways to present the information in Concept evaluation: </li></ul><ul><li>Design-build-test cycle : building physical models or prototypes. </li></ul><ul><li>- For New technology or complex known technology </li></ul><ul><li>Design-test-build cycle : developing analytical models and simulating (i.e., testing) the concept before any thing built. </li></ul><ul><li>- For systems that are understood and can be modeled mathematically . </li></ul>
  33. 34. Information Presentation in Concept Evaluation Design-build-test cycle Design-test-build cycle