Innovation day 2013 2.4 frederik mortier (verhaert) - test management

516 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
516
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • feasibility study is niet enkel ter bevestiging
    Zekere risico’s aan verbonden (past binnen kader van risk management op inno day)

    Risico’s bespreken en daarna een efficiënte aanpak voorstellen

  • feasibility study is niet enkel ter bevestiging

    Situering binnen NPD

    Zekere risico’s aan verbonden (past binnen kader van risk management op inno day)

    Risico’s bespreken en daarna een efficiënte aanpak voorstellen
  • Situering van de presentatie

    Conceptniveau  presentatie handelt over innovatieprojecten(kader inno day / verhaert)  focus ligt op technische haalbaarheid

    Technische haalbaarheid nagaan dmv testen (uiteraard niet enigste mogelijkheid)
  • link naar andere presentaties en algemene risk mgmt aanpak duidelijk maken
  • feasibility is not only looking at the feasibility (monitoring), but also about making the concept feasible (treatment)

    Snapshot uit de presentatie ‘risk training’
  • Innovatief project  technical feasibility
    Uitleg geven rondom HQMS (aerotecs project)
    Last item  usability

    Field demonstrator  user centered
  • validation  demonstrator in application (feasibility & usability & desirability check)
    verification  BB or PoC (feasibility check)

    Over specification will have its influence throughout the entire project!

    Example:
    Fusionics (wrong specification leads to expensive instrument)
    Airbus 5µm accuracy possible, however, customer does not have a need for such a precision
  • Early stage cost reduction by changing concept has significant influence in overall project costs

    Reversing the learning curve by learning as much as fast as possible  overall advantage!
  • You test what you really want to know
    Simulations are possible for partially determined cases with many variable changes
    Human interpretation also for testing, but without translation from theoretical case to practical

    Blister package: Determine in feasibility study Bullnett the break force of the blister  NASA already has very detailed information regarding human capabilities!
  • renting vs. buying: FLIR camera for 1 day, gekalibreerde meettoestellen

  • wrong basic requirement:
    Fusionics (te zwaar gespec’t)
    Hot cold treatment:
    Temperature target 2 °C very demanding and not usefull for treatment
    New application based on ice packs => good and bad characteristics copied
    Airbus (té goed en daardoor te omslachtig)
    Borehole measurement
    5 µm target very demanding , set by quality department and not usefull for end user (production)
  • Proof of concept => breadboarding
    Works like real => application demonstrator
    looks/works/tastes like real => product prototype
  • Test plan schrijven + uitvoeren => Iteratief proces

    test plan:
    Banksys:
    - loopt het mis indien… ?
    - wanneer loopt het mis?
     andere insteek
  • Test plan schrijven + uitvoeren => Iteratief proces

    test plan:
    Banksys:
    - loopt het mis indien… ?
    - wanneer loopt het mis?
     andere insteek
  • quick desk search  parameters/eigenschappen/moeilijkheden leren kennen



  • low threshold  nog steeds hoog in de praktijk:
    - corporate culture (veel administratie)
    - beschikbaarheid instrumentatie/tools
    - often downstream test approach  uitstellen testen (enkel voor validatie/verificatie)
    - niet durven handen vuil te maken?

    Foto’s opstelling beneden:
    -


  • Test plan schrijven + uitvoeren => Iteratief proces

    test plan:
    Banksys:
    - loopt het mis indien… ?
    - wanneer loopt het mis?
     andere insteek
  • Test plan schrijven + uitvoeren => Iteratief proces

    test plan:
    Banksys:
    - loopt het mis indien… ?
    - wanneer loopt het mis?
     andere insteek
  • Link to sentence in beginning  bad idea/concept  show example.

  • Test plan schrijven + uitvoeren => Iteratief proces

    test plan:
    Banksys:
    - loopt het mis indien… ?
    - wanneer loopt het mis?
     andere insteek
  • Zo snel mogelijk leren door Q&D met daarop gevolgd een test plan op basis van priorities/risks zal het totale risico van een feasibility study beperken.
  • Innovation day 2013 2.4 frederik mortier (verhaert) - test management

    1. 1. 2 Test management: How to organize your research and test campaign during the feasibility study CONFIDENTIAL Frederik Mortier Consultant applied physics & systems frederik.mortier@verhaert.com THEME 2: RISK MANAGEMENT IN INNOVATION
    2. 2. 3 Frederik Mortier Consultant Applied Physics & Systems 2012- now Researcher @ Ghent University 2011-2012 MSc. in Engineering – Electro-Mechanical 2009-2011 MSc. in Industrial Sciences – Electro-Mechanical 2005-2009 frederik.mortier@verhaert.com
    3. 3. 4 “A feasibility study is not only meant to confirm a good idea…”
    4. 4. 5 “…, it could also detect a bad idea!”
    5. 5. 6 CONFIDENTIAL Outline
    6. 6. 7 Outline  Context within NPD  Problems & Risks  Efficient Test Approach  Conclusion
    7. 7. 8 CONFIDENTIAL Context within NPD
    8. 8. 9 Concept Design & Development Production Visualization Testing Simulation Review Quick&Dirty Detailed Conclusion Utility Allowability Desirability Feasibility Usability When What How Context within NPD
    9. 9. 10 Cut Tangible • Visualize • Simulate • Test • Review • Roadshow Risk focus • Criticalities • Added value • 360° Early • Rapid prototyping • First time right Options Context within NPD Risk reduction methods
    10. 10. 11 Risk Management Risk identification Risk analysis Risk evaluation Technology management Development Tools & methods Verification Risk mitigation Requirement mgmt Risk assessment Risk treatment Risk monitoring Validation Context within NPD
    11. 11. 12 Why Feasibility Study? Investigate feasibility of the idea in the application •Confirm / deny added value •Determine required costs By determining: Feasibility Study Strengths vs. Weaknesses Opportunities vs. Threats Needed resources Context within NPD  Risk mitigation
    12. 12. 13 Context within NPD Added Value – What do we want to find out? Feasibility • Is the core concept feasible ? Proof-of technology • Does my core technology work in the lab ? Field demon- strator • Will my design function in the application?
    13. 13. 14 Context within NPD Added Value – Validation vs. Verification Needs Requirements Design Test setup Implementation Documentation Validation Check if developed product fulfills its purpose (intended use)  Are we building the right product?  Field test : full system test in real life  Demonstrator in application Test in production if the operator can measure a hole Ø and can detect faults. Validation = application owner (customer) Verification Check if developed product is compliant with specifications  Are we building the product right?  Lab test / Analysis / Simulation  Breadboards / Proof-of-concept in lab Test on a sample plate if we can reach e.g accuracy, w.r.t. reference technology. Verification = development partner (engineer)
    14. 14. 15 Context within NPD Cost Required Why wait for the risk to come true ? More freedom to adjust in early project phases  Early detection of risk allows for measures at lower cost and with less impact on schedule Normal Time Learning curve System specification Conceptual design Detailed design Tooling / purchase Integration Test Reversed  Fail fast, fail cheap!
    15. 15. 16 Context within NPD Why Testing? Different aspects of technical part of feasibility study: •Better guarantee on final results •Simulations often too complex/time consuming •Sometimes fastest method •New technologies need testing (no previous experience)  Feasibility testing is done to learn!  Avoid unnecessary testing  Example: blister packaging & NASA Visualization Testing Simulation Review
    16. 16. 17 NASA example Source: http://msis.jsc.nasa.gov/sections/section04.htm
    17. 17. 18 Context within NPD Test Management Managing of test activities: • Preparing and managing test plan • Preparation of tests • Choice of test approach + testing • Interpretation of test results • Documentation / reporting results  Iterative process during feasibility study
    18. 18. 19 CONFIDENTIAL Problems & Risks
    19. 19. 20 Problems & Risks Risks during feasibility phase Risks dealt with are the same as in the overall risk management • Budget • Planning • Performance Perfor- mance Planning Risk manage ment Budget Keep 3 axes in balance !
    20. 20. 21 Budget risks Uncontrolled budgets Timing • Long test times • Documentation Test setups / tools • Renting vs. buying • Too advanced setups too soon • Rework due to unforeseen events Perfor- mance Planning Risk manage ment Budget Problems & Risks
    21. 21. 22 Planning risks Planning Long duration of test campaign • Test conclusions too late • High criticality items too late Tests vs. Resources • Equipment availability • Personnel availability Lead times • Equipment/tools • Software • Test stands Perfor- mance Planning Risk manage ment Budget Problems & Risks
    22. 22. 23 Performance risks Performance Wrong result • Wrong measurement method • Wrong measurement tool • Inaccurate results • Wrong test order Wrong basic requirement • Wrong test criteria  “risk based methods in NPD” Perfor- mance Planning Risk manage ment Budget Problems & Risks
    23. 23. 24 CONFIDENTIAL Efficient Test Approach
    24. 24. 25 Outline Objective of testing Evaluate performance / behavior to remove / reduce project risk by testing on “representative hardware” Ways of testing • Proof of concept  bread boarding • Works like real  application demonstrator • Looks/works/tastes like real  product prototype Efficient test approach  Test Management
    25. 25. 26 Quick & Dirty Testing Test Plan Development Detailed Testing Conclusion Efficient test approach Test flow
    26. 26. 27 Quick & Dirty Testing Test Plan Development Detailed Testing Conclusion Efficient test approach Test flow
    27. 27. 28 Efficient test approach Quick & Dirty What? • Basic testing of hypothesis • Basic materials/tools/budgets • Wide window • Unexpected phenomena When? • Prior to full test plan development • Sometimes during FFE Why? • Upstream testing  reversed learning curve! • Variables and operating windows • Significance of these variables  risk based testing • Reduced cost  no expensive tools / instrumentation • Reduced timing  short tests, short tooling lead times  Increased test output with reduced risk
    28. 28. 29 Efficient test approach Quick & Dirty Examples Basic testing:
    29. 29. 30 Efficient test approach Quick & Dirty So why don’t they? • Corporate culture  administration during testing • Availability tools & instrumentation • Downstream test approach • Used of testing known things or by known methods • Afraid to get hands dirty..? Hot potato issue! • Foresee time & budget for Quick & Dirty testing • Ensure basic equipment is at hand  Low threshold for testing should be encouraged
    30. 30. 31 Quick & Dirty Testing Test Plan Development Detailed Testing Conclusion Efficient test approach Test flow
    31. 31. 32 Efficient test approach Test plan development 1. Gathering inputs • Desk search • Quick & dirty test results • Specification document • Risk assessment • .. 2. Define test actions  List items according to function / module / .. 3. Define acceptance criteria  Based on gathered information 4. Assign priorities to each test based upon risk/criticality  Risk based testing  Assign risks well thought
    32. 32. 33 Efficient test approach Test plan development 5. Define test methods based on: • Variables • Measurement method • Automatic vs. manual 6. Define setup & tools • Measurement method • Cost (rent vs. hire) • Location (EMC, Clima-chamber) 7. Estimate timing • Lead time setup & tools • Estimated testing time  Test plan will give priority / budget / timing overview!
    33. 33. 34 Quick & Dirty Testing Test Plan Development Detailed Testing Conclusion Efficient test approach Test flow
    34. 34. 35 Efficient test approach Detailed Testing • Instrumentation Calibration & sanity check! Setting up instruments correctly • Test stand Designed to be versatile & adaptable  adaptation on the fly for faster results • Testing Test order based on priorities (high to low) Reassess priorities/tasks as you go Eyes open for undefined phenomena & assess their influence  test plan iteration
    35. 35. 36 Efficient test approach Detailed Testing • Documentation On the fly  most efficient, less hassle for test engineer  use templates! Take pictures  store them for later use, small effort Need for computer at test stand + internet
    36. 36. 37 Efficient test approach Breadboard examples More advanced testing • Use of data-acquisition • Use of rapid prototyping • Use of automation devices
    37. 37. 38 Quick & Dirty Testing Test Plan Development Detailed Testing Conclusion Efficient test approach Test flow
    38. 38. 39 Efficient test approach Test conclusion Test action acceptance • Fill in test plan acceptance fail/pass • Comment on possible fails • Identify show stoppers as soon as possible Feasibility Study Strengths vs. Weaknesses Opportunities vs. Threats Needed resources
    39. 39. 40 CONFIDENTIAL Conclusion
    40. 40. 41 Efficient test approach Conclusion Reduce the barrier for Quick & Dirty testing  Lower threshold to start testing + getting a feel with the matter  Define test window + parameters for more detailed & accurate testing Plan your testing based on risks  start with highest priorities/risks first  Reassess test plan priorities as you go Test plan helps you control: • Priorities / risks • Cost tooling/setup • Timings Perfor- mance Planning Risk managem ent Budget Keep 3 axes in balance ! Maximize output, reverse the learning curve Reduce overall risk Risk based testing
    41. 41. 42
    42. 42. 43 VERHAERT MASTERS IN INNOVATION® Headquarters Hogenakkerhoekstraat 21 9150 Kruibeke (B) tel +32 (0)3 250 19 00 fax +32 (0)3 254 10 08 ezine@verhaert.com More at www.verhaert.com VERHAERT MASTERS IN INNOVATION® Netherlands ESIC European Space Innovation Centre Kapteynstraat 1 2201 BB Noordwijk (NL) Tel: +31 (0)618 12 19 19 derk.schneemann@verhaert.com More at www.verhaert.com MASTERS IN INNOVATION® is a platform set up by VERHAERT to train, stimulate and incubate you as an innovator. We provide an extensive training program with different tracks and covering critical areas of new products and business innovation. Furthermore we manage the VERHAERT venturing program and organize our Innovation Day, an annual conference on best practices and insights on new products & business innovation.

    ×