SlideShare a Scribd company logo
Simulation and Modelling 2016 Conference
Tuesday 13 and Wednesday 14 September
2016, Birmingham
Thinktank, Birmingham Science Museum
Millennium Point
Birmingham
West Midlands
B4 7XG
VALIDATION: GAINING CONFIDENCE
IN SIMULATION
A Program Office perspective
Distinction must be made between
the verification and validation of the
key components that constitute a
simulation
• Verification - Does the code implement the physics of the
problem correctly and does the solution generated
compared favourably to exact analytical results ?
• Validation – Does the actual simulation agree with physical
reality? Is the level of uncertainty and error acceptable ?
From a Program Office perspective Validation
assessment generates credibility for decision
makers
Simulation verification may include
analysis of:
• Discretization strategy,
• Application of boundary conditions,
• Grid convergence criteria,
• Handling non-linearity,
• Iterative convergence,
• Numerical stability ( relaxation factors etc.)
D.Odeleye CEng MIMechE 4
–Validation is the process of
determining the degree to which a
model is an accurate representation of
the real world from the perspective of
the intended use of the model.
–High confidence simulations offers the
promise of developing higher quality
products with fewer resources in less
time
D.Odeleye CEng MIMechE 5
• Generating the right data, not a lot of
data
• Understanding what is appropriate and
required for validating models
• Assessing the influencing factors that
need to be considered and tested
teg
D.Odeleye CEng MIMechE 7
D.Odeleye CEng MIMechE 8
• What does the Program need to
know and when?
– This defines the problem and what CAE methods that can be
used
• Lean product creation concepts
require
– Compatibility with Quality, Cost& Delivery before design maturity
– Front loading using all design options before commitment
– No late changes
D.Odeleye CEng MIMechE 9
• Generating the right data, not a lot of
data
• Understanding what is appropriate and
required for validating models
• Assessing the influencing factors that
need to be considered and tested
teg
Correlate and compare the
components of simulation in
order to validate models i.e.
– Invest upfront resources to generate the most accurate inputs
for Simulation
– Compare results from different Simulation tools,
– Compare real world measurements to simulated results,
– Compare “controlled” test environment ( i.e. wind tunnel)
results to simulation results.
D.Odeleye CEng MIMechE 11
Extend legacy/baseline physical
tests to generate comprehensive
model input parameters
– Typical durability testing focuses on “ has it broken yet” .
providing little information on optimisation and distance from
failure modes
– Testing to failure provides quantitative data i.e. time to failure,
cycles to failure etc. and damage mode provides insight into
distance from failure mode and performance degradation
supports better predictive simulations.
D.Odeleye CEng MIMechE 12
D.Odeleye CEng MIMechE 13
Bogey Testing
Testing to Failure
Degradation testing
Trade off between different test schemes and implications
Delivery Cost Quality
Long test duration impact on engineering sign off timing
High Cost
Comprehensive
data
Shortest test duration
Most likely to meet
program timing
Lowest relative
Cost. Results of test cannot be used
To predict performance with sufficient confidence
D.Odeleye CEng MIMechE 14
Bogey test
Test to failure
D.Odeleye CEng MIMechE 15
Predicted life
Crankshaft #1
Predicted lif
Crankshaft #
( revised #2
@125PS)
( revised #2
@150PS)
ed #2)
PS
D.Odeleye CEng MIMechE
Original design is the purple part and the
green part is the revised part with the
width of web 6 increased
17
•Based on sampling at the extremes of either product strength, or level of
induced key stresses.
i.e. Selection of weakest parts, worst clearances, etc.
•If Input data is based on the weakest part test data, all stronger parts would
also pass the test ( for a given test cycle).
D.Odeleye CEng MIMechE 18
D.Odeleye CEng MIMechE 19
D.Odeleye CEng MIMechE 20
D.Odeleye CEng MIMechE 21
D.Odeleye CEng MIMechE 22
D.Odeleye CEng MIMechE 23
D.Odeleye CEng MIMechE 24
D.Odeleye CEng MIMechE 25
• Generating the right data, not a lot of
data
• Understanding what is appropriate and
required for validating models
• Assessing the influencing factors that
need to be considered and tested
teg
Conduct studies to identify
sources of
– Variation in required simulation output, what input parameters
have what effect on the required results ?
– Conduct sensitivity analysis of input parameters ,
– Classify influencing factors and identify main effects
• Validate input data from experimental tests, does it
make engineering sense
D.Odeleye CEng MIMechE 27
D.Odeleye CEng MIMechE 28
D.Odeleye CEng MIMechE 29
Inputs
CAD, Mesh
generation, space (
F.E, Volume,
elements) and time (
stability constraints)
discretization
Parameters,
Boundary
Conditions.
Solver
Settings i.e. mesh
refinement vs
solution
convergence time
Different Simulation
tools, iterative solver
Output
Simulation Solutions
Experimental data (
controlled noise
environment), real
world data
Compare components , conduct sensitivity analysis
and Parametric studies to validate simulations
Simulation components
Confidence in Simulation is gained
by demonstrating value added to
– Quality – credible data necessary to make decisions at each stage of the
product development , fidelity of predictions over time.
– Cost – return on investment versus e.g. cost of testing prototypes
– Delivery – are simulations conducted and results available exactly when
needed ?
D.Odeleye CEng MIMechE 31
Further considerations and
opportunities for increasing
confidence
• Faster post processing
• Use of virtual reality & augmented reality,
• Real time simulation ( running simulations on the ‘fly’ ),
• Cloud computing
• Artificial intelligence ( using techniques such as evolutionary
computations, artificial neural networks, fuzzy systems , general
machine learning and data mining methods
• Simulating Electric vehicles and EV powertrains
• Simulating Autonomous vehicle performance.
D.Odeleye CEng MIMechE 32
• Electric/hybrid vehicles present the next opportunities for the use
of simulation to further compress product development
timescales.
D.Odeleye CEng MIMechE 33
DISCUSS
The better the Validation the better
the prediction consequently the
more confidence customers have
in ongoing and future Simulations
D.Odeleye CEng MIMechE 35

More Related Content

What's hot

Practical Application Of Risk Based Testing Methods
Practical Application Of Risk Based Testing MethodsPractical Application Of Risk Based Testing Methods
Practical Application Of Risk Based Testing Methods
Reuben Korngold
 
Edwin Van Loon - How Much Testing is Enough - EuroSTAR 2010
Edwin Van Loon -  How Much Testing is Enough - EuroSTAR 2010Edwin Van Loon -  How Much Testing is Enough - EuroSTAR 2010
Edwin Van Loon - How Much Testing is Enough - EuroSTAR 2010
TEST Huddle
 
John Brennen - Red Hot Testing in a Green World
John Brennen - Red Hot Testing in a Green WorldJohn Brennen - Red Hot Testing in a Green World
John Brennen - Red Hot Testing in a Green World
TEST Huddle
 
Neil Thompson - Value Inspired Testing: Renovating Risk-Based Testing and Inn...
Neil Thompson - Value Inspired Testing: Renovating Risk-Based Testing and Inn...Neil Thompson - Value Inspired Testing: Renovating Risk-Based Testing and Inn...
Neil Thompson - Value Inspired Testing: Renovating Risk-Based Testing and Inn...
TEST Huddle
 
Test effort estimation a reason behind successful testing
Test effort estimation   a reason behind successful testingTest effort estimation   a reason behind successful testing
Test effort estimation a reason behind successful testing
Indium Software
 
Peter Zimmerer - Establishing Testing Knowledge and Experience Sharing at Sie...
Peter Zimmerer - Establishing Testing Knowledge and Experience Sharing at Sie...Peter Zimmerer - Establishing Testing Knowledge and Experience Sharing at Sie...
Peter Zimmerer - Establishing Testing Knowledge and Experience Sharing at Sie...
TEST Huddle
 
'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer
'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer
'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer
TEST Huddle
 
Derk-Jan de Grood - 9 Causes of losing valuable testing time - EuroSTAR 2010
Derk-Jan de Grood - 9 Causes of losing valuable testing time - EuroSTAR 2010Derk-Jan de Grood - 9 Causes of losing valuable testing time - EuroSTAR 2010
Derk-Jan de Grood - 9 Causes of losing valuable testing time - EuroSTAR 2010
TEST Huddle
 
Martin Koojj - Testers in the Board of Directors
Martin Koojj - Testers in the Board of DirectorsMartin Koojj - Testers in the Board of Directors
Martin Koojj - Testers in the Board of Directors
TEST Huddle
 
How much testing is enough
How much testing is enoughHow much testing is enough
How much testing is enough
Reti Yulvenia
 
Neil Pandit - A Visual Approach to Risk Based Integration Testing
Neil Pandit - A Visual Approach to Risk Based Integration TestingNeil Pandit - A Visual Approach to Risk Based Integration Testing
Neil Pandit - A Visual Approach to Risk Based Integration Testing
TEST Huddle
 
Effective Test Estimation
Effective Test EstimationEffective Test Estimation
Effective Test Estimation
TechWell
 
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
TEST Huddle
 
Test Estimation in Practice
Test Estimation in PracticeTest Estimation in Practice
Test Estimation in Practice
TechWell
 
Testing strategies
Testing strategiesTesting strategies
Testing strategies
YMT College Of Management
 
Thomas Axen - Lean Kaizen Applied To Software Testing - EuroSTAR 2010
Thomas Axen - Lean Kaizen Applied To Software Testing - EuroSTAR 2010Thomas Axen - Lean Kaizen Applied To Software Testing - EuroSTAR 2010
Thomas Axen - Lean Kaizen Applied To Software Testing - EuroSTAR 2010
TEST Huddle
 
Doe Taguchi Basic Manual1
Doe Taguchi Basic Manual1Doe Taguchi Basic Manual1
Doe Taguchi Basic Manual1
nazeer pasha
 
Ruud Teunissen - Personal Test Improvement - Dealing with the Future
Ruud Teunissen - Personal Test Improvement -  Dealing with the FutureRuud Teunissen - Personal Test Improvement -  Dealing with the Future
Ruud Teunissen - Personal Test Improvement - Dealing with the Future
TEST Huddle
 
Test Case Point Analysis
Test Case Point AnalysisTest Case Point Analysis
Test Case Point Analysis
vuqn
 
John Fodeh - Spend Wisely, Test Well
John Fodeh - Spend Wisely, Test WellJohn Fodeh - Spend Wisely, Test Well
John Fodeh - Spend Wisely, Test Well
TEST Huddle
 

What's hot (20)

Practical Application Of Risk Based Testing Methods
Practical Application Of Risk Based Testing MethodsPractical Application Of Risk Based Testing Methods
Practical Application Of Risk Based Testing Methods
 
Edwin Van Loon - How Much Testing is Enough - EuroSTAR 2010
Edwin Van Loon -  How Much Testing is Enough - EuroSTAR 2010Edwin Van Loon -  How Much Testing is Enough - EuroSTAR 2010
Edwin Van Loon - How Much Testing is Enough - EuroSTAR 2010
 
John Brennen - Red Hot Testing in a Green World
John Brennen - Red Hot Testing in a Green WorldJohn Brennen - Red Hot Testing in a Green World
John Brennen - Red Hot Testing in a Green World
 
Neil Thompson - Value Inspired Testing: Renovating Risk-Based Testing and Inn...
Neil Thompson - Value Inspired Testing: Renovating Risk-Based Testing and Inn...Neil Thompson - Value Inspired Testing: Renovating Risk-Based Testing and Inn...
Neil Thompson - Value Inspired Testing: Renovating Risk-Based Testing and Inn...
 
Test effort estimation a reason behind successful testing
Test effort estimation   a reason behind successful testingTest effort estimation   a reason behind successful testing
Test effort estimation a reason behind successful testing
 
Peter Zimmerer - Establishing Testing Knowledge and Experience Sharing at Sie...
Peter Zimmerer - Establishing Testing Knowledge and Experience Sharing at Sie...Peter Zimmerer - Establishing Testing Knowledge and Experience Sharing at Sie...
Peter Zimmerer - Establishing Testing Knowledge and Experience Sharing at Sie...
 
'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer
'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer
'Architecture Testing: Wrongly Ignored!' by Peter Zimmerer
 
Derk-Jan de Grood - 9 Causes of losing valuable testing time - EuroSTAR 2010
Derk-Jan de Grood - 9 Causes of losing valuable testing time - EuroSTAR 2010Derk-Jan de Grood - 9 Causes of losing valuable testing time - EuroSTAR 2010
Derk-Jan de Grood - 9 Causes of losing valuable testing time - EuroSTAR 2010
 
Martin Koojj - Testers in the Board of Directors
Martin Koojj - Testers in the Board of DirectorsMartin Koojj - Testers in the Board of Directors
Martin Koojj - Testers in the Board of Directors
 
How much testing is enough
How much testing is enoughHow much testing is enough
How much testing is enough
 
Neil Pandit - A Visual Approach to Risk Based Integration Testing
Neil Pandit - A Visual Approach to Risk Based Integration TestingNeil Pandit - A Visual Approach to Risk Based Integration Testing
Neil Pandit - A Visual Approach to Risk Based Integration Testing
 
Effective Test Estimation
Effective Test EstimationEffective Test Estimation
Effective Test Estimation
 
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
 
Test Estimation in Practice
Test Estimation in PracticeTest Estimation in Practice
Test Estimation in Practice
 
Testing strategies
Testing strategiesTesting strategies
Testing strategies
 
Thomas Axen - Lean Kaizen Applied To Software Testing - EuroSTAR 2010
Thomas Axen - Lean Kaizen Applied To Software Testing - EuroSTAR 2010Thomas Axen - Lean Kaizen Applied To Software Testing - EuroSTAR 2010
Thomas Axen - Lean Kaizen Applied To Software Testing - EuroSTAR 2010
 
Doe Taguchi Basic Manual1
Doe Taguchi Basic Manual1Doe Taguchi Basic Manual1
Doe Taguchi Basic Manual1
 
Ruud Teunissen - Personal Test Improvement - Dealing with the Future
Ruud Teunissen - Personal Test Improvement -  Dealing with the FutureRuud Teunissen - Personal Test Improvement -  Dealing with the Future
Ruud Teunissen - Personal Test Improvement - Dealing with the Future
 
Test Case Point Analysis
Test Case Point AnalysisTest Case Point Analysis
Test Case Point Analysis
 
John Fodeh - Spend Wisely, Test Well
John Fodeh - Spend Wisely, Test WellJohn Fodeh - Spend Wisely, Test Well
John Fodeh - Spend Wisely, Test Well
 

Similar to Validation gaining confidence in simulation Darre Odeleye CEng MIMechE

Design of Experiments
Design of ExperimentsDesign of Experiments
Design of Experiments
Ronald Shewchuk
 
Peter Zimmerer - Evolve Design For Testability To The Next Level - EuroSTAR 2012
Peter Zimmerer - Evolve Design For Testability To The Next Level - EuroSTAR 2012Peter Zimmerer - Evolve Design For Testability To The Next Level - EuroSTAR 2012
Peter Zimmerer - Evolve Design For Testability To The Next Level - EuroSTAR 2012
TEST Huddle
 
Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010
Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010
Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010
TEST Huddle
 
Innovation day 2013 2.5 joris vanderschrick (verhaert) - embedded system de...
Innovation day 2013   2.5 joris vanderschrick (verhaert) - embedded system de...Innovation day 2013   2.5 joris vanderschrick (verhaert) - embedded system de...
Innovation day 2013 2.5 joris vanderschrick (verhaert) - embedded system de...
Verhaert Masters in Innovation
 
want to contact me login to www.stqa.org
want to contact me login to www.stqa.orgwant to contact me login to www.stqa.org
want to contact me login to www.stqa.org
nazeer pasha
 
Nilesh More_Project Manager
Nilesh More_Project ManagerNilesh More_Project Manager
Nilesh More_Project Manager
Nilesh More
 
Testing of Object-Oriented Software
Testing of Object-Oriented SoftwareTesting of Object-Oriented Software
Testing of Object-Oriented Software
Praveen Penumathsa
 
Software Testing
Software TestingSoftware Testing
Software Testing
Ecaterina Moraru (Valica)
 
Software Testing 1198102207476437 4
Software Testing 1198102207476437 4Software Testing 1198102207476437 4
Software Testing 1198102207476437 4
Siddhartha Parida
 
Joseph G Scott
Joseph G  ScottJoseph G  Scott
Joseph G Scott
Joe Scott
 
Business process simulation how to get value out of it (no magic 2013)
Business process simulation  how to get value out of it (no magic 2013)Business process simulation  how to get value out of it (no magic 2013)
Business process simulation how to get value out of it (no magic 2013)
Denis Gagné
 
Advanced Verification Methodology for Complex System on Chip Verification
Advanced Verification Methodology for Complex System on Chip VerificationAdvanced Verification Methodology for Complex System on Chip Verification
Advanced Verification Methodology for Complex System on Chip Verification
VLSICS Design
 
Risk Driven Testing
Risk Driven TestingRisk Driven Testing
Risk Driven Testing
Jorge Boria
 
Paul Gerrard - The Redistribution of Testing – Where to Innovate and What to ...
Paul Gerrard - The Redistribution of Testing – Where to Innovate and What to ...Paul Gerrard - The Redistribution of Testing – Where to Innovate and What to ...
Paul Gerrard - The Redistribution of Testing – Where to Innovate and What to ...
TEST Huddle
 
Test Estimation
Test Estimation Test Estimation
Test Estimation
SQALab
 
Automated Testing Tutorial
Automated Testing TutorialAutomated Testing Tutorial
Automated Testing Tutorial
John Liebenau
 
Guidelines to Understanding Design of Experiment and Reliability Prediction
Guidelines to Understanding Design of Experiment and Reliability PredictionGuidelines to Understanding Design of Experiment and Reliability Prediction
Guidelines to Understanding Design of Experiment and Reliability Prediction
ijsrd.com
 
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Making Model-Driven Verification Practical and Scalable: Experiences and Less...Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Lionel Briand
 
puneet_pall_resume
puneet_pall_resumepuneet_pall_resume
puneet_pall_resume
puneet pall
 
Simulacion luis garciaguzman-21012011
Simulacion luis garciaguzman-21012011Simulacion luis garciaguzman-21012011
Simulacion luis garciaguzman-21012011
lideresacademicos
 

Similar to Validation gaining confidence in simulation Darre Odeleye CEng MIMechE (20)

Design of Experiments
Design of ExperimentsDesign of Experiments
Design of Experiments
 
Peter Zimmerer - Evolve Design For Testability To The Next Level - EuroSTAR 2012
Peter Zimmerer - Evolve Design For Testability To The Next Level - EuroSTAR 2012Peter Zimmerer - Evolve Design For Testability To The Next Level - EuroSTAR 2012
Peter Zimmerer - Evolve Design For Testability To The Next Level - EuroSTAR 2012
 
Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010
Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010
Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010
 
Innovation day 2013 2.5 joris vanderschrick (verhaert) - embedded system de...
Innovation day 2013   2.5 joris vanderschrick (verhaert) - embedded system de...Innovation day 2013   2.5 joris vanderschrick (verhaert) - embedded system de...
Innovation day 2013 2.5 joris vanderschrick (verhaert) - embedded system de...
 
want to contact me login to www.stqa.org
want to contact me login to www.stqa.orgwant to contact me login to www.stqa.org
want to contact me login to www.stqa.org
 
Nilesh More_Project Manager
Nilesh More_Project ManagerNilesh More_Project Manager
Nilesh More_Project Manager
 
Testing of Object-Oriented Software
Testing of Object-Oriented SoftwareTesting of Object-Oriented Software
Testing of Object-Oriented Software
 
Software Testing
Software TestingSoftware Testing
Software Testing
 
Software Testing 1198102207476437 4
Software Testing 1198102207476437 4Software Testing 1198102207476437 4
Software Testing 1198102207476437 4
 
Joseph G Scott
Joseph G  ScottJoseph G  Scott
Joseph G Scott
 
Business process simulation how to get value out of it (no magic 2013)
Business process simulation  how to get value out of it (no magic 2013)Business process simulation  how to get value out of it (no magic 2013)
Business process simulation how to get value out of it (no magic 2013)
 
Advanced Verification Methodology for Complex System on Chip Verification
Advanced Verification Methodology for Complex System on Chip VerificationAdvanced Verification Methodology for Complex System on Chip Verification
Advanced Verification Methodology for Complex System on Chip Verification
 
Risk Driven Testing
Risk Driven TestingRisk Driven Testing
Risk Driven Testing
 
Paul Gerrard - The Redistribution of Testing – Where to Innovate and What to ...
Paul Gerrard - The Redistribution of Testing – Where to Innovate and What to ...Paul Gerrard - The Redistribution of Testing – Where to Innovate and What to ...
Paul Gerrard - The Redistribution of Testing – Where to Innovate and What to ...
 
Test Estimation
Test Estimation Test Estimation
Test Estimation
 
Automated Testing Tutorial
Automated Testing TutorialAutomated Testing Tutorial
Automated Testing Tutorial
 
Guidelines to Understanding Design of Experiment and Reliability Prediction
Guidelines to Understanding Design of Experiment and Reliability PredictionGuidelines to Understanding Design of Experiment and Reliability Prediction
Guidelines to Understanding Design of Experiment and Reliability Prediction
 
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Making Model-Driven Verification Practical and Scalable: Experiences and Less...Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
 
puneet_pall_resume
puneet_pall_resumepuneet_pall_resume
puneet_pall_resume
 
Simulacion luis garciaguzman-21012011
Simulacion luis garciaguzman-21012011Simulacion luis garciaguzman-21012011
Simulacion luis garciaguzman-21012011
 

Validation gaining confidence in simulation Darre Odeleye CEng MIMechE

  • 1. Simulation and Modelling 2016 Conference Tuesday 13 and Wednesday 14 September 2016, Birmingham Thinktank, Birmingham Science Museum Millennium Point Birmingham West Midlands B4 7XG
  • 2. VALIDATION: GAINING CONFIDENCE IN SIMULATION A Program Office perspective
  • 3. Distinction must be made between the verification and validation of the key components that constitute a simulation • Verification - Does the code implement the physics of the problem correctly and does the solution generated compared favourably to exact analytical results ? • Validation – Does the actual simulation agree with physical reality? Is the level of uncertainty and error acceptable ? From a Program Office perspective Validation assessment generates credibility for decision makers
  • 4. Simulation verification may include analysis of: • Discretization strategy, • Application of boundary conditions, • Grid convergence criteria, • Handling non-linearity, • Iterative convergence, • Numerical stability ( relaxation factors etc.) D.Odeleye CEng MIMechE 4
  • 5. –Validation is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended use of the model. –High confidence simulations offers the promise of developing higher quality products with fewer resources in less time D.Odeleye CEng MIMechE 5
  • 6. • Generating the right data, not a lot of data • Understanding what is appropriate and required for validating models • Assessing the influencing factors that need to be considered and tested
  • 9. • What does the Program need to know and when? – This defines the problem and what CAE methods that can be used • Lean product creation concepts require – Compatibility with Quality, Cost& Delivery before design maturity – Front loading using all design options before commitment – No late changes D.Odeleye CEng MIMechE 9
  • 10. • Generating the right data, not a lot of data • Understanding what is appropriate and required for validating models • Assessing the influencing factors that need to be considered and tested teg
  • 11. Correlate and compare the components of simulation in order to validate models i.e. – Invest upfront resources to generate the most accurate inputs for Simulation – Compare results from different Simulation tools, – Compare real world measurements to simulated results, – Compare “controlled” test environment ( i.e. wind tunnel) results to simulation results. D.Odeleye CEng MIMechE 11
  • 12. Extend legacy/baseline physical tests to generate comprehensive model input parameters – Typical durability testing focuses on “ has it broken yet” . providing little information on optimisation and distance from failure modes – Testing to failure provides quantitative data i.e. time to failure, cycles to failure etc. and damage mode provides insight into distance from failure mode and performance degradation supports better predictive simulations. D.Odeleye CEng MIMechE 12
  • 14. Bogey Testing Testing to Failure Degradation testing Trade off between different test schemes and implications Delivery Cost Quality Long test duration impact on engineering sign off timing High Cost Comprehensive data Shortest test duration Most likely to meet program timing Lowest relative Cost. Results of test cannot be used To predict performance with sufficient confidence D.Odeleye CEng MIMechE 14
  • 15. Bogey test Test to failure D.Odeleye CEng MIMechE 15
  • 16. Predicted life Crankshaft #1 Predicted lif Crankshaft # ( revised #2 @125PS) ( revised #2 @150PS) ed #2) PS
  • 17. D.Odeleye CEng MIMechE Original design is the purple part and the green part is the revised part with the width of web 6 increased 17
  • 18. •Based on sampling at the extremes of either product strength, or level of induced key stresses. i.e. Selection of weakest parts, worst clearances, etc. •If Input data is based on the weakest part test data, all stronger parts would also pass the test ( for a given test cycle). D.Odeleye CEng MIMechE 18
  • 26. • Generating the right data, not a lot of data • Understanding what is appropriate and required for validating models • Assessing the influencing factors that need to be considered and tested teg
  • 27. Conduct studies to identify sources of – Variation in required simulation output, what input parameters have what effect on the required results ? – Conduct sensitivity analysis of input parameters , – Classify influencing factors and identify main effects • Validate input data from experimental tests, does it make engineering sense D.Odeleye CEng MIMechE 27
  • 30. Inputs CAD, Mesh generation, space ( F.E, Volume, elements) and time ( stability constraints) discretization Parameters, Boundary Conditions. Solver Settings i.e. mesh refinement vs solution convergence time Different Simulation tools, iterative solver Output Simulation Solutions Experimental data ( controlled noise environment), real world data Compare components , conduct sensitivity analysis and Parametric studies to validate simulations Simulation components
  • 31. Confidence in Simulation is gained by demonstrating value added to – Quality – credible data necessary to make decisions at each stage of the product development , fidelity of predictions over time. – Cost – return on investment versus e.g. cost of testing prototypes – Delivery – are simulations conducted and results available exactly when needed ? D.Odeleye CEng MIMechE 31
  • 32. Further considerations and opportunities for increasing confidence • Faster post processing • Use of virtual reality & augmented reality, • Real time simulation ( running simulations on the ‘fly’ ), • Cloud computing • Artificial intelligence ( using techniques such as evolutionary computations, artificial neural networks, fuzzy systems , general machine learning and data mining methods • Simulating Electric vehicles and EV powertrains • Simulating Autonomous vehicle performance. D.Odeleye CEng MIMechE 32
  • 33. • Electric/hybrid vehicles present the next opportunities for the use of simulation to further compress product development timescales. D.Odeleye CEng MIMechE 33
  • 35. The better the Validation the better the prediction consequently the more confidence customers have in ongoing and future Simulations D.Odeleye CEng MIMechE 35