Common Simulation Pitfalls
We Shall understand
 General and programming mistakes
 Simulation inaccuracies
 Misleading results
When to simulate!
• Analytical model gives good enough representation
• Simulation takes months
• Simulation is expensive
• Simulation is non-scalable
When not to simulate!
• Analytical model not feasible(complex)
• Analytical model not possible(too simple)
• Simulate to verify analysis
• Otherwise simulations are unnecessary
General mistakes
• Inappropriate levels of details
• Improper selection of programming language
• Improper initial conditions
General mistakes(continued…..)
• Short run times
• Poor random number generators
• Inadequate time estimate
• No achievable goals
• Incomplete mix of essential
• Inadequate level of user participation
• Inability to manage simulation project
Simulation inaccuracies
• Over reliance on link budget methods for abstraction
• Overly simplistic modeling of ratio layers
Common Simulation Pitfalls
We Shall understand
 General and programming mistakes
 Simulation inaccuracies
 Misleading results
General mistakes
1. Improper programming language
• Scope & type of simulation determine best choice
• Object oriented vs procedural
- Types/diversity of simulation parameters
• Interpreted vs compiled
- Machine dependence
- Speed
2. Unverified models
• Programming is non trivial
• Semantic mistakes make simulations get
- Wrong results
- Misleading results
• Modular verification a must
3. Improper initial conditions
• Initial condition not steady state
• Often a late realization
• Surprisingly wrong results
• May never converge
4. Short run times
• Strong dependence on initial conditions
• Don’t achieve true steady state
5. Poor random number generators
• Lacking pseudo-random sequence leads to predictability
• Wrong choice of seed value could cause inadvertent correlation
between processes
- Use celebrated RNGs
6. No achievable goals
• Goals not defined
- Tangible output analysis
- Logs and trace files
• Goals are unreal
- Affects simulation complexity and implementation
7. Incomplete mix of essential skills
• Domain knowledge
• Statistics
• Programming
• Project management
• Past experience
8. Inadequate level of user participation
• From modelling to implementation
• UI design
• Output analysis
9. Inability to manage simulation project
• Simulations are not monolithic
• Need software engineering tools
-Multivariate design
- Code management
-Track changes
Simulation inaccuracies
1. Over reliance on the link budget methods for abstraction
• Link budget losses overly static
- Fair enough for steady state analysis
- Dynamic analysis not possible
• Result are misleading
2. Overly simplistic modeling of ratio layers
• Lowest layer often ignored
- No bit level BER & delay
• Often the Achilles heel
• Wrong results in highly dynamic use cases
Solve the
Inappropriate levels of
details
Improper selection of
Programming language
No
Clear
goals
Inadequate time
estimate
Poor RNG
Unverified models
Lacking essential skills
Link/radio layers
No SE tools
Short run times
Improper initial
Conditions
Short run times
No user
Participation
Development of System Simulation
We Shall understand
 The process of development of systems simulations
 Development life cycle
Development Process
1. Problem formulation
• Identify controllable and uncontrollable inputs
2. Data collection & analysis
• What to collect
• How much to collect
• Cost and accuracy trade off
3. Simulation development
• Codify, codify and codify!
4. Model validation, verification, & calibration
• Validation
• Is it the right system?
• Emulates real phenomenon
5. “What-if” analysis
• Performance measures with different inputs
6. Model validation, verification, & calibration
• Verification
• Are we building the system right?
• Implementation Must correspond to the model
7. Model validation, verification, & calibration
• Calibration
• Parameter estimation
• Tweaking/tuning to ensure that simulated data
follows real data
8. Sensitivity analysis
• Relative importance of different parameters with
respect to output
• Even with respect to each other
Life cycle of Simulation Development
1-Descriptive analysis
3- Post prescriptive
analysis
2- Prescriptive analysis
validation, verification basic model
Sensitivity and the what-if-analysis
Sensitivity and the
what-if-analysis
Sensitivity and the
what-if-analysis

Lecture 3 of simulation and modulation.pptx

  • 1.
    Common Simulation Pitfalls WeShall understand  General and programming mistakes  Simulation inaccuracies  Misleading results
  • 2.
    When to simulate! •Analytical model gives good enough representation • Simulation takes months • Simulation is expensive • Simulation is non-scalable When not to simulate! • Analytical model not feasible(complex) • Analytical model not possible(too simple) • Simulate to verify analysis • Otherwise simulations are unnecessary
  • 3.
    General mistakes • Inappropriatelevels of details • Improper selection of programming language • Improper initial conditions General mistakes(continued…..) • Short run times • Poor random number generators • Inadequate time estimate • No achievable goals • Incomplete mix of essential • Inadequate level of user participation • Inability to manage simulation project
  • 4.
    Simulation inaccuracies • Overreliance on link budget methods for abstraction • Overly simplistic modeling of ratio layers
  • 5.
    Common Simulation Pitfalls WeShall understand  General and programming mistakes  Simulation inaccuracies  Misleading results
  • 6.
    General mistakes 1. Improperprogramming language • Scope & type of simulation determine best choice • Object oriented vs procedural - Types/diversity of simulation parameters • Interpreted vs compiled - Machine dependence - Speed 2. Unverified models • Programming is non trivial • Semantic mistakes make simulations get - Wrong results - Misleading results • Modular verification a must
  • 7.
    3. Improper initialconditions • Initial condition not steady state • Often a late realization • Surprisingly wrong results • May never converge 4. Short run times • Strong dependence on initial conditions • Don’t achieve true steady state 5. Poor random number generators • Lacking pseudo-random sequence leads to predictability • Wrong choice of seed value could cause inadvertent correlation between processes - Use celebrated RNGs
  • 8.
    6. No achievablegoals • Goals not defined - Tangible output analysis - Logs and trace files • Goals are unreal - Affects simulation complexity and implementation 7. Incomplete mix of essential skills • Domain knowledge • Statistics • Programming • Project management • Past experience 8. Inadequate level of user participation • From modelling to implementation • UI design • Output analysis
  • 9.
    9. Inability tomanage simulation project • Simulations are not monolithic • Need software engineering tools -Multivariate design - Code management -Track changes
  • 10.
    Simulation inaccuracies 1. Overreliance on the link budget methods for abstraction • Link budget losses overly static - Fair enough for steady state analysis - Dynamic analysis not possible • Result are misleading 2. Overly simplistic modeling of ratio layers • Lowest layer often ignored - No bit level BER & delay • Often the Achilles heel • Wrong results in highly dynamic use cases
  • 11.
    Solve the Inappropriate levelsof details Improper selection of Programming language No Clear goals Inadequate time estimate Poor RNG Unverified models Lacking essential skills Link/radio layers No SE tools Short run times Improper initial Conditions Short run times No user Participation
  • 12.
    Development of SystemSimulation We Shall understand  The process of development of systems simulations  Development life cycle
  • 13.
    Development Process 1. Problemformulation • Identify controllable and uncontrollable inputs 2. Data collection & analysis • What to collect • How much to collect • Cost and accuracy trade off 3. Simulation development • Codify, codify and codify! 4. Model validation, verification, & calibration • Validation • Is it the right system? • Emulates real phenomenon 5. “What-if” analysis • Performance measures with different inputs
  • 14.
    6. Model validation,verification, & calibration • Verification • Are we building the system right? • Implementation Must correspond to the model 7. Model validation, verification, & calibration • Calibration • Parameter estimation • Tweaking/tuning to ensure that simulated data follows real data 8. Sensitivity analysis • Relative importance of different parameters with respect to output • Even with respect to each other
  • 15.
    Life cycle ofSimulation Development 1-Descriptive analysis 3- Post prescriptive analysis 2- Prescriptive analysis validation, verification basic model Sensitivity and the what-if-analysis Sensitivity and the what-if-analysis Sensitivity and the what-if-analysis