Brokerage 2007 performatie evaluatie

370 views
330 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
370
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Brokerage 2007 performatie evaluatie

  1. 1. Performance Evaluation Chris Blondia Nick Vercammen
  2. 2. Performance Evaluation Cycle System under Evaluation Use evaluation results Performance in system measures System Model Relate results to - Environment System Model - System Evaluation Model -Input (e.g.traces) Derive -System model Performance Measures -Evaluation method - Compute -Analytical - Simulate -Simulation - Measure -Experimental 2
  3. 3. Example: Influence of network congestion on video quality • Streaming video transmitted through a congested network • Packets of the video get lost due to buffer overflow • What is the influence of the packet loss on the video quality 3
  4. 4. Performance Evaluation Cycle: System Under Evaluation System under Evaluation Use evaluation results Performance in system measures System Model Relate results to - Environment System Model - System Evaluation Model -Input (e.g.traces) Derive -System model Performance Measures -Evaluation method - Compute -Analytical - Simulate -Simulation - Measure -Experimental 4
  5. 5. System under Evaluation / Performance measures Performance measures  Network: packet loss, number of consecutive packets lost, number of admitted streams  Content: type of packet lost, synchronization  User perception Congested network (Buffer Video overflow) server End User 5
  6. 6. Performance Evaluation Cycle: System Model System under Evaluation Use evaluation results Performance in system measures System Model Relate results to - Environment System Model - System Evaluation Model -Input (e.g.traces) Derive -System model Performance Measures -Evaluation method - Compute -Analytical - Simulate -Simulation - Measure -Experimental 6
  7. 7. System Model Congestion can be modelled by - Background traffic - Variable service time in router Background traffic Tagged video source Packet loss 7
  8. 8. Performance Evaluation Cycle: Evaluation model and determine performance measures System under Evaluation Use evaluation results Performance in system measures System Model Relate results to - Environment System Model - System Evaluation Model -Input (e.g.traces) Derive -System model Performance Measures -Evaluation method - Compute -Analytical - Simulate -Simulation - Measure -Experimental 8
  9. 9. Analytical Approach Frame size trace - bond 250000 I-frames I h1, e 1/ 1 B B P B B P B B P B B B-frames m N I P-frames a 200000 1 r f 1/ N h1, N I 2 / 150000 h2, 1/ 1 s t i 100000 N I B B P B B P B B P B B 2 b 1/ N h1, 50000 N i h1, I hL,L 1/ 0 0 10000 20000 frame number 30000 40000 N I 1 B B P B B P B B P B B L 1/ N N . . approximate B<∞ process packet loss superposition probability 9 of X sources
  10. 10. s e c Use of results: Admission Control r u o s 1e-4 admission boundary, buffer = 100 packets x 60 i Theoretical results r Experimental results e 50 t s a 40 f o 30 r e b 20 m u n 10 0 0 5 10 15 20 25 30 35 40 45 number of bond sources 10
  11. 11. Experimental Set-up and Results  See demonstration 11
  12. 12. Performance Evaluation Cycle: Interpretation of results System under Evaluation Use evaluation results Performance in system measures System Model Relate results to - Environment System Model - System Evaluation Model -Input (e.g.traces) Derive -System model Performance Measures -Evaluation method - Compute -Analytical - Simulate -Simulation - Measure -Experimental 12
  13. 13. Use performance evaluation results  Use the IBBT competences to suggest improvements  Examples:  Network groups:  intelligent buffer management schemes (e.g. using thresholds)  Congestion control mechanisms (e.g. explicit congestion notification)  Define efficient admission control algorithms  Video groups:  Codecs using feedback  use layered video coding schemes 13
  14. 14. Use of Performance Evaluation Competence Application and Content Network Protocols Layer L2 – L4 Physical Layer 14
  15. 15. Application and Content Layer System Measures Application content distribution responsiveness, scalability, web access stability parallel file processing Video and audio analysis of different coding efficiency, Content codecs, error error resilience and resilience concealment, error types and mechanisms, … frequencies, synchronization Traffic Analysis video, (aggregated) burstiness, applications, web peak bitrate, traffic average bitrate, mobility 15
  16. 16. Networks Protocols L2 – L4 Layer System Measures Protocol scheduling mechanism, responsiveness, performance MAC protocol, routing scalability, stability, evaluation protocol, auto-configuration functionality, protocol throughput protocol test testing of HGW, suites modem/mux interaction Advanced and protocols for mesh, ad hoc scalability, large scale and wireless sensor robustness, reliability, experimental networks throughput testbeds (fixed and wireless) 16
  17. 17. Physical Layer Layer System Measures Signal propagation WiMAX Signal propagation Body Area Networks characteristics 17
  18. 18. Generic Test and Measurement Equipment Use of advanced test equipment: E.g. Spirent-Smartbits, Spirent- Avalanche, Agilent-NX2, Opticom- Opera, Tracespan, Fluke-Optiview, … 18
  19. 19. Testbeds - Example: Wireless testbed 19

×