Brokerage 2007 performatie evaluatie
Upcoming SlideShare
Loading in...5
×
 

Brokerage 2007 performatie evaluatie

on

  • 418 views

 

Statistics

Views

Total Views
418
Views on SlideShare
418
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Brokerage 2007 performatie evaluatie Brokerage 2007 performatie evaluatie Presentation Transcript

  • Performance Evaluation Chris Blondia Nick Vercammen
  • Performance Evaluation Cycle System under Evaluation Use evaluation results Performance in system measures System Model Relate results to - Environment System Model - System Evaluation Model -Input (e.g.traces) Derive -System model Performance Measures -Evaluation method - Compute -Analytical - Simulate -Simulation - Measure -Experimental 2
  • Example: Influence of network congestion on video quality • Streaming video transmitted through a congested network • Packets of the video get lost due to buffer overflow • What is the influence of the packet loss on the video quality 3
  • Performance Evaluation Cycle: System Under Evaluation System under Evaluation Use evaluation results Performance in system measures System Model Relate results to - Environment System Model - System Evaluation Model -Input (e.g.traces) Derive -System model Performance Measures -Evaluation method - Compute -Analytical - Simulate -Simulation - Measure -Experimental 4
  • System under Evaluation / Performance measures Performance measures  Network: packet loss, number of consecutive packets lost, number of admitted streams  Content: type of packet lost, synchronization  User perception Congested network (Buffer Video overflow) server End User 5
  • Performance Evaluation Cycle: System Model System under Evaluation Use evaluation results Performance in system measures System Model Relate results to - Environment System Model - System Evaluation Model -Input (e.g.traces) Derive -System model Performance Measures -Evaluation method - Compute -Analytical - Simulate -Simulation - Measure -Experimental 6
  • System Model Congestion can be modelled by - Background traffic - Variable service time in router Background traffic Tagged video source Packet loss 7
  • Performance Evaluation Cycle: Evaluation model and determine performance measures System under Evaluation Use evaluation results Performance in system measures System Model Relate results to - Environment System Model - System Evaluation Model -Input (e.g.traces) Derive -System model Performance Measures -Evaluation method - Compute -Analytical - Simulate -Simulation - Measure -Experimental 8
  • Analytical Approach Frame size trace - bond 250000 I-frames I h1, e 1/ 1 B B P B B P B B P B B B-frames m N I P-frames a 200000 1 r f 1/ N h1, N I 2 / 150000 h2, 1/ 1 s t i 100000 N I B B P B B P B B P B B 2 b 1/ N h1, 50000 N i h1, I hL,L 1/ 0 0 10000 20000 frame number 30000 40000 N I 1 B B P B B P B B P B B L 1/ N N . . approximate B<∞ process packet loss superposition probability 9 of X sources
  • s e c Use of results: Admission Control r u o s 1e-4 admission boundary, buffer = 100 packets x 60 i Theoretical results r Experimental results e 50 t s a 40 f o 30 r e b 20 m u n 10 0 0 5 10 15 20 25 30 35 40 45 number of bond sources 10
  • Experimental Set-up and Results  See demonstration 11
  • Performance Evaluation Cycle: Interpretation of results System under Evaluation Use evaluation results Performance in system measures System Model Relate results to - Environment System Model - System Evaluation Model -Input (e.g.traces) Derive -System model Performance Measures -Evaluation method - Compute -Analytical - Simulate -Simulation - Measure -Experimental 12
  • Use performance evaluation results  Use the IBBT competences to suggest improvements  Examples:  Network groups:  intelligent buffer management schemes (e.g. using thresholds)  Congestion control mechanisms (e.g. explicit congestion notification)  Define efficient admission control algorithms  Video groups:  Codecs using feedback  use layered video coding schemes 13
  • Use of Performance Evaluation Competence Application and Content Network Protocols Layer L2 – L4 Physical Layer 14
  • Application and Content Layer System Measures Application content distribution responsiveness, scalability, web access stability parallel file processing Video and audio analysis of different coding efficiency, Content codecs, error error resilience and resilience concealment, error types and mechanisms, … frequencies, synchronization Traffic Analysis video, (aggregated) burstiness, applications, web peak bitrate, traffic average bitrate, mobility 15
  • Networks Protocols L2 – L4 Layer System Measures Protocol scheduling mechanism, responsiveness, performance MAC protocol, routing scalability, stability, evaluation protocol, auto-configuration functionality, protocol throughput protocol test testing of HGW, suites modem/mux interaction Advanced and protocols for mesh, ad hoc scalability, large scale and wireless sensor robustness, reliability, experimental networks throughput testbeds (fixed and wireless) 16
  • Physical Layer Layer System Measures Signal propagation WiMAX Signal propagation Body Area Networks characteristics 17
  • Generic Test and Measurement Equipment Use of advanced test equipment: E.g. Spirent-Smartbits, Spirent- Avalanche, Agilent-NX2, Opticom- Opera, Tracespan, Fluke-Optiview, … 18
  • Testbeds - Example: Wireless testbed 19