• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Using neural networks in active queue manegment
 

Using neural networks in active queue manegment

on

  • 480 views

 

Statistics

Views

Total Views
480
Views on SlideShare
480
Embed Views
0

Actions

Likes
0
Downloads
12
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Using neural networks in active queue manegment Using neural networks in active queue manegment Presentation Transcript

    • USING NEURAL NETWORKS IN ACTIVE QUEUE MANAGEMENT
      MARCIN WASOWSKI
    • QUEUE MANAGEMENT
    • QUEUE MANAGEMENT
      PASIVE
      1) drop-head: when a packet arrives to a full queue, drop the first packet in line
      2) random drop: when a packet arrives to a full queue, drop random packet (more complex)
      ACTIVE
      It is a technique that consists in dropping packets before a router's queue is full
    • ACTIVE QUEUE MANAGEMENT
      Active approach: early dropping when congestion arises
      – give sources enough time to react to congestion before queues fill up
      – do not keep queues full
      – drop packets selectively to avoid global synchronization
    • RED is an Active Queue Management scheme for Internet routers
      – tailored for TCP connections across IP routers
      RED design goals
      – congestion avoidance
      – global synchronization avoidance
      – avoidance of bias against bursty traffic
      – bound on average queue length to limit delay
      RED – RandomEarlyDetection
    • RED – Random Early Detection
      avg(t)=(1-w)*avg(t-1) + w*q(t)
      p(t)= Maxdrop *(avg(t)-Minth)/(Maxth-Minth)
    • ARED
      Minth and Maxth are changing
      the packet drop rate increases linearly from
      zero, when the average queue size is at the RED parameter minth, to a drop rate of when the average queue size reaches maxth.
    • WRED
      Weight Random Early Detection
    • USING NEURAL NETWORKS
    • STEPS
      1) SIMULATION NETWORK WORKING IN NS 2 (NETWORK SIMULATOR 2) SOFTWARE
      2) LEARNING NEURAL NETWORKS
    • END