Multimedia Network
Upcoming SlideShare
Loading in...5
×
 

Multimedia Network

on

  • 681 views

Audio and its digitization

Audio and its digitization
Video and its digitization
Bandwidth requirements
Multimedia transmission requirements
Problems with Internet & solutions

Statistics

Views

Total Views
681
Views on SlideShare
681
Embed Views
0

Actions

Likes
1
Downloads
27
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Multimedia Network Multimedia Network Presentation Transcript

  • Eng: Mohammed Hussein1Republic of YemenTHAMAR UNIVERSITYFaculty of Computer Science&Information SystemLecturer, and Researcher atThamar UniversityBy Eng: Mohammed Husseinmohammedhbi@thuniv.net
  • Outline Introduction Audio and its digitization Video and its digitization Bandwidth requirements Multimedia transmission requirements SAS Factors for Audio andVideo Network Performance parameters Multimedia over Internet &Wireless Problems with Internet & solutions2 Eng: Mohammed Hussein
  • IntroductionThe continuous media are played during somewell-defined time interval with user interaction.The most demanding are audio and videoAudio: what we hear through our earsVideo : what we see through our eyes3 Eng: Mohammed Hussein
  • Demand for Multimedia Communication4 Eng: Mohammed Hussein
  • What is Streaming??? Streaming media refers to media types with time constraintsand a continuous Video and audio transmission Playback of streaming media starts while data is beingreceived, i.e. it is not necessary to download the entire filebefore playback starts Advantage: doesn‘t require to download completecontent first (thus it avoids long delays)5 Eng: Mohammed Hussein
  • Example 165Excellent4Good3Fair2Poor1BadPixilated VideoDistorted SpeechEng: Mohammed Hussein
  • 75Excellent3Fair2Poor1Bad4Good Clear VideoClear SpeechEng: Mohammed HusseinExample 2
  • MATLAB SIMULINK To start the Simulink software, you must first start the MATLAB®technical computing environment.You can then start the Simulinksoftware in two ways: On the toolbar, click the Simulink icon. Enter the simulink command at the MATLAB prompt.8 Eng: Mohammed Hussein
  • How we can digitize audio signal? Basic steps:1. Conversion to electronic form using microphone (analog signal )2. Sampling the analog signal based on PAM or PCM.3. Quantization using Analog to Digital converter9 Eng: Mohammed Hussein
  • Eng: Mohammed Hussein10
  • Types of Waveforms (Aperiodic) Signals11 Eng: Mohammed Hussein
  • Why we use these signals? Based on these Signals we can create complex Signals. Sinusoid, unit step and exponential are used to approximatebasically more complex signals. They help us to predict and analysis as we take the signals as inputto the system such as filters, high pass, low pass, band pass, etcfilters and see what the output is.12 Eng: Mohammed Hussein
  • Digitizing audio signalAnalog signalPAM signal (Sampling )Quantized signal13 Eng: Mohammed Hussein
  • Pulse Code Modulation (PCM)Eng: Mohammed Hussein14
  • Discrete-time and continuous-time signals A discrete-time signal is quantities that defined only on a discreteset of times.A simple source for a discrete time signal isthe sampling of a continuous signal, approximating the signal by asequence of its values at particular time instants. A continuous-time real signal is any real-valued (or complex-valued) function which is defined at every time t in an interval, mostcommonly an infinite interval.15 Eng: Mohammed Hussein
  • Analog and digital signals The figure shows a digital signal that results from approximating ananalog signal by its values at particular time instants. Digital signals are discrete and quantized, while analog signalspossess neither property.16 Eng: Mohammed Hussein
  • Amplitude modulationEng: Mohammed Hussein17
  • Carrier and SignalEng: Mohammed Hussein18
  • Bandwidth requirements: Audio NC: Number of channels, UC: Uncompressed, C: Compressed19 Eng: Mohammed Hussein
  • Video Best way to understand electronic video signal is understand howwe get a picture from a digital camera. Electronic sensors Camera Gain Calibration (CCD) convertdifferent levels of light to electronic signals of different amplitudes. It is useful to know this conversion for evaluating the performance ofthe CCD camera. The light is passed through a RGB filter20 Eng: Mohammed Hussein
  • VideoEng: Mohammed Hussein21 The video consist of a sequence of frames The frames are displayed fast enough to get the impression ofmotion. To get flicker free display, the frames are repeated 50 times persecond. Because the property of human eye at the rate of 50 framesper second doesn’t identify the changes in the videoFrame 1 Frame 2
  • Digitizing video Each frame is divided in small grids, called pixels. For black-and-whiteTV, grey level of each pixel is represented by 8bits. In case of color, each pixel is represented by 24 bit, 8-bit for eachprimary colors (R G B ). Assuming that a frame is made of 640*480 pixels, the bandwidthrequirement is 2*25*640*480*24 =368.64 Mbps The 25 frames repeated 2 times with 24 bits color This is too high for transmission without compression through theinternet.22 Eng: Mohammed Hussein
  • Resolution How many Megapixel of your camera? Example of resolution a 3.1 Megapixel digital cameraprovides the following resolution options: The highest resolution 2048* 1536 2048* 1536 1600*1200 1024*768 640*48023 Eng: Mohammed Hussein
  • Video and Audio ConverterEng: Mohammed Hussein24
  • Bandwidth Requirements : Video Now the requirements for applications such as : HDTV,TV,…The CIFand QCIF used for conferencing applications. The number of pixels is horizontal (HR) 1920 and vertically (VR) 1080,with each pixel 24 bits (CR) and 60 frames per second (FR) willRequired 2986 Mbps. But by using compression, it reduced.25Eng: Mohammed Hussein
  • Multimedia Transmission RequirementsEng: Mohammed Hussein26Qualitative requirements are:1. Response of the human ear (20 Hz to 20 KHz, sensitive to changesto signal levels rather than absolute values, for animal can be morethen human ).2. Response of the Human Eye (retains for few msec before decaying)3. Tolerance to Error ( Higher error rate tolerance for uncompressedsignals), also as we communicate with network there became anerror after compressed.4. Tolerance to delay and variation in delay (small for live applications)5. Lip synchronization (most critical aspect) to avoid overlap.
  • Difference with classic applicationsEng: Mohammed Hussein27Highly delay-sensitivePackets are useless if they arrive too lateLoss-tolerant (for the most part)Packet loss can be concealed
  • Performance Parameters Synchronization Accuracy Specification (SAS) factors used tospecify the goodness of sync which are:1. Delay:Acceptable time gap between transmission andreception2. Delay jitter: the variation between the desired presentationtimes and actual presentation times of streamed multimediaobjects.3. Delay skew:Average difference between the desired andactual presentation times.4. Error rate: Level of error is specified by bit-error rate (BER)28 Eng: Mohammed Hussein
  • SAS Factors for Audio Delay : for conversation,one-way delay should be in 100-500msec range, which requires echo cancellation. Delay Jitter: 10 times better than delay Lip synchronization : between audio and video should bebetter than 80 ms BER: required less than 0.01 for telephones. Less than 0.001 for uncompressed CD quality audio less than 0.0001 for compressed CD quality audio29 Eng: Mohammed Hussein
  • SAS Factors for Video Delay and jitterRequired less than 50 ms for HDTVLess than 100 ms for broadcast qualityTVLess than 500 ms for video conference Error rateRequired less than 0.000001 for HDTVLess than 0.00001 for broadcastTVLess than 0.0001 for video conference30 Eng: Mohammed Hussein
  • Traffic Characterization ParametersVariability of bit rates are :1. Constant bit rate (CBR) applications: Example: uncompressed digitized voicetransmission required CBR.2. Variable bit rate (VBR) applications: Example: video transmission using compression Most Multimedia applications generateVBR traffic31 Eng: Mohammed Hussein
  • Network Performance Parameters1. Throughput: Effective rate of transmission of informationbits.2. Delay:Time required for a bit to traverse through a network(end-to-end) (250 msec max).3. Delay variance:Variation of delay as a packet traversesthough a network (10 msec max).4. Error rate: Specified by bit-error rate (BER), Packet errorrate (PER), Packet Loss Rate (PLR) and Cell Loss Rate(CLR). The values of these NPPs, as compared to the bandwidth and SASfactors of an application determines the QoS of a network.32 Eng: Mohammed Hussein
  • Multimedia over Wireless and InternetThe Internet Protocol (IP)-based Internet andwireless systems areconverging into aubiquitous all-IPinformation transportinfrastructure.Thus, will allow mobile andstationary users to accessthe wireless Internet formultimedia servicesanywhere, and anytime.33 Eng: Mohammed Hussein
  • Multimedia over LAN The packet may suffer of a larger number of collisions. So someLAN type are not suitable for multimedia traffic.34 Eng: Mohammed Hussein
  • Why Multimedia over Wireless? WLAN technologies are being increasingly used formultimedia transmissions. IP multimedia is partly real-time, which needs low delay. There will be multiple of simultaneous services, with differentQoS requirements. All of these should be supported by cost-efficient manner byusing network resources efficiently.35 Eng: Mohammed Hussein
  • QoS in Broadband wireless networksNew services based on multimedia applications VoIP Video conferencing Video on demand (VoD)-They demand strict network guarantees such as reservedbandwidth or bounded delays in BroadbandWireless Network(BWN)-The challenge for BWN is to provide QoS simultaneously toservices with different characteristics36 Eng: Mohammed Hussein
  • Wireless Vs. Wired Multimedia supported in wireless networks is muchmore difficult than wired networks. Wireless links are moving and unpredictable. Wired links are fixed position.-That is why the QoS in wireless networks is managed ata MAC layer (medium access control)37 Eng: Mohammed Hussein
  • Multimedia over Cellular Networks38 Eng: Mohammed Hussein Voice over IP
  • Multimedia over Internet The internet was not designed to carry multimediatraffic. The existing protocols are:1. TCP: Includes connection establishment, errorcontrol, flow control and hence unsuitable for real-time multimedia traffic.2. UDP: connectionless protocol at the transport layer,can deliver real-time data if error can be tolerated, aswith uncompressed audio and video.39 Eng: Mohammed Hussein
  • Internet for Multimedia TrafficTo make the internet support multimedia we need someenhancements: Multicasting:Used in audio and video conferencing (one-to-many). Original IP is a “best-effort” is unicast approach. IP multicast: an extension to original IP protocol supports Dynamic and distributed group membership. Multiple group membership. Multiple send/receive nodes. Multicast Backbone (Mbone): Real-world implementation ofIP Multicast40 Eng: Mohammed Hussein
  • Multicast Backbone It can be considered as internet radio orTV. User call up for movie and Mbone net view uncompressedmovies. It implement virtual overlay network on top of internet It consists of multicast islands connected by tunnels41 Eng: Mohammed HusseinMbonetunnel
  • Problems with Internet & solutions : TCP/UDP/IP suite provides best-effort, no guarantees onexpectation or variance of packet delay Performance deteriorate if links are congested (transoceanic) Most router implementations use only First-Come-First-Serve(FCFS) packet processing and transmission scheduling42 Eng: Mohammed Hussein
  • Problems and solutions Limited bandwidth Solution: Compression Packet Jitter Solution: Fixed/adaptive playout delay for Audio (example:phone over IP) Packet loss Solution: FEC, Interleaving43 Eng: Mohammed Hussein
  • Forward Error Control (FEC)Eng: Mohammed Hussein44 FEC is a technique used for controllingerrors in data transmission over unreliableor noisy communication channels. The central idea is the sender encodes theirmessage in a redundant way by usingan error-correcting code (ECC). Packet-level Forward Error Control (FEC)for video streaming over a wirelessnetwork. packet-level interleaving when combinedwith FEC presents a remedy to time-correlated error bursts, though it canfurther increase delay.
  • Problem: Limited bandwidth Intro:Digitalization Audiox samples every second (x=frequency)The value of each sample is rounded to a finitenumber of values (for example 256).This is calledquantization. VideoEach pixel has a colorEach color has a value45 Eng: Mohammed Hussein
  • Problem: Limited bandwidthNeed for compression Audio CD quality: 44100 samples per seconds with 16 bits persample, stereo sound 44100*16*2 = 1.411 Mbps For a 3-minute song: 1.441 * 180 = 254 Mb = 31.75 MB Video For 320*240 images with 24-bit colors 320*240*24 = 1843200/8000= 230KB/image 15 frames/sec: 15*230KB = 3.456MB 3 minutes of video: 3.456*180 = 622MB46 Eng: Mohammed Hussein254/8=31.75
  • Audio compression Several techniques GSM (13 kbps), G.729(8 kbps), G723.3(6.4 and 5.3kbps) MPEG 1 layer 3 (also known as MP3) Typical compress rates 96kbps, 128kbps, 160kbps Very little sound degradation If file is broken up, each piece is still playable Complex (psychoacoustic masking, redundancy reduction, and bitreservoir buffering) 3-minute song (128kbps) : 2.8MB47 Eng: Mohammed Hussein
  • Discrete Cosine Transform (DCT)Eng: Mohammed Hussein48 DCT brought blocks of 8x8 pixels information into frequencydomain.The low frequency means smoother image. Some data appeared more important to others. Because of someproperties of the human eye it isnt necessary give colorinformation for every pixel. So, to reduce image size by usingimportant data only. If the image does not change too rapidly in thevertical direction, than apply DCT to the collection of eight values.‫يالحظ‬ ‫االنسان‬ ‫نظر‬‫اختالفات‬low frequency‫من‬ ‫اكثر‬High frequency
  • Image compression: JPEG Divide digitized image in 8x8 pixel blocks Pixel blocks are transformed into frequency blocks using DCT(Discrete CosineTransform).This is similar to FFT (FastFourierTransform). The quantization phase limits the precision of the frequencycoefficient. The encoding phase packs this information in a dense fashion49 Eng: Mohammed HusseinDCT
  • Square waves to Sine WavesEng: Mohammed Hussein50 Square sampled by 5 different coefficient frequencyHigh FrequencyLow Frequency
  • Square waves to Sine WavesEng: Mohammed Hussein51 If we take out number of high frequency to reduce amount of data(Compression).
  • DCT with Quantization & Run length EncodeEng: Mohammed Hussein52
  • JPEG Compression53 Eng: Mohammed Hussein
  • Video compression Popular techniques MPEG 1 for CD-ROM quality video (1.5Mbps) MPEG 2 for high quality DVD video (3-6 Mbps) MPEG 4 for object-oriented video compression54 Eng: Mohammed Hussein
  • Video Compression: MPEG MPEG uses inter-frame encoding Exploits the similarity between consecutive frames Three frame types I frame: independent encoding of the frame (JPEG) :Coded without reference toother frames. P frame: encodes difference relative to I-frame (predicted) : Coded with reference toa previous reference frame (either I or P). B frame: encodes difference relative to interpolated frame: Coded with reference toboth previous and future reference frames (either I or P). Note that frames will have different sizes Complex encoding, e.g. motion of pixel blocks, scene changes, … Decoding is easier then encoding MPEG often uses fixed-rate encodingI PB B BB B BP55Eng: Mohammed HusseinTransmit OrderIBBPBBPBB
  • MPEG Compression (cont.)56 Eng: Mohammed Hussein
  • MPEG System StreamsEng: Mohammed Hussein57 Combine MPEG video and audio streams in a singlesynchronized stream. Consists of a hierarchy with meta data at every leveldescribing the data: System level contains synchronization information Video level is organized as a stream of pictures group Pictures are organized in slices The motion vector isnt valid for the whole frame. Instead of this theframe is divided into macro blocks of 16x16 pixels.
  • MPEG System Streams (cont.)Eng: Mohammed Hussein58
  • MPEG System Streams (cont.)59 Eng: Mohammed Hussein
  • Problem: Packet Jitter Jitter:Variation in delayExample135 4 3 2SenderNo jitter125 466ReceiverJitterpkt 6pkt 560 Eng: Mohammed Hussein
  • Dealing with packet jitter How does Phone over IP applications limit the effect of jitter? A sequence number is added to each packet A timestamp is added to each packet Playout is delayed Fixed playout delay61 Eng: Mohammed Hussein
  • Dealing with packet jitterAdaptive playout delay Objective is to use a value for p-r that tracks the network delayperformance as it varies during a transfer. The following formulas are used:di = (1-u)di-1 + u(ri – ti) u=0.01 for examplei = (1-u)i-1 + u|ri-ti-di|Whereti is the timestamp of the ith packet (the time pkt i is sent)ri is the time packet i is receivedpi is the time packet i is playeddi is an estimate of the average network delayi is an estimate of the average deviation of the delay from the estimated average delay62 Eng: Mohammed Hussein
  • Problem: Packet lossEng: Mohammed Hussein63 Loss is in a broader sense: packet never arrives or arrives later thanits scheduled playout time Since retransmission is inappropriate for RealTime applications,FEC or Interleaving are used to reduce loss impact.
  • Recovering from packet lossForward Error Correction Send redundant encoded chunk every n chunks (XOR originaln chunks) If 1 packet in this group lost, can reconstruct If >1 packets lost, cannot recover Disadvantages The smaller the group size, the larger the overhead Playout delay increased64 Eng: Mohammed Hussein
  • Recovering from packet loss PiggybackingLo-fi stream With one redundant low quality chunk per chunk, scheme can recoverfrom single packet losses65 Eng: Mohammed Hussein
  • Recovering from packet loss Interleaving Divide 20 msec of audio data into smaller units of 5 msec each andinterleave. Upon loss, have a set of partially filled chunks66 Eng: Mohammed Hussein‫استرجاع‬‫الباكتات‬‫الخسرانه‬
  • Recovering from packet lossReceiver-based Repair The simplest form: Packet repetition Replaces lost packets with copies of the packets that arrivedimmediately before the loss. A more computationally intensive form: Interpolation UsesAudio before and after the loss to interpolate a suitablepacket to cover the loss .67 Eng: Mohammed Hussein
  • Internet for Multimedia Traffic Another enhancements required. UDP is more suitable thanTCP for interactive traffic. But, it requires theservices of RTP. Real-timeTransport protocol (RTP): Application layer protocol designed to handle real time traffic RTP provides and-to-end transport services to real-time audio, videoand simulation data. So RTP used for data communication and RCTPused for control signal communication. However, It does not provide QoS guarantees. RTP encapsulation isonly seen at the end systems: it is not seen by intermediate routers.68 Eng: Mohammed Hussein
  • RTP runs on top of UDP• RTP does not provide any mechanism to ensure timely delivery of data orprovide other quality of service guarantees.• RTP encapsulation is only seen at the end systems: it is not seen byintermediate routers.• Routers providing best-effort service do not make any special effort to ensure thatRTP packets arrive at the destination in a timely matter.69Eng: Mohammed HusseinRTP extend UDP and uses a even numberedUDP port and the next number is used by acompanion protocol RealTimeTransportControl protocol (RCTP).• port numbers, IP addresses• payload type identification• packet sequence numbering• time-stamping
  • RTP packet format PayloadType: 7 bits, providing 128 possible different typesof encoding; eg PCM, MPEG2 video, etc. Sequence Number: 16 bits; used to detect packet loss Timestamp: 32 bytes; gives the sampling instant of the firstaudio/video byte in the packet; used to remove jitterintroduced by the network Synchronization Source identifier (SSRC): 32 bits; an idfor the source of a stream; assigned randomly by the source70 Eng: Mohammed Hussein
  • Timestamp vs. Sequence No Timestamps relates packets to real time Timestamp value sampled from a media specific clock Sequence number relates packets to other packets Audio silence example : Consider audio data type What do you want to send during silence? Not sending anything Why might this cause problems? Other side needs to distinguish between loss and silence Receiver usesTimestamps and sequence No. to figure out whathappened .71 Eng: Mohammed Hussein
  • RTP Control Protocol (RTCP) Used in conjunction with RTP. Used to exchange control informationbetween the sender and the receiver. Three reports are defined: Receiver reception, Sender, and Sourcedescription. Reports contain statistics such as the number of packets sent, numberof packets lost, inter-arrival jitter Typically, limit theRTCP bandwidth to 5%.Approximately onesender report for threereceiver reports72 Eng: Mohammed Hussein
  • Feedback During Media Streaming Feedback can be used to control performance Sender may modify its transmissions based on feedback73 Eng: Mohammed Hussein
  • Streaming Stored Multimedia Example Audio/Video file is segmented and sent over eitherTCP orUDP, public segmentation protocol: Real-Time Protocol(RTP) User interactive control is provided, e.g. the public protocolRealTime Streaming Protocol (RTSP)74 Eng: Mohammed Hussein
  • Streaming Stored Multimedia Example Helper Application: displays content, which is typicallyrequested via aWeb browser; e.g. RealPlayer; typical functions: Decompression Jitter removal Error correction: use redundant packets to be used for reconstructionof original stream GUI for user control75 Eng: Mohammed Hussein
  • Streaming from a Web Server (cont) Alternative: set up connection between server and player, thendownload. Web browser requests and receives a Meta File.(a file describing the object) instead of receiving the file itself; Browser launches the appropriate Player and passes it the MetaFile; Player sets up aTCP connection with a streaming server anddownloads the file.76 Eng: Mohammed Hussein
  • Options when using a streaming server Use UDP, and Server sends at a rate (Compression andTransmission) appropriate forclient; to reduce jitter, Player buffers initially for 2-5 seconds, then starts display. UseTCP, and sender sends at maximum possible rate underTCP; retransmit when erroris encountered; Player uses a much large buffer to smooth delivery rate ofTCP77 Eng: Mohammed Hussein
  • Real Time Streaming Protocol (RTSP) For user to control display: rewind, fast forward, pause, resume, etc… Out-of-band protocol (uses two connections, one for control messages(Port 554) and one for media stream) RFC 2326 permits use of eitherTCP or UDP for the control messagesconnection, sometimes called the RTSP Channel As before, meta file is communicated to web browser which thenlaunches the Player; Player sets up an RTSP connection for controlmessages in addition to the connection for the streaming media78 Eng: Mohammed Hussein
  • Meta File Example<title>Twister</title><session><group language=en lipsync><switch><track type=audioe="PCMU/8000/1"src = "rtsp://audio.example.com/twister/audio.en/lofi"><track type=audioe="DVI4/16000/2" pt="90 DVI4/8000/1"src="rtsp://audio.example.com/twister/audio.en/hifi"></switch><track type="video/jpeg"src="rtsp://video.example.com/twister/video"></group></session>79 Eng: Mohammed Hussein
  • RTSP OperationsC: SETUP rtsp://audio.example.com/twister/audio RTSP/1.0Transport: rtp/udp; compression; port=3056; mode=PLAYS: RTSP/1.0 200 1 OKSession 4231C: PLAY rtsp://audio.example.com/twister/audio.en/lofi RTSP/1.0Session: 4231 Range: npt=0-S: RTSP/1.0 200 1 OKSession 4231C: PAUSE rtsp://audio.example.com/twister/audio.en/lofi RTSP/1.0Session: 4231 Range: npt=37S: RTSP/1.0 200 1 OKSession 4231C:TEARDOWN rtsp://audio.example.com/twister/audio.en/lofi RTSP/1.0Session:4231S: RTSP/1.0 200 1 OKSession 423180 Eng: Mohammed Hussein
  • References Streaming in Mobile Networks Jun-Zhao Sun, Douglas Howie,Antti Koivisto and Jaakko Sauvola,“AHierarchical Framework Model of Mobile Security,” MediaTeam Oulu, infotechOulu, Finland, 2001, pp.A56-A59 Claudio Cicconetti, Luciano Lenzini and Enzo Mingozzi,“Quality of ServiceSupport inWireless Networks,” University of Pisa, Nokia Research Center,Italy,April 2006, pp. 50-55 http://www.ece.umd.edu/~minwu/public_paper/Jnl/0603MuserVideo_IEEEfinal_NetMag.pdf http://research.microsoft.com/china/papers/Scalable_Video_Coding_Transport.pdf http://islab.cis.nctu.edu.tw/seminar/doc/1.pdf http://www.tml.tkk.fi/Opinnot/T-110.5120/2005/slides/06a.Streaming%20Multimedia%20Architecture,%20Codecs%20and%20QoS2.pdf http://www.hindawi.com/journals/am/2009/982867/81 Eng: Mohammed Hussein