Slides

1,305 views
1,228 views

Published on

0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,305
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
93
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide
  • Mapping-divide into 32 subbands, or frequency samples Psychoacoustic-…below which noise is imperceptible to the human ear (Map & Psycho can be done independently Bit-Allocation-total noise to mask ratios can be minimized, over all the channels and subbands Frame Packing – header includes bit allocation and scaling information (scale factor) Quantizer & Coding – scaled and quantized according to the bit allocation
  • MPEG-2 is an extension of the powerful video compression capabilities of MPEG-1 system to provide digital compression for audio and video signals. MPEG-2 is capable of coding standard-definition television at bit rate from approximately 3-15 Mbits per second and high-definition television at 15-30 Mbits per second.
  • Part 1 is very similar to part 1 of MPEG-1. This important function allows the streams to combine into one so that the data are well suited to digital storage or transmission. This is specified into 2 forms: the Program Stream and the Transport Stream. The Program Stream is similar to MPEG-1 Systems Multiplex. It results from combining 1 or more Packetised Elementary Streams (PES), which have a common time base, into a single stream. The Transport Stream combines one or more Packetized Elementary Streams (PES) with one or more independent time bases into a single stream. Part 2 of MPEG-2 builds on the powerful video compression capabilities of the MPEG-1 standard to offer a wide range of coding tools. Part 6 Digital Storage Media Command and Control (DSM-CC) is the specification of a set of protocols which provides the control functions and operations specific to managing MPEG-1 and MPEG-2 bitstreams. These protocols may be used to support applications in both stand-alone and heterogeneous network environments. Part 7: not constrained to be backwards-compatible with MPEG-1 Audio. Part 8 of MPEG-2 was originally planned to be coding of video when input samples are 10 bits. Work on this part was discontinued when it became apparent that there was insufficient interest from industry for such a standard. Part 9 of MPEG-2 is the specification of the Real-time Interface (RTI) to Transport Stream decoders which may be utilised for adaptation to all appropriate networks carrying Transport Streams.
  • Every macro block contains 4 luminance blocks and 2 chrominance blocks . Every block has a dimension of 8x8 values. The luminance blocks contain information of the brightness of every pixel in macro block . The chrominance blocks contain color information. Because of some properties of the human eye it isn't necessary to give color information for every pixel. Instead 4 pixels are related to one color value.
  • The MPEG file consists of compressed video data, called the video stream. The basic unit of the video stream is a "Group of Pictures" (GOP), made up of three picture types, also called frames: I, P, and B. The ‘I’-frames can be restructured without any references to other frames. On average, the ‘I’-frames can occur one in every ten-fifteen frames of motion picture. This type of frames contains information only about itself. ‘ P’-frames can only be recreated by references from previous I-frame or P-frame; it is impossible to construct them without any data of another frame. The ‘B’-frames are referred to as bi-directional frames, because they can be recreated based on forward and backward predictions from the information presented in the nearest preceding and following ‘I’ or ‘P’ frame.
  • Video compression is based on eliminating repeating information, thus, making a file smaller without effecting its quality. Spatial redundancy occurs because parts of the picture (called pixels) are often replicated (with minor changes) within a single frame of video. Temporal redundancy arises when consecutive frames of video display images of the same scene. It is common for the content of the scene to remain fixed or to change only slightly between successive frames.
  • MPEG compression is accomplished by four basic techniques: pre-processing, temporal prediction, motion compensation, and quantization coding. Pre-processing filters out non-essential visual information from the video signal, information that is difficult to encode but not an important component of human visual perception.
  • Temporal redundancy (between frames) Motion Estimation is usually the most time consuming part of MPEG encoding. Discrete Cosine Transformation (DCT) is a process of representing original data as a linear sum of basic cosine functions with different frequencies. Each image delivers one brightness and two color signals per pixel. The DCT converts these signals into frequency coefficients containing the color and brightness information. The signals can then be compressed more easily.
  • After converting the input block of 8x8 pixels into an 8x8 block of DCT coefficients, the quantization is applied to the DCT block. The lower frequency region takes minimum quantization, the higher frequency region takes maximum quantization. Since the accuracy of DCT coefficient will be reduced by quantization, the degree and the shape of quantization is design for different picture type (Intra or NonIntra) due to not cause visible damage to the reconstructed video. Quantization is the use of complex mathematical operations to ensure that image parts that are important to the human eye are represented precisely and irrelevant information is represented with less precision.
  • Submerged: MPEG-4 was first introduced in 1998 and takes advantage of nearly a decade in improvements to the MPEG algorithm compression. It is a graphics and video compression algorithm standard that is based on MPEG-1 and MPEG-2 and Apple QuickTime technology. Advantages: The main difference of the new standard MPEG-4 with respect to MPEG-1 and MPEG-2 in terms of requirements and functionalities are that it goes beyond the goals of making the storage and transmission of digital audiovisual material more efficient, by compressing data. It specifies a description of digital audiovisual scenes in the form of objects that have certain relations in space and time. MPEG-1 and MPEG-2 were all designed for the compression of linear content. The MPEG-4 standard introduced what is termed the systems introducing object orientation and this allows for the secure delivery of interactive multimedia.
  • Part 1 Systems: management and protection of intellectual property = royalty rate for each encode and decode Part 2 Visual: natural = pictures from camera, camcorder, teleconferencing (involving real people) / synthetic: drawings of 2D-3D objects, cartoon characters. Part 3 Audio: natural = voice or sound of something moving / synthetic = music Part 4 Conformance Testing: test MPEG-4 implementations Part 5 Reference Software: implements products and tools that can be used according to ISO (int’l organization for standardization …) Part 6 DMIF- deals with delivery technologies and defines session protocols (in the application layer) of a network to manage the multimedia streaming This is of MPEG-4 version 1, version 2 primarily consists of visual, audio, systems, and DMIF. Backward compatible extension.
  • Object-Oriented: in the way that audiovisual objects coded allows MPEG-4 to combine 2-D, 3-D graphics, animation and interactivity with audiovisual and other powerful tools like content security, targeting, and personalization. Low Data Rate: MPEG-4 video is optimized for low (<64 kbps), intermediate (64-384kbps), and high ( 384-4Mbps) bitrates. Interoperability: The MPEG-4 standard consists of a collection of tools that support application areas. The method that MPEG-4 uses to code object creates conformance points that provide the basis for interoperability. MPEG-4 does not target a major “killer application”, but what it does is that it opens methods in playing with audiovisual scenes, creating reusing, accessing and consuming so that audiovisual content will become easier. Currently companies such as Apple, Cisco, Kaseena, Phillips and Sun Microsystems, along with others are joining forces to increase the deployment of interoperable solutions.
  • Here is an example of the MPEG-4 Object based coding architecture. The primitive visual object and primitive audio object are encoded along with the computer information, which consists of what parameters the author has defined for the scene. It gets encoded along with the Binary Format for the scene. These three are then multiplexed and synchronized with the individual coded audiovisual objects of the scene. It goes through the network and storage device and then de-multiplexed and decoded. In addition, the coded audio visual objects are decoded and sent to compose the scene. Uncoded Audio visual objects are also sent to compose the scene as well. This allows the streamed data to be applied to media objects, in order to modify their attributes. AND THIS CODING CREATES THE SCENE
  • HERE IS AN EXAMPLE OF A MPEG-4 SCENE. MPEG-4 provides a standardized way to describe a scene. And its by Positioning Objects in space and time meaning that Media objects are placed anywhere in a given coordinate system. Sprite and voice are separated to form primitive object. 2.Primitive media objects are grouped in order to form compound media objects. Then Streamed data is applied to media objects, in order to modify their attributes (e.g. a sound, a moving texture belonging to an object; animation parameters driving a synthetic face). And here is where the user can change viewing and listening points anywhere in the scene.
  • Digital TV : MPEG-4 allows increased text, picture, audio, or graphics to be controlled by the user so that entertainment value can be added to certain programs or provide valuable information unrelated to the current program to the interest of the viewers: Examples of increased functionalities would be TV station logos, customized advertising, multi-window screen formats allowing display of sports stats or stock quotes using data-casting. Mobile multimedia: This is in regards to the enormous popularity of cell phones and palm computers. MPEG-4 deals with mobile devices in the manner that it can deal with the narrow bandwidth and limited computational capacity. (does this by improving error resilience, code efficiency, and flexibility of resources) TV Production: Since MPEG-4 focuses on coding of audiovisual objects instead of rectangular linear video frames, it allows high quality and more flexible scenes. An example is how local TV stations could inject regional advertisement video objects that would be better suited when international programs are broadcasted depending on the targeted viewers. Games: In dealing with games the main focus is user interaction. MPEG-4’s role is it allows video objects in games to be even more realistic. An example: the creator can personalize games by using personal video data bases linked in real-time into the games. Streaming Video: S treaming video over the internet is becoming very popular. Examples are news updates and live music shows. Bandwidth is limited due to the use of modems and transmission reliability is an issue when packet loss may occur. MPEG-4 has improved on scalability of bitstream in term of the temporal and spatial resolution that (Larisa) has mentioned previously about.
  • 1. Multimedia such as audio, video or multimedia files — are encoded, secured, transmitted and viewed. 2. The Digital Items can be considered the “what” of the Multimedia Framework (e.g., a video collection, a music album) and the Users can be considered the “who” of the Multimedia Framework. 3. MPEG-21 has been tabbed as the standard for the 21st Century.
  • Slides

    1. 1. Analysis of MPEG Presented By: Gray Consultants
    2. 2. Gray Consultants <ul><li>In the order of appearance: </li></ul><ul><li>Kristie Hoang </li></ul><ul><li>Loc Trieu </li></ul><ul><li>Larissa Bachinskaya </li></ul><ul><li>Tina Nguyen </li></ul><ul><li>Amy Quach </li></ul>
    3. 3. MPEG: the Organization <ul><li>Moving Picture Experts Group </li></ul><ul><li>Established in 1988 </li></ul><ul><li>Standards under International Organization for </li></ul><ul><li>standardization (ISO) and International Electro </li></ul><ul><li>technical Commission (IEC) </li></ul><ul><li>Official name is: ISO/IEC JTC1 SC29 WG11 </li></ul>
    4. 4. MPEG vs. Competitor <ul><li>Generally produces better quality than the other formats such as: </li></ul><ul><ul><li>Video for Window </li></ul></ul><ul><ul><li>Index and QuickTime </li></ul></ul><ul><li>MPEG audio/video compression can be used many applications: </li></ul><ul><ul><li>DVD player </li></ul></ul><ul><ul><li>HDTV recorder </li></ul></ul><ul><ul><li>Internet Video </li></ul></ul><ul><ul><li>Video Conferences </li></ul></ul><ul><ul><li>Others </li></ul></ul>
    5. 5. MPEG Overview <ul><li>MPEG-1 : a standard for storage and retrieval of moving pictures and audio on storage media </li></ul><ul><li>MPEG-2 : a standard for digital television </li></ul><ul><li>MPEG-4 : a standard for multimedia applications </li></ul><ul><li>MPEG-7 : a content representation standard for information search </li></ul><ul><li>MPEG-21 : offers metadata information for audio and video files </li></ul>
    6. 6. MPEG 1 <ul><li>First standard to be published by the MPEG organization (in 1992) </li></ul><ul><li>A standard for storage and retrieval of moving pictures and audio on storage media </li></ul><ul><li>Example formats: VideoCD (VCD), mp3, mp2 </li></ul>
    7. 7. 5 Parts of MPEG 1 <ul><li>Part 1: Combining video and audio inputs into a single/multiple </li></ul><ul><li>data stream </li></ul><ul><li>Part 2: Video Compression </li></ul><ul><li>Part 3: Audio Compression </li></ul><ul><li>Part 4: Requirements Verification </li></ul><ul><li>Part 5: Technical report on the software implementation of the </li></ul><ul><li>Parts 1 - 3 </li></ul>
    8. 8. Basic Structure of Audio Encoder Note: A decoder basically works in just the opposite manner
    9. 9. Processes of and Audio Encoder <ul><li>Mapping Block – divides audio inputs into 32 equal-width frequency subbands (samples) </li></ul><ul><li>Psychoacoustic Block – calculates masking threshold for each subband </li></ul><ul><li>Bit-Allocation Block – allocates bits using outputs of the Mapping and Psychoacoustic blocks </li></ul><ul><li>Quantizer & Coding Block – scales and quantize (reduce) the samples </li></ul><ul><li>Frame Packing Block – formats the samples with headers into an encoded stream </li></ul>
    10. 10. MPEG-1 Layers I, II, III <ul><li>MPEG layer differences lie in processing power and resulting audio/sound quality </li></ul><ul><ul><li>Mp1 – little processing needed, poor quality </li></ul></ul><ul><ul><li>Mp2 – minimal processing, “okay” quality </li></ul></ul><ul><ul><li>Mp3 – massive processing, high “CD” quality </li></ul></ul>
    11. 11. MPEG-2 Overview <ul><li>Extends video & audio compression of MPEG-1 </li></ul><ul><li>- Substantially reduces bandwidth required for high-quality </li></ul><ul><li> transmissions </li></ul><ul><li>- Optimizes balance between resolution (quality) and </li></ul><ul><li>bandwidth (speed) </li></ul>
    12. 12. 10 Parts of MPEG-2 <ul><li>Part 1 : Combine video and audio data into single/multiple streams </li></ul><ul><li>Part 2: Offers more advanced video compression tools </li></ul><ul><li>Part 3: Is a multi-channel extension of the MPEG-1 Audio standard </li></ul><ul><li>Part 4/5: Correspond to and build on part 4/5 of MPEG-1 </li></ul><ul><li>Part 6: Specifies protocols of managing MPEG-1 & MPEG-2 bitstreams </li></ul><ul><li>Part 7: Specifies a multi-channel audio coding algorithm </li></ul><ul><li>Part 8: (was discontinued because of obsolescence) </li></ul><ul><li>Part 9: specifies the Real-time Interface (RTI) to Transport Stream decoders </li></ul><ul><li>Part 10: the conformance part of Digital Storage Media Command and Control (currently under development) </li></ul>
    13. 13. MPEG-2 Video Compression Overview VIDEO STREAM DATA HIRERARCHY
    14. 14. MPEG-2 Video Compression Overview <ul><li>Video stream </li></ul><ul><ul><li>Group of Pictures (GOP) </li></ul></ul><ul><ul><ul><li>I-frames : can be reconstructed without any reference to other frames </li></ul></ul></ul><ul><ul><ul><li>P-frames : forward predicted from last I-frame and P-frames </li></ul></ul></ul><ul><ul><ul><li>B-frames: forward and backward predicted </li></ul></ul></ul>
    15. 15. MPEG-2 Video Compression Overview <ul><li>Compression: Eliminating Redundancies </li></ul><ul><ul><li>Spatial Redundancy </li></ul></ul><ul><ul><ul><li>Pixels are replicated within a single frame of video </li></ul></ul></ul><ul><ul><li>Temporal Redundancy </li></ul></ul><ul><ul><ul><li>Consecutive frames of video display images of the same scene </li></ul></ul></ul>
    16. 16. MPEG-2 Video Compression Overview <ul><li>Four Video Compression Techniques: </li></ul><ul><li>1. Pre-processing </li></ul><ul><li>2. Temporal Prediction </li></ul><ul><li>3. Motion Compensation </li></ul><ul><li>4. Quantization </li></ul>
    17. 17. MPEG-2 Video Compression Overview <ul><li>Pre-processing </li></ul><ul><ul><li>Filters out unnecessary information </li></ul></ul><ul><ul><ul><li>Information that is difficult to encode </li></ul></ul></ul><ul><ul><ul><li>Not an important component of human visual perception </li></ul></ul></ul>
    18. 18. MPEG-2 Video Compression Overview <ul><li>Temporal Prediction: </li></ul><ul><ul><li>Uses the mathematical algorithm Discrete Cosine Transform (DCT) to: </li></ul></ul><ul><ul><ul><li>Divide each frame into 8X8 blocks of pixels </li></ul></ul></ul><ul><ul><ul><li>Reorganize residual differences between frames </li></ul></ul></ul><ul><ul><ul><li>Encode each block separately </li></ul></ul></ul>
    19. 19. MPEG-2 Video Compression Overview
    20. 20. MPEG-2 Video Compression Overview
    21. 21. MPEG-2 Video Compression Overview
    22. 22. MPEG-2 Video Compression Overview <ul><li>Quantization: </li></ul><ul><ul><li>Refers to DCT coefficients </li></ul></ul><ul><ul><li>Removes subjective redundancy </li></ul></ul><ul><ul><li>Controls compression factor </li></ul></ul><ul><ul><li>Converts coefficients into even smaller numbers </li></ul></ul>
    23. 23. MPEG-2 Video Compression Overview <ul><li>Multimedia Communications </li></ul><ul><li>Webcasting </li></ul><ul><li>Broadcasting </li></ul><ul><li>Video on Demand </li></ul><ul><li>Interactive Digital Media </li></ul><ul><li>Telecommunications </li></ul><ul><li>Mobile communications </li></ul>Where It Is Used:
    24. 24. MPEG-2 Transmission Overview <ul><li>Building the MPEG Bit Stream: </li></ul><ul><li>Elementary Stream (ES) </li></ul><ul><li>- Digital Control Data </li></ul><ul><li>- Digital Audio </li></ul><ul><li>- Digital Video </li></ul><ul><li>- Digital Data </li></ul><ul><li>Packetised Elementary Stream (PES) </li></ul><ul><li>- Each ES combined into stream of PES packets. </li></ul><ul><li>- A PES packet may be fixed (or variable) sized block. </li></ul><ul><li>- Each block has up to 65536 bytes per block and a 6 byte protocol header. </li></ul>
    25. 25. MPEG-2 Transmission Cont. <ul><li>MPEG-2 Multiplexing </li></ul><ul><li>MPEG Program Stream </li></ul><ul><li>- Tightly coupled PES packets </li></ul><ul><li>- Used for video playback and network application </li></ul><ul><li>MPEG Transport Stream </li></ul><ul><li>- Each PES packet is broken into fixed-sized transport packets </li></ul>
    26. 26. MPEG Transport Streams
    27. 27. Combining ES from Encoders into a Transport Stream
    28. 28. Single & Multiple Program Transport Streams
    29. 29. Format of a Transport Stream Packet
    30. 30. MPEG-2 Encoders
    31. 31. Types of MPEG-2 Decoders <ul><li>1. MPEG-2 Software Decoder & PC-Based Accelerator </li></ul><ul><li>2. MPEG-2 Computer Decoder </li></ul><ul><li>3. MPEG-2 Network Computers/Thin Clients </li></ul><ul><li>4. MPEG-2 Set-Top Box </li></ul><ul><li>5. MPEG-2 Consumer Equipment </li></ul>
    32. 32. MPEG-4 Overview <ul><li>Submergence </li></ul><ul><ul><li>Handle specific requirements from rapidly developing multimedia applications </li></ul></ul><ul><li>Advantages over MPEG-1 and MPEG-2 </li></ul><ul><ul><li>Object-oriented coding </li></ul></ul>
    33. 33. MPEG-4 Standard: 6 Parts Overview <ul><li>Part 1: Systems - specifies scene description, multiplexing, synchronization, buffer </li></ul><ul><li> management, and management and protection of intellectual property. </li></ul><ul><li>Part 2: Visual - specifies the coded representation of natural and synthetic visual objects . </li></ul><ul><li>Part 3: Audio - specifies the coded representation of natural and synthetic audio objects. </li></ul><ul><li>Part 4: Conformance Testing - defines conformance conditions for bit streams and </li></ul><ul><li>devices; this part is used to test MPEG-4 implementations. </li></ul><ul><li>Part 5: Reference Software - includes software corresponding to most parts of MPEG-4, </li></ul><ul><li>it can be used for implementing compliant products as ISO </li></ul><ul><li>waives the copyright of the code. </li></ul><ul><li>Part 6: Delivery Multimedia Integration Framework (DMIF) - defines a session protocol </li></ul><ul><li>for the management of multimedia streaming over generic delivery technologies. </li></ul>
    34. 34. Features & Functionalities <ul><li>Object Oriented </li></ul><ul><ul><li>Primitive Audiovisual Objects are Coded </li></ul></ul><ul><li>Low Data Rate </li></ul><ul><ul><li>Allows for high quality video at lower data rates and smaller file size </li></ul></ul><ul><li>Interoperability </li></ul><ul><ul><li>Opens methods in playing with audiovisual scenes </li></ul></ul>
    35. 35. MPEG-4 Object Based Coding Architecture
    36. 36. MPEG-4 Scene
    37. 37. Targeted Applications <ul><li>Digital TV </li></ul><ul><ul><li>TV logos, Customized advertising, Multi-window screen </li></ul></ul><ul><li>Mobile multimedia </li></ul><ul><ul><li>Cell phones and palm computers </li></ul></ul><ul><li>TV production </li></ul><ul><ul><li>Target viewers </li></ul></ul><ul><li>Games </li></ul><ul><ul><li>Personalize games </li></ul></ul><ul><li>Streaming Video </li></ul><ul><ul><li>News updates and live music shows over Internet </li></ul></ul>
    38. 38. MPEG 7 <ul><li>Another ISO/IEC standard being developed by MPEG </li></ul><ul><li>Content representation standard for information search </li></ul><ul><li>Makes searching the Web for multimedia content as easy as </li></ul><ul><li>searching for text-only files </li></ul><ul><li>Operates in both real-time and non real-time </li></ul><ul><li>environments </li></ul>
    39. 39. The Future: MPEG21 <ul><li>“ Multimedia framework” </li></ul><ul><li>Based on two essential concepts: </li></ul><ul><ul><ul><li>Digital Item </li></ul></ul></ul><ul><ul><ul><li>Concept of Users interacting with Digital Item </li></ul></ul></ul><ul><li>More universal framework for digital content protection </li></ul><ul><li>Most of MPEG-21’s elements are set for completion in 2003 and 2004. </li></ul>

    ×