1– Introduction To  Direct Show
Upcoming SlideShare
Loading in...5
×
 

1– Introduction To Direct Show

on

  • 1,473 views

Introduction to DirectShow programming 1st of nine lectures

Introduction to DirectShow programming 1st of nine lectures

Statistics

Views

Total Views
1,473
Slideshare-icon Views on SlideShare
1,471
Embed Views
2

Actions

Likes
1
Downloads
31
Comments
0

1 Embed 2

http://www.slideshare.net 2

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    1– Introduction To  Direct Show 1– Introduction To Direct Show Presentation Transcript

    • (1) Introduction to DirectShow Microsoft’s Multimedia Streaming Architecture
    • Intel - Motivation & Goals
      • DirectShow is the most common architecture for:
      • STB and IP-STB media streaming and broadcast reception
      • PC Streaming
      • Goals
      • Understand STB architecture, its advantages and disadvantages
      • Fast prototyping and application building for testing and demonstrations
    • What is DirectX
      • DirectX is Microsoft's component based architecture for media processing.
      • Similar streaming architecture were designed by:
        • Real
        • Emblaze (RapidMedia)
        • Linux (GStreaming)
      • Media processing
        • Audio
        • Video
        • Transport streams
        • text.
    • DirectX – Why we need it?
      • Software reusability
        • Code blocks can be reused and inherited
        • Common Interfaces
      • Easy connection between 3 rd party modules
      • Which components?
        • Compression
        • color conversion
        • audio effects
        • Transport stream handlers
    • A bit of History DirectX Evolution from proprietary Codec to Filter base architecture
    • VFW
      • VFW architecture – a Plug-In architecture that supported CODEC and CAPTURE Interfaces for Audio & Video stream manipulation.
      • Advantages – All the CODECs have the same interfaces. There is no need specific modifications for each codec
      Capture CODEC Control Interfaces IO Interfaces
    • VFW Problems
      • Rigid architecture
      • Only three types of filters
      • How to add costume filters that split or mux transport stream?
      • Rigid interfaces creates a problem to the future progress of the architecture
    • Active Movie
      • A Filter base architecture that support customize filters
      • The architecture is built using:
        • Filters that manipulates the stream
        • Pins that handle filter connection
        • graph management which
          • Connect the filters (automatically)
          • Manage filters states (run, pause, stop)
          • Transfer messages between filters
    • DirectX
      • Evolution of Active Movie (Audio & Video) to other aspects:
        • DirectShow – (previously ActiveMovie) Audio/Video/Text.
        • DirectDraw/Direct3D – Graphics.
        • DirectSound – Sound effects and 3D sound.
        • DirectPlay/DirectInput – Multiplayer Game tools.
        • DirectMusic – Automatic music generation
    • Introduction to Directshow Hands-on DirectShow : GraphEdit
    • GraphEdit
      • Visual Interface for graph building and testing.
    • GraphEdt – Rendering a file
      • Render a media file
      • The graph build is what the OS does when you double-click on a file.
      • File rendering options:
        • Drag a media file to graph edit
        • Use File->Open File
        • Use:
          • graph->Insert filters use the circled button as a short cut
          • Select File Source Async.
          • Right click on output pin and select render
    • GraphEdt - Play with it
      • Use the: Play, Pause, and Stop commands
        • Notice the file progress bar
      • Use the slider to change the current play position
    • GraphEdit – A bit more
      • Try looking at filter properties:
      • Use right-click on the filter and select “Filter Properties”
      • We will later look at the Interface that enables the communication to property pages.
      • Example of Source file property
      • This property is used only for monitoring and not for controls. Why?
    • The “Metargemet” video
      • Lets view it using Graphedt Media Metargemet.wmv
      • This video is a nice example for information processing architecture
      • The information is:
        • Generated at the source (Keren)
        • Processed and translated by a “transform unit” (Riki – translator)
        • Rendered to the students
        • Feedback messages are passed from the Renderer to the source via the translator
      Keren Mor Info. Source Riki Blich “ Transform” Student “ Renderer "
    • DirectShow Basic ideas
      • Information processing is a sequential process
      • Each processing unit has a clear function
      • The Source “Push” or pump information to processing units (Blue). Data information is always Unidirectional: “ Downstream ”
      • Feedback is sent from the Renderer(s) to the source (Orange) – “ Upstream ”
      • No Information loops
      Keren Mor Source Riki Blich “ Transform” Student “ Renderer "
    • Information passing architecture
      • There are basically two methods:
        • Push
        • Pull
      • Those architecture differ in
        • threading,
        • Which filter is the active filter
        • Where is the “Brain” of the graph
    • Push Architecture
      • Source is active
      • Source generate Media Samples
      • information is pushed by the source
      • Like in the case of the “Metargemet”
      • Source is active, owns the “processing thread”
      • Used in
        • ASF/WMA/WMV files
        • Video capture (USB cam, video cards)
        • BDA capture
    • Pull Architecture
      • Information is pulled by a non-source filter
      • Source is passive
      • Does not own the “processing thread”
      • In most file processing schemes (AVI, MPEG2) the source is “dumb” and passive and the parser (demuxer) holds the “knowledge” of file parsing.
    • Components required
      • Data Processing unit– used for:
        • Media data Generation
        • Media data manipulation/transform
        • Media data rendering
      • Interfaces – Generic Interfaces between data processing units
        • In “Metargemet” we use “Air Interface” with two basic media descriptors
      • Media descriptors
        • In “Metargemet” we use two media types:
          • Archaic speech media between Source and Transform
          • “ Sleng” between Transform (Riki) and Renderer (Student)
      • Management unit (in “Metargem” – class/school)
    • Naming
      • Data processing unit = Filter
        • Used for data manipulation and transform.
      • Filter Interface = Pin
        • Used to define the connection between filters
        • Include information about:
          • Direction
          • Memory required
          • Media Type
    • Naming - 2
      • Media Descriptor = Media Type
        • Include information about the media passed
      • Data Unit = Media Sample
        • Include the data itself
        • Include information about the data
    • Filters
      • ActiveX objects
        • (Use COM Interfaces)
      • Manipulates data
      • Have Pin as members for interfacing other filters
      • Three basic types: Source, Render, Transform
      • Identification: GUID
      • Control – property pages
    • Pin
      • Filters have one or more IO Interfaces - pins
      • Pin is a class which basically holds interfaces to which connect processing modules (filters)
      • Pins could be exposed or hidden
        • Look at Infinite Pin tee filter
        • Dynamic creation of pins for many types of filters:
          • Infinite Pin T
          • MPEG and AVI Splitters
      • Two basic type:
        • Input
        • Output
    • Graph Builder
      • Create a streaming process for example:
        • Read from file
        • Demux
        • Decode
        • Encode
        • Stream to network
      • Tasks:
        • Loads the filters
        • connect Filters
        • Change filter states
        • Disconnect filters
        • Remove filters from graph.
      • The graph is always unidirectional (No loops)
      • Have three basic states: Play, Pause, Stop.
    • DMOs
      • DirectX® Media Objects ( DMOs )
      • Create stream processing object that could operate without a graph (Filters could only work as part of a graph)
      • DMOs could operate inside a graph using a “DMO Wrapper” which acts as a filter
      • Advantage
        • Simpler interface
        • No need to create a graph