Adobe OSMF Overview


Published on

Overview of Adobe OSMF Platform for video integration with advertisement, analytics, CDNs etc

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Adobe OSMF Overview

  1. 1. Flash & Flex Developers Adobe OSMF - Part I By Chaim Sajnovsky / DSP-IP Due to flash video exponential growing during the last years, a huge myriad of video players and flash video-based applications were developed. Even when most of them work well, there was a sense of lack of best practices and frameworks for this new unstoppable world. Some time circa the middle of 2009 Adobe decided to release the first versions of a new, compact but powerful framework called OSMF. It stands for Open Source Media Framework. I guess you already have heard about it and even seen or applied some examples. Maybe even you already built some OSMF based app. In this article I will try to bring on some little tricks and fruits of my experience up to now. Please notice I use for this exercises the latest available sprint at this time (sprint 9) so differences with former sprint can be found. Why someone should use OSMF? Why should we use OSMF and not keep working with AS3 as we used to do and did work well so far? We already had a working and nice video player after all and comes with flash: FLVPLayback. It handles well progressive and streaming video, comes with skins and configurable buttons and controls. And it’s easy and quick to set also. Well, as with every single Framework, the idea is to speed development and bring a set of pre-made elements, components and tools that avoid us to waste a lot of time and energy. Furthermore, being Open Source, the community behind this project is very active debugging and suggesting improvements. And, of course, you can modify the code in order to fit your needs as desired. What is the difference between OSMF and the FLVPlayback component then? The 2 concepts sound very similar. While FLVPlayback is a visual component, and more important, a closed one (you cannot modify the code inside it if you need to customize it), OSMF more a controller that can be added to a view. But you can now modify and build your OSMF player. That gives us a complete set of possibilities for great video players and video applications but in a fast way. You will be thrilled with the new gamut of opportunities. Another point is, OSMF is a specific and abstract logic for video. You will only make its displayObject(the view) to appear visually. In a classical MVC pattern, we can classify OSMF as just the controller. But there is more. Being a common framework for video apps developers, you can collaboratively build, extend and improve other developers apps. And one of OSMF strongest points is, the possibility of building plugins. Plugins are really usefull to integrate third applications or services providers to OSMF players in a simple way. No matter if you are a CDN, a Statistics service, a Ads provider, etc, now you can create your plugin for OSMF and then release it for OSMF developers to integrate it in a breeze.
  2. 2. Flash & Flex Developers After this brief intro, let’s get our hands dirty.. Basic elements First things come first: get the library. Go to and get the source. You can see both the swc file and the source files without compiling. I personally use the non-compiled source files just in case I need to extend some class. Once you already got them, create a project with that includes the library, and then proceed to import the basic classes: import; import ; import; import; import; import; import org.osmf.utils.URL; Those will let us to begin building our very basic player by now. var myPlayer:MediaPlayer; var myVideoElement:VideoElement; var mc_container:Sprite; init( ); function init( ) { //this is the container of our player mc_container= new Sprite; addChild ( mc_container ) ; //MediaPlayer is the basic controller of OSMF. It will manage all the logic myPlayer = new MediaPlayer ( ) ; //Once the view becomes viable, it dispatches an event.. myPlayer.addEventListener( MediaPlayerCapabilityChangeEvent.CAN_PLAY_CH ANGE, onViewable ); //And once the video is loaded, will tell us what the new dimensions are myPlayer.addEventListener( DisplayObjectEvent.MEDIA_SIZE_CHANGE, onDimensionChange ); var path:String=" here the path to your video… " //VideoElement is the way we have to tell explicitly to the MediaPlayer that we intend to load a video myVideoElement = new VideoElement( new NetLoader(), new URLResource( new URL ( path ) ) ); //Here we assign the VideoElement to the MediaPlayer = myVideoElement;
  3. 3. Flash & Flex Developers } function onViewable( evt : MediaPlayerCapabilityChangeEvent ):void { if (evt.enabled) { // We need to wait for this event before making anything.. //If the video is loaded and ready, just play it automatically… if(myPlayer.canPlay) { } } } function onDimensionChange( evt : DisplayObjectEvent ):void { //First we resize the OSMF view (the classic Video element //in AS3 myPlayer.displayObject.width=evt.newWidth; myPlayer.displayObject.height=evt.newHeight; //Then we add the view to the stage … mc_video.addChild( myPlayer.displayObject ); } We already have our autoplay OSMF player. Please notice I did not added buttons and controls because we will talk just about how to get the maximum of our OSMF player and I already discount you will manage to add those. Now that we have it nicely running, let’s get to discover some how-to’s.. Reading metadata: OSMF can read 2 types of metadata: Metadata that comes with the media (as we classically can read it with AS3 NetStream.client.onMetadata), and dynamically added metadata to our media elements , called “facets” in OSMF. By now, let’s listen for the metadata that comes with our nice video. To do this, we should listen to the “loadable” trait at the VideoElement. But what is a trait? As Adobe defines them: ”Traits define the capabilities of a media element. Different implementations of a media element aggregate different sets of traits, depending on the characteristics of the media they represent. For example, an audio element may aggregate the IAudible, IPlayable, ITemporal and ILoadable traits. These individual traits define the capabilities of the media element independently of the specific media element implementation. For a media element that has the ILoadable trait, the trait provides the media element's load and unload capability. Similarly, the implementation of the IPlayable trait provides media element's ability to play itself. “ First we should import some more classes: import org.osmf.traits.LoadState;
  4. 4. Flash & Flex Developers import; import org.osmf.traits.MediaTraitType; Then we proceed to modify the onViewable function to this: function onViewable( evt : MediaPlayerCapabilityChangeEvent ):void { if (evt.enabled) { //we listen the loadable trait in order to know when the media was succesfully loaded myVideoElement.getTrait(MediaTraitType.LOAD).addEventListener( LoadEvent.LOAD_STATE_CHANGE, onLoad) if(myPlayer.canPlay) { } } } function onLoad(event:LoadEvent):void { //this means, it loaded OK.. if (event.loadState == LoadState.READY) { //here we find the client, once the media element is loaded.. myVideoElement.client.addHandler(NetStreamCodes.ON_META_DATA, onMetadata); } } function onMetadata(info:Object):void { //here you get your metadata extracting it from the info //Object… } Listening for loading errors: How we know if the video failed when attempt to load it? In our classic AS3 player, we will add a listener to the netstream in order to listen to its status. With OSMF we first import the MediaErrorEvent class: import; And then modify our onLoad function to this:
  5. 5. Flash & Flex Developers function onLoad(event:LoadEvent):void { //this means, it loaded OK.. if (event.loadState == LoadState.READY) { myVideoElement.client.addHandler(NetStreamCodes.ON_META_DATA, onMetadata); myVideoElement.addEventListener(MediaErrorEvent.MEDIA_ERROR, onError) } } function onError(evt:MediaErrorEvent):void{ trace(evt.error)//here you can know what was the //error type… if(evt.type==MediaErrorEvent.MEDIA_ERROR) { trace("video not found") } } Modifiyng the videoElement to access all the events Sometimes we want to debug our application made with OSMF. I prefer to create my own custom VideoElement subclass , and then replacing the videoElement for this subclass. Please be advice to use this kind of subclasses only to test, replacing those back with the official OSMF ones. Here is an example: package com.example { import; import; import; import; import; import; import org.osmf.traits.LoadTrait; import org.osmf.traits.MediaTraitType; import; public class MyVideoElementDebugger extends VideoElement { private var stream:NetStream; public function MyVideoElementDebugger(loader:NetLoader, resource:MediaResourceBase=null) {
  6. 6. Flash & Flex Developers super(loader, resource); } override protected function processReadyState():void { super.processReadyState() var loadTrait:LoadTrait = getTrait(MediaTraitType.LOAD) as LoadTrait; var context:NetLoadedContext = NetLoadedContext(loadTrait.loadedContext); stream =; stream.addEventListener(NetStatusEvent.NET_STATUS, onNetStatusEvent2); } private function onNetStatusEvent2(evt:NetStatusEvent):void { trace( } } } And then modify at the init() function this line: myVideoElement = new VideoElement ( new NetLoader(), new URLResource( new URL ( path ) ) ); to this: myVideoElement = new MyVideoElementDebugger( new NetLoader(), new URLResource( new URL ( path ) ) ); check out the tracing… Progressive vs RTMP : We can considerate the VideoElement as a magic one. You can load it with video, audio and images. When loading video you can load both Progressive (http) or streaming video (RTMP). Just change the string of your URL to an rtmp path. And remember, please an “flv:” or and “mp4:” or an “f4v:” before the name of your clip. I.e: rtmp:// That is the basic approach. Then, of course OSMF gives us more features. Change the VideoElement creation from this: myVideoElement = new VideoElement( new NetLoader(), new URLResource( new URL ( path ) ) );
  7. 7. Flash & Flex Developers to that: myVideoElement = new VideoElement( new NetLoader(), new URLResource( new FMSURL ( path, false ) ) ); The extra parameter is, as Adobe explains, is useInstance: “useInstance:Boolean (default = false) — If true, then the second part of the URL path is considered the instance name, such as rtmp://host/app/foo/bar/stream. In this case the instance name would be 'foo' and the stream would be 'bar/stream'. If false, then the second part of the URL path is considered to be the stream name, such as rtmp://host/app/foo/bar/stream. In this case there is no instance name and the stream would be 'foo/bar/stream'.” This FMSURL will work with both VOD and live streams. But there is more. If you want to explicitly define your stream as live, use: myVideoElement = new VideoElement( new NetLoader(), new StreamingURLResource( new URL ( path),"live" ) ); This way OSMF will understand your stream as live only, and then will treat it as such, i.e, will remove the seekable trait, since you generally cannot seek a live stream (Note: you can seek back if you have DVR working on FMS 3.5). This is the first part of this deep into OSMF. Here you can find some links for you to continue to learn this great framework. • (specs and classes) • ( Adobe OSMF group) • (OSMF live docs) • (twitter osmf user group)