Unique info about videostreaming compression in iOS from our the best iOS specialist Vladimir Predko. He's ready to answer all your questions! Go ahead!
2. 1. A little bit about codecs and containers
2. A little bit about streaming
3. How to arrange it for iOS
4. Comparison of different approaches for compressing videostream
Owerview
3. About codecs and containers
The container is a file or streaming format, which takes care of packaging,
transport, and presentation of information which is inside of it.
4. About codecs and containers
Examples of containers:
● AVI
● MKV
● QuickTime
● MP4
● MPEG-TS
5. About codecs and containers
Examples of codecs:
Video:
● MPEG-4
● DivX
● h.264/x.264
Audio:
● MP3
● AAC
● DTS
6. What about the mechanism of media streaming?
Stream media is a multimedia which is continuously obtained by the user from the
streaming provider.
● live-streaming
● streaming on demand
7. What about the mechanism of media streaming?
Media data -> large amounts -> expensive storage and transmission
Recommended stream bandwidth:
● not HD video ~2 Mbit/s
● HD video ~5 Mbit/s
● UHD ~9Mbit/s
8. What about the mechanism of media streaming?
Example of calculation of the bandwidth:
1hour of video 320 × 240 ~ 300kbit/s ~128MB
for 1 thousand of users - 300Mbit/s == 135GB/h
9. What about the mechanism of media streaming?
The protocols used for streaming:
For audio compression: MP3, AAC, Vorbis ...
For video compression: h.264, VP8 …
Containers: MP4, FLV...
10. What about the mechanism of media streaming?
Transport protocols:
Used for media delivery from server to client.
RTMP (Real Time Messaging Protocol)
RTP (Real-time Transport Protocol) + RTCP
RTSP (Real Time Streaming Protocol)
11. What about the mechanism of media streaming?
Transport protocols:
Newer: Apple’s HLS, Adobe’s HDS, MPEG-DASH
The process often consists of two stages:
1. The delivery of stream to the server using the transport protocol (streaming
transport protocol)
2. Translation from server to final user (HTTP based protocols)
12. What about the mechanism of media streaming?
HLS (HTTP Live Streaming)
It is based on the principle of splitting the stream into fragments.
It uses advanced m3u playlist, which is downloaded at the beginning of the
session and contains metadata about the nested streams.
13. What about the mechanism of media streaming?
HLS (HTTP Live Streaming)
It involves the use of an intermediate server that:
1. transforms the media-stream into the correct format: h.264, MP3/HE-AAC/AC-
3 and packs into the MPEG-TS container
2. splits the MPEG-TS file into fragments of equal length + creates an index file
with a link to fragments (.m3u8)
14. How to compress video in iOS?
Life before iOS 8:
Hardware:
By using AVAssetWriter -> we write to a file (for online streaming write small
files, then read them and pass)
Software:
Using any third-party library (eg ffmpeg)
15. How to compress video in iOS?
Life after iOS 8:
VideoToolbox appears
16. How to compress video in iOS?
AVFoundation:
● Decoding directly when displaying.
● Encoding into file
VideoToolbox:
● Decoding frames into CVPixelBuffer
● Encoding frames into CMSampleBuffer
17. How to compress video in iOS?
Briefly about h.264:
● widely used
● gives much a better picture quality at lower bit rates (MPEG-2)
● ideal for videostreaming
● ...
18. How to compress video in iOS?
Briefly about h.264:
It uses two approaches to reduce the size of the video:
● It compresses data within one frame
● It compresses the data using information from the group of pictures (pictures
are grouped into groups GOP)
19. How to compress video in iOS?
Briefly about h.264: GOP (Group of Pictures )
20. How to compress video in iOS?
Briefly about h.264: GOP (Group of Pictures )
I-frames: (key-frame) - self-sufficient, has the biggest size, the fastest decoding.
P-frames: (predicted frame) - uses information from the nearest P- or I-frame
B-frame: (bidirectional frame) - used information from the frames before and after
current
21. How to compress video in iOS?
VideoToolbox: the main points
It provides direct access to the decoder / encoder.
It depends on CoreMedia, CoreVideo and CoreFoundation
It requires additional work with buffers obtained from the encoder.
22. The process of preparing videostream in iOS
1. Capture video from the device -> CMSampleBuffer with uncompressed
frame data
2. Compressing frame by encoder from VideoToolbox -> CMSampleBuffer
with compressed frame data
3. Converting the stream of CMSampleBuffer into NALUs streaming trough the
network
25. The process of preparing videostream in iOS
Compression process:
1. Create and configure VTCompressionSession by using
VTCompressionSessionCreate, as one of the parameters we pass a
pointer to encoding callback-function
2. Call VTCompressionSessionEncodeFrame, as one of the parameters we
pass CVPixelBufferRef, repeat for each frame.
3. Process CMSampleBuffer that we get from encoder callback.
26. The process of preparing videostream in iOS
The creation of СompressionSession
27. The process of preparing videostream in iOS
Sending the buffer to the compression:
28. The process of preparing videostream in iOS
The signature of the callback-function:
29. The process of preparing videostream in iOS
Then it is necessary to convert stream of CMSampleBuffers to the stream of
packets suitable for further transmission over the network.