Introduction material for 360 Video. This includes Multimedia Pipeline and Rendering pipeline for playback. Comparison of projections for 360 video rendering.
Narsingi Hyderabad 💋 Call Girl 9748763073 Call Girls in Escort service book now
Introduction to 360 Video
1.
2. • 2D, 3D, VR, 360 Video: What’s difference?
– 2D vs 3D : Mono and Stereo
– VR vs 360 Video:
• Virtual Reality include graphically generated and live action
contents
• Video captured from omni-direction
• Video playback pipeline
• Service scenario for 360 Video
• How IP-based Media Cloud Service works?
3. • 2D Video Streaming example
https://www.chromium.org/developers/design-documents/video
4. • 2 Pipelines of Playback
– Multimedia Pipeline: Same with 2D Playback
– Rendering Pipeline: Different from 2D Playback
Framebuffer
Projection
Split Stereo
Screen
Creation
Distortion
Correction
Time
Warpping
Data Source Demuxer Video Decoder
Multimedia Pipeline
Rendering Pipeline
5. • 3D Object to 2D Plane Image
– Example: Sphere to Equirectangluar Projection
– Distortion can happen during process
http://clipground.com/equirectangular-projection-clipart.html
7. • Definition
– Parallax resulting from the eyes’ horizontal separation
About 50–75 mm (interpupillary distance) depending on
each individual
– Human brain extracts depth information from the two-
dimensional retinal images in stereopsis.
• Definition In computer vision
– The difference in coordinates of similar features within
two stereo images.
• VR devices
– Needs to show two different images for Binocular Parallax
– 360 2D videos: Hard to present depth information
– 360 3D videos: Different images for each eye
https://en.wikipedia.org/wiki/Binocular_disparity
8. • The lenses in the VR device
– Magnify the image for very wide field of view (FOV)
– Enhances immersion, but Pincushion Distortion
• How to resolve distortion?
– Barrel Distortion filter
Pincushion Distortion Barrel Distortion
9. • Definition
– Reprojection technique in VR that warps the rendered image before
sending it to the display to correct for the head movement occurred after
the rendering
– Uses depth map(Z buffers) which is already provided to Engine
– From Oculus PC SDK 0.3.1/Mobile SDK 0.4.0
• Benefit
– Decrease Latency and Increase FPS
– Reduced Judder from missed frame
• Asynchronous Timewarp (ATW)
– Execute in different threads for Timewarp and Rendering
– Create Timewarp frame before every VSYNC
Original Timewarp
10. 1. Upload
Upload Local
File
Ingestion Server
Streaming Server
360 Cameras
Live Ingestion Server
RTMP
HLS
360 Cameras
2. Live Streaming
3. Local Playback
11. • Local Playback
– Gear VR Example
– Put contents in VR device and play
• 360 VR Services
– Upload configuration comparisons among services
– VoD/Live Service on the Cloud
– Adaptive Streaming
12. YouTube Facebook Samsung VR
Media Container MP4, MKV MP4 MP4, MOV
File Size 128G 1.75G(<=10min) 25G
Bitrates
36~45Mbps@30fps
53~60Mbps@60fps
20Mbps@30fps
30Mbps@60fps
40Mbps
Video
Codec H.264 H.264 H.264
FPS Up to 60 Up to 60 Up to 60
Resolution 3840x2160(Mono) 4096x2048(Mono)
4096x2048(Mono)
4096x4096(Stereo)
Projection Equirectangular Equirectangular Equirectangular
Audio
Codec AAC AAC, MP3 AAC
Channel 2, 6(5.1) - 1~6
3D Audio - - Fraunhofer Cingo
13. WEB CMS META
Ingest Processing Delivery
Analysis
http://www.slideshare.net/awskorea/20151028-aws-media-customer-day-3-
introducing-aws-cloud-based-media-processing-solution
14. • Viewport dependent full
encoding (Facebook)
– 30 different view port
– 5 level quality
30 x 5 = 150 full video
150 x 0.5 = 75
https://code.facebook.com/posts/1126354007399553/next-generation-
video-encoding-techniques-for-360-video-and-vr/
• Titled approach
– Size is not changed
– 5 level quality
5 x 6 = 30 tilted videos
30 x 0.75 = 22.5
15. • 360 Video Playback Pipeline
– Same Multimedia Pipeline with 2D video
– Different Rendering Pipeline
: Projection, Stereo/Parallax, Distortion Correction,
Timewarp
• 360 Video Service Scenarios
– Local, VoD and Live
– IP-based Media Service on Cloud
– Adaptive Streaming