The document discusses various aspects of digital video technology including:
1) Digital video recording principles such as how images are captured using CCD sensors and converted into digital files for storage.
2) Television standards and connection systems including color encoding systems, aspect ratios, and connection types like SCART, HDMI, etc.
3) Broadcast systems including terrestrial, satellite, and multiplex broadcasting which allows multiple signals to be transmitted simultaneously.
4) Elements of producing video such as how cameras capture light and focus images, microphones capture sound, and controls like shutter speed, aperture, and white balance affect the image.
5) Digital editing principles involving converting analog to digital, compression formats, and linear
Digital Cinema means the transmission and delivery of films to theatres electronically where the image is stored in a computer server and beamed onto the theatre screens.
Distance Coding And Performance Of The Mark 5 And St350 Soundfield Microphone...Bruce Wiggins
A paper presented at the Institute of Acoustics Reproduced Sound 25 conference in 2009 looking at the response of two SoundField Ambisonic microphones to sound sources at different distances from the microphone.
10 Minute Research Presentation on Ambisonics and ImpactBruce Wiggins
In this 10 minute presentation I introduced my research into Ambisonic Surround Sound algorithm and software development and talked about its impact on the under graduate curriculum, post graduate validation events and its use in industry.
Digital Cinema means the transmission and delivery of films to theatres electronically where the image is stored in a computer server and beamed onto the theatre screens.
Distance Coding And Performance Of The Mark 5 And St350 Soundfield Microphone...Bruce Wiggins
A paper presented at the Institute of Acoustics Reproduced Sound 25 conference in 2009 looking at the response of two SoundField Ambisonic microphones to sound sources at different distances from the microphone.
10 Minute Research Presentation on Ambisonics and ImpactBruce Wiggins
In this 10 minute presentation I introduced my research into Ambisonic Surround Sound algorithm and software development and talked about its impact on the under graduate curriculum, post graduate validation events and its use in industry.
Filmic Tone Mapping, a presentation at Electronic Arts on a technique from film that became very applicable to games with the addition of support for HDR lighting and rendering in graphics cards.
Depiction of Internal Mechanism of old analog Television. The also contains a bit of history and types of television along with evolution of television. It is designed from signal processing perspective. The presentation is available in editable format for the convenience.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Ethnobotany and Ethnopharmacology:
Ethnobotany in herbal drug evaluation,
Impact of Ethnobotany in traditional medicine,
New development in herbals,
Bio-prospecting tools for drug discovery,
Role of Ethnopharmacology in drug evaluation,
Reverse Pharmacology.
The Art Pastor's Guide to Sabbath | Steve ThomasonSteve Thomason
What is the purpose of the Sabbath Law in the Torah. It is interesting to compare how the context of the law shifts from Exodus to Deuteronomy. Who gets to rest, and why?
We all have good and bad thoughts from time to time and situation to situation. We are bombarded daily with spiraling thoughts(both negative and positive) creating all-consuming feel , making us difficult to manage with associated suffering. Good thoughts are like our Mob Signal (Positive thought) amidst noise(negative thought) in the atmosphere. Negative thoughts like noise outweigh positive thoughts. These thoughts often create unwanted confusion, trouble, stress and frustration in our mind as well as chaos in our physical world. Negative thoughts are also known as “distorted thinking”.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
2. GRADING CRITERIA FOR U21A1
REPORT:
P1 - describe the principles of digital video technology and digital video recording with
some appropriate use of subject terminology
• Give definitions of all of the terms covered in this presentation
M1 - explain the principles of digital video technology and digital video recording with
reference to detailed illustrative examples and with generally correct use of subject
terminology
• Find examples (pictures) that are relevant to the definitions
D1 - comprehensively explain the principles of digital video technology and digital video
recording with elucidated examples and consistently using subject terminology correctly
• Use secondary sources of information (quotes/articles/screenshots from films) to
support what you are writing about
3. TELEVISION STANDARDS AND
CONNECTION SYSTEMS
a) The RGB system
- This is the standard colour system used in TV & Film
- The primary additive colours are Red, Green & Blue
- The human eye sees all colours as a combination of these colours
- The secondary additive colours are Yellow, Cyan and Magenta
- Televisions and computers display all colours as a combination of these colours
- Cameras covert these colours
4. TELEVISION STANDARDS AND
CONNECTION SYSTEMS
b) TV Standards
The following are colour encoding systems for analogue television
- PAL – Phase Alternative Line – was used in the UK and parts of Europe
- NTSC – was used in the US
- SECAM – was used in France
The following are colour encoding systems for digital television
- DVB – Digital Video Broadcast – used in the UK and many other companies
- DTMB - Digital Terrestrial Multimedia Broadcast – used in China
- ISDB - Integrated Services Digital Broadcasting - used in Japan
5. TELEVISION STANDARDS AND
CONNECTION SYSTEMS
Two methods used for "painting" an image on a television screen:
• Interlace (for example 1080i)
– PAL – scans 50 lines per second (25 odd, 25 even)
– Needs to be de-interlaced by televisions and computer monitors
– Can create ‘artefacts’.
– Signal bandwidth – cheaper as it requires less bandwidth
• Progressive (for example 720p)
– All lines are drawn in sequence
– Requires more bandwidth
7. TELEVISION STANDARDS AND
CONNECTION SYSTEMS
c) HDTV – High Definition TV
– 1080 lines at 30 frames per second (FPS)
– A lot higher and faster than the analogue TV Signal which was 625 lines at 25 FPS
– This faster refresh rate of the frames means that the resolution is much higher
d) Aspect Ratio – to do with the width and height of the image
– TV and films were traditionally 4:3
– Screens with greater width are widescreen – 16:9
– Examples are on the next slide
8.
9. TELEVISION STANDARDS AND
CONNECTION SYSTEMS
e) Connection Systems:
– Phono Connections have one video line and two audio lines
– SCART connections separate luminescence (brightness) and Chrominance (colour RGB)
– Serial/Parallel Port connections are used in digital systems/computers
• Please research the following and explain where they may be used – also look at USB
& Firewire
11. BROADCAST SYSTEMS
B) Satellite Broadcasting
– The transmission of a TV signal which leaves the earths atmosphere and is beamed back to
the earth via satellite
12. BROADCAST SYSTEMS
C) Multiplexes
– Technology that enables several signals to be transmitted at the same time along the same
path – e.g. multichannel television
14. PRODUCING IMAGES AND SOUND
• How images are produced
– Light passes through the camera lens
– The image is focussed on the prism
– The light is split into its 3 primary colours
– The 3 beams of light pass to 3 Charge Coupled Devices (CCDs) or image sensors
– The CCDs transform the light into electrical signals
– These signals are then converted through a series of microchips into a digital image
– This digital image is then stored (usually on an SD Card)
– The amount of light that reaches the CCDs is determined by the Shutter Speed and Iris
Aperture
15. PRODUCING IMAGES AND SOUND
• The Iris
– Can be opened up or closed down
– Smaller aperture = less light
– Larger aperture = more light
16. PRODUCING IMAGES AND SOUND
• Shutter Speed
– This is the amount of time that light is processed for each frame
• Slow shutter speed = more light
• Fast shutter speed = less light
17. PRODUCING IMAGES AND SOUND
• White Balance
– a control on the camera that
compensates for the colour
effect of day light (blue) and
artificial light (yellow)
– Different light has different
temperatures and will look
different on camera
– White balance compensates
for this change in
temperature/light
18. PRODUCING IMAGES AND SOUND
• Focus
– Focus is adjusted by changing
the position of the lens in
relation to the prism
– This helps to sharpen or blur
the image
19. PRODUCING IMAGES AND SOUND
• Microphones
– How sound is produced
• The head of the microphone vibrates a diaphragm around a magnet when it picks up sound waves
• This produces and electrical signal in the coil
20. RECORDING AND PLAYBACK
• DVDs and CDs (Digital Versatile Disc/Compact Disc)
– They are made of several layers of plastic with a spiral track
– Data is written to the disc with a laser that burns ‘pits’ into the spiral track
– Data is read by a laser and detector that senses pits in the tracks
21. RECORDING AND PLAYBACK
• CDs were first used in 1982. They can contain 650MB of info
• DVDs were first used in 1996. They can contain between 4.7GB (2 hours) and 17GB (8
hours) of info (this depends on the structure of the disc and how they are used)
22. DIGITAL EDITING
• A) Analogue to Digital conversion
– Equipment can change an analogue signal into a digital signal and vice
versa
– This can be useful for capturing analogue video to a PC for editing
• B) Digital compression
– Images can be condensed into a smaller amount of electrical space
(bytes)
– MPEG – ‘Moving Picture Experts Group’ is a common form of
compression
– AVI files are larger
– Look at the following types of Digital compression, define them and
find out about them
• Different types of Digital file
formats:
– AVI
– MPG
– VOB
– MP4
– MOV
– MKV
– WMOV
23. DIGITAL EDITING
• C) Editing Images and Sound
– Linear Editing
• Linear edit systems are analogue
• Linear editing is shots that are shot one after the other in the required order
• If an edit is to be changed then all of the following shots have to be re-
edited/re-shot as well
• What are the positives and negatives of this kind of editing?
24. DIGITAL EDITING
• C) Editing Images and Sound
– Non-Linear Editing
• Non-Linear editing systems are digital
• Shots can be taken in any order at any time
• A Non-Linear editing system consists of:
– A tape, film or SD Card
– A computer
– A monitor
– Speakers
– Software – Adobe Premiere, Final Cut Pro, Avid
25. DIGITAL EDITING
• C) Editing Images and Sound
– Non-Linear Editing
• Recorded footage is captured from the SD Card
• Captured footage is imported into the project
• The individual clips can then be added to the
timeline
• Shots can be cut to the right length and are
arranged in the appropriate order
• The edited film can be laid off and then uploaded
to a website (YouTube, Vimeo, etc)
What are the
positives and
negatives of
this kind of
editing?