3. UNIT 1: UNDERSTANDING SOUND
Sound is one of the most
important elements of
multimedia, it tells a story and
sets mood to an illustration or
motion graphic. It is a perfect
way to communicate with an
audience and capture anyone’s
attention while conveying the
story and message of the
multimedia production.
3
5. Difference between a Sound and Audio
Sound
● is a mechanical vibration or
frequency that travels through
solid, liquid or gas;
● It may or may not be perceived by
the ear but still creates a
mechanical disturbance in a
medium.
Audio
● is a sound, not all sounds are audio.
● Anything audible that has been
made, captured, or processed by
an electronic or digital device is
considered audio.
5
Digital audio
● refers to digitally produced sound
which includes beats and samples
that aren’t achievable using
traditional instruments or is not
present on the real-world
7. 1. Pitch and
Frequency
● It is the perception of sound frequency by
the human ear;
● The higher the frequency the higher the
pitch and the lower the frequency mean
a lower pitch;
● Frequency refers to the number of
vibrations occur while pitch is the
description of how high or low a sound is
perceived by the human ear.
7
8. 2. Amplitude
and Loudness
● Amplitude determines the loudness or
volume of a sound and is measured of the
height of the wave.
● The difference between amplitude from a
frequency is that amplitude refers to the
vertical distance or maximum
displacement moved by a point on a wave
from its equilibrium position while
frequency refers to the number of wave
cycles in a second. 8
Amplitude
Frequency
10. 2. Amplitude
and Loudness
● Loudness is the subjective measure of the
physical strength of a sound pressure level
picked by an auditory sensation and is
commonly measured in decibels (dB)
10
● A whisper is about 30 dB, normal conversation is
about 60 dB, and a motorcycle engine running is
about 95 dB. Noise above 70 dB over a prolonged
period of time may start to damage your hearing.
Loud noise above 120 dB can cause immediate harm
to your ears.
11. 3. Speed of
sound
● The speed of sound refers to how fast
sound waves travel through a medium.
Speed of sound differs from different
mediums, compared to gasses and liquids,
sound travels fastest in solids where
molecules are closely grouped together
11
12. 4. Reflection
of sound
● Sound waves like light waves follow the
laws of reflection, once it hits a solid
surface it bounces back to the same
medium.
12
13. 5. Timbre
● Timbre is also known as tone color or tone
quality which is used to describe sound
quality made by an instrument, a voice or
an object
● It is a way of differentiating sound with the
same frequency.
13
14. Waves and medium
● Any disturbance travelling through a medium is
called a wave while a medium is anything that
carries a wave, it can be substances or material like
water, air, and solid surfaces.
14
15. Major Categories of
Waves
This is where you section ends. Duplicate this set of slides as many times you need to go over all your sections. 15
16. 1. Transverse
Waves
● In a transverse wave, the
disturbance is coming from a
perpendicular direction.
● Medium particles oscillate at 90
degrees or in an up and down
motion to the direction of the
wave.
16
17. 2. Longitudinal
Waves
● The disturbance is along the
direction in which the wave
travels.
● If the disturbance is parallel to the
direction of wave it is under
longitudinal wave or also called as
compression wave.
● Examples are sound waves and
ultrasound waves. .
17
18. 3.Surface Waves
● It is a wave that travels along the
surface of water in which particles
undergo a circular motion.
18
21. 1. Electromagnetic Wave
● Light waves produced by the sun
are an example of electromagnetic
wave because of its capability to
transmit energy thought an empty
space or vacuum.
21
22. 2. Mechanical wave
● Sound waves produces by the
human voice, instruments and
other real-life objects is under the
category of mechanical wave that
requires a medium in order to
transport their energy from one
place to another
22
23. Recording of Sound Waves
● A microphone is a device use for recording sound
that is capable of converting mechanical energy
into electric energy. When the sound wave hits the
diaphragm inside the microphone, the coil and the
magnet of the microphone pushes back and forth
creating a vibration on a magnetic field thus
converting these vibrations into an electrical current
which produces an audio signal.
23
26. Dynamic Microphones
● It is the most common type of microphone
that has unidirectional pattern or cardioids
pattern (heart pattern) which allows the
device to pick up sound in the direction
where the diaphragm is pointing.
26
27. Condenser Microphones
● Also called capacitor microphones and very
common in professional recording studio. It is
capable of capturing transient or short, high
amplitude burst of a sound wave providing a
richer, clearer and more accurate recording.
Many condensers have switches that can
pickup sound from the front and back
(bidirectional), and sounds from all
directions (omnidirectional)
27
28. Ribbon Microphones
● Popular in the in the early days and are
meant for softer sounds. Ribbon
microphones are bidirectional useful for
capturing an audience reaction in a show or
recordings of an instrument or ensemble.
28
29. Did you know?
Pulse Code Modulation (PCM)
is a standard form or method
use by computers and other
digital devices to store
sampled analog signal to
digital signal.
29
Sampling Rate/Sampling Frequency
refers to the frequency of the captured number
of electric voltages (amplitude) in one second
of a sound file.
According Nyquist, sampling rate should twice
the frequency of any information you want,
that is your sampling rate(F) should be (F>2*f),
hence the sample rate must be at least 40 kHz
since the human hearing range is 20 to 20,000
Hz.
30. Types of sound
Audible
Sound perceived by the human
ear that ranges from 20 Hz to
20 kHz. For sound recording,
the standard or minimum
sample rate is 44.1 kHz.
30
Inaudible
Sound that can’t be perceived by the human
ear, inaudible sounds have frequencies lower
than 20 Hz and higher than 20 kHz.
31. Jitter and Dither
● When recording (analog to digital conversion and vice versa) clock
distortions may happen, it is the small delay or advance on the rhythm
programmed.
● It is usually cause by changes in electrical voltage and noises in the audio
signal.
● To cover up this signal error, a background noise called dither is applied in
exporting the audio or converting the audio in lower bit-depth (e.g. 24 bit
resolution to 16 bit).
31
32. Did you know?
What is a bit-depth?
A bit-depth defines the number of
measurement values to describe the
amplitude of an audio sample.
Each bit in a 24-bit resolution is
represented by a 6dB of dynamic
range (Figure 1.8).
32
Bitrate is the term used to describe the amount of
data being transferred into audio.
A higher bitrate generally means better audio quality.
“Bitrate is going to determine audio fidelity,”
says producer and engineer Gus Berry.
33. Major Groups of Audio File Formats
Uncompressed Audio Formats
● Used in CDs, and standard
format used by electronic
devices in storing data
● E.g.: WAV, AIFF, and PCM
33
Lossy Compression
● Enables greater reduction
in file size by removing
some audio information
● E.g.: MP3, and AAC
Lossless Compression
● An audio format that stores
data with less space
without losing any
information
● E.g.: FLAC, WavPack, ALAC,
and Monkey’s Audio
34. Free themes and templates for
Google Slides or PowerPoint
NOT to be sold as is or modified!
Read FAQ on slidesmania.com
Do not remove the slidesmania.com text on the sides.
Sharing is caring!
34