1. Salford City College
Eccles Sixth Form Centre
BTEC Extended Diploma in GAMES DESIGN
Unit 73: Sound For Computer Games
IG2 Task 1
1
Produce a glossary of terms specific to the methods and principles of sound design and production. Using a provided template, you must
research and gather definitions specific to provided glossary terms. Any definitions must be referenced with the URL link of the website you
have obtained the definition.
You must also, where possible, provide specific details of how researched definitions relate to your own production practice.
Name: Adam Bailey RESEARCHED DEFINITION (provide short internet researched definition
and URL link)
DESCRIBE THE RELEVANCE OF
THE RESEARCHED TERM TO
YOUR OWN PRODUCTION
PRACTICE?
SOUND DESIGN
METHODOLOGY
Foley Artistry Foley is the reproduction of everyday sound effects that are added to film, video,
and other mediums in post-production to enhance audio quality. These
reproduced sounds can be anything from the swishing of clothing and footsteps
to squeaky doors and breaking glass.
http://en.wikipedia.org/wiki/Foley_(filmmaking)
There has been a lot of relevance to
this in the work I have been doing
for example I’ve had to gather up
recordings of normal sounds and
then edit them so they sound like
something different.
Sound Libraries A sample library is a collection of digital sound recordings, known as samples, for
use by composers, arrangers, performers, and producers of music. The sound
files are loaded into a sampler - either hardware or computer-based - which is
then used to create music.
http://en.wikipedia.org/wiki/Sample_library
This is basically where all your
sounds and media you have
produced is stored. These can
range from pretty much anything,
VSTs, VSTi, actually media items
even sounds from the web. This has
been relevant to my work because I
have had to store my work into a
sample library to keep my media
items organised.
SOUND FILE FORMATS Uncompressed There is one major uncompressed audio format, PCM, which is usually stored in
a .wav file on Windows or in an .aiff file on Mac OS. The AIFF format is based on
the Interchange File Format (IFF). The WAV format is based on the Resource
Interchange File Format (RIFF), which is similar to IFF. WAV and AIFF are flexible
file formats designed to store more or less any combination of sampling rates or
bitrates. This makes them suitable file formats for storing and archiving an
original recording. BWF (Broadcast Wave Format) is a standard audio format
created by the European Broadcasting Union as a successor to WAV. BWF allows
metadata to be stored in the file. See European Broadcasting Union:
We have used uncompressed file
formats in our work like when we
was doing recording it made the file
formats into .av which is a
uncompressed file format also
when we was exporting the files
from reaper it converted the files
into .wav so it was recognisable on
media players.
2. Salford City College
Eccles Sixth Form Centre
BTEC Extended Diploma in GAMES DESIGN
Unit 73: Sound For Computer Games
IG2 Task 1
2
Specification of the Broadcast Wave Format (EBU Technical document 3285, July
1997). This is the primary recording format used in many professional audio
workstations in the television and film industry. BWF files include a standardized
timestamp reference which allows for easy synchronization with a separate
picture element. Stand-alone, file based, multi-track recorders from AETA, Sound
Devices, Zaxcom, HHB Communications Ltd, Fostex, Nagra, Aaton, and TASCAM
all use BWF as their preferred format.Uncompressed audio can also be stored in
raw audio format.
http://en.wikipedia.org/wiki/Audio_file_format#Uncompressed_audio_format
.wav Microsoft's native uncompressed audio format for Windows. While a flexible
format, capable of storing very high quality audio, WAV has a few limitations
which have become apparent over its lifetime, and extensions to the format
have been developed to address them. One shortcoming identified in WAV is its
inability to hold any metadata describing its audio contents
http://www.jiscdigitalmedia.ac.uk/guide/uncompressed-audio-file-formats
We use this quite a lot in lesson for
example when exporting the audio
sounds from reaper and the file
format for when we do the audio
recordings on the field recorders.
.aiff Audio Interchange File Format - developed by Apple and Amiga. Works similarly
to WAV but uses a different method of dividing the PCM data into manageable
chunks. Widely available free codecs for all platforms. AIFF is the native format
for audio on Mac OSX
http://www.jiscdigitalmedia.ac.uk/guide/uncompressed-audio-file-formats
We didn’t use this audio file format
because we was using reaper which
converts the files into .wav and .aiff
is commonly found on Apple
software so probably wouldn’t be
compatible on Windows
.au The Au file format is a simple audio file format introduced by Sun Microsystems.
The format was common on NeXT systems and on early Web pages. Originally it
was headerless, being simply 8-bit µ-law-encoded data at an 8000 Hz sample
rate. Hardware from other vendors often used sample rates as high as 8192 Hz,
often integer factors of video clock signals.
https://www.princeton.edu/~achaney/tmve/wiki100k/docs/Au_file_format.html
We didn’t use .au because it is a
rare file format to find so we
couldn’t of really used it and that I
don’t think there is any software
that I know of that could open this
file format.
.smp Files in the .smp file extension are in mono, and are also in a 16 bit format for
audio that has support for loop points. It was initially used by TurtleBeach during
the 90's. The Turtle Beach Sample Vision is able to create new sample files that
are stored with the .smp file extension.
http://file-extension.paretologic.com/detail.php/File-Extension-smp
This isn’t really relevant because it
isn’t really high definition therefore
we don’t really want low bit-rate
audio we are aiming for higher
audio bit-rate
Lossy Compression In information technology, "lossy" compression is the class of data encoding
methods that uses inexact approximations (or partial data discarding) for
representing the content that has been encoded. Such compression techniques
are used to reduce the amount of data that would otherwise be needed to store,
handle, and/or transmit the represented content. The different versions of the
photo of the dog at the right demonstrate how the approximation of an image
becomes progressively coarser as more details of the data that made up the
We didn’t use this because we
didn’t really compress any data.
3. Salford City College
Eccles Sixth Form Centre
BTEC Extended Diploma in GAMES DESIGN
Unit 73: Sound For Computer Games
IG2 Task 1
3
original image are removed. The amount of data reduction possible using lossy
compression can often be much more substantial than what is possible with
lossless data compression techniques.
http://en.wikipedia.org/wiki/Lossy_compression
.mp3 MPEG, audio layer 3. Layer 3 is one of three coding schemes (layer 1, layer 2 and
layer 3) for the compression of audio signals. Layer 3 uses perceptual audio
coding and psychoacoustic compression to remove all superfluous information
(more specifically, the redundant and irrelevant parts of a sound signal. The stuff
the human ear doesn't hear anyway). It also adds a MDCT (Modified Discrete
Cosine Transform) that implements a filter bank, increasing the frequency
resolution 18 times higher than that of layer 2.
http://www.webopedia.com/TERM/M/MP3.html
We didn’t really use .mp3 format
because most of our work was in
.wav
AUDIO LIMITATIONS Sound Processor Unit (SPU) A sound card (also known as an audio card) is an internal computer expansion
card that facilitates the input and output of audio signals to and from a
computer under control of computer programs. The term sound card is also
applied to external audio interfaces that use software to generate sound, as
opposed to using hardware inside the PC. Typical uses of sound cards include
providing the audio component for multimedia applications such as music
composition, editing video or audio, presentation, education and entertainment
(games) and video projection.
http://en.wikipedia.org/wiki/Sound_card
We used this when we was using
the computers so that we could
hear what we were producing in
reaper
Digital Sound Processor
(DSP)
The goal of DSP is usually to measure, filter and/or compress continuous real-
world analogue signals. The first step is usually to convert the signal from an
analogue to a digital form, by sampling and then digitizing it using an analogue-
to-digital converter (ADC), which turns the analogue signal into a stream of
numbers. However, often, the required output signal is another analogue output
signal, which requires a digital-to-analogue converter (DAC). Even if this process
is more complex than analogue processing and has a discrete value range, the
application of computational power to digital signal processing allows for many
advantages over analogue processing in many applications, such as error
detection and correction in transmission as well as data compression.
http://en.wikipedia.org/wiki/Digital_signal_processing
We used this in sound recording
because we recorded sounds from
life which is analogue and it
converted the file format into
digital so we did use this during the
periods of recording
Random Access Memory
(RAM)
RAM (pronounced ramm) is an acronym for random access memory, a type of
computer memory that can be accessed randomly; that is, and any byte of
memory can be accessed without touching the preceding bytes. RAM is the most
common type of memory found in computers and other devices, such as
printers.
http://www.webopedia.com/TERM/R/RAM.html
We used RAM when we was on the
computers for when we was
making the audio file if there
wasn’t RAM in the computer, the
computer wouldn’t of been able to
open reaper
Mono Audio Commonly called mono sound, mono, or non-stereo sound, this early sound We didn’t use mono sound because
4. Salford City College
Eccles Sixth Form Centre
BTEC Extended Diploma in GAMES DESIGN
Unit 73: Sound For Computer Games
IG2 Task 1
4
system used a single channel of audio for sound output. In monophonic sound
systems, the signal sent to the sound system encodes one single stream of sound
and it usually uses just one speaker. Monophonic sound is the most basic format
of sound output.
http://www.webopedia.com/TERM/M/monophonic_sound.html
it is a single channel which isn’t that
good we tried for at least stereo
sound
Stereo Audio Commonly called stereo sound or just stereo, stereophonic sound divides sounds
across two channels (recorded on two separate sources) then the recorded
sounds are mixed so that some elements are channelled to the left and others to
the right. Stereophonic sound is generally considered the best sound technology
of the 1950 and early 1960's.
http://www.webopedia.com/TERM/S/stereophonic_sound.html
We used stereo on reaper when we
was making the audio files.
Surround Sound A multichannel sound technology that produces 6 channels of sound in the left,
right, centre, left surround, right surround, and rear centre positions. 6.1
systems also have 1 channel for LFE (low frequency effects) which is usually sent
to a subwoofer.
http://www.webopedia.com/TERM/6/6_1_surround_sound.html
We used surround sound for when
we was doing sound recording
using the on recorder microphone
Direct Audio (Pulse Code
Modulation – PCM)
Pulse code modulation (PCM) is a digital scheme for transmitting analoguedata.
The signals in PCM are binary; that is, there are only two possible states,
represented by logic 1 (high) and logic0 (low). This is true no matter how
complex the analogue waveform happens to be. Using PCM, it is possible to
digitize all forms of analogue data, including full-motion video, voices, music,
telemetry, and virtual reality (VR).
http://searchnetworking.techtarget.com/definition/pulse-code-modulation-PCM
This pretty much was a digital
limiters which prevented us doing
some stuff on reaper
AUDIO RECORDING
SYSTEMS
Analogue Analogue recording methods store signals as a continual wave in or on the
media. The wave might be stored as a physical texture on a phonograph record,
or a fluctuation in the field strength of a magnetic recording. This is different
from digital recording of which among many possibilities include digital audio
and digital video, which digital signals are represented as data or discrete
numbers.
http://en.wikipedia.org/wiki/Analog_recording
We used analogue in recording like
for the digital recorder.
Digital Mini Disc A small recordable compact disc
http://dictionary.reference.com/browse/minidisc
We didn’t use this because we
didn’t use a minidisc recorder
Compact Disc (CD) Known by its abbreviation, CD, a compact disc is a polycarbonate with one or
more metal layers capable of storing digital information. The most prevalent
types of compact discs are those used by the music industry to store digital
recordings and CD-ROMs used to store computer data. Both of these types of
compact disc are read-only, which means that once the data has been recorded
onto them, they can only be read, or played.
http://www.webopedia.com/TERM/C/compact_disc.html
We didn’t use CDs because we
didn’t really get any audio from CDs
because we used digital sound
recorders.
5. Salford City College
Eccles Sixth Form Centre
BTEC Extended Diploma in GAMES DESIGN
Unit 73: Sound For Computer Games
IG2 Task 1
5
Digital Audio Tape (DAT) Digital Audio Tape is a signal recording and playback medium developed by Sony
and introduced in 1987. In appearance it is similar to a Compact Cassette, using
4 mm magnetic tape enclosed in a protective shell, but is roughly half the size at
73 mm Ă— 54 mm Ă— 10.5 mm.
http://en.wikipedia.org/wiki/Digital_Audio_Tape
We didn’t use DAT because they
are kind of out-dated and that
computers don’t use tapes now.
MIDI Short for Musical Instrument Digital Interface is a technical standard that
describes a protocol, digital interface and connectors and allows a wide variety
of electronic musical instruments, computers and other related devices to
connect and communicate with one another. A single MIDI link can carry up to
sixteen channels of information, each of which can be routed to a separate
device.
http://en.wikipedia.org/wiki/MIDI
We used this on reaper when we
was editing sounds and creating
sounds we used this in combination
of software plugins and MIDI
keyboard instrument.
Software Sequencers In 1975, New England Digital (NED) released ABLE computer (microcomputer) as
a dedicated data processing unit for Dartmouth Digital Synthesizer (1973), and
based on it, later Synclavier series were developed. The Synclavier I, released in
September 1977, was one of the earliest digital music workstation products with
multi-track sequencer.
We use this on reaper for when we
are making the audio tracks so that
we can layers several sounds on top
of each other.
Software Plug-ins In computing, a plug-in (or plugin, extension, or add-on / addon) is a software
component that adds a specific feature to an existing software application.
When an application supports plug-ins, it enables customization. The common
examples are the plug-ins used in web browsers to add new features such as
search-engines, virus scanners, or the ability to utilize a new file type such as a
new video format. Well-known browser plug-ins includes the Adobe Flash Player,
the QuickTime Player, and the Java plug-in, which can launch a user-activated
Java applet on a web page to its execution a local Java virtual machine.
http://en.wikipedia.org/wiki/Plug-in_(computing)
We have used this on reaper when
we was making the sounds and
editing sounds using VSTs and
VSTi’s
MIDI Keyboard Instruments A MIDI keyboard is typically a piano-style user interface keyboard device used
for sending MIDI signals or commands over a USB or MIDI cable to other devices
connected and operating on the same MIDI protocol interface. This could also be
a personal computer running software such as a digital audio workstation (DAW)
that listens to and sends MIDI information to other MIDI devices connected by
cable or running internal to the personal computer system. The basic MIDI
keyboard does not produce sound. Instead, MIDI information is sent to an
electronic module capable of reproducing an array of digital sounds or samples
that resemble traditional analogue musical instruments. These samples or
waveforms are also referred to as voices or timbres.
http://en.wikipedia.org/wiki/MIDI_keyboard
This is the default thing we used
when we was making the sounds
because it was used as a virtual
instrument
AUDIO SAMPLING File Size Constraints - Bit-
depth
In digital audio using pulse-code modulation (PCM), bit depth is the number of
bits of information in each sample, and it directly corresponds to the resolution
We tried to produce the sounds in
the highest quality possible but
6. Salford City College
Eccles Sixth Form Centre
BTEC Extended Diploma in GAMES DESIGN
Unit 73: Sound For Computer Games
IG2 Task 1
6
of each sample. Examples of bit depth include Compact Disc Digital Audio, which
uses 16 bits per sample, and DVD-Audio and Blu-ray Disc which can support up
to 24 bits per sample.
http://en.wikipedia.org/wiki/Audio_bit_depth
because .wav has a low bit-rate we
couldn’t make them really high
quality
File Size Constraints -
Sample Rate
A frequency of how many times audio is measured per second, usually measured
in kilohertz (kHz); a usual number you might see is 44.1kHz. This is tied directly
to bit depth or the number of bits measured in each cycle.
http://superuser.com/questions/388382/what-does-the-sample-rate-and-
sample-size-of-audio-means
I tried to get the audio files into a
good sample rate.