Adam Sporka
Charles University
Tuesdays 10.40, Room SW1, Malostranské náměstí
NCGD009 ’19/’20
Adaptive Music and Interactive Audio
Adaptive Music
Techniques, design,
production, implementation
Generative Audio
Aleatoric systems, generative
music, genetic algorithms, AI
Course Overview
Sound Effects
Design, production,
implementation
Audio Implementation
Low-level programming
FMOD, Wwise;
Unity, Unreal integration
Using Audio as an
Interactive Medium
What Is Sound to People?
• Physics
– Sound field
– Signal
– Content of signal
• Music / audio design
– Tone, noise
– Sample, loop
– Subjective property of a device
• „The headphones have a nice sound“
We’re Mostly Visual Creatures
• Audio is taught to carry only a complementary information
• Easy to communicate using sound
• Difficult to talk about sound
sight
hearing the rest
Levels of Description of Sound
Physics Design
Common Applications of Sound
• Obvious:
– Film, video games
– Acoustic notification of background processes
• Less obvious:
– Subjective reduction of noise level
Sound in User Interfaces
• Interaction modality (input and output)
• Alerts
• Widget sounds
• Presentation of data
• Mood setters
• Sound branding
Sound in Video Games
• Complement of visual information
• Bears other forms of information
• Foreground
– Voices
– Sounds of objects being manipulated
• Background
– Environment
– Mood
– Music
Sound Design Exercises
• Bomb explosion
• Explosion in outer space
• Two swords colliding
• Lightsaber combat
• Blade of sword, illuminated by sunrise
• Whisper in the crowd
• Cave
• Light
• Semantics more important than correct physics?
Sound Effects
Behavior of Audio Sources
tuner hit
one-shot
always the same
only heard
from up close
signal generator
sustained
randomized
helicopter
sustained
maneuver-sensitive
Doppler effect
likely
explosion
one-shot
randomized
may disrupt
other sounds
running animal
periodic
randomized
depends on the
ground surface
Audio Localization
ITD: Inter-aural Time Difference
IID: Inter-aural Intensity Difference
HRTF: Head-related Transfer Function
Spatial Audio
Early Echoes
Reverb
Spatial
complexity
Diegetic Sounds
• Belonging to the world
• Explainable by processes in the world
• Intrinsic acoustic properties of objects
• Collisions of objects
Examples:
• Footsteps of a person we see
• Sword drawn
• Dialog of people
Diegetic Ambiences
• Without any events happening, what would you hear?
Non-diegetic Sounds
• Cannot be explained by the processes in the world
Examples:
• Narrator’s voice-overs
• Abstract sound effects
• Music (if not from a visible source)
User Interface Sounds
• Definitely not anyhow a part of the world
Examples:
• Buttons pressed
• Choices activated
• Alerts
• Widgets
Sound Effects by Duration
• One-shots
– Short (or finite)
– Distinct instances
– Played once or periodically
– Impacts, footsteps, …
• Sustained sounds
– Loops
– Keep playing until switched off
– Engines, wind
One-shots: “Vertical”
footsteps, shots, one-off activities
Sustains: “Horizontal”
situations, states, stable running machines
Audio Mix
Multiple One-shot Samples
Multiple sounds playing at the same time
People are not good at:
• Attributing each sound to its object
What to do:
• Only the most important visual elements have their one-shot
sounds
• The rest is simulated as a 2’-long loop
• Make sure you have perfect loops
Multiple One-shot Samples
People are good at:
• Detecting loops, cycles
• Telling that two sounds are identical
What to do:
• For real environment recordings:
Remove easily memorable sounds
• For synthesis:
Randomize the sounds inside the loops
Multiple One-shot Samples
Very important:
• Sounds triggered synchronously to frames
• 100s of reasons to play a sound per second
⇒ real danger of ending up with a 30/60Hz hum
Unexpected Realism
• No sound in space
• Can’t whisper in a crowd
• Plate armor does not sound hollow
• Camera is not the same thing as the microphone

Adaptive Music and Interactive Audio

  • 1.
    Adam Sporka Charles University Tuesdays10.40, Room SW1, Malostranské náměstí NCGD009 ’19/’20 Adaptive Music and Interactive Audio
  • 2.
    Adaptive Music Techniques, design, production,implementation Generative Audio Aleatoric systems, generative music, genetic algorithms, AI Course Overview Sound Effects Design, production, implementation Audio Implementation Low-level programming FMOD, Wwise; Unity, Unreal integration
  • 3.
    Using Audio asan Interactive Medium
  • 4.
    What Is Soundto People? • Physics – Sound field – Signal – Content of signal • Music / audio design – Tone, noise – Sample, loop – Subjective property of a device • „The headphones have a nice sound“
  • 5.
    We’re Mostly VisualCreatures • Audio is taught to carry only a complementary information • Easy to communicate using sound • Difficult to talk about sound sight hearing the rest
  • 6.
    Levels of Descriptionof Sound Physics Design
  • 7.
    Common Applications ofSound • Obvious: – Film, video games – Acoustic notification of background processes • Less obvious: – Subjective reduction of noise level
  • 8.
    Sound in UserInterfaces • Interaction modality (input and output) • Alerts • Widget sounds • Presentation of data • Mood setters • Sound branding
  • 9.
    Sound in VideoGames • Complement of visual information • Bears other forms of information • Foreground – Voices – Sounds of objects being manipulated • Background – Environment – Mood – Music
  • 10.
    Sound Design Exercises •Bomb explosion • Explosion in outer space • Two swords colliding • Lightsaber combat • Blade of sword, illuminated by sunrise • Whisper in the crowd • Cave • Light • Semantics more important than correct physics?
  • 11.
  • 12.
    Behavior of AudioSources tuner hit one-shot always the same only heard from up close signal generator sustained randomized helicopter sustained maneuver-sensitive Doppler effect likely explosion one-shot randomized may disrupt other sounds running animal periodic randomized depends on the ground surface
  • 13.
    Audio Localization ITD: Inter-auralTime Difference IID: Inter-aural Intensity Difference HRTF: Head-related Transfer Function
  • 14.
  • 15.
    Diegetic Sounds • Belongingto the world • Explainable by processes in the world • Intrinsic acoustic properties of objects • Collisions of objects Examples: • Footsteps of a person we see • Sword drawn • Dialog of people
  • 16.
    Diegetic Ambiences • Withoutany events happening, what would you hear?
  • 17.
    Non-diegetic Sounds • Cannotbe explained by the processes in the world Examples: • Narrator’s voice-overs • Abstract sound effects • Music (if not from a visible source)
  • 18.
    User Interface Sounds •Definitely not anyhow a part of the world Examples: • Buttons pressed • Choices activated • Alerts • Widgets
  • 19.
    Sound Effects byDuration • One-shots – Short (or finite) – Distinct instances – Played once or periodically – Impacts, footsteps, … • Sustained sounds – Loops – Keep playing until switched off – Engines, wind
  • 20.
    One-shots: “Vertical” footsteps, shots,one-off activities Sustains: “Horizontal” situations, states, stable running machines Audio Mix
  • 21.
    Multiple One-shot Samples Multiplesounds playing at the same time People are not good at: • Attributing each sound to its object What to do: • Only the most important visual elements have their one-shot sounds • The rest is simulated as a 2’-long loop • Make sure you have perfect loops
  • 22.
    Multiple One-shot Samples Peopleare good at: • Detecting loops, cycles • Telling that two sounds are identical What to do: • For real environment recordings: Remove easily memorable sounds • For synthesis: Randomize the sounds inside the loops
  • 23.
    Multiple One-shot Samples Veryimportant: • Sounds triggered synchronously to frames • 100s of reasons to play a sound per second ⇒ real danger of ending up with a 30/60Hz hum
  • 24.
    Unexpected Realism • Nosound in space • Can’t whisper in a crowd • Plate armor does not sound hollow • Camera is not the same thing as the microphone