SlideShare a Scribd company logo
Introduction to Audio Content Analysis
module 3.6: learned features
alexander lerch
overview intro approaches summary
introduction
overview
corresponding textbook section
section 3.6
lecture content
• introduction to the concept of feature learning
• quick overview of important methods
learning objectives
• describe the advantages and disadvantages of feature learning
• give examples of approaches to feature learning
module 3.6: learned features 1 / 6
overview intro approaches summary
introduction
overview
corresponding textbook section
section 3.6
lecture content
• introduction to the concept of feature learning
• quick overview of important methods
learning objectives
• describe the advantages and disadvantages of feature learning
• give examples of approaches to feature learning
module 3.6: learned features 1 / 6
overview intro approaches summary
introduction
feature learning vs. traditional features
hand-crafted features:
• arbitrary definitions
• simple to compute
• mostly focus on one technical property
• provide limited information
feature learning:
• automatically learn features from data-set
• meaning not obvious, can combine multiple properties
module 3.6: learned features 2 / 6
overview intro approaches summary
introduction
feature learning vs. traditional features
hand-crafted features:
• arbitrary definitions
• simple to compute
• mostly focus on one technical property
• provide limited information
feature learning:
• automatically learn features from data-set
• meaning not obvious, can combine multiple properties
module 3.6: learned features 2 / 6
overview intro approaches summary
overview
general properties
principle
1 put (a lot of) raw data at input
2 learn a way of reducing dimensionality while keeping as much information as possible
advantages
• features might contain more useful information than provided by hand-crafted
features
• no expert knowledge required
disadvantages
• usually time consuming
• limited ways of controlling the type of information learned
module 3.6: learned features 3 / 6
overview intro approaches summary
overview
general properties
principle
1 put (a lot of) raw data at input
2 learn a way of reducing dimensionality while keeping as much information as possible
advantages
• features might contain more useful information than provided by hand-crafted
features
• no expert knowledge required
disadvantages
• usually time consuming
• limited ways of controlling the type of information learned
module 3.6: learned features 3 / 6
overview intro approaches summary
overview
general properties
principle
1 put (a lot of) raw data at input
2 learn a way of reducing dimensionality while keeping as much information as possible
advantages
• features might contain more useful information than provided by hand-crafted
features
• no expert knowledge required
disadvantages
• usually time consuming
• limited ways of controlling the type of information learned
module 3.6: learned features 3 / 6
overview intro approaches summary
approaches
dictionary learning
dictionary learning (sparse coding, non-negative matrix factorization)
X = B · A
X: input signal to be modeled (often spectrogram)
B: dictionary/template matrix (often set of single spectra that comprise the basic
building blocks of X)
A: activation matrix indicating the weight and superposition of templates
• derive B,A, by minimizing a cost function, e.g. ||X − BA||2
→ templates are trained, activations are used as feature vector (length: number of
templates)
module 3.6: learned features 4 / 6
overview intro approaches summary
approaches
dictionary learning
dictionary learning (sparse coding, non-negative matrix factorization)
X = B · A
X: input signal to be modeled (often spectrogram)
B: dictionary/template matrix (often set of single spectra that comprise the basic
building blocks of X)
A: activation matrix indicating the weight and superposition of templates
• derive B,A, by minimizing a cost function, e.g. ||X − BA||2
→ templates are trained, activations are used as feature vector (length: number of
templates)
module 3.6: learned features 4 / 6
overview intro approaches summary
approaches
deep learning
clustering
• find clusters in data set (e.g., from magnitude spectra or simple features)
• store median of clusters (compare: template matrix)
→ features:
▶ binary vector (length: number of clusters, zero except for closest cluster)
▶ distance vector (distance to each cluster)
neural networks and deep architectures
• stack multiple layers of simple learning blocks
• each layer uses the output of the previous layer as input
→ feature: output of the highest layer
• task-dependency:
▶ independent: use auto-encoder structure to maximize encoded information
▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation
for other tasks (cf. transfer learning)
module 3.6: learned features 5 / 6
overview intro approaches summary
approaches
deep learning
clustering
• find clusters in data set (e.g., from magnitude spectra or simple features)
• store median of clusters (compare: template matrix)
→ features:
▶ binary vector (length: number of clusters, zero except for closest cluster)
▶ distance vector (distance to each cluster)
neural networks and deep architectures
• stack multiple layers of simple learning blocks
• each layer uses the output of the previous layer as input
→ feature: output of the highest layer
• task-dependency:
▶ independent: use auto-encoder structure to maximize encoded information
▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation
for other tasks (cf. transfer learning)
module 3.6: learned features 5 / 6
overview intro approaches summary
approaches
deep learning
clustering
• find clusters in data set (e.g., from magnitude spectra or simple features)
• store median of clusters (compare: template matrix)
→ features:
▶ binary vector (length: number of clusters, zero except for closest cluster)
▶ distance vector (distance to each cluster)
neural networks and deep architectures
• stack multiple layers of simple learning blocks
• each layer uses the output of the previous layer as input
→ feature: output of the highest layer
• task-dependency:
▶ independent: use auto-encoder structure to maximize encoded information
▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation
for other tasks (cf. transfer learning)
module 3.6: learned features 5 / 6
overview intro approaches summary
approaches
deep learning
clustering
• find clusters in data set (e.g., from magnitude spectra or simple features)
• store median of clusters (compare: template matrix)
→ features:
▶ binary vector (length: number of clusters, zero except for closest cluster)
▶ distance vector (distance to each cluster)
neural networks and deep architectures
• stack multiple layers of simple learning blocks
• each layer uses the output of the previous layer as input
→ feature: output of the highest layer
• task-dependency:
▶ independent: use auto-encoder structure to maximize encoded information
▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation
for other tasks (cf. transfer learning)
module 3.6: learned features 5 / 6
overview intro approaches summary
approaches
deep learning
clustering
• find clusters in data set (e.g., from magnitude spectra or simple features)
• store median of clusters (compare: template matrix)
→ features:
▶ binary vector (length: number of clusters, zero except for closest cluster)
▶ distance vector (distance to each cluster)
neural networks and deep architectures
• stack multiple layers of simple learning blocks
• each layer uses the output of the previous layer as input
→ feature: output of the highest layer
• task-dependency:
▶ independent: use auto-encoder structure to maximize encoded information
▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation
for other tasks (cf. transfer learning)
module 3.6: learned features 5 / 6
overview intro approaches summary
summary
lecture content
learned features
• not designed but learn from data
▶ no hand-designed features ⇒ no expert knowledge needed
▶ requires large amounts of data and computational resources
• not interpretable
• potentially more powerful as it contains all relevant information
module 3.6: learned features 6 / 6

More Related Content

Similar to 03-06-ACA-Input-FeatureLearning.pdf

Ccourse 140618093931-phpapp02
Ccourse 140618093931-phpapp02Ccourse 140618093931-phpapp02
Ccourse 140618093931-phpapp02
Getachew Ganfur
 
c++ Unit III - PPT.pptx
c++ Unit III - PPT.pptxc++ Unit III - PPT.pptx
Object Oriented Programming Concepts using Java
Object Oriented Programming Concepts using JavaObject Oriented Programming Concepts using Java
Object Oriented Programming Concepts using Java
Glenn Guden
 
10 - Encapsulation(object oriented programming)- java . ppt
10 - Encapsulation(object oriented programming)- java . ppt10 - Encapsulation(object oriented programming)- java . ppt
10 - Encapsulation(object oriented programming)- java . ppt
VhlRddy
 
4-OOPS.pptx
4-OOPS.pptx4-OOPS.pptx
4-OOPS.pptx
SatyamMishra237306
 
Unit3 part3-packages and interfaces
Unit3 part3-packages and interfacesUnit3 part3-packages and interfaces
Unit3 part3-packages and interfaces
DevaKumari Vijay
 
Inheritance
InheritanceInheritance
Inheritance
abhay singh
 
Industry - Program analysis and verification - Type-preserving Heap Profiler ...
Industry - Program analysis and verification - Type-preserving Heap Profiler ...Industry - Program analysis and verification - Type-preserving Heap Profiler ...
Industry - Program analysis and verification - Type-preserving Heap Profiler ...
ICSM 2011
 
Bjarne essencegn13
Bjarne essencegn13Bjarne essencegn13
Bjarne essencegn13
Hunde Gurmessa
 
C# classes objects
C#  classes objectsC#  classes objects
C# classes objects
Dr.Neeraj Kumar Pandey
 
Deep learning with keras
Deep learning with kerasDeep learning with keras
Deep learning with keras
MOHITKUMAR1379
 
Synapseindia reviews.odp.
Synapseindia reviews.odp.Synapseindia reviews.odp.
Synapseindia reviews.odp.
Tarunsingh198
 
Week_1 Machine Learning introduction.pptx
Week_1 Machine Learning introduction.pptxWeek_1 Machine Learning introduction.pptx
Week_1 Machine Learning introduction.pptx
muhammadsamroz
 
UsingCPP_for_Artist.ppt
UsingCPP_for_Artist.pptUsingCPP_for_Artist.ppt
UsingCPP_for_Artist.ppt
vinu28455
 
Introduction to oop
Introduction to oop Introduction to oop
Introduction to oop
Kumar
 
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
Databricks
 
Core Java unit no. 1 object and class ppt
Core Java unit no. 1 object and class pptCore Java unit no. 1 object and class ppt
Core Java unit no. 1 object and class ppt
Mochi263119
 
Core Java unit no. 1 object and class ppt
Core Java unit no. 1 object and class pptCore Java unit no. 1 object and class ppt
Core Java unit no. 1 object and class ppt
Mochi263119
 
C++ppt. Classs and object, class and object
C++ppt. Classs and object, class and objectC++ppt. Classs and object, class and object
C++ppt. Classs and object, class and object
secondakay
 
background.pptx
background.pptxbackground.pptx
background.pptx
KabileshCm
 

Similar to 03-06-ACA-Input-FeatureLearning.pdf (20)

Ccourse 140618093931-phpapp02
Ccourse 140618093931-phpapp02Ccourse 140618093931-phpapp02
Ccourse 140618093931-phpapp02
 
c++ Unit III - PPT.pptx
c++ Unit III - PPT.pptxc++ Unit III - PPT.pptx
c++ Unit III - PPT.pptx
 
Object Oriented Programming Concepts using Java
Object Oriented Programming Concepts using JavaObject Oriented Programming Concepts using Java
Object Oriented Programming Concepts using Java
 
10 - Encapsulation(object oriented programming)- java . ppt
10 - Encapsulation(object oriented programming)- java . ppt10 - Encapsulation(object oriented programming)- java . ppt
10 - Encapsulation(object oriented programming)- java . ppt
 
4-OOPS.pptx
4-OOPS.pptx4-OOPS.pptx
4-OOPS.pptx
 
Unit3 part3-packages and interfaces
Unit3 part3-packages and interfacesUnit3 part3-packages and interfaces
Unit3 part3-packages and interfaces
 
Inheritance
InheritanceInheritance
Inheritance
 
Industry - Program analysis and verification - Type-preserving Heap Profiler ...
Industry - Program analysis and verification - Type-preserving Heap Profiler ...Industry - Program analysis and verification - Type-preserving Heap Profiler ...
Industry - Program analysis and verification - Type-preserving Heap Profiler ...
 
Bjarne essencegn13
Bjarne essencegn13Bjarne essencegn13
Bjarne essencegn13
 
C# classes objects
C#  classes objectsC#  classes objects
C# classes objects
 
Deep learning with keras
Deep learning with kerasDeep learning with keras
Deep learning with keras
 
Synapseindia reviews.odp.
Synapseindia reviews.odp.Synapseindia reviews.odp.
Synapseindia reviews.odp.
 
Week_1 Machine Learning introduction.pptx
Week_1 Machine Learning introduction.pptxWeek_1 Machine Learning introduction.pptx
Week_1 Machine Learning introduction.pptx
 
UsingCPP_for_Artist.ppt
UsingCPP_for_Artist.pptUsingCPP_for_Artist.ppt
UsingCPP_for_Artist.ppt
 
Introduction to oop
Introduction to oop Introduction to oop
Introduction to oop
 
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
 
Core Java unit no. 1 object and class ppt
Core Java unit no. 1 object and class pptCore Java unit no. 1 object and class ppt
Core Java unit no. 1 object and class ppt
 
Core Java unit no. 1 object and class ppt
Core Java unit no. 1 object and class pptCore Java unit no. 1 object and class ppt
Core Java unit no. 1 object and class ppt
 
C++ppt. Classs and object, class and object
C++ppt. Classs and object, class and objectC++ppt. Classs and object, class and object
C++ppt. Classs and object, class and object
 
background.pptx
background.pptxbackground.pptx
background.pptx
 

More from AlexanderLerch4

0A-02-ACA-Fundamentals-Convolution.pdf
0A-02-ACA-Fundamentals-Convolution.pdf0A-02-ACA-Fundamentals-Convolution.pdf
0A-02-ACA-Fundamentals-Convolution.pdf
AlexanderLerch4
 
12-02-ACA-Genre.pdf
12-02-ACA-Genre.pdf12-02-ACA-Genre.pdf
12-02-ACA-Genre.pdf
AlexanderLerch4
 
10-02-ACA-Alignment.pdf
10-02-ACA-Alignment.pdf10-02-ACA-Alignment.pdf
10-02-ACA-Alignment.pdf
AlexanderLerch4
 
09-03-ACA-Temporal-Onsets.pdf
09-03-ACA-Temporal-Onsets.pdf09-03-ACA-Temporal-Onsets.pdf
09-03-ACA-Temporal-Onsets.pdf
AlexanderLerch4
 
13-00-ACA-Mood.pdf
13-00-ACA-Mood.pdf13-00-ACA-Mood.pdf
13-00-ACA-Mood.pdf
AlexanderLerch4
 
09-07-ACA-Temporal-Structure.pdf
09-07-ACA-Temporal-Structure.pdf09-07-ACA-Temporal-Structure.pdf
09-07-ACA-Temporal-Structure.pdf
AlexanderLerch4
 
09-01-ACA-Temporal-Intro.pdf
09-01-ACA-Temporal-Intro.pdf09-01-ACA-Temporal-Intro.pdf
09-01-ACA-Temporal-Intro.pdf
AlexanderLerch4
 
07-06-ACA-Tonal-Chord.pdf
07-06-ACA-Tonal-Chord.pdf07-06-ACA-Tonal-Chord.pdf
07-06-ACA-Tonal-Chord.pdf
AlexanderLerch4
 
15-00-ACA-Music-Performance-Analysis.pdf
15-00-ACA-Music-Performance-Analysis.pdf15-00-ACA-Music-Performance-Analysis.pdf
15-00-ACA-Music-Performance-Analysis.pdf
AlexanderLerch4
 
12-01-ACA-Similarity.pdf
12-01-ACA-Similarity.pdf12-01-ACA-Similarity.pdf
12-01-ACA-Similarity.pdf
AlexanderLerch4
 
09-04-ACA-Temporal-BeatHisto.pdf
09-04-ACA-Temporal-BeatHisto.pdf09-04-ACA-Temporal-BeatHisto.pdf
09-04-ACA-Temporal-BeatHisto.pdf
AlexanderLerch4
 
09-05-ACA-Temporal-Tempo.pdf
09-05-ACA-Temporal-Tempo.pdf09-05-ACA-Temporal-Tempo.pdf
09-05-ACA-Temporal-Tempo.pdf
AlexanderLerch4
 
11-00-ACA-Fingerprinting.pdf
11-00-ACA-Fingerprinting.pdf11-00-ACA-Fingerprinting.pdf
11-00-ACA-Fingerprinting.pdf
AlexanderLerch4
 
08-00-ACA-Intensity.pdf
08-00-ACA-Intensity.pdf08-00-ACA-Intensity.pdf
08-00-ACA-Intensity.pdf
AlexanderLerch4
 
03-05-ACA-Input-Features.pdf
03-05-ACA-Input-Features.pdf03-05-ACA-Input-Features.pdf
03-05-ACA-Input-Features.pdf
AlexanderLerch4
 
07-03-03-ACA-Tonal-Monof0.pdf
07-03-03-ACA-Tonal-Monof0.pdf07-03-03-ACA-Tonal-Monof0.pdf
07-03-03-ACA-Tonal-Monof0.pdf
AlexanderLerch4
 
05-00-ACA-Data-Intro.pdf
05-00-ACA-Data-Intro.pdf05-00-ACA-Data-Intro.pdf
05-00-ACA-Data-Intro.pdf
AlexanderLerch4
 
07-03-05-ACA-Tonal-F0Eval.pdf
07-03-05-ACA-Tonal-F0Eval.pdf07-03-05-ACA-Tonal-F0Eval.pdf
07-03-05-ACA-Tonal-F0Eval.pdf
AlexanderLerch4
 
03-03-01-ACA-Input-TF-Fourier.pdf
03-03-01-ACA-Input-TF-Fourier.pdf03-03-01-ACA-Input-TF-Fourier.pdf
03-03-01-ACA-Input-TF-Fourier.pdf
AlexanderLerch4
 
07-03-04-ACA-Tonal-Polyf0.pdf
07-03-04-ACA-Tonal-Polyf0.pdf07-03-04-ACA-Tonal-Polyf0.pdf
07-03-04-ACA-Tonal-Polyf0.pdf
AlexanderLerch4
 

More from AlexanderLerch4 (20)

0A-02-ACA-Fundamentals-Convolution.pdf
0A-02-ACA-Fundamentals-Convolution.pdf0A-02-ACA-Fundamentals-Convolution.pdf
0A-02-ACA-Fundamentals-Convolution.pdf
 
12-02-ACA-Genre.pdf
12-02-ACA-Genre.pdf12-02-ACA-Genre.pdf
12-02-ACA-Genre.pdf
 
10-02-ACA-Alignment.pdf
10-02-ACA-Alignment.pdf10-02-ACA-Alignment.pdf
10-02-ACA-Alignment.pdf
 
09-03-ACA-Temporal-Onsets.pdf
09-03-ACA-Temporal-Onsets.pdf09-03-ACA-Temporal-Onsets.pdf
09-03-ACA-Temporal-Onsets.pdf
 
13-00-ACA-Mood.pdf
13-00-ACA-Mood.pdf13-00-ACA-Mood.pdf
13-00-ACA-Mood.pdf
 
09-07-ACA-Temporal-Structure.pdf
09-07-ACA-Temporal-Structure.pdf09-07-ACA-Temporal-Structure.pdf
09-07-ACA-Temporal-Structure.pdf
 
09-01-ACA-Temporal-Intro.pdf
09-01-ACA-Temporal-Intro.pdf09-01-ACA-Temporal-Intro.pdf
09-01-ACA-Temporal-Intro.pdf
 
07-06-ACA-Tonal-Chord.pdf
07-06-ACA-Tonal-Chord.pdf07-06-ACA-Tonal-Chord.pdf
07-06-ACA-Tonal-Chord.pdf
 
15-00-ACA-Music-Performance-Analysis.pdf
15-00-ACA-Music-Performance-Analysis.pdf15-00-ACA-Music-Performance-Analysis.pdf
15-00-ACA-Music-Performance-Analysis.pdf
 
12-01-ACA-Similarity.pdf
12-01-ACA-Similarity.pdf12-01-ACA-Similarity.pdf
12-01-ACA-Similarity.pdf
 
09-04-ACA-Temporal-BeatHisto.pdf
09-04-ACA-Temporal-BeatHisto.pdf09-04-ACA-Temporal-BeatHisto.pdf
09-04-ACA-Temporal-BeatHisto.pdf
 
09-05-ACA-Temporal-Tempo.pdf
09-05-ACA-Temporal-Tempo.pdf09-05-ACA-Temporal-Tempo.pdf
09-05-ACA-Temporal-Tempo.pdf
 
11-00-ACA-Fingerprinting.pdf
11-00-ACA-Fingerprinting.pdf11-00-ACA-Fingerprinting.pdf
11-00-ACA-Fingerprinting.pdf
 
08-00-ACA-Intensity.pdf
08-00-ACA-Intensity.pdf08-00-ACA-Intensity.pdf
08-00-ACA-Intensity.pdf
 
03-05-ACA-Input-Features.pdf
03-05-ACA-Input-Features.pdf03-05-ACA-Input-Features.pdf
03-05-ACA-Input-Features.pdf
 
07-03-03-ACA-Tonal-Monof0.pdf
07-03-03-ACA-Tonal-Monof0.pdf07-03-03-ACA-Tonal-Monof0.pdf
07-03-03-ACA-Tonal-Monof0.pdf
 
05-00-ACA-Data-Intro.pdf
05-00-ACA-Data-Intro.pdf05-00-ACA-Data-Intro.pdf
05-00-ACA-Data-Intro.pdf
 
07-03-05-ACA-Tonal-F0Eval.pdf
07-03-05-ACA-Tonal-F0Eval.pdf07-03-05-ACA-Tonal-F0Eval.pdf
07-03-05-ACA-Tonal-F0Eval.pdf
 
03-03-01-ACA-Input-TF-Fourier.pdf
03-03-01-ACA-Input-TF-Fourier.pdf03-03-01-ACA-Input-TF-Fourier.pdf
03-03-01-ACA-Input-TF-Fourier.pdf
 
07-03-04-ACA-Tonal-Polyf0.pdf
07-03-04-ACA-Tonal-Polyf0.pdf07-03-04-ACA-Tonal-Polyf0.pdf
07-03-04-ACA-Tonal-Polyf0.pdf
 

Recently uploaded

20240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 202420240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 2024
Matthew Sinclair
 
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
SOFTTECHHUB
 
“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”
Claudio Di Ciccio
 
UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5
DianaGray10
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
KAMESHS29
 
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
SOFTTECHHUB
 
20240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 202420240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 2024
Matthew Sinclair
 
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfUnlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Malak Abu Hammad
 
GenAI Pilot Implementation in the organizations
GenAI Pilot Implementation in the organizationsGenAI Pilot Implementation in the organizations
GenAI Pilot Implementation in the organizations
kumardaparthi1024
 
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
Neo4j
 
Programming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup SlidesProgramming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup Slides
Zilliz
 
Essentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FMEEssentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FME
Safe Software
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
Alpen-Adria-Universität
 
Pushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 daysPushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 days
Adtran
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
Neo4j
 
Microsoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdfMicrosoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdf
Uni Systems S.M.S.A.
 
Introduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - CybersecurityIntroduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - Cybersecurity
mikeeftimakis1
 
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceAI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
IndexBug
 
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfObservability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Paige Cruz
 
Removing Uninteresting Bytes in Software Fuzzing
Removing Uninteresting Bytes in Software FuzzingRemoving Uninteresting Bytes in Software Fuzzing
Removing Uninteresting Bytes in Software Fuzzing
Aftab Hussain
 

Recently uploaded (20)

20240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 202420240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 2024
 
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
 
“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”
 
UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
 
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
 
20240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 202420240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 2024
 
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfUnlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
 
GenAI Pilot Implementation in the organizations
GenAI Pilot Implementation in the organizationsGenAI Pilot Implementation in the organizations
GenAI Pilot Implementation in the organizations
 
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
 
Programming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup SlidesProgramming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup Slides
 
Essentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FMEEssentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FME
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
 
Pushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 daysPushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 days
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
 
Microsoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdfMicrosoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdf
 
Introduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - CybersecurityIntroduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - Cybersecurity
 
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceAI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
 
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfObservability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
 
Removing Uninteresting Bytes in Software Fuzzing
Removing Uninteresting Bytes in Software FuzzingRemoving Uninteresting Bytes in Software Fuzzing
Removing Uninteresting Bytes in Software Fuzzing
 

03-06-ACA-Input-FeatureLearning.pdf

  • 1. Introduction to Audio Content Analysis module 3.6: learned features alexander lerch
  • 2. overview intro approaches summary introduction overview corresponding textbook section section 3.6 lecture content • introduction to the concept of feature learning • quick overview of important methods learning objectives • describe the advantages and disadvantages of feature learning • give examples of approaches to feature learning module 3.6: learned features 1 / 6
  • 3. overview intro approaches summary introduction overview corresponding textbook section section 3.6 lecture content • introduction to the concept of feature learning • quick overview of important methods learning objectives • describe the advantages and disadvantages of feature learning • give examples of approaches to feature learning module 3.6: learned features 1 / 6
  • 4. overview intro approaches summary introduction feature learning vs. traditional features hand-crafted features: • arbitrary definitions • simple to compute • mostly focus on one technical property • provide limited information feature learning: • automatically learn features from data-set • meaning not obvious, can combine multiple properties module 3.6: learned features 2 / 6
  • 5. overview intro approaches summary introduction feature learning vs. traditional features hand-crafted features: • arbitrary definitions • simple to compute • mostly focus on one technical property • provide limited information feature learning: • automatically learn features from data-set • meaning not obvious, can combine multiple properties module 3.6: learned features 2 / 6
  • 6. overview intro approaches summary overview general properties principle 1 put (a lot of) raw data at input 2 learn a way of reducing dimensionality while keeping as much information as possible advantages • features might contain more useful information than provided by hand-crafted features • no expert knowledge required disadvantages • usually time consuming • limited ways of controlling the type of information learned module 3.6: learned features 3 / 6
  • 7. overview intro approaches summary overview general properties principle 1 put (a lot of) raw data at input 2 learn a way of reducing dimensionality while keeping as much information as possible advantages • features might contain more useful information than provided by hand-crafted features • no expert knowledge required disadvantages • usually time consuming • limited ways of controlling the type of information learned module 3.6: learned features 3 / 6
  • 8. overview intro approaches summary overview general properties principle 1 put (a lot of) raw data at input 2 learn a way of reducing dimensionality while keeping as much information as possible advantages • features might contain more useful information than provided by hand-crafted features • no expert knowledge required disadvantages • usually time consuming • limited ways of controlling the type of information learned module 3.6: learned features 3 / 6
  • 9. overview intro approaches summary approaches dictionary learning dictionary learning (sparse coding, non-negative matrix factorization) X = B · A X: input signal to be modeled (often spectrogram) B: dictionary/template matrix (often set of single spectra that comprise the basic building blocks of X) A: activation matrix indicating the weight and superposition of templates • derive B,A, by minimizing a cost function, e.g. ||X − BA||2 → templates are trained, activations are used as feature vector (length: number of templates) module 3.6: learned features 4 / 6
  • 10. overview intro approaches summary approaches dictionary learning dictionary learning (sparse coding, non-negative matrix factorization) X = B · A X: input signal to be modeled (often spectrogram) B: dictionary/template matrix (often set of single spectra that comprise the basic building blocks of X) A: activation matrix indicating the weight and superposition of templates • derive B,A, by minimizing a cost function, e.g. ||X − BA||2 → templates are trained, activations are used as feature vector (length: number of templates) module 3.6: learned features 4 / 6
  • 11. overview intro approaches summary approaches deep learning clustering • find clusters in data set (e.g., from magnitude spectra or simple features) • store median of clusters (compare: template matrix) → features: ▶ binary vector (length: number of clusters, zero except for closest cluster) ▶ distance vector (distance to each cluster) neural networks and deep architectures • stack multiple layers of simple learning blocks • each layer uses the output of the previous layer as input → feature: output of the highest layer • task-dependency: ▶ independent: use auto-encoder structure to maximize encoded information ▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation for other tasks (cf. transfer learning) module 3.6: learned features 5 / 6
  • 12. overview intro approaches summary approaches deep learning clustering • find clusters in data set (e.g., from magnitude spectra or simple features) • store median of clusters (compare: template matrix) → features: ▶ binary vector (length: number of clusters, zero except for closest cluster) ▶ distance vector (distance to each cluster) neural networks and deep architectures • stack multiple layers of simple learning blocks • each layer uses the output of the previous layer as input → feature: output of the highest layer • task-dependency: ▶ independent: use auto-encoder structure to maximize encoded information ▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation for other tasks (cf. transfer learning) module 3.6: learned features 5 / 6
  • 13. overview intro approaches summary approaches deep learning clustering • find clusters in data set (e.g., from magnitude spectra or simple features) • store median of clusters (compare: template matrix) → features: ▶ binary vector (length: number of clusters, zero except for closest cluster) ▶ distance vector (distance to each cluster) neural networks and deep architectures • stack multiple layers of simple learning blocks • each layer uses the output of the previous layer as input → feature: output of the highest layer • task-dependency: ▶ independent: use auto-encoder structure to maximize encoded information ▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation for other tasks (cf. transfer learning) module 3.6: learned features 5 / 6
  • 14. overview intro approaches summary approaches deep learning clustering • find clusters in data set (e.g., from magnitude spectra or simple features) • store median of clusters (compare: template matrix) → features: ▶ binary vector (length: number of clusters, zero except for closest cluster) ▶ distance vector (distance to each cluster) neural networks and deep architectures • stack multiple layers of simple learning blocks • each layer uses the output of the previous layer as input → feature: output of the highest layer • task-dependency: ▶ independent: use auto-encoder structure to maximize encoded information ▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation for other tasks (cf. transfer learning) module 3.6: learned features 5 / 6
  • 15. overview intro approaches summary approaches deep learning clustering • find clusters in data set (e.g., from magnitude spectra or simple features) • store median of clusters (compare: template matrix) → features: ▶ binary vector (length: number of clusters, zero except for closest cluster) ▶ distance vector (distance to each cluster) neural networks and deep architectures • stack multiple layers of simple learning blocks • each layer uses the output of the previous layer as input → feature: output of the highest layer • task-dependency: ▶ independent: use auto-encoder structure to maximize encoded information ▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation for other tasks (cf. transfer learning) module 3.6: learned features 5 / 6
  • 16. overview intro approaches summary summary lecture content learned features • not designed but learn from data ▶ no hand-designed features ⇒ no expert knowledge needed ▶ requires large amounts of data and computational resources • not interpretable • potentially more powerful as it contains all relevant information module 3.6: learned features 6 / 6