SlideShare a Scribd company logo
1 of 16
Download to read offline
Introduction to Audio Content Analysis
module 3.6: learned features
alexander lerch
overview intro approaches summary
introduction
overview
corresponding textbook section
section 3.6
lecture content
• introduction to the concept of feature learning
• quick overview of important methods
learning objectives
• describe the advantages and disadvantages of feature learning
• give examples of approaches to feature learning
module 3.6: learned features 1 / 6
overview intro approaches summary
introduction
overview
corresponding textbook section
section 3.6
lecture content
• introduction to the concept of feature learning
• quick overview of important methods
learning objectives
• describe the advantages and disadvantages of feature learning
• give examples of approaches to feature learning
module 3.6: learned features 1 / 6
overview intro approaches summary
introduction
feature learning vs. traditional features
hand-crafted features:
• arbitrary definitions
• simple to compute
• mostly focus on one technical property
• provide limited information
feature learning:
• automatically learn features from data-set
• meaning not obvious, can combine multiple properties
module 3.6: learned features 2 / 6
overview intro approaches summary
introduction
feature learning vs. traditional features
hand-crafted features:
• arbitrary definitions
• simple to compute
• mostly focus on one technical property
• provide limited information
feature learning:
• automatically learn features from data-set
• meaning not obvious, can combine multiple properties
module 3.6: learned features 2 / 6
overview intro approaches summary
overview
general properties
principle
1 put (a lot of) raw data at input
2 learn a way of reducing dimensionality while keeping as much information as possible
advantages
• features might contain more useful information than provided by hand-crafted
features
• no expert knowledge required
disadvantages
• usually time consuming
• limited ways of controlling the type of information learned
module 3.6: learned features 3 / 6
overview intro approaches summary
overview
general properties
principle
1 put (a lot of) raw data at input
2 learn a way of reducing dimensionality while keeping as much information as possible
advantages
• features might contain more useful information than provided by hand-crafted
features
• no expert knowledge required
disadvantages
• usually time consuming
• limited ways of controlling the type of information learned
module 3.6: learned features 3 / 6
overview intro approaches summary
overview
general properties
principle
1 put (a lot of) raw data at input
2 learn a way of reducing dimensionality while keeping as much information as possible
advantages
• features might contain more useful information than provided by hand-crafted
features
• no expert knowledge required
disadvantages
• usually time consuming
• limited ways of controlling the type of information learned
module 3.6: learned features 3 / 6
overview intro approaches summary
approaches
dictionary learning
dictionary learning (sparse coding, non-negative matrix factorization)
X = B · A
X: input signal to be modeled (often spectrogram)
B: dictionary/template matrix (often set of single spectra that comprise the basic
building blocks of X)
A: activation matrix indicating the weight and superposition of templates
• derive B,A, by minimizing a cost function, e.g. ||X − BA||2
→ templates are trained, activations are used as feature vector (length: number of
templates)
module 3.6: learned features 4 / 6
overview intro approaches summary
approaches
dictionary learning
dictionary learning (sparse coding, non-negative matrix factorization)
X = B · A
X: input signal to be modeled (often spectrogram)
B: dictionary/template matrix (often set of single spectra that comprise the basic
building blocks of X)
A: activation matrix indicating the weight and superposition of templates
• derive B,A, by minimizing a cost function, e.g. ||X − BA||2
→ templates are trained, activations are used as feature vector (length: number of
templates)
module 3.6: learned features 4 / 6
overview intro approaches summary
approaches
deep learning
clustering
• find clusters in data set (e.g., from magnitude spectra or simple features)
• store median of clusters (compare: template matrix)
→ features:
▶ binary vector (length: number of clusters, zero except for closest cluster)
▶ distance vector (distance to each cluster)
neural networks and deep architectures
• stack multiple layers of simple learning blocks
• each layer uses the output of the previous layer as input
→ feature: output of the highest layer
• task-dependency:
▶ independent: use auto-encoder structure to maximize encoded information
▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation
for other tasks (cf. transfer learning)
module 3.6: learned features 5 / 6
overview intro approaches summary
approaches
deep learning
clustering
• find clusters in data set (e.g., from magnitude spectra or simple features)
• store median of clusters (compare: template matrix)
→ features:
▶ binary vector (length: number of clusters, zero except for closest cluster)
▶ distance vector (distance to each cluster)
neural networks and deep architectures
• stack multiple layers of simple learning blocks
• each layer uses the output of the previous layer as input
→ feature: output of the highest layer
• task-dependency:
▶ independent: use auto-encoder structure to maximize encoded information
▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation
for other tasks (cf. transfer learning)
module 3.6: learned features 5 / 6
overview intro approaches summary
approaches
deep learning
clustering
• find clusters in data set (e.g., from magnitude spectra or simple features)
• store median of clusters (compare: template matrix)
→ features:
▶ binary vector (length: number of clusters, zero except for closest cluster)
▶ distance vector (distance to each cluster)
neural networks and deep architectures
• stack multiple layers of simple learning blocks
• each layer uses the output of the previous layer as input
→ feature: output of the highest layer
• task-dependency:
▶ independent: use auto-encoder structure to maximize encoded information
▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation
for other tasks (cf. transfer learning)
module 3.6: learned features 5 / 6
overview intro approaches summary
approaches
deep learning
clustering
• find clusters in data set (e.g., from magnitude spectra or simple features)
• store median of clusters (compare: template matrix)
→ features:
▶ binary vector (length: number of clusters, zero except for closest cluster)
▶ distance vector (distance to each cluster)
neural networks and deep architectures
• stack multiple layers of simple learning blocks
• each layer uses the output of the previous layer as input
→ feature: output of the highest layer
• task-dependency:
▶ independent: use auto-encoder structure to maximize encoded information
▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation
for other tasks (cf. transfer learning)
module 3.6: learned features 5 / 6
overview intro approaches summary
approaches
deep learning
clustering
• find clusters in data set (e.g., from magnitude spectra or simple features)
• store median of clusters (compare: template matrix)
→ features:
▶ binary vector (length: number of clusters, zero except for closest cluster)
▶ distance vector (distance to each cluster)
neural networks and deep architectures
• stack multiple layers of simple learning blocks
• each layer uses the output of the previous layer as input
→ feature: output of the highest layer
• task-dependency:
▶ independent: use auto-encoder structure to maximize encoded information
▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation
for other tasks (cf. transfer learning)
module 3.6: learned features 5 / 6
overview intro approaches summary
summary
lecture content
learned features
• not designed but learn from data
▶ no hand-designed features ⇒ no expert knowledge needed
▶ requires large amounts of data and computational resources
• not interpretable
• potentially more powerful as it contains all relevant information
module 3.6: learned features 6 / 6

More Related Content

Similar to 03-06-ACA-Input-FeatureLearning.pdf

Ccourse 140618093931-phpapp02
Ccourse 140618093931-phpapp02Ccourse 140618093931-phpapp02
Ccourse 140618093931-phpapp02
Getachew Ganfur
 
10 - Encapsulation(object oriented programming)- java . ppt
10 - Encapsulation(object oriented programming)- java . ppt10 - Encapsulation(object oriented programming)- java . ppt
10 - Encapsulation(object oriented programming)- java . ppt
VhlRddy
 
Week_1 Machine Learning introduction.pptx
Week_1 Machine Learning introduction.pptxWeek_1 Machine Learning introduction.pptx
Week_1 Machine Learning introduction.pptx
muhammadsamroz
 
Introduction to oop
Introduction to oop Introduction to oop
Introduction to oop
Kumar
 
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
Databricks
 
C++ppt. Classs and object, class and object
C++ppt. Classs and object, class and objectC++ppt. Classs and object, class and object
C++ppt. Classs and object, class and object
secondakay
 

Similar to 03-06-ACA-Input-FeatureLearning.pdf (20)

Ccourse 140618093931-phpapp02
Ccourse 140618093931-phpapp02Ccourse 140618093931-phpapp02
Ccourse 140618093931-phpapp02
 
c++ Unit III - PPT.pptx
c++ Unit III - PPT.pptxc++ Unit III - PPT.pptx
c++ Unit III - PPT.pptx
 
Object Oriented Programming Concepts using Java
Object Oriented Programming Concepts using JavaObject Oriented Programming Concepts using Java
Object Oriented Programming Concepts using Java
 
10 - Encapsulation(object oriented programming)- java . ppt
10 - Encapsulation(object oriented programming)- java . ppt10 - Encapsulation(object oriented programming)- java . ppt
10 - Encapsulation(object oriented programming)- java . ppt
 
4-OOPS.pptx
4-OOPS.pptx4-OOPS.pptx
4-OOPS.pptx
 
Unit3 part3-packages and interfaces
Unit3 part3-packages and interfacesUnit3 part3-packages and interfaces
Unit3 part3-packages and interfaces
 
Inheritance
InheritanceInheritance
Inheritance
 
Industry - Program analysis and verification - Type-preserving Heap Profiler ...
Industry - Program analysis and verification - Type-preserving Heap Profiler ...Industry - Program analysis and verification - Type-preserving Heap Profiler ...
Industry - Program analysis and verification - Type-preserving Heap Profiler ...
 
Bjarne essencegn13
Bjarne essencegn13Bjarne essencegn13
Bjarne essencegn13
 
C# classes objects
C#  classes objectsC#  classes objects
C# classes objects
 
Deep learning with keras
Deep learning with kerasDeep learning with keras
Deep learning with keras
 
Synapseindia reviews.odp.
Synapseindia reviews.odp.Synapseindia reviews.odp.
Synapseindia reviews.odp.
 
Week_1 Machine Learning introduction.pptx
Week_1 Machine Learning introduction.pptxWeek_1 Machine Learning introduction.pptx
Week_1 Machine Learning introduction.pptx
 
UsingCPP_for_Artist.ppt
UsingCPP_for_Artist.pptUsingCPP_for_Artist.ppt
UsingCPP_for_Artist.ppt
 
Introduction to oop
Introduction to oop Introduction to oop
Introduction to oop
 
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
An Online Spark Pipeline: Semi-Supervised Learning and Automatic Retraining w...
 
C++ppt. Classs and object, class and object
C++ppt. Classs and object, class and objectC++ppt. Classs and object, class and object
C++ppt. Classs and object, class and object
 
background.pptx
background.pptxbackground.pptx
background.pptx
 
Introduction to Design Patterns in Javascript
Introduction to Design Patterns in JavascriptIntroduction to Design Patterns in Javascript
Introduction to Design Patterns in Javascript
 
Oop(object oriented programming)
Oop(object oriented programming)Oop(object oriented programming)
Oop(object oriented programming)
 

More from AlexanderLerch4

More from AlexanderLerch4 (20)

0A-02-ACA-Fundamentals-Convolution.pdf
0A-02-ACA-Fundamentals-Convolution.pdf0A-02-ACA-Fundamentals-Convolution.pdf
0A-02-ACA-Fundamentals-Convolution.pdf
 
12-02-ACA-Genre.pdf
12-02-ACA-Genre.pdf12-02-ACA-Genre.pdf
12-02-ACA-Genre.pdf
 
10-02-ACA-Alignment.pdf
10-02-ACA-Alignment.pdf10-02-ACA-Alignment.pdf
10-02-ACA-Alignment.pdf
 
09-03-ACA-Temporal-Onsets.pdf
09-03-ACA-Temporal-Onsets.pdf09-03-ACA-Temporal-Onsets.pdf
09-03-ACA-Temporal-Onsets.pdf
 
13-00-ACA-Mood.pdf
13-00-ACA-Mood.pdf13-00-ACA-Mood.pdf
13-00-ACA-Mood.pdf
 
09-07-ACA-Temporal-Structure.pdf
09-07-ACA-Temporal-Structure.pdf09-07-ACA-Temporal-Structure.pdf
09-07-ACA-Temporal-Structure.pdf
 
09-01-ACA-Temporal-Intro.pdf
09-01-ACA-Temporal-Intro.pdf09-01-ACA-Temporal-Intro.pdf
09-01-ACA-Temporal-Intro.pdf
 
07-06-ACA-Tonal-Chord.pdf
07-06-ACA-Tonal-Chord.pdf07-06-ACA-Tonal-Chord.pdf
07-06-ACA-Tonal-Chord.pdf
 
15-00-ACA-Music-Performance-Analysis.pdf
15-00-ACA-Music-Performance-Analysis.pdf15-00-ACA-Music-Performance-Analysis.pdf
15-00-ACA-Music-Performance-Analysis.pdf
 
12-01-ACA-Similarity.pdf
12-01-ACA-Similarity.pdf12-01-ACA-Similarity.pdf
12-01-ACA-Similarity.pdf
 
09-04-ACA-Temporal-BeatHisto.pdf
09-04-ACA-Temporal-BeatHisto.pdf09-04-ACA-Temporal-BeatHisto.pdf
09-04-ACA-Temporal-BeatHisto.pdf
 
09-05-ACA-Temporal-Tempo.pdf
09-05-ACA-Temporal-Tempo.pdf09-05-ACA-Temporal-Tempo.pdf
09-05-ACA-Temporal-Tempo.pdf
 
11-00-ACA-Fingerprinting.pdf
11-00-ACA-Fingerprinting.pdf11-00-ACA-Fingerprinting.pdf
11-00-ACA-Fingerprinting.pdf
 
08-00-ACA-Intensity.pdf
08-00-ACA-Intensity.pdf08-00-ACA-Intensity.pdf
08-00-ACA-Intensity.pdf
 
03-05-ACA-Input-Features.pdf
03-05-ACA-Input-Features.pdf03-05-ACA-Input-Features.pdf
03-05-ACA-Input-Features.pdf
 
07-03-03-ACA-Tonal-Monof0.pdf
07-03-03-ACA-Tonal-Monof0.pdf07-03-03-ACA-Tonal-Monof0.pdf
07-03-03-ACA-Tonal-Monof0.pdf
 
05-00-ACA-Data-Intro.pdf
05-00-ACA-Data-Intro.pdf05-00-ACA-Data-Intro.pdf
05-00-ACA-Data-Intro.pdf
 
07-03-05-ACA-Tonal-F0Eval.pdf
07-03-05-ACA-Tonal-F0Eval.pdf07-03-05-ACA-Tonal-F0Eval.pdf
07-03-05-ACA-Tonal-F0Eval.pdf
 
03-03-01-ACA-Input-TF-Fourier.pdf
03-03-01-ACA-Input-TF-Fourier.pdf03-03-01-ACA-Input-TF-Fourier.pdf
03-03-01-ACA-Input-TF-Fourier.pdf
 
07-03-04-ACA-Tonal-Polyf0.pdf
07-03-04-ACA-Tonal-Polyf0.pdf07-03-04-ACA-Tonal-Polyf0.pdf
07-03-04-ACA-Tonal-Polyf0.pdf
 

Recently uploaded

Hyatt driving innovation and exceptional customer experiences with FIDO passw...
Hyatt driving innovation and exceptional customer experiences with FIDO passw...Hyatt driving innovation and exceptional customer experiences with FIDO passw...
Hyatt driving innovation and exceptional customer experiences with FIDO passw...
FIDO Alliance
 
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc
 

Recently uploaded (20)

Introduction to use of FHIR Documents in ABDM
Introduction to use of FHIR Documents in ABDMIntroduction to use of FHIR Documents in ABDM
Introduction to use of FHIR Documents in ABDM
 
Human Expert Website Manual WCAG 2.0 2.1 2.2 Audit - Digital Accessibility Au...
Human Expert Website Manual WCAG 2.0 2.1 2.2 Audit - Digital Accessibility Au...Human Expert Website Manual WCAG 2.0 2.1 2.2 Audit - Digital Accessibility Au...
Human Expert Website Manual WCAG 2.0 2.1 2.2 Audit - Digital Accessibility Au...
 
Google I/O Extended 2024 Warsaw
Google I/O Extended 2024 WarsawGoogle I/O Extended 2024 Warsaw
Google I/O Extended 2024 Warsaw
 
WebAssembly is Key to Better LLM Performance
WebAssembly is Key to Better LLM PerformanceWebAssembly is Key to Better LLM Performance
WebAssembly is Key to Better LLM Performance
 
ERP Contender Series: Acumatica vs. Sage Intacct
ERP Contender Series: Acumatica vs. Sage IntacctERP Contender Series: Acumatica vs. Sage Intacct
ERP Contender Series: Acumatica vs. Sage Intacct
 
Generative AI Use Cases and Applications.pdf
Generative AI Use Cases and Applications.pdfGenerative AI Use Cases and Applications.pdf
Generative AI Use Cases and Applications.pdf
 
Working together SRE & Platform Engineering
Working together SRE & Platform EngineeringWorking together SRE & Platform Engineering
Working together SRE & Platform Engineering
 
ChatGPT and Beyond - Elevating DevOps Productivity
ChatGPT and Beyond - Elevating DevOps ProductivityChatGPT and Beyond - Elevating DevOps Productivity
ChatGPT and Beyond - Elevating DevOps Productivity
 
How to Check CNIC Information Online with Pakdata cf
How to Check CNIC Information Online with Pakdata cfHow to Check CNIC Information Online with Pakdata cf
How to Check CNIC Information Online with Pakdata cf
 
JohnPollard-hybrid-app-RailsConf2024.pptx
JohnPollard-hybrid-app-RailsConf2024.pptxJohnPollard-hybrid-app-RailsConf2024.pptx
JohnPollard-hybrid-app-RailsConf2024.pptx
 
Design Guidelines for Passkeys 2024.pptx
Design Guidelines for Passkeys 2024.pptxDesign Guidelines for Passkeys 2024.pptx
Design Guidelines for Passkeys 2024.pptx
 
Vector Search @ sw2con for slideshare.pptx
Vector Search @ sw2con for slideshare.pptxVector Search @ sw2con for slideshare.pptx
Vector Search @ sw2con for slideshare.pptx
 
الأمن السيبراني - ما لا يسع للمستخدم جهله
الأمن السيبراني - ما لا يسع للمستخدم جهلهالأمن السيبراني - ما لا يسع للمستخدم جهله
الأمن السيبراني - ما لا يسع للمستخدم جهله
 
Top 10 CodeIgniter Development Companies
Top 10 CodeIgniter Development CompaniesTop 10 CodeIgniter Development Companies
Top 10 CodeIgniter Development Companies
 
Hyatt driving innovation and exceptional customer experiences with FIDO passw...
Hyatt driving innovation and exceptional customer experiences with FIDO passw...Hyatt driving innovation and exceptional customer experiences with FIDO passw...
Hyatt driving innovation and exceptional customer experiences with FIDO passw...
 
TEST BANK For, Information Technology Project Management 9th Edition Kathy Sc...
TEST BANK For, Information Technology Project Management 9th Edition Kathy Sc...TEST BANK For, Information Technology Project Management 9th Edition Kathy Sc...
TEST BANK For, Information Technology Project Management 9th Edition Kathy Sc...
 
How to Check GPS Location with a Live Tracker in Pakistan
How to Check GPS Location with a Live Tracker in PakistanHow to Check GPS Location with a Live Tracker in Pakistan
How to Check GPS Location with a Live Tracker in Pakistan
 
Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)
Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)
Observability Concepts EVERY Developer Should Know (DevOpsDays Seattle)
 
AI in Action: Real World Use Cases by Anitaraj
AI in Action: Real World Use Cases by AnitarajAI in Action: Real World Use Cases by Anitaraj
AI in Action: Real World Use Cases by Anitaraj
 
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
 

03-06-ACA-Input-FeatureLearning.pdf

  • 1. Introduction to Audio Content Analysis module 3.6: learned features alexander lerch
  • 2. overview intro approaches summary introduction overview corresponding textbook section section 3.6 lecture content • introduction to the concept of feature learning • quick overview of important methods learning objectives • describe the advantages and disadvantages of feature learning • give examples of approaches to feature learning module 3.6: learned features 1 / 6
  • 3. overview intro approaches summary introduction overview corresponding textbook section section 3.6 lecture content • introduction to the concept of feature learning • quick overview of important methods learning objectives • describe the advantages and disadvantages of feature learning • give examples of approaches to feature learning module 3.6: learned features 1 / 6
  • 4. overview intro approaches summary introduction feature learning vs. traditional features hand-crafted features: • arbitrary definitions • simple to compute • mostly focus on one technical property • provide limited information feature learning: • automatically learn features from data-set • meaning not obvious, can combine multiple properties module 3.6: learned features 2 / 6
  • 5. overview intro approaches summary introduction feature learning vs. traditional features hand-crafted features: • arbitrary definitions • simple to compute • mostly focus on one technical property • provide limited information feature learning: • automatically learn features from data-set • meaning not obvious, can combine multiple properties module 3.6: learned features 2 / 6
  • 6. overview intro approaches summary overview general properties principle 1 put (a lot of) raw data at input 2 learn a way of reducing dimensionality while keeping as much information as possible advantages • features might contain more useful information than provided by hand-crafted features • no expert knowledge required disadvantages • usually time consuming • limited ways of controlling the type of information learned module 3.6: learned features 3 / 6
  • 7. overview intro approaches summary overview general properties principle 1 put (a lot of) raw data at input 2 learn a way of reducing dimensionality while keeping as much information as possible advantages • features might contain more useful information than provided by hand-crafted features • no expert knowledge required disadvantages • usually time consuming • limited ways of controlling the type of information learned module 3.6: learned features 3 / 6
  • 8. overview intro approaches summary overview general properties principle 1 put (a lot of) raw data at input 2 learn a way of reducing dimensionality while keeping as much information as possible advantages • features might contain more useful information than provided by hand-crafted features • no expert knowledge required disadvantages • usually time consuming • limited ways of controlling the type of information learned module 3.6: learned features 3 / 6
  • 9. overview intro approaches summary approaches dictionary learning dictionary learning (sparse coding, non-negative matrix factorization) X = B · A X: input signal to be modeled (often spectrogram) B: dictionary/template matrix (often set of single spectra that comprise the basic building blocks of X) A: activation matrix indicating the weight and superposition of templates • derive B,A, by minimizing a cost function, e.g. ||X − BA||2 → templates are trained, activations are used as feature vector (length: number of templates) module 3.6: learned features 4 / 6
  • 10. overview intro approaches summary approaches dictionary learning dictionary learning (sparse coding, non-negative matrix factorization) X = B · A X: input signal to be modeled (often spectrogram) B: dictionary/template matrix (often set of single spectra that comprise the basic building blocks of X) A: activation matrix indicating the weight and superposition of templates • derive B,A, by minimizing a cost function, e.g. ||X − BA||2 → templates are trained, activations are used as feature vector (length: number of templates) module 3.6: learned features 4 / 6
  • 11. overview intro approaches summary approaches deep learning clustering • find clusters in data set (e.g., from magnitude spectra or simple features) • store median of clusters (compare: template matrix) → features: ▶ binary vector (length: number of clusters, zero except for closest cluster) ▶ distance vector (distance to each cluster) neural networks and deep architectures • stack multiple layers of simple learning blocks • each layer uses the output of the previous layer as input → feature: output of the highest layer • task-dependency: ▶ independent: use auto-encoder structure to maximize encoded information ▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation for other tasks (cf. transfer learning) module 3.6: learned features 5 / 6
  • 12. overview intro approaches summary approaches deep learning clustering • find clusters in data set (e.g., from magnitude spectra or simple features) • store median of clusters (compare: template matrix) → features: ▶ binary vector (length: number of clusters, zero except for closest cluster) ▶ distance vector (distance to each cluster) neural networks and deep architectures • stack multiple layers of simple learning blocks • each layer uses the output of the previous layer as input → feature: output of the highest layer • task-dependency: ▶ independent: use auto-encoder structure to maximize encoded information ▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation for other tasks (cf. transfer learning) module 3.6: learned features 5 / 6
  • 13. overview intro approaches summary approaches deep learning clustering • find clusters in data set (e.g., from magnitude spectra or simple features) • store median of clusters (compare: template matrix) → features: ▶ binary vector (length: number of clusters, zero except for closest cluster) ▶ distance vector (distance to each cluster) neural networks and deep architectures • stack multiple layers of simple learning blocks • each layer uses the output of the previous layer as input → feature: output of the highest layer • task-dependency: ▶ independent: use auto-encoder structure to maximize encoded information ▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation for other tasks (cf. transfer learning) module 3.6: learned features 5 / 6
  • 14. overview intro approaches summary approaches deep learning clustering • find clusters in data set (e.g., from magnitude spectra or simple features) • store median of clusters (compare: template matrix) → features: ▶ binary vector (length: number of clusters, zero except for closest cluster) ▶ distance vector (distance to each cluster) neural networks and deep architectures • stack multiple layers of simple learning blocks • each layer uses the output of the previous layer as input → feature: output of the highest layer • task-dependency: ▶ independent: use auto-encoder structure to maximize encoded information ▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation for other tasks (cf. transfer learning) module 3.6: learned features 5 / 6
  • 15. overview intro approaches summary approaches deep learning clustering • find clusters in data set (e.g., from magnitude spectra or simple features) • store median of clusters (compare: template matrix) → features: ▶ binary vector (length: number of clusters, zero except for closest cluster) ▶ distance vector (distance to each cluster) neural networks and deep architectures • stack multiple layers of simple learning blocks • each layer uses the output of the previous layer as input → feature: output of the highest layer • task-dependency: ▶ independent: use auto-encoder structure to maximize encoded information ▶ dependent: train for specific task(s) (cf. multi-task learning) and use representation for other tasks (cf. transfer learning) module 3.6: learned features 5 / 6
  • 16. overview intro approaches summary summary lecture content learned features • not designed but learn from data ▶ no hand-designed features ⇒ no expert knowledge needed ▶ requires large amounts of data and computational resources • not interpretable • potentially more powerful as it contains all relevant information module 3.6: learned features 6 / 6