SlideShare a Scribd company logo
1 of 58
Download to read offline
/ 56
Deep generative neural networks for novelty
generation:
a foundational framework, metrics and experiments
1
Mehdi Cherti
LAL/CNRS, Université Paris Saclay
Supervised by:
- Balàzs Kégl (LAL/CNRS, Université Paris Saclay)
- Akın Kazakçı(Mines Paristech)
/ 562
Prediction
Quest for artificial intelligence
/ 563
Prediction Novelty generation
Quest for artificial intelligence
/ 56
How to study novelty generation ?
4
/ 565
Studying novelty generation:
•Design theory
•Computational creativity
•Machine learning
/ 566
Design theory
• Early work: (Simon, 1969, 1973), Design as
‘problem solving’ (i.e. moving from an initial state
to a desired state)
• C-K theory: (Hatchuel et al., 2003), Design as
joint expansion of knowledge and concepts
• Various formalisms of knowledge (Set Theory
(Hatchuel et al, 2007), Graphs (Kazakci et al,
2010), Matroids (Le Masson et al, 2017))
/ 56
• Through C-K, it acknowledges that
knowledge is central
• But lacks computer-based experimental
tools
7
Design theory
/ 568
Generation as optimization with
evolutionary algorithms
Computational creativity
/ 569
• Enables experimentation but the end-
goal is the object itself rather than
studying the generative process
• Fitness function barrier
• No representation learning
• Generation and evaluation are
disconnected
Computational creativity
/ 5610
Machine learning proposes powerful
generative models,
/ 5611
but these powerful models are used
to regenerate objects that we can
relate easily to known objects…
/ 56
• Although trained for
generating what we know,
some models can generate
unrecognizable objects
• However, these models and
samples are considered as
spurious (Bengio et al. 2013),
or as a failure(Salimans et al.
2016)
12
/ 56
Instead of ignoring or eliminating novelty,
we should study it.
13
/ 5614
• Goal of the thesis: study generative potential of deep generative
networks (DGNs) for novelty generation
• Research questions:
• What novelty can be generated by DGN?
• How to evaluate the generative potential of a DGN?
• What are the general characteristics of DGN that can generate
novelty?
• Method: We use computer based simulations with deep generative
models because
• They offer a rich and powerful set of existing techniques
• They can learn (i.e. representations of objects)
• Their generative potential has not been studied systematically
/ 56
Outline
1. Introduction
2.The impact of representations on novelty generation
3. Results
3.a. Studying the generative potential of a deep net
3.b. Evaluating the generative potential of deep nets
3.c. Characteristics of models that can generate novelty
4. Conclusion and perspectives
15
/ 5616
2.The impact of representations on
novelty generation
/ 56
2.The impact of representations on novelty
generation
17
(Reich, 1995)
In design literature, it has been acknowledged that
objects can be represented in multiple ways
What effect do representations have
on novelty generation ?
/ 56
• Suppose we have a dataset of 16 letters
18
2.The impact of representations on novelty
generation
/ 56
• Suppose we represent images in pixel space
• We generate pixels randomly uniformly
19
Everything is new,
but no structure
2.The impact of representations on novelty
generation
/ 56
• Suppose we re-represent each letter using strokes
20
• For instance,
2.The impact of representations on novelty
generation
/ 5621
2.The impact of representations on novelty
generation
/ 5622
Pixel space Stroke space
Representations change what you can generate
2.The impact of representations on novelty
generation
/ 56
•How do we choose a “useful” representation for
novelty generation ?
•Machine learning, and deep generative models in
particular, provides ways to learn
representations from data
Q: Can we use those learned representations for
generation of novelty even if these models are
not designed to do so ?
23
2.The impact of representations on novelty
generation
/ 5624
2.The impact of representations on novelty
generation
•Noise vs novelty
•Likelihood
•Compression of representations
Summary:
/ 5625
• What novelty can be generated by deep generative
nets (DGN)?
• How to evaluate the generative potential of a
DGN?
• What are the general characteristics of DGN that
can generate novelty?
Research questions:
/ 56
3.Experiments
26
/ 5627
• We observed that some models could generate novelty
although not designed to do that
• Thus, deep generative models have an unused generative
potential
• Can we demonstrate this more systematically ?
3.a. Studying the generative potential of a deep net
/ 5628
3.a. Studying the generative potential of a deep net
/ 5629
(Kazakci, Cherti, Kégl, 2016)
Train data
Generative
model
Learn
Generate
??
3.a. Studying the generative potential of a deep net
We use a convolutional sparse auto-encoder as a
model
Sparsity
Training objective is to
minimize the
reconstruction error
30
3.a. Studying the generative potential of a deep net
/ 56
Reconstruction
Input (dim 625)
Bottleneck
Encode
Decode
Deep autoencoder with a bottleneck from Hinton, G. E., & Salakhutdinov, R. R. (2006).
3.a. Studying the generative potential of a deep net
/ 56
• We use an iterative method to generate new images
• Start with a random image
• Force the network to construct (i.e. interpret)
• , until convergence, f(x) = dec(enc(x))
32
3.a. Studying the generative potential of a deep net
/ 5633
3.a. Studying the generative potential of a deep net
/ 56
3.a. Studying the generative potential of a deep net
34
/ 56
3.a. Studying the generative potential of a deep net
35
/ 5636Kazakçı, Cherti, Kégl, 2016
3.a. Studying the generative potential of a deep net
/ 5637
Known Training digits
Representable “Combinations of strokes”
37
3.a. Studying the generative potential of a deep net
Our interpretation
of the results:
/ 5638
Known Training digits
Representable All digits that the model can generate
Valuable All recognizable digits
3.a. Studying the generative potential of a deep net
/ 5639
Known Training digits
Representable “Combinations of strokes”
Valuable Human selection
3.a. Studying the generative potential of a deep net
/ 56
• We have one example of a deep generative
model that can indeed generate novelty
• Can we go further by automatically finding
models that can generate novelty ?
40
3.b. Evaluating the generative potential of deep nets
/ 5641
We designed a new setup and set of
metrics to find models that are capable
of generating novelty
3.b. Evaluating the generative potential of deep nets
/ 56
•Training on known classes
•Testing on classes known to the experimenter but
unknown to the model
42
Idea: simulate the unknown by
3.b. Evaluating the generative potential of deep nets
Proposed setup: train on digits and test on letters,
where letters are used as a proxy for evaluating
the capacity of models to generate novelty
/ 5643
Generative
model
Learn
Generate
Q: How many of those are
letters ?
3.b. Evaluating the generative potential of deep nets
/ 5644
Discriminator
Learn
3.b. Evaluating the generative potential of deep nets
To count letters, we learn a discriminator with
36 classes = 10 for digits + 26 for letters
/ 5645
Discriminator
Nb. of letters
Predict
3.b. Evaluating the generative potential of deep nets
We then use the discriminator to score the models:
/ 5646
3.b. Evaluating the generative potential of deep nets
“Nb of letters” score is a proxy for finding
models that generate images that are :
• non trivial
• non recognizable as digits
/ 56
• We do a large scale experiment where we train ~1000
models (autoencoders, GANs) by varying their
hyperparameters.
• From each model, we generate 1000 images, then we
evaluate the model using our proposed metrics
• Question we tried to answer:
47
Can we find models that can generate novelty ?
3.b. Evaluating the generative potential of deep nets
/ 56
• Selecting models for letters count lead to models that
can generate novelty
48
• Selecting models for digits count lead to models that
memorize training classes
3.b. Evaluating the generative potential of deep nets
Pangrams
49
3.b. Evaluating the generative potential of deep nets
/ 5650
Known Training digits
Representable “Combinations of strokes”
Valuable Letters
3.b. Evaluating the generative potential of deep nets
/ 5651
We have shown that we can automatically
find models that can generate novelty, as
well as other models that cannot
3.b. Evaluating the generative potential of deep nets
/ 5652
• Can we characterize the difference between
models that can generate novelty and models
that cannot ?
• We study a particular model architecture
through a series of experiments
3.b. Evaluating the generative potential of deep nets
/ 5653
• We study the effect of different ways of restricting the
capacity of the representation on the same architecture
• We find that restricting the capacity of the
representation hurts their ability to generate novelty
3.c.Characteristics of
models that can generate novelty
More capacity
Morenovelty
/ 5654
Conclusion
Main Contributions:
• Importance of representation on novelty generation
• Current models can generate novelty even though not
designed for that
• We propose a new setup and a set of metrics to assess
the capacity of the models to generate novelty
• We show that constraining the capacity of the
representation can be harmful for novelty generation
/ 5655
Perspectives: immediate next steps
• Explain why existing models can generate novelty
• Propose an explicit training criterion to learn a
representation suitable for novelty generation
• Propose alternatives generative procedures to
random sampling
• Experiment on more complex datasets and
domains
/ 5656
• Agent evolving over time: dynamic knowledge and
value function
• Multi-agent system so that agents get/give feedback
and cooperate
Perspectives: future
Thank you !
/ 56
3.a. Studying the generative potential of a deep net
58

More Related Content

What's hot

Training Neural Networks
Training Neural NetworksTraining Neural Networks
Training Neural NetworksDatabricks
 
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기1시간만에 GAN(Generative Adversarial Network) 완전 정복하기
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기NAVER Engineering
 
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Simplilearn
 
Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...bihira aggrey
 
PhD Oral Defense of Md Kafiul Islam on "ARTIFACT CHARACTERIZATION, DETECTION ...
PhD Oral Defense of Md Kafiul Islam on "ARTIFACT CHARACTERIZATION, DETECTION ...PhD Oral Defense of Md Kafiul Islam on "ARTIFACT CHARACTERIZATION, DETECTION ...
PhD Oral Defense of Md Kafiul Islam on "ARTIFACT CHARACTERIZATION, DETECTION ...Md Kafiul Islam
 
Implement principal component analysis (PCA) in python from scratch
Implement principal component analysis (PCA) in python from scratchImplement principal component analysis (PCA) in python from scratch
Implement principal component analysis (PCA) in python from scratchEshanAgarwal4
 
Activation function
Activation functionActivation function
Activation functionAstha Jain
 
Artificial Neural Network(Artificial intelligence)
Artificial Neural Network(Artificial intelligence)Artificial Neural Network(Artificial intelligence)
Artificial Neural Network(Artificial intelligence)spartacus131211
 
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)Fellowship at Vodafone FutureLab
 
Dimensionality Reduction
Dimensionality ReductionDimensionality Reduction
Dimensionality Reductionmrizwan969
 
Deep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksDeep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksChristian Perone
 
Back propagation
Back propagationBack propagation
Back propagationNagarajan
 

What's hot (20)

Training Neural Networks
Training Neural NetworksTraining Neural Networks
Training Neural Networks
 
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기1시간만에 GAN(Generative Adversarial Network) 완전 정복하기
1시간만에 GAN(Generative Adversarial Network) 완전 정복하기
 
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
 
Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...
 
PhD Oral Defense of Md Kafiul Islam on "ARTIFACT CHARACTERIZATION, DETECTION ...
PhD Oral Defense of Md Kafiul Islam on "ARTIFACT CHARACTERIZATION, DETECTION ...PhD Oral Defense of Md Kafiul Islam on "ARTIFACT CHARACTERIZATION, DETECTION ...
PhD Oral Defense of Md Kafiul Islam on "ARTIFACT CHARACTERIZATION, DETECTION ...
 
Meta-Learning Presentation
Meta-Learning PresentationMeta-Learning Presentation
Meta-Learning Presentation
 
Neural network
Neural networkNeural network
Neural network
 
Mc culloch pitts neuron
Mc culloch pitts neuronMc culloch pitts neuron
Mc culloch pitts neuron
 
MobileNet V3
MobileNet V3MobileNet V3
MobileNet V3
 
Implement principal component analysis (PCA) in python from scratch
Implement principal component analysis (PCA) in python from scratchImplement principal component analysis (PCA) in python from scratch
Implement principal component analysis (PCA) in python from scratch
 
Neural networks introduction
Neural networks introductionNeural networks introduction
Neural networks introduction
 
Activation function
Activation functionActivation function
Activation function
 
Artificial Neural Network(Artificial intelligence)
Artificial Neural Network(Artificial intelligence)Artificial Neural Network(Artificial intelligence)
Artificial Neural Network(Artificial intelligence)
 
Transfer Learning
Transfer LearningTransfer Learning
Transfer Learning
 
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
 
Dimensionality Reduction
Dimensionality ReductionDimensionality Reduction
Dimensionality Reduction
 
Deep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksDeep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural Networks
 
Back propagation
Back propagationBack propagation
Back propagation
 
Activation function
Activation functionActivation function
Activation function
 
AlexNet
AlexNetAlexNet
AlexNet
 

Similar to Slides, thesis dissertation defense, deep generative neural networks for novelty generation

Learning, Representations, Generative modelling
Learning, Representations, Generative modellingLearning, Representations, Generative modelling
Learning, Representations, Generative modellingAkin Osman Kazakci
 
Creativity through deep learning
Creativity through deep learningCreativity through deep learning
Creativity through deep learningAkin Osman Kazakci
 
Novelty generation with deep learning
Novelty generation with deep learningNovelty generation with deep learning
Novelty generation with deep learningmehdi Cherti
 
DEF CON 24 - Clarence Chio - machine duping 101
DEF CON 24 - Clarence Chio - machine duping 101DEF CON 24 - Clarence Chio - machine duping 101
DEF CON 24 - Clarence Chio - machine duping 101Felipe Prado
 
Deep Learning via Semi-Supervised Embedding (第 7 回 Deep Learning 勉強会資料; 大澤)
Deep Learning via Semi-Supervised Embedding (第 7 回 Deep Learning 勉強会資料; 大澤)Deep Learning via Semi-Supervised Embedding (第 7 回 Deep Learning 勉強会資料; 大澤)
Deep Learning via Semi-Supervised Embedding (第 7 回 Deep Learning 勉強会資料; 大澤)Ohsawa Goodfellow
 
Unit_4- Principles of AI explaining the importants of AI
Unit_4- Principles of AI explaining the importants of AIUnit_4- Principles of AI explaining the importants of AI
Unit_4- Principles of AI explaining the importants of AIVijayAECE1
 
Chris Dyer - 2017 - Neural MT Workshop Invited Talk: The Neural Noisy Channel...
Chris Dyer - 2017 - Neural MT Workshop Invited Talk: The Neural Noisy Channel...Chris Dyer - 2017 - Neural MT Workshop Invited Talk: The Neural Noisy Channel...
Chris Dyer - 2017 - Neural MT Workshop Invited Talk: The Neural Noisy Channel...Association for Computational Linguistics
 
Machine Learning Interpretability - Mateusz Dymczyk - H2O AI World London 2018
Machine Learning Interpretability - Mateusz Dymczyk - H2O AI World London 2018Machine Learning Interpretability - Mateusz Dymczyk - H2O AI World London 2018
Machine Learning Interpretability - Mateusz Dymczyk - H2O AI World London 2018Sri Ambati
 
Self-supervised Learning Lecture Note
Self-supervised Learning Lecture NoteSelf-supervised Learning Lecture Note
Self-supervised Learning Lecture NoteSangwoo Mo
 
Distilling dark knowledge from neural networks
Distilling dark knowledge from neural networksDistilling dark knowledge from neural networks
Distilling dark knowledge from neural networksAlexander Korbonits
 
PPT - Deep and Confident Prediction For Time Series at Uber
PPT - Deep and Confident Prediction For Time Series at UberPPT - Deep and Confident Prediction For Time Series at Uber
PPT - Deep and Confident Prediction For Time Series at UberJisang Yoon
 
Towads Unsupervised Commonsense Reasoning in AI
Towads Unsupervised Commonsense Reasoning in AITowads Unsupervised Commonsense Reasoning in AI
Towads Unsupervised Commonsense Reasoning in AITassilo Klein
 
Information to Wisdom: Commonsense Knowledge Extraction and Compilation - Part 2
Information to Wisdom: Commonsense Knowledge Extraction and Compilation - Part 2Information to Wisdom: Commonsense Knowledge Extraction and Compilation - Part 2
Information to Wisdom: Commonsense Knowledge Extraction and Compilation - Part 2Dr. Aparna Varde
 
The Creative Value of Bad Ideas
The Creative Value of Bad IdeasThe Creative Value of Bad Ideas
The Creative Value of Bad IdeasR. Sosa
 
Deep Learning (DL) from Scratch
Deep Learning (DL) from ScratchDeep Learning (DL) from Scratch
Deep Learning (DL) from ScratchAziz416788
 
Do wide and deep networks learn the same things? Uncovering how neural networ...
Do wide and deep networks learn the same things? Uncovering how neural networ...Do wide and deep networks learn the same things? Uncovering how neural networ...
Do wide and deep networks learn the same things? Uncovering how neural networ...Seunghyun Hwang
 

Similar to Slides, thesis dissertation defense, deep generative neural networks for novelty generation (20)

Learning, Representations, Generative modelling
Learning, Representations, Generative modellingLearning, Representations, Generative modelling
Learning, Representations, Generative modelling
 
Creativity through deep learning
Creativity through deep learningCreativity through deep learning
Creativity through deep learning
 
Novelty generation with deep learning
Novelty generation with deep learningNovelty generation with deep learning
Novelty generation with deep learning
 
DEF CON 24 - Clarence Chio - machine duping 101
DEF CON 24 - Clarence Chio - machine duping 101DEF CON 24 - Clarence Chio - machine duping 101
DEF CON 24 - Clarence Chio - machine duping 101
 
Deep Learning via Semi-Supervised Embedding (第 7 回 Deep Learning 勉強会資料; 大澤)
Deep Learning via Semi-Supervised Embedding (第 7 回 Deep Learning 勉強会資料; 大澤)Deep Learning via Semi-Supervised Embedding (第 7 回 Deep Learning 勉強会資料; 大澤)
Deep Learning via Semi-Supervised Embedding (第 7 回 Deep Learning 勉強会資料; 大澤)
 
Unit_4- Principles of AI explaining the importants of AI
Unit_4- Principles of AI explaining the importants of AIUnit_4- Principles of AI explaining the importants of AI
Unit_4- Principles of AI explaining the importants of AI
 
ICED 2013 A
ICED 2013 AICED 2013 A
ICED 2013 A
 
Chris Dyer - 2017 - Neural MT Workshop Invited Talk: The Neural Noisy Channel...
Chris Dyer - 2017 - Neural MT Workshop Invited Talk: The Neural Noisy Channel...Chris Dyer - 2017 - Neural MT Workshop Invited Talk: The Neural Noisy Channel...
Chris Dyer - 2017 - Neural MT Workshop Invited Talk: The Neural Noisy Channel...
 
Lesson 19
Lesson 19Lesson 19
Lesson 19
 
AI Lesson 19
AI Lesson 19AI Lesson 19
AI Lesson 19
 
Machine Learning Interpretability - Mateusz Dymczyk - H2O AI World London 2018
Machine Learning Interpretability - Mateusz Dymczyk - H2O AI World London 2018Machine Learning Interpretability - Mateusz Dymczyk - H2O AI World London 2018
Machine Learning Interpretability - Mateusz Dymczyk - H2O AI World London 2018
 
Self-supervised Learning Lecture Note
Self-supervised Learning Lecture NoteSelf-supervised Learning Lecture Note
Self-supervised Learning Lecture Note
 
Distilling dark knowledge from neural networks
Distilling dark knowledge from neural networksDistilling dark knowledge from neural networks
Distilling dark knowledge from neural networks
 
PPT - Deep and Confident Prediction For Time Series at Uber
PPT - Deep and Confident Prediction For Time Series at UberPPT - Deep and Confident Prediction For Time Series at Uber
PPT - Deep and Confident Prediction For Time Series at Uber
 
Towads Unsupervised Commonsense Reasoning in AI
Towads Unsupervised Commonsense Reasoning in AITowads Unsupervised Commonsense Reasoning in AI
Towads Unsupervised Commonsense Reasoning in AI
 
Deep Learning Opening Workshop - Improving Generative Models - Junier Oliva, ...
Deep Learning Opening Workshop - Improving Generative Models - Junier Oliva, ...Deep Learning Opening Workshop - Improving Generative Models - Junier Oliva, ...
Deep Learning Opening Workshop - Improving Generative Models - Junier Oliva, ...
 
Information to Wisdom: Commonsense Knowledge Extraction and Compilation - Part 2
Information to Wisdom: Commonsense Knowledge Extraction and Compilation - Part 2Information to Wisdom: Commonsense Knowledge Extraction and Compilation - Part 2
Information to Wisdom: Commonsense Knowledge Extraction and Compilation - Part 2
 
The Creative Value of Bad Ideas
The Creative Value of Bad IdeasThe Creative Value of Bad Ideas
The Creative Value of Bad Ideas
 
Deep Learning (DL) from Scratch
Deep Learning (DL) from ScratchDeep Learning (DL) from Scratch
Deep Learning (DL) from Scratch
 
Do wide and deep networks learn the same things? Uncovering how neural networ...
Do wide and deep networks learn the same things? Uncovering how neural networ...Do wide and deep networks learn the same things? Uncovering how neural networ...
Do wide and deep networks learn the same things? Uncovering how neural networ...
 

Recently uploaded

Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...
Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...
Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...NETWAYS
 
LANDMARKS AND MONUMENTS IN NIGERIA.pptx
LANDMARKS  AND MONUMENTS IN NIGERIA.pptxLANDMARKS  AND MONUMENTS IN NIGERIA.pptx
LANDMARKS AND MONUMENTS IN NIGERIA.pptxBasil Achie
 
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...NETWAYS
 
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...NETWAYS
 
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024eCommerce Institute
 
Motivation and Theory Maslow and Murray pdf
Motivation and Theory Maslow and Murray pdfMotivation and Theory Maslow and Murray pdf
Motivation and Theory Maslow and Murray pdfakankshagupta7348026
 
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...Krijn Poppe
 
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls KolkataRussian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkataanamikaraghav4
 
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝soniya singh
 
SBFT Tool Competition 2024 - CPS-UAV Test Case Generation Track
SBFT Tool Competition 2024 - CPS-UAV Test Case Generation TrackSBFT Tool Competition 2024 - CPS-UAV Test Case Generation Track
SBFT Tool Competition 2024 - CPS-UAV Test Case Generation TrackSebastiano Panichella
 
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...Kayode Fayemi
 
Navi Mumbai Call Girls Service Pooja 9892124323 Real Russian Girls Looking Mo...
Navi Mumbai Call Girls Service Pooja 9892124323 Real Russian Girls Looking Mo...Navi Mumbai Call Girls Service Pooja 9892124323 Real Russian Girls Looking Mo...
Navi Mumbai Call Girls Service Pooja 9892124323 Real Russian Girls Looking Mo...Pooja Nehwal
 
Microsoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AIMicrosoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AITatiana Gurgel
 
Work Remotely with Confluence ACE 2.pptx
Work Remotely with Confluence ACE 2.pptxWork Remotely with Confluence ACE 2.pptx
Work Remotely with Confluence ACE 2.pptxmavinoikein
 
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )Pooja Nehwal
 
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...henrik385807
 
Philippine History cavite Mutiny Report.ppt
Philippine History cavite Mutiny Report.pptPhilippine History cavite Mutiny Report.ppt
Philippine History cavite Mutiny Report.pptssuser319dad
 
The 3rd Intl. Workshop on NL-based Software Engineering
The 3rd Intl. Workshop on NL-based Software EngineeringThe 3rd Intl. Workshop on NL-based Software Engineering
The 3rd Intl. Workshop on NL-based Software EngineeringSebastiano Panichella
 
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdf
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdfCTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdf
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdfhenrik385807
 
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)Basil Achie
 

Recently uploaded (20)

Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...
Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...
Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...
 
LANDMARKS AND MONUMENTS IN NIGERIA.pptx
LANDMARKS  AND MONUMENTS IN NIGERIA.pptxLANDMARKS  AND MONUMENTS IN NIGERIA.pptx
LANDMARKS AND MONUMENTS IN NIGERIA.pptx
 
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
 
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...
 
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024
Andrés Ramírez Gossler, Facundo Schinnea - eCommerce Day Chile 2024
 
Motivation and Theory Maslow and Murray pdf
Motivation and Theory Maslow and Murray pdfMotivation and Theory Maslow and Murray pdf
Motivation and Theory Maslow and Murray pdf
 
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...
 
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls KolkataRussian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkata
 
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝
 
SBFT Tool Competition 2024 - CPS-UAV Test Case Generation Track
SBFT Tool Competition 2024 - CPS-UAV Test Case Generation TrackSBFT Tool Competition 2024 - CPS-UAV Test Case Generation Track
SBFT Tool Competition 2024 - CPS-UAV Test Case Generation Track
 
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...
Governance and Nation-Building in Nigeria: Some Reflections on Options for Po...
 
Navi Mumbai Call Girls Service Pooja 9892124323 Real Russian Girls Looking Mo...
Navi Mumbai Call Girls Service Pooja 9892124323 Real Russian Girls Looking Mo...Navi Mumbai Call Girls Service Pooja 9892124323 Real Russian Girls Looking Mo...
Navi Mumbai Call Girls Service Pooja 9892124323 Real Russian Girls Looking Mo...
 
Microsoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AIMicrosoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AI
 
Work Remotely with Confluence ACE 2.pptx
Work Remotely with Confluence ACE 2.pptxWork Remotely with Confluence ACE 2.pptx
Work Remotely with Confluence ACE 2.pptx
 
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )
WhatsApp 📞 9892124323 ✅Call Girls In Juhu ( Mumbai )
 
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...
 
Philippine History cavite Mutiny Report.ppt
Philippine History cavite Mutiny Report.pptPhilippine History cavite Mutiny Report.ppt
Philippine History cavite Mutiny Report.ppt
 
The 3rd Intl. Workshop on NL-based Software Engineering
The 3rd Intl. Workshop on NL-based Software EngineeringThe 3rd Intl. Workshop on NL-based Software Engineering
The 3rd Intl. Workshop on NL-based Software Engineering
 
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdf
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdfCTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdf
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdf
 
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)
 

Slides, thesis dissertation defense, deep generative neural networks for novelty generation

  • 1. / 56 Deep generative neural networks for novelty generation: a foundational framework, metrics and experiments 1 Mehdi Cherti LAL/CNRS, Université Paris Saclay Supervised by: - Balàzs Kégl (LAL/CNRS, Université Paris Saclay) - Akın Kazakçı(Mines Paristech)
  • 2. / 562 Prediction Quest for artificial intelligence
  • 3. / 563 Prediction Novelty generation Quest for artificial intelligence
  • 4. / 56 How to study novelty generation ? 4
  • 5. / 565 Studying novelty generation: •Design theory •Computational creativity •Machine learning
  • 6. / 566 Design theory • Early work: (Simon, 1969, 1973), Design as ‘problem solving’ (i.e. moving from an initial state to a desired state) • C-K theory: (Hatchuel et al., 2003), Design as joint expansion of knowledge and concepts • Various formalisms of knowledge (Set Theory (Hatchuel et al, 2007), Graphs (Kazakci et al, 2010), Matroids (Le Masson et al, 2017))
  • 7. / 56 • Through C-K, it acknowledges that knowledge is central • But lacks computer-based experimental tools 7 Design theory
  • 8. / 568 Generation as optimization with evolutionary algorithms Computational creativity
  • 9. / 569 • Enables experimentation but the end- goal is the object itself rather than studying the generative process • Fitness function barrier • No representation learning • Generation and evaluation are disconnected Computational creativity
  • 10. / 5610 Machine learning proposes powerful generative models,
  • 11. / 5611 but these powerful models are used to regenerate objects that we can relate easily to known objects…
  • 12. / 56 • Although trained for generating what we know, some models can generate unrecognizable objects • However, these models and samples are considered as spurious (Bengio et al. 2013), or as a failure(Salimans et al. 2016) 12
  • 13. / 56 Instead of ignoring or eliminating novelty, we should study it. 13
  • 14. / 5614 • Goal of the thesis: study generative potential of deep generative networks (DGNs) for novelty generation • Research questions: • What novelty can be generated by DGN? • How to evaluate the generative potential of a DGN? • What are the general characteristics of DGN that can generate novelty? • Method: We use computer based simulations with deep generative models because • They offer a rich and powerful set of existing techniques • They can learn (i.e. representations of objects) • Their generative potential has not been studied systematically
  • 15. / 56 Outline 1. Introduction 2.The impact of representations on novelty generation 3. Results 3.a. Studying the generative potential of a deep net 3.b. Evaluating the generative potential of deep nets 3.c. Characteristics of models that can generate novelty 4. Conclusion and perspectives 15
  • 16. / 5616 2.The impact of representations on novelty generation
  • 17. / 56 2.The impact of representations on novelty generation 17 (Reich, 1995) In design literature, it has been acknowledged that objects can be represented in multiple ways What effect do representations have on novelty generation ?
  • 18. / 56 • Suppose we have a dataset of 16 letters 18 2.The impact of representations on novelty generation
  • 19. / 56 • Suppose we represent images in pixel space • We generate pixels randomly uniformly 19 Everything is new, but no structure 2.The impact of representations on novelty generation
  • 20. / 56 • Suppose we re-represent each letter using strokes 20 • For instance, 2.The impact of representations on novelty generation
  • 21. / 5621 2.The impact of representations on novelty generation
  • 22. / 5622 Pixel space Stroke space Representations change what you can generate 2.The impact of representations on novelty generation
  • 23. / 56 •How do we choose a “useful” representation for novelty generation ? •Machine learning, and deep generative models in particular, provides ways to learn representations from data Q: Can we use those learned representations for generation of novelty even if these models are not designed to do so ? 23 2.The impact of representations on novelty generation
  • 24. / 5624 2.The impact of representations on novelty generation •Noise vs novelty •Likelihood •Compression of representations Summary:
  • 25. / 5625 • What novelty can be generated by deep generative nets (DGN)? • How to evaluate the generative potential of a DGN? • What are the general characteristics of DGN that can generate novelty? Research questions:
  • 27. / 5627 • We observed that some models could generate novelty although not designed to do that • Thus, deep generative models have an unused generative potential • Can we demonstrate this more systematically ? 3.a. Studying the generative potential of a deep net
  • 28. / 5628 3.a. Studying the generative potential of a deep net
  • 29. / 5629 (Kazakci, Cherti, Kégl, 2016) Train data Generative model Learn Generate ?? 3.a. Studying the generative potential of a deep net
  • 30. We use a convolutional sparse auto-encoder as a model Sparsity Training objective is to minimize the reconstruction error 30 3.a. Studying the generative potential of a deep net / 56
  • 31. Reconstruction Input (dim 625) Bottleneck Encode Decode Deep autoencoder with a bottleneck from Hinton, G. E., & Salakhutdinov, R. R. (2006). 3.a. Studying the generative potential of a deep net
  • 32. / 56 • We use an iterative method to generate new images • Start with a random image • Force the network to construct (i.e. interpret) • , until convergence, f(x) = dec(enc(x)) 32 3.a. Studying the generative potential of a deep net
  • 33. / 5633 3.a. Studying the generative potential of a deep net
  • 34. / 56 3.a. Studying the generative potential of a deep net 34
  • 35. / 56 3.a. Studying the generative potential of a deep net 35
  • 36. / 5636Kazakçı, Cherti, Kégl, 2016 3.a. Studying the generative potential of a deep net
  • 37. / 5637 Known Training digits Representable “Combinations of strokes” 37 3.a. Studying the generative potential of a deep net Our interpretation of the results:
  • 38. / 5638 Known Training digits Representable All digits that the model can generate Valuable All recognizable digits 3.a. Studying the generative potential of a deep net
  • 39. / 5639 Known Training digits Representable “Combinations of strokes” Valuable Human selection 3.a. Studying the generative potential of a deep net
  • 40. / 56 • We have one example of a deep generative model that can indeed generate novelty • Can we go further by automatically finding models that can generate novelty ? 40 3.b. Evaluating the generative potential of deep nets
  • 41. / 5641 We designed a new setup and set of metrics to find models that are capable of generating novelty 3.b. Evaluating the generative potential of deep nets
  • 42. / 56 •Training on known classes •Testing on classes known to the experimenter but unknown to the model 42 Idea: simulate the unknown by 3.b. Evaluating the generative potential of deep nets Proposed setup: train on digits and test on letters, where letters are used as a proxy for evaluating the capacity of models to generate novelty
  • 43. / 5643 Generative model Learn Generate Q: How many of those are letters ? 3.b. Evaluating the generative potential of deep nets
  • 44. / 5644 Discriminator Learn 3.b. Evaluating the generative potential of deep nets To count letters, we learn a discriminator with 36 classes = 10 for digits + 26 for letters
  • 45. / 5645 Discriminator Nb. of letters Predict 3.b. Evaluating the generative potential of deep nets We then use the discriminator to score the models:
  • 46. / 5646 3.b. Evaluating the generative potential of deep nets “Nb of letters” score is a proxy for finding models that generate images that are : • non trivial • non recognizable as digits
  • 47. / 56 • We do a large scale experiment where we train ~1000 models (autoencoders, GANs) by varying their hyperparameters. • From each model, we generate 1000 images, then we evaluate the model using our proposed metrics • Question we tried to answer: 47 Can we find models that can generate novelty ? 3.b. Evaluating the generative potential of deep nets
  • 48. / 56 • Selecting models for letters count lead to models that can generate novelty 48 • Selecting models for digits count lead to models that memorize training classes 3.b. Evaluating the generative potential of deep nets
  • 49. Pangrams 49 3.b. Evaluating the generative potential of deep nets
  • 50. / 5650 Known Training digits Representable “Combinations of strokes” Valuable Letters 3.b. Evaluating the generative potential of deep nets
  • 51. / 5651 We have shown that we can automatically find models that can generate novelty, as well as other models that cannot 3.b. Evaluating the generative potential of deep nets
  • 52. / 5652 • Can we characterize the difference between models that can generate novelty and models that cannot ? • We study a particular model architecture through a series of experiments 3.b. Evaluating the generative potential of deep nets
  • 53. / 5653 • We study the effect of different ways of restricting the capacity of the representation on the same architecture • We find that restricting the capacity of the representation hurts their ability to generate novelty 3.c.Characteristics of models that can generate novelty More capacity Morenovelty
  • 54. / 5654 Conclusion Main Contributions: • Importance of representation on novelty generation • Current models can generate novelty even though not designed for that • We propose a new setup and a set of metrics to assess the capacity of the models to generate novelty • We show that constraining the capacity of the representation can be harmful for novelty generation
  • 55. / 5655 Perspectives: immediate next steps • Explain why existing models can generate novelty • Propose an explicit training criterion to learn a representation suitable for novelty generation • Propose alternatives generative procedures to random sampling • Experiment on more complex datasets and domains
  • 56. / 5656 • Agent evolving over time: dynamic knowledge and value function • Multi-agent system so that agents get/give feedback and cooperate Perspectives: future
  • 58. / 56 3.a. Studying the generative potential of a deep net 58