SlideShare a Scribd company logo
1 of 43
Buenos Aires, mayo de 2016
Eduardo Poggi
Agenda
 Reglas de generalización
 Algoritmo Star de Michalsky
 Algoritmo de Vere
 Learning First Order Rules
Agenda
 Reglas de generalización
 Algoritmo Star de Michalsky
 Algoritmo de Vere
 Learning First Order Rules
Reglas de generalización
 Eliminación de conjunción
 P(X) <- A(X) ^ B(X) ^ C(X) <
 P(X) <- A(X) ^ B(X)
 Adición de disyunción
 P(X) <- A(X) <
 P(x) <- A(X) v B(X)
 Conjunciones por disyunciones
 P(X) <- A(X) ^ B(X) <
 P(x) <- A(X) v B(X)
Reglas de generalización
 Ampliación de rango de valores
 P(v/v in R1) <
 P(v/v in R2) iif R1<R2
 Constantes por variables
 P(a) <- <
 P(X) <-
 Resolución inductiva
 { P(X) <- A(X) ^ B(X);
 P(X) <- -A(X) ^ C(X) } <
 P(X) <- B(X) v C(X)
Reglas de generalización
 Escalar en árbol de
generalización
 P(v) <
 P(t(v)) iif v<t(v)
Distancias
ProductoProducto
ComestiblesComestibles LimpiezaLimpieza IndumentariaIndumentaria
AnimalAnimal VegetalVegetal MineralMineral
LácteosLácteos CárnicosCárnicos
Leche liquidaLeche liquida Leche fermentadaLeche fermentada QuesosQuesos MantecaManteca
Yogurt enteroYogurt entero Yogurt descremadoYogurt descremado
Yogurt naturalYogurt natural Yogurt saborizadoYogurt saborizado
Reglas de generalización constructivas
 Reemplazo de términos
 P(X) <- A(X) ^ B(X) <
 P(X) <- A(X) ^ C(X) iff B(X) < C(X)
Agenda
 Reglas de generalización
 Algoritmo Star de Michalsky
 Algoritmo de Vere
 Learning First Order Rules
STAR (Michalsky)
 Hasta condición de terminación
 Seleccionar un ejemplo
 Obtener el árbol de generalización (STAR) a partir de aplicar al
ejemplo todas las reglas de generalización (especialización)
posibles que no cubran contra-ejemplos.
 Evaluar la lista de generalización y ordenar.
 Eliminar los ejemplos ya cubiertos.
STAR (Michalsky)
venenoso(X) <-
color(X,marron)
forma(X,alargado)
tierra(X,humeda)
ambiente(X,humedo)
bajo_arbol(X,a33)
arbol(a33,fresno)
venenoso(X) <-
color(X,marron)
forma(X,alargado)
tierra(X,humeda)
ambiente(X,humedo)
bajo_arbol(X,a33)
arbol(a33,fresno)
venenoso(X) <-
color(X,marron)
forma(X,alargado)
tierra(X,humeda)
ambiente(X,humedo)
bajo_arbol(X,Y)
arbol(Y,fresno)
venenoso(X) <-
color(X,marron)
forma(X,alargado)
tierra(X,humeda)
ambiente(X,humedo)
bajo_arbol(X,Y)
arbol(Y,fresno)
venenoso(X) <-
color(X,marron)
forma(X,alargado)
tierra(X,humeda)
bajo_arbol(X,a33)
arbol(a33,fresno)
venenoso(X) <-
color(X,marron)
forma(X,alargado)
tierra(X,humeda)
bajo_arbol(X,a33)
arbol(a33,fresno)
venenoso(X) <-
color(X,marron)
forma(X,alargado)
tierra(X,humeda)
ambiente(X,humedo)
venenoso(X) <-
color(X,marron)
forma(X,alargado)
tierra(X,humeda)
ambiente(X,humedo)
venenoso(X) <- …venenoso(X) <- …
STAR (Michalsky)
venenoso(X) <- color(X,
[marron,verde])
forma(X,alargado)
tierra(X,humeda)
bajo_arbol(X,Y) arbol(Y,
[fresno,laurel])
venenoso(X) <- color(X,
[marron,verde])
forma(X,alargado)
tierra(X,humeda)
bajo_arbol(X,Y) arbol(Y,
[fresno,laurel])
venenoso(X) <-
forma(X,alargado)
tierra(X,humeda)
ambiente(X,humedo)
venenoso(X) <-
forma(X,alargado)
tierra(X,humeda)
ambiente(X,humedo)
venenoso(X) <-
color(X,marron)
forma(X,alargado)
tierra(X,humeda)
ambiente(X,humedo)
bajo_arbol(X,Y) arbol(Y,Z)
venenoso(X) <-
color(X,marron)
forma(X,alargado)
tierra(X,humeda)
ambiente(X,humedo)
bajo_arbol(X,Y) arbol(Y,Z)
Agenda
 Reglas de generalización
 Algoritmo Star de Michalsky
 Algoritmo de Vere
 Learning First Order Rules
Abstracción y GME
 Abstracción como sustitución inductiva
 GCME = Generalización Común Máximalmente
Específica
 Acoplamiento y residuo
Vere
 P = GCME de los ejemplos
 N = GCME de los contraejemplos
 C = P & -N
 Continuar iterativamente hasta:
 C = P & -(N1 & -(N2 & -(… & Nk) …))
Vere
 P1 =
 color(X,marron) ^ forma(X,alargado) ^ tierra(X,humeda) ^
 (ambiente(X,humedo) v ambiente(X,semi_humedo)^
 bajo_arbol(X,Y) ^ arbol(Y,Z)
 N1 =
 color(X,verde) ^ forma(X,redondo) ^
 ambiente(X,semi_humedo)^
 bajo_arbol(X,Y) ^ arbol(Y,Z)
 C1 = P1 ^ -N2 = ?
Vere
 C1 = P1 ^ -N2 =
 color(X,marron) ^ forma(X,alargado) ^ tierra(X,humeda) ^
 (ambiente(X,humedo) v ambiente(X,semi_humedo)^
 bajo_arbol(X,Y) ^ arbol(Y,Z) ^
 - [ color(X,verde) ^ forma(X,redondo) ^
 ambiente(X,semi_humedo) ^
 bajo_arbol(X,Y) ^ arbol(Y,Z) ]
 =
 color(X,marron) ^ forma(X,alargado) ^ tierra(X,humeda) ^
 (ambiente(X,humedo) v ambiente(X,semi_humedo) ^
 bajo_arbol(X,Y) ^ arbol(Y,Z) ^
 -color(X,verde) ^ -forma(X,redondo) ^
 -ambiente(X,semi_humedo) ^
 - bajo_arbol(X,Y) ^ - arbol(Y,Z)
Vere
 ≈ …
 color(X,marron) ^ -color(X,verde)
 forma(X,alargado) ^ -forma(X,redondo)
 tierra(X,humeda) ^
 ambiente(X,humedo ^
 bajo_arbol(X,Y) ^ arbol(Y,Z)
ML como BH
 K = ?
 Lista = {semilla}
 Hasta condición de terminación
 Nodo = primero de la lista
 Seleccionar reglas de generalización aplicables al Nodo
 Aplicar reglas al Nodo y generar nuevos Nodos
 Calcular Performance de Nodos
 Agregar Nodos a la Lista
 Ordenar Lista según Performance
 Truncar Lista en los k mejores
Agenda
 Reglas de generalización
 Algoritmo Star de Michalsky
 Algoritmo de Vere
 Learning First Order Rules
Learning set of rules
 Learning sets of rules has the advantage that the
hypothesis is easy to interpret.
 Sequential covering algorithm to learn first-order rules.
Learning rules
 First-order rule sets contain rules that have variables.
 This enables us to have stronger representational power.
 Example:
 If Parent(x,y) then Ancestor(x,y)
 If Parent(x,z) and Ancestor(z,y) then Ancestor(x,y)
 How would you represent this using a decision tree or predicate calculus?
Sequential Covering
 General idea:
 Learn one rule that covers certain number of positive examples
 Remove those examples covered by the rule
 Repeat until no positive examples are left.
Rule 1 Rule 2
Accuracy vs Coverage
We ask that each rule has high accuracy but not necessarily
high coverage, for example:
Rule 1
Rule 1 has 90% accuracy and 50% coverage. In general the
coverage may be low as long as accuracy is high.
Sequential Covering Algorithm
 Sequential-Covering (class,attributes,examples,threshold T)
 RuleSet = 0
 Rule = Learn-one-rule(class,attributes,examples)
 While (performance(Rule) > T) do
 RuleSet += Rule
 Examples = Examples  {ex. classified correctly by Rule}
 Rule = Learn-one-rule(class,attributes,examples)
 Sort RuleSet based on the performance of the rules
 Return RuleSet
Sequential Covering Algorithm
 Observations:
 It performs a greedy search (no backtracking); as such it may not
find an optimal rule set.
 It learns a disjunctive set of rules by learning each disjunct
(conjunction of att.values) at a time.
 It sequentially covers the set of positive examples until the
performance of a rule is below a threshold.
Learn One Rule
How do we learn each individual rule?
One approach is to proceed as in decision tree learning but by
following the branch with best score in terms of splitting
function:
Luminosity
Mass
Type A Type B
Type C
> T1<= T1
> T2<= T2
If Luminosity <= T1 and
Mass > T2 then class is Type B
Learn One Rule
 Observations:
 We greedily choose the attribute that most improves rule performance over
the training set.
 We perform a greedy depth first search with no backtracking.
 The algorithm can be extended using a beam-search:
 We keep a list of the best k attributes at each step.
 For each attribute we generate descendants.
 In the next step we take the best k attributes and continue.
Algorithm
 LearnOneRule(class,attributes,examples,k):
 Best-hypothesis = 0
 Candidate-hypotheses = {Best-hypothesis}
 While Candidate-hypotheses is not empty do
 Generate the next more specific candidate hypotheses
 Update Best-hypothesis

For all h in new-candidates

if (Performance(h) > Performance(Best-hypothesis)) Best-hypothesis = h
 Update Candidate-hypotheses = best k members of new-candidates
 Return rule: If Best-hypothesis then prediction (most frequent class of
examples covered by Best-hypothesis)
Algorithm
 Generate the next more specific candidate hypotheses:
 Values = the set of all attribute values, e.g., color = blue
 For each rule h in Candidate-hypotheses do
 For each attribute-value v do

Add to h value v

new-candidates += h
 Remove from new-candidates hypotheses that are duplicates, inconsistent or
not maximally specific.
 Return new-candidates
Example
 Astronomy problem: classifying objects as stars of different types.
 Attributes: luminosity, mass, temperature, size.
 Assume the set of possible values are as follows:
 Luminosity <= T1 = l1 Luminosity > T1 = l2
 Mass <= T2 = m1 Mass > T2 = m2
 Temperature <= T3 = c1 Temperature > T3 = c2
 Size <= T4 = s1 Size > T4 = s2
Running Algorithm on Example
 Most specific hypotheses:
 l1, l2, m1, m2, c1, c2, s1, s2
 Assume Performance = P
 P(c1) > P(x) for all x different than c1
 Then best-hypothesis = c1
 Assume k = 4
 Best possible hypotheses: l1, m2, s1, c1
Running Algorithm on Example
 Candidate hypotheses: l1, m2, and c1
 New candidates:
 l1 & l1 (*)
 l1 & L2 (^) m2 & l1 c1 & l1
 l1 & m1 m2 & l2 c1 & l2
 l1 & m2 m2 & m1 (^) … etc
 l1 & c1 m2 & m2 (*)
 l1 & c2 m2 & c1
 l1 & s1 … etc
 l1 & s2
 (*) duplicate (^) inconsistent
Running Algorithm on Example
 Compute the performance of each new candidate.
 Update best-hypothesis to the best new candidate
 Example:
 Best–hypothesis = l1 & c2
 Now take the best k = 3 new candidates
 And continue generating new candidates:
 l1 & c2 & s1
 l1 & c2 & s2
 … etc
Performance Evaluation
 The performance of a new candidate can be computed
using information-theoretic measures like entropy:
 Performance(h, examples, class)
 h_examples = the subsets of examples covered by h
 Return Entropy(h_examples)
Considerations
 The best-hypothesis is the hypothesis with highest
 performance value, and not necessarily the last hypothesis:
 Space:
 L1, m2, c1
 L1&m2, l2&s1, m2&c1
 L1&m2&s1, l2&s1&c2, m2&c1&s2
 …
 Possible best hypothesis: l2&s1
Variations
What happens if the proportion of examples of a class is low?
In other words what happens if the a priori probability of a class
of examples is very low?
Example: patients with a very strange disease.
In that case we can modify the algorithm to learn only from those
rare examples, and to classify anything outside the rule set as negative.
Variations
 A second variation is used in the popular AQ and CN2 algorithms.
General idea:
 Choose one seed positive example
 Look for the most specific rule that covers the positive example and has
high performance
 Repeat with another seed example until no more improvement is seen on
the rule set
Variations
Rule 1 Rule 2
seed1
seed2
Final Points for Consideration
 A search can be done in a general-to-specific fashion. But one can also use
a specific-to-general fashion. Which one is best?
 Here we use a generate-then-test strategy. How about using an example-
driven strategy like the candidate elimination algorithm? (this last type is
more easily fooled by noise in the data)
 When and how should we prune rules?
 Different performance metrics exist:
 Relative frequency
 Accuracy
 Entropy
Rule learning and decision trees
 What is the difference between both?
 Decision trees: Divide and conquer
 Rule Learning: Separate and conquer
eduardopoggi@yahoo.com.ar
eduardo-poggi
http://ar.linkedin.com/in/eduardoapoggi
https://www.facebook.com/eduardo.poggi
@eduardoapoggi
Bibliografía

More Related Content

What's hot

Lecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceLecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceasimnawaz54
 
Predicates and Quantifiers
Predicates and QuantifiersPredicates and Quantifiers
Predicates and Quantifiersblaircomp2003
 
Higher-Order (F, α, β, ρ, d) –Convexity for Multiobjective Programming Problem
Higher-Order (F, α, β, ρ, d) –Convexity for Multiobjective Programming ProblemHigher-Order (F, α, β, ρ, d) –Convexity for Multiobjective Programming Problem
Higher-Order (F, α, β, ρ, d) –Convexity for Multiobjective Programming Probleminventionjournals
 
Lesson 4 - Calculating Limits (Slides+Notes)
Lesson 4 - Calculating Limits (Slides+Notes)Lesson 4 - Calculating Limits (Slides+Notes)
Lesson 4 - Calculating Limits (Slides+Notes)Matthew Leingang
 
Project in Calcu
Project in CalcuProject in Calcu
Project in Calcupatrickpaz
 
5.3 areas, riemann sums, and the fundamental theorem of calaculus
5.3 areas, riemann sums, and the fundamental theorem of calaculus5.3 areas, riemann sums, and the fundamental theorem of calaculus
5.3 areas, riemann sums, and the fundamental theorem of calaculusmath265
 
Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Matthew Leingang
 
Some properties of two-fuzzy Nor med spaces
Some properties of two-fuzzy Nor med spacesSome properties of two-fuzzy Nor med spaces
Some properties of two-fuzzy Nor med spacesIOSR Journals
 
Predicates and Quantifiers
Predicates and Quantifiers Predicates and Quantifiers
Predicates and Quantifiers Istiak Ahmed
 
5.2 the substitution methods
5.2 the substitution methods5.2 the substitution methods
5.2 the substitution methodsmath265
 
Lesson 14: Derivatives of Logarithmic and Exponential Functions (slides)
Lesson 14: Derivatives of Logarithmic and Exponential Functions (slides)Lesson 14: Derivatives of Logarithmic and Exponential Functions (slides)
Lesson 14: Derivatives of Logarithmic and Exponential Functions (slides)Matthew Leingang
 

What's hot (19)

Lecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceLecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inference
 
Predicates and Quantifiers
Predicates and QuantifiersPredicates and Quantifiers
Predicates and Quantifiers
 
Ring
RingRing
Ring
 
Quantum Search and Quantum Learning
Quantum Search and Quantum Learning Quantum Search and Quantum Learning
Quantum Search and Quantum Learning
 
Higher-Order (F, α, β, ρ, d) –Convexity for Multiobjective Programming Problem
Higher-Order (F, α, β, ρ, d) –Convexity for Multiobjective Programming ProblemHigher-Order (F, α, β, ρ, d) –Convexity for Multiobjective Programming Problem
Higher-Order (F, α, β, ρ, d) –Convexity for Multiobjective Programming Problem
 
Math
MathMath
Math
 
Lesson 4 - Calculating Limits (Slides+Notes)
Lesson 4 - Calculating Limits (Slides+Notes)Lesson 4 - Calculating Limits (Slides+Notes)
Lesson 4 - Calculating Limits (Slides+Notes)
 
Analysis Solutions CVI
Analysis Solutions CVIAnalysis Solutions CVI
Analysis Solutions CVI
 
Metric space
Metric spaceMetric space
Metric space
 
Project in Calcu
Project in CalcuProject in Calcu
Project in Calcu
 
5.3 areas, riemann sums, and the fundamental theorem of calaculus
5.3 areas, riemann sums, and the fundamental theorem of calaculus5.3 areas, riemann sums, and the fundamental theorem of calaculus
5.3 areas, riemann sums, and the fundamental theorem of calaculus
 
Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Some properties of two-fuzzy Nor med spaces
Some properties of two-fuzzy Nor med spacesSome properties of two-fuzzy Nor med spaces
Some properties of two-fuzzy Nor med spaces
 
Predicates and Quantifiers
Predicates and Quantifiers Predicates and Quantifiers
Predicates and Quantifiers
 
5.2 the substitution methods
5.2 the substitution methods5.2 the substitution methods
5.2 the substitution methods
 
Ip3614981501
Ip3614981501Ip3614981501
Ip3614981501
 
Lesson 14: Derivatives of Logarithmic and Exponential Functions (slides)
Lesson 14: Derivatives of Logarithmic and Exponential Functions (slides)Lesson 14: Derivatives of Logarithmic and Exponential Functions (slides)
Lesson 14: Derivatives of Logarithmic and Exponential Functions (slides)
 
O2
O2O2
O2
 

Viewers also liked

Henrion poggi analytics - ann - 1
Henrion poggi   analytics - ann - 1Henrion poggi   analytics - ann - 1
Henrion poggi analytics - ann - 1Gaston Liberman
 
Poggi analytics - inference - 1a
Poggi   analytics - inference - 1aPoggi   analytics - inference - 1a
Poggi analytics - inference - 1aGaston Liberman
 
Amines - Nomenclature. Classes. Applications
Amines - Nomenclature. Classes. ApplicationsAmines - Nomenclature. Classes. Applications
Amines - Nomenclature. Classes. ApplicationsDiane Infante
 
Poggi analytics - sentiment - 1
Poggi   analytics - sentiment - 1Poggi   analytics - sentiment - 1
Poggi analytics - sentiment - 1Gaston Liberman
 
Disruptive-technologies-and-networking-in-telecom-industries
Disruptive-technologies-and-networking-in-telecom-industriesDisruptive-technologies-and-networking-in-telecom-industries
Disruptive-technologies-and-networking-in-telecom-industriesSimon Hartington
 
20110127 讓國會亮起來─公督盟如何發揮國會的力量?
20110127 讓國會亮起來─公督盟如何發揮國會的力量?20110127 讓國會亮起來─公督盟如何發揮國會的力量?
20110127 讓國會亮起來─公督盟如何發揮國會的力量?chewei liu
 
Gradient elution parameters in capillary liquid chromatography for high-speed...
Gradient elution parameters in capillary liquid chromatography for high-speed...Gradient elution parameters in capillary liquid chromatography for high-speed...
Gradient elution parameters in capillary liquid chromatography for high-speed...Diane Infante
 
Fauvism- Henri Matisse
Fauvism- Henri MatisseFauvism- Henri Matisse
Fauvism- Henri MatisseHira Sohaib
 
Did you Know -Fusion of FA & TA Analysis on GBIME
Did you Know -Fusion of FA  & TA Analysis on GBIMEDid you Know -Fusion of FA  & TA Analysis on GBIME
Did you Know -Fusion of FA & TA Analysis on GBIMESharechart Shrestha
 
康軒中自二下Ppt經典款ch6 1-2實驗
康軒中自二下Ppt經典款ch6 1-2實驗康軒中自二下Ppt經典款ch6 1-2實驗
康軒中自二下Ppt經典款ch6 1-2實驗fbw41598
 
Poggi analytics - intro - 1c
Poggi   analytics - intro - 1cPoggi   analytics - intro - 1c
Poggi analytics - intro - 1cGaston Liberman
 
Fiduciary Liability & EBL - They are not the same
Fiduciary Liability & EBL - They are not the sameFiduciary Liability & EBL - They are not the same
Fiduciary Liability & EBL - They are not the sameInsuranceCommunityCenter
 
ISWC GoodRelations Tutorial Part 4
ISWC GoodRelations Tutorial Part 4ISWC GoodRelations Tutorial Part 4
ISWC GoodRelations Tutorial Part 4Martin Hepp
 

Viewers also liked (19)

Henrion poggi analytics - ann - 1
Henrion poggi   analytics - ann - 1Henrion poggi   analytics - ann - 1
Henrion poggi analytics - ann - 1
 
Poggi analytics - inference - 1a
Poggi   analytics - inference - 1aPoggi   analytics - inference - 1a
Poggi analytics - inference - 1a
 
Amines - Nomenclature. Classes. Applications
Amines - Nomenclature. Classes. ApplicationsAmines - Nomenclature. Classes. Applications
Amines - Nomenclature. Classes. Applications
 
ФГОС
ФГОСФГОС
ФГОС
 
Poggi analytics - sentiment - 1
Poggi   analytics - sentiment - 1Poggi   analytics - sentiment - 1
Poggi analytics - sentiment - 1
 
Disruptive-technologies-and-networking-in-telecom-industries
Disruptive-technologies-and-networking-in-telecom-industriesDisruptive-technologies-and-networking-in-telecom-industries
Disruptive-technologies-and-networking-in-telecom-industries
 
20110127 讓國會亮起來─公督盟如何發揮國會的力量?
20110127 讓國會亮起來─公督盟如何發揮國會的力量?20110127 讓國會亮起來─公督盟如何發揮國會的力量?
20110127 讓國會亮起來─公督盟如何發揮國會的力量?
 
Gradient elution parameters in capillary liquid chromatography for high-speed...
Gradient elution parameters in capillary liquid chromatography for high-speed...Gradient elution parameters in capillary liquid chromatography for high-speed...
Gradient elution parameters in capillary liquid chromatography for high-speed...
 
Fauvism- Henri Matisse
Fauvism- Henri MatisseFauvism- Henri Matisse
Fauvism- Henri Matisse
 
Overdrive
OverdriveOverdrive
Overdrive
 
Did you Know -Fusion of FA & TA Analysis on GBIME
Did you Know -Fusion of FA  & TA Analysis on GBIMEDid you Know -Fusion of FA  & TA Analysis on GBIME
Did you Know -Fusion of FA & TA Analysis on GBIME
 
康軒中自二下Ppt經典款ch6 1-2實驗
康軒中自二下Ppt經典款ch6 1-2實驗康軒中自二下Ppt經典款ch6 1-2實驗
康軒中自二下Ppt經典款ch6 1-2實驗
 
Poggi analytics - intro - 1c
Poggi   analytics - intro - 1cPoggi   analytics - intro - 1c
Poggi analytics - intro - 1c
 
Presentation au
Presentation auPresentation au
Presentation au
 
Storyboard
StoryboardStoryboard
Storyboard
 
Secret and Lies Analysis
Secret and Lies AnalysisSecret and Lies Analysis
Secret and Lies Analysis
 
[YMC] Nội san Sức trẻ số 35
[YMC] Nội san Sức trẻ số 35[YMC] Nội san Sức trẻ số 35
[YMC] Nội san Sức trẻ số 35
 
Fiduciary Liability & EBL - They are not the same
Fiduciary Liability & EBL - They are not the sameFiduciary Liability & EBL - They are not the same
Fiduciary Liability & EBL - They are not the same
 
ISWC GoodRelations Tutorial Part 4
ISWC GoodRelations Tutorial Part 4ISWC GoodRelations Tutorial Part 4
ISWC GoodRelations Tutorial Part 4
 

Similar to Poggi analytics - star - 1a

Some Thoughts on Sampling
Some Thoughts on SamplingSome Thoughts on Sampling
Some Thoughts on SamplingDon Sheehy
 
Machine learning (1)
Machine learning (1)Machine learning (1)
Machine learning (1)NYversity
 
1004_theorem_proving_2018.pptx on the to
1004_theorem_proving_2018.pptx on the to1004_theorem_proving_2018.pptx on the to
1004_theorem_proving_2018.pptx on the tofariyaPatel
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docxbissacr
 
Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Asma Ben Slimene
 
Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Asma Ben Slimene
 
Options Portfolio Selection
Options Portfolio SelectionOptions Portfolio Selection
Options Portfolio Selectionguasoni
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docxmccormicknadine86
 
Path relinking for high dimensional continuous optimization
Path relinking for high dimensional continuous optimizationPath relinking for high dimensional continuous optimization
Path relinking for high dimensional continuous optimizationPatxi Gortázar
 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1arogozhnikov
 
Digital Logic.pptxghuuhhhhhhuu7ffghhhhhg
Digital Logic.pptxghuuhhhhhhuu7ffghhhhhgDigital Logic.pptxghuuhhhhhhuu7ffghhhhhg
Digital Logic.pptxghuuhhhhhhuu7ffghhhhhgAnujyotiDe
 
Stochastic Processes Homework Help
Stochastic Processes Homework HelpStochastic Processes Homework Help
Stochastic Processes Homework HelpExcel Homework Help
 
block-mdp-masters-defense.pdf
block-mdp-masters-defense.pdfblock-mdp-masters-defense.pdf
block-mdp-masters-defense.pdfJunghyun Lee
 

Similar to Poggi analytics - star - 1a (20)

Some Thoughts on Sampling
Some Thoughts on SamplingSome Thoughts on Sampling
Some Thoughts on Sampling
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
 
Fst ch3 notes
Fst ch3 notesFst ch3 notes
Fst ch3 notes
 
Machine learning (1)
Machine learning (1)Machine learning (1)
Machine learning (1)
 
1004_theorem_proving_2018.pptx on the to
1004_theorem_proving_2018.pptx on the to1004_theorem_proving_2018.pptx on the to
1004_theorem_proving_2018.pptx on the to
 
Discrete Math Lecture 02: First Order Logic
Discrete Math Lecture 02: First Order LogicDiscrete Math Lecture 02: First Order Logic
Discrete Math Lecture 02: First Order Logic
 
dma_ppt.pdf
dma_ppt.pdfdma_ppt.pdf
dma_ppt.pdf
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docx
 
Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...
 
Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...
 
Options Portfolio Selection
Options Portfolio SelectionOptions Portfolio Selection
Options Portfolio Selection
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docx
 
OPERATIONS RESEARCH
OPERATIONS RESEARCHOPERATIONS RESEARCH
OPERATIONS RESEARCH
 
Path relinking for high dimensional continuous optimization
Path relinking for high dimensional continuous optimizationPath relinking for high dimensional continuous optimization
Path relinking for high dimensional continuous optimization
 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1
 
Digital Logic.pptxghuuhhhhhhuu7ffghhhhhg
Digital Logic.pptxghuuhhhhhhuu7ffghhhhhgDigital Logic.pptxghuuhhhhhhuu7ffghhhhhg
Digital Logic.pptxghuuhhhhhhuu7ffghhhhhg
 
C2.0 propositional logic
C2.0 propositional logicC2.0 propositional logic
C2.0 propositional logic
 
Stochastic Processes Homework Help
Stochastic Processes Homework HelpStochastic Processes Homework Help
Stochastic Processes Homework Help
 
block-mdp-masters-defense.pdf
block-mdp-masters-defense.pdfblock-mdp-masters-defense.pdf
block-mdp-masters-defense.pdf
 
02 RULES OF INFERENCES.pptx
02 RULES OF INFERENCES.pptx02 RULES OF INFERENCES.pptx
02 RULES OF INFERENCES.pptx
 

More from Gaston Liberman

Poggi analytics - tm - 1b
Poggi   analytics - tm - 1bPoggi   analytics - tm - 1b
Poggi analytics - tm - 1bGaston Liberman
 
Poggi analytics - distance - 1a
Poggi   analytics - distance - 1aPoggi   analytics - distance - 1a
Poggi analytics - distance - 1aGaston Liberman
 
Poggi analytics - geneticos - 1
Poggi   analytics - geneticos - 1Poggi   analytics - geneticos - 1
Poggi analytics - geneticos - 1Gaston Liberman
 
Poggi analytics - ebl - 1
Poggi   analytics - ebl - 1Poggi   analytics - ebl - 1
Poggi analytics - ebl - 1Gaston Liberman
 
Poggi analytics - ensamble - 1b
Poggi   analytics - ensamble - 1bPoggi   analytics - ensamble - 1b
Poggi analytics - ensamble - 1bGaston Liberman
 
Poggi analytics - trees - 1e
Poggi   analytics - trees - 1ePoggi   analytics - trees - 1e
Poggi analytics - trees - 1eGaston Liberman
 
Poggi analytics - concepts - 1a
Poggi   analytics - concepts - 1aPoggi   analytics - concepts - 1a
Poggi analytics - concepts - 1aGaston Liberman
 
Poggi analytics - ml - 1d
Poggi   analytics - ml - 1dPoggi   analytics - ml - 1d
Poggi analytics - ml - 1dGaston Liberman
 
Poggi analytics - bayes - 1a
Poggi   analytics - bayes - 1aPoggi   analytics - bayes - 1a
Poggi analytics - bayes - 1aGaston Liberman
 
Poggi analytics - clustering - 1
Poggi   analytics - clustering - 1Poggi   analytics - clustering - 1
Poggi analytics - clustering - 1Gaston Liberman
 

More from Gaston Liberman (11)

Taller bd8
Taller bd8Taller bd8
Taller bd8
 
Poggi analytics - tm - 1b
Poggi   analytics - tm - 1bPoggi   analytics - tm - 1b
Poggi analytics - tm - 1b
 
Poggi analytics - distance - 1a
Poggi   analytics - distance - 1aPoggi   analytics - distance - 1a
Poggi analytics - distance - 1a
 
Poggi analytics - geneticos - 1
Poggi   analytics - geneticos - 1Poggi   analytics - geneticos - 1
Poggi analytics - geneticos - 1
 
Poggi analytics - ebl - 1
Poggi   analytics - ebl - 1Poggi   analytics - ebl - 1
Poggi analytics - ebl - 1
 
Poggi analytics - ensamble - 1b
Poggi   analytics - ensamble - 1bPoggi   analytics - ensamble - 1b
Poggi analytics - ensamble - 1b
 
Poggi analytics - trees - 1e
Poggi   analytics - trees - 1ePoggi   analytics - trees - 1e
Poggi analytics - trees - 1e
 
Poggi analytics - concepts - 1a
Poggi   analytics - concepts - 1aPoggi   analytics - concepts - 1a
Poggi analytics - concepts - 1a
 
Poggi analytics - ml - 1d
Poggi   analytics - ml - 1dPoggi   analytics - ml - 1d
Poggi analytics - ml - 1d
 
Poggi analytics - bayes - 1a
Poggi   analytics - bayes - 1aPoggi   analytics - bayes - 1a
Poggi analytics - bayes - 1a
 
Poggi analytics - clustering - 1
Poggi   analytics - clustering - 1Poggi   analytics - clustering - 1
Poggi analytics - clustering - 1
 

Recently uploaded

8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR
8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR
8447779800, Low rate Call girls in Shivaji Enclave Delhi NCRashishs7044
 
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...lizamodels9
 
Market Sizes Sample Report - 2024 Edition
Market Sizes Sample Report - 2024 EditionMarket Sizes Sample Report - 2024 Edition
Market Sizes Sample Report - 2024 EditionMintel Group
 
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCRashishs7044
 
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCRashishs7044
 
Case study on tata clothing brand zudio in detail
Case study on tata clothing brand zudio in detailCase study on tata clothing brand zudio in detail
Case study on tata clothing brand zudio in detailAriel592675
 
Call Girls in DELHI Cantt, ( Call Me )-8377877756-Female Escort- In Delhi / Ncr
Call Girls in DELHI Cantt, ( Call Me )-8377877756-Female Escort- In Delhi / NcrCall Girls in DELHI Cantt, ( Call Me )-8377877756-Female Escort- In Delhi / Ncr
Call Girls in DELHI Cantt, ( Call Me )-8377877756-Female Escort- In Delhi / Ncrdollysharma2066
 
BEST Call Girls In Old Faridabad ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Old Faridabad ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,BEST Call Girls In Old Faridabad ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Old Faridabad ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,noida100girls
 
Youth Involvement in an Innovative Coconut Value Chain by Mwalimu Menza
Youth Involvement in an Innovative Coconut Value Chain by Mwalimu MenzaYouth Involvement in an Innovative Coconut Value Chain by Mwalimu Menza
Youth Involvement in an Innovative Coconut Value Chain by Mwalimu Menzaictsugar
 
Marketing Management Business Plan_My Sweet Creations
Marketing Management Business Plan_My Sweet CreationsMarketing Management Business Plan_My Sweet Creations
Marketing Management Business Plan_My Sweet Creationsnakalysalcedo61
 
8447779800, Low rate Call girls in Tughlakabad Delhi NCR
8447779800, Low rate Call girls in Tughlakabad Delhi NCR8447779800, Low rate Call girls in Tughlakabad Delhi NCR
8447779800, Low rate Call girls in Tughlakabad Delhi NCRashishs7044
 
8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCR8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCRashishs7044
 
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...lizamodels9
 
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...lizamodels9
 
Progress Report - Oracle Database Analyst Summit
Progress  Report - Oracle Database Analyst SummitProgress  Report - Oracle Database Analyst Summit
Progress Report - Oracle Database Analyst SummitHolger Mueller
 
VIP Kolkata Call Girl Howrah 👉 8250192130 Available With Room
VIP Kolkata Call Girl Howrah 👉 8250192130  Available With RoomVIP Kolkata Call Girl Howrah 👉 8250192130  Available With Room
VIP Kolkata Call Girl Howrah 👉 8250192130 Available With Roomdivyansh0kumar0
 
Islamabad Escorts | Call 03274100048 | Escort Service in Islamabad
Islamabad Escorts | Call 03274100048 | Escort Service in IslamabadIslamabad Escorts | Call 03274100048 | Escort Service in Islamabad
Islamabad Escorts | Call 03274100048 | Escort Service in IslamabadAyesha Khan
 
Intro to BCG's Carbon Emissions Benchmark_vF.pdf
Intro to BCG's Carbon Emissions Benchmark_vF.pdfIntro to BCG's Carbon Emissions Benchmark_vF.pdf
Intro to BCG's Carbon Emissions Benchmark_vF.pdfpollardmorgan
 
Digital Transformation in the PLM domain - distrib.pdf
Digital Transformation in the PLM domain - distrib.pdfDigital Transformation in the PLM domain - distrib.pdf
Digital Transformation in the PLM domain - distrib.pdfJos Voskuil
 
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...lizamodels9
 

Recently uploaded (20)

8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR
8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR
8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR
 
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...
 
Market Sizes Sample Report - 2024 Edition
Market Sizes Sample Report - 2024 EditionMarket Sizes Sample Report - 2024 Edition
Market Sizes Sample Report - 2024 Edition
 
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
 
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR
 
Case study on tata clothing brand zudio in detail
Case study on tata clothing brand zudio in detailCase study on tata clothing brand zudio in detail
Case study on tata clothing brand zudio in detail
 
Call Girls in DELHI Cantt, ( Call Me )-8377877756-Female Escort- In Delhi / Ncr
Call Girls in DELHI Cantt, ( Call Me )-8377877756-Female Escort- In Delhi / NcrCall Girls in DELHI Cantt, ( Call Me )-8377877756-Female Escort- In Delhi / Ncr
Call Girls in DELHI Cantt, ( Call Me )-8377877756-Female Escort- In Delhi / Ncr
 
BEST Call Girls In Old Faridabad ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Old Faridabad ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,BEST Call Girls In Old Faridabad ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Old Faridabad ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
 
Youth Involvement in an Innovative Coconut Value Chain by Mwalimu Menza
Youth Involvement in an Innovative Coconut Value Chain by Mwalimu MenzaYouth Involvement in an Innovative Coconut Value Chain by Mwalimu Menza
Youth Involvement in an Innovative Coconut Value Chain by Mwalimu Menza
 
Marketing Management Business Plan_My Sweet Creations
Marketing Management Business Plan_My Sweet CreationsMarketing Management Business Plan_My Sweet Creations
Marketing Management Business Plan_My Sweet Creations
 
8447779800, Low rate Call girls in Tughlakabad Delhi NCR
8447779800, Low rate Call girls in Tughlakabad Delhi NCR8447779800, Low rate Call girls in Tughlakabad Delhi NCR
8447779800, Low rate Call girls in Tughlakabad Delhi NCR
 
8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCR8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCR
 
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...
 
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
 
Progress Report - Oracle Database Analyst Summit
Progress  Report - Oracle Database Analyst SummitProgress  Report - Oracle Database Analyst Summit
Progress Report - Oracle Database Analyst Summit
 
VIP Kolkata Call Girl Howrah 👉 8250192130 Available With Room
VIP Kolkata Call Girl Howrah 👉 8250192130  Available With RoomVIP Kolkata Call Girl Howrah 👉 8250192130  Available With Room
VIP Kolkata Call Girl Howrah 👉 8250192130 Available With Room
 
Islamabad Escorts | Call 03274100048 | Escort Service in Islamabad
Islamabad Escorts | Call 03274100048 | Escort Service in IslamabadIslamabad Escorts | Call 03274100048 | Escort Service in Islamabad
Islamabad Escorts | Call 03274100048 | Escort Service in Islamabad
 
Intro to BCG's Carbon Emissions Benchmark_vF.pdf
Intro to BCG's Carbon Emissions Benchmark_vF.pdfIntro to BCG's Carbon Emissions Benchmark_vF.pdf
Intro to BCG's Carbon Emissions Benchmark_vF.pdf
 
Digital Transformation in the PLM domain - distrib.pdf
Digital Transformation in the PLM domain - distrib.pdfDigital Transformation in the PLM domain - distrib.pdf
Digital Transformation in the PLM domain - distrib.pdf
 
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
 

Poggi analytics - star - 1a

  • 1. Buenos Aires, mayo de 2016 Eduardo Poggi
  • 2. Agenda  Reglas de generalización  Algoritmo Star de Michalsky  Algoritmo de Vere  Learning First Order Rules
  • 3. Agenda  Reglas de generalización  Algoritmo Star de Michalsky  Algoritmo de Vere  Learning First Order Rules
  • 4. Reglas de generalización  Eliminación de conjunción  P(X) <- A(X) ^ B(X) ^ C(X) <  P(X) <- A(X) ^ B(X)  Adición de disyunción  P(X) <- A(X) <  P(x) <- A(X) v B(X)  Conjunciones por disyunciones  P(X) <- A(X) ^ B(X) <  P(x) <- A(X) v B(X)
  • 5. Reglas de generalización  Ampliación de rango de valores  P(v/v in R1) <  P(v/v in R2) iif R1<R2  Constantes por variables  P(a) <- <  P(X) <-  Resolución inductiva  { P(X) <- A(X) ^ B(X);  P(X) <- -A(X) ^ C(X) } <  P(X) <- B(X) v C(X)
  • 6. Reglas de generalización  Escalar en árbol de generalización  P(v) <  P(t(v)) iif v<t(v)
  • 7. Distancias ProductoProducto ComestiblesComestibles LimpiezaLimpieza IndumentariaIndumentaria AnimalAnimal VegetalVegetal MineralMineral LácteosLácteos CárnicosCárnicos Leche liquidaLeche liquida Leche fermentadaLeche fermentada QuesosQuesos MantecaManteca Yogurt enteroYogurt entero Yogurt descremadoYogurt descremado Yogurt naturalYogurt natural Yogurt saborizadoYogurt saborizado
  • 8. Reglas de generalización constructivas  Reemplazo de términos  P(X) <- A(X) ^ B(X) <  P(X) <- A(X) ^ C(X) iff B(X) < C(X)
  • 9. Agenda  Reglas de generalización  Algoritmo Star de Michalsky  Algoritmo de Vere  Learning First Order Rules
  • 10. STAR (Michalsky)  Hasta condición de terminación  Seleccionar un ejemplo  Obtener el árbol de generalización (STAR) a partir de aplicar al ejemplo todas las reglas de generalización (especialización) posibles que no cubran contra-ejemplos.  Evaluar la lista de generalización y ordenar.  Eliminar los ejemplos ya cubiertos.
  • 11. STAR (Michalsky) venenoso(X) <- color(X,marron) forma(X,alargado) tierra(X,humeda) ambiente(X,humedo) bajo_arbol(X,a33) arbol(a33,fresno) venenoso(X) <- color(X,marron) forma(X,alargado) tierra(X,humeda) ambiente(X,humedo) bajo_arbol(X,a33) arbol(a33,fresno) venenoso(X) <- color(X,marron) forma(X,alargado) tierra(X,humeda) ambiente(X,humedo) bajo_arbol(X,Y) arbol(Y,fresno) venenoso(X) <- color(X,marron) forma(X,alargado) tierra(X,humeda) ambiente(X,humedo) bajo_arbol(X,Y) arbol(Y,fresno) venenoso(X) <- color(X,marron) forma(X,alargado) tierra(X,humeda) bajo_arbol(X,a33) arbol(a33,fresno) venenoso(X) <- color(X,marron) forma(X,alargado) tierra(X,humeda) bajo_arbol(X,a33) arbol(a33,fresno) venenoso(X) <- color(X,marron) forma(X,alargado) tierra(X,humeda) ambiente(X,humedo) venenoso(X) <- color(X,marron) forma(X,alargado) tierra(X,humeda) ambiente(X,humedo) venenoso(X) <- …venenoso(X) <- …
  • 12. STAR (Michalsky) venenoso(X) <- color(X, [marron,verde]) forma(X,alargado) tierra(X,humeda) bajo_arbol(X,Y) arbol(Y, [fresno,laurel]) venenoso(X) <- color(X, [marron,verde]) forma(X,alargado) tierra(X,humeda) bajo_arbol(X,Y) arbol(Y, [fresno,laurel]) venenoso(X) <- forma(X,alargado) tierra(X,humeda) ambiente(X,humedo) venenoso(X) <- forma(X,alargado) tierra(X,humeda) ambiente(X,humedo) venenoso(X) <- color(X,marron) forma(X,alargado) tierra(X,humeda) ambiente(X,humedo) bajo_arbol(X,Y) arbol(Y,Z) venenoso(X) <- color(X,marron) forma(X,alargado) tierra(X,humeda) ambiente(X,humedo) bajo_arbol(X,Y) arbol(Y,Z)
  • 13. Agenda  Reglas de generalización  Algoritmo Star de Michalsky  Algoritmo de Vere  Learning First Order Rules
  • 14. Abstracción y GME  Abstracción como sustitución inductiva  GCME = Generalización Común Máximalmente Específica  Acoplamiento y residuo
  • 15. Vere  P = GCME de los ejemplos  N = GCME de los contraejemplos  C = P & -N  Continuar iterativamente hasta:  C = P & -(N1 & -(N2 & -(… & Nk) …))
  • 16. Vere  P1 =  color(X,marron) ^ forma(X,alargado) ^ tierra(X,humeda) ^  (ambiente(X,humedo) v ambiente(X,semi_humedo)^  bajo_arbol(X,Y) ^ arbol(Y,Z)  N1 =  color(X,verde) ^ forma(X,redondo) ^  ambiente(X,semi_humedo)^  bajo_arbol(X,Y) ^ arbol(Y,Z)  C1 = P1 ^ -N2 = ?
  • 17. Vere  C1 = P1 ^ -N2 =  color(X,marron) ^ forma(X,alargado) ^ tierra(X,humeda) ^  (ambiente(X,humedo) v ambiente(X,semi_humedo)^  bajo_arbol(X,Y) ^ arbol(Y,Z) ^  - [ color(X,verde) ^ forma(X,redondo) ^  ambiente(X,semi_humedo) ^  bajo_arbol(X,Y) ^ arbol(Y,Z) ]  =  color(X,marron) ^ forma(X,alargado) ^ tierra(X,humeda) ^  (ambiente(X,humedo) v ambiente(X,semi_humedo) ^  bajo_arbol(X,Y) ^ arbol(Y,Z) ^  -color(X,verde) ^ -forma(X,redondo) ^  -ambiente(X,semi_humedo) ^  - bajo_arbol(X,Y) ^ - arbol(Y,Z)
  • 18. Vere  ≈ …  color(X,marron) ^ -color(X,verde)  forma(X,alargado) ^ -forma(X,redondo)  tierra(X,humeda) ^  ambiente(X,humedo ^  bajo_arbol(X,Y) ^ arbol(Y,Z)
  • 19. ML como BH  K = ?  Lista = {semilla}  Hasta condición de terminación  Nodo = primero de la lista  Seleccionar reglas de generalización aplicables al Nodo  Aplicar reglas al Nodo y generar nuevos Nodos  Calcular Performance de Nodos  Agregar Nodos a la Lista  Ordenar Lista según Performance  Truncar Lista en los k mejores
  • 20. Agenda  Reglas de generalización  Algoritmo Star de Michalsky  Algoritmo de Vere  Learning First Order Rules
  • 21. Learning set of rules  Learning sets of rules has the advantage that the hypothesis is easy to interpret.  Sequential covering algorithm to learn first-order rules.
  • 22. Learning rules  First-order rule sets contain rules that have variables.  This enables us to have stronger representational power.  Example:  If Parent(x,y) then Ancestor(x,y)  If Parent(x,z) and Ancestor(z,y) then Ancestor(x,y)  How would you represent this using a decision tree or predicate calculus?
  • 23. Sequential Covering  General idea:  Learn one rule that covers certain number of positive examples  Remove those examples covered by the rule  Repeat until no positive examples are left. Rule 1 Rule 2
  • 24. Accuracy vs Coverage We ask that each rule has high accuracy but not necessarily high coverage, for example: Rule 1 Rule 1 has 90% accuracy and 50% coverage. In general the coverage may be low as long as accuracy is high.
  • 25. Sequential Covering Algorithm  Sequential-Covering (class,attributes,examples,threshold T)  RuleSet = 0  Rule = Learn-one-rule(class,attributes,examples)  While (performance(Rule) > T) do  RuleSet += Rule  Examples = Examples {ex. classified correctly by Rule}  Rule = Learn-one-rule(class,attributes,examples)  Sort RuleSet based on the performance of the rules  Return RuleSet
  • 26. Sequential Covering Algorithm  Observations:  It performs a greedy search (no backtracking); as such it may not find an optimal rule set.  It learns a disjunctive set of rules by learning each disjunct (conjunction of att.values) at a time.  It sequentially covers the set of positive examples until the performance of a rule is below a threshold.
  • 27. Learn One Rule How do we learn each individual rule? One approach is to proceed as in decision tree learning but by following the branch with best score in terms of splitting function: Luminosity Mass Type A Type B Type C > T1<= T1 > T2<= T2 If Luminosity <= T1 and Mass > T2 then class is Type B
  • 28. Learn One Rule  Observations:  We greedily choose the attribute that most improves rule performance over the training set.  We perform a greedy depth first search with no backtracking.  The algorithm can be extended using a beam-search:  We keep a list of the best k attributes at each step.  For each attribute we generate descendants.  In the next step we take the best k attributes and continue.
  • 29. Algorithm  LearnOneRule(class,attributes,examples,k):  Best-hypothesis = 0  Candidate-hypotheses = {Best-hypothesis}  While Candidate-hypotheses is not empty do  Generate the next more specific candidate hypotheses  Update Best-hypothesis  For all h in new-candidates  if (Performance(h) > Performance(Best-hypothesis)) Best-hypothesis = h  Update Candidate-hypotheses = best k members of new-candidates  Return rule: If Best-hypothesis then prediction (most frequent class of examples covered by Best-hypothesis)
  • 30. Algorithm  Generate the next more specific candidate hypotheses:  Values = the set of all attribute values, e.g., color = blue  For each rule h in Candidate-hypotheses do  For each attribute-value v do  Add to h value v  new-candidates += h  Remove from new-candidates hypotheses that are duplicates, inconsistent or not maximally specific.  Return new-candidates
  • 31. Example  Astronomy problem: classifying objects as stars of different types.  Attributes: luminosity, mass, temperature, size.  Assume the set of possible values are as follows:  Luminosity <= T1 = l1 Luminosity > T1 = l2  Mass <= T2 = m1 Mass > T2 = m2  Temperature <= T3 = c1 Temperature > T3 = c2  Size <= T4 = s1 Size > T4 = s2
  • 32. Running Algorithm on Example  Most specific hypotheses:  l1, l2, m1, m2, c1, c2, s1, s2  Assume Performance = P  P(c1) > P(x) for all x different than c1  Then best-hypothesis = c1  Assume k = 4  Best possible hypotheses: l1, m2, s1, c1
  • 33. Running Algorithm on Example  Candidate hypotheses: l1, m2, and c1  New candidates:  l1 & l1 (*)  l1 & L2 (^) m2 & l1 c1 & l1  l1 & m1 m2 & l2 c1 & l2  l1 & m2 m2 & m1 (^) … etc  l1 & c1 m2 & m2 (*)  l1 & c2 m2 & c1  l1 & s1 … etc  l1 & s2  (*) duplicate (^) inconsistent
  • 34. Running Algorithm on Example  Compute the performance of each new candidate.  Update best-hypothesis to the best new candidate  Example:  Best–hypothesis = l1 & c2  Now take the best k = 3 new candidates  And continue generating new candidates:  l1 & c2 & s1  l1 & c2 & s2  … etc
  • 35. Performance Evaluation  The performance of a new candidate can be computed using information-theoretic measures like entropy:  Performance(h, examples, class)  h_examples = the subsets of examples covered by h  Return Entropy(h_examples)
  • 36. Considerations  The best-hypothesis is the hypothesis with highest  performance value, and not necessarily the last hypothesis:  Space:  L1, m2, c1  L1&m2, l2&s1, m2&c1  L1&m2&s1, l2&s1&c2, m2&c1&s2  …  Possible best hypothesis: l2&s1
  • 37. Variations What happens if the proportion of examples of a class is low? In other words what happens if the a priori probability of a class of examples is very low? Example: patients with a very strange disease. In that case we can modify the algorithm to learn only from those rare examples, and to classify anything outside the rule set as negative.
  • 38. Variations  A second variation is used in the popular AQ and CN2 algorithms. General idea:  Choose one seed positive example  Look for the most specific rule that covers the positive example and has high performance  Repeat with another seed example until no more improvement is seen on the rule set
  • 39. Variations Rule 1 Rule 2 seed1 seed2
  • 40. Final Points for Consideration  A search can be done in a general-to-specific fashion. But one can also use a specific-to-general fashion. Which one is best?  Here we use a generate-then-test strategy. How about using an example- driven strategy like the candidate elimination algorithm? (this last type is more easily fooled by noise in the data)  When and how should we prune rules?  Different performance metrics exist:  Relative frequency  Accuracy  Entropy
  • 41. Rule learning and decision trees  What is the difference between both?  Decision trees: Divide and conquer  Rule Learning: Separate and conquer

Editor's Notes

  1. What is machine learning?
  2. What is machine learning?
  3. What is machine learning?
  4. What is machine learning?
  5. What is machine learning?
  6. What is machine learning?
  7. What is machine learning?
  8. What is machine learning?
  9. What is machine learning?
  10. What is machine learning?
  11. What is machine learning?
  12. What is machine learning?
  13. What is machine learning?
  14. What is machine learning?
  15. What is machine learning?
  16. What is machine learning?
  17. What is machine learning?
  18. What is machine learning?
  19. What is machine learning?
  20. What is machine learning?
  21. What is machine learning?
  22. What is machine learning?
  23. What is machine learning?
  24. What is machine learning?
  25. What is machine learning?
  26. What is machine learning?