Probabilistic Soft Logic (PSL) is a framework for modeling relational data with soft constraints. PSL uses soft truth values between 0 and 1 instead of Boolean values. Rules in PSL are weighted and define a probability distribution over interpretations. The Lukasiewicz t-norm and co-norm are used to measure the degree to which rules are satisfied or violated under an interpretation. This allows PSL to compute the most probable interpretation that best satisfies the rules.
This document summarizes a research paper on asymptotic attractivity results for functional differential equations in Banach algebras.
The paper proves the existence of locally asymptotically attractive solutions and asymptotic stability for nonlinear functional differential equations in Banach algebras. It establishes conditions under which the solutions are uniformly locally asymptotically attractive on a given interval.
The paper represents functional differential equations as integral equations and uses a fixed point theorem to prove the existence of solutions. It then shows that any two solutions converge uniformly, demonstrating the local asymptotic attractivity of the solutions.
Congruence Lattices of Isoform LatticesIOSR Journals
This document discusses isoform lattices and congruence lattices. It begins by defining isoform congruences and isoform lattices. Every finite distributive lattice D can be represented as the congruence lattice of a finite isoform lattice. A new lattice construction called N(A,B,θ) is introduced, where A is a finite bounded lattice, B is a finite lattice with a discrete transitive congruence θ, and N(A,B,θ) is their pruned direct product. It is proved that N(A,B,θ) is a lattice. The document then discusses the congruences of N(A,B,θ) and proves the main theorem that
Metalogic: The non-algorithmic side of the mindHaskell Lambda
The document discusses metalogic, which is the study of metatheories of logic. It defines metalogic and contrasts it with logic. It then discusses classical and quantum metalanguages, Tarski's Convention T, and Gödel's incompleteness theorems. The key points are:
1) Metalogic studies the properties of logical systems themselves, not arguments within a system like logic does.
2) A quantum metalanguage assigns degrees of certainty to assertions rather than treating them classically.
3) Tarski's Convention T relates sentences to their truth values, and this is generalized to Convention PT for quantum logic.
4) Gödel's incompleteness theorems
Last time we talked about propositional logic, a logic on simple statements.
This time we will talk about first order logic, a logic on quantified statements.
First order logic is much more expressive than propositional logic.
The topics on first order logic are:
1-Quantifiers
2-Negation
3-Multiple quantifiers
4-Arguments of quantified statements
This document discusses using deductive reasoning to verify conjectures. It introduces the law of detachment and law of syllogism as valid forms of deductive reasoning. Examples show applying these laws to determine if conjectures are valid based on given conditional statements. The document also contains warm up questions, vocabulary, example problems, and a lesson quiz to assess understanding.
This document discusses inference rules in first-order logic, specifically resolution. It defines resolution in propositional logic and first-order logic, and proves it is a sound and complete proof procedure. It also discusses how to use resolution to answer true/false and fill-in-the-blank questions by adding clauses representing the question and searching for resolutions that prove or answer the question.
This document introduces the concept of fuzzy sets, which are classes of objects that have a continuum of grades of membership rather than crisp criteria for inclusion. Fuzzy sets are characterized by membership functions that assign each object a value between 0 and 1 indicating its grade of membership. Operations like union, intersection, and complementation are extended from ordinary sets to fuzzy sets. Basic properties of these operations are established, including identities analogous to those for ordinary sets. Fuzzy sets provide a framework for dealing with imprecisely defined classes that play an important role in human thinking and domains like pattern recognition.
This document summarizes a research paper on asymptotic attractivity results for functional differential equations in Banach algebras.
The paper proves the existence of locally asymptotically attractive solutions and asymptotic stability for nonlinear functional differential equations in Banach algebras. It establishes conditions under which the solutions are uniformly locally asymptotically attractive on a given interval.
The paper represents functional differential equations as integral equations and uses a fixed point theorem to prove the existence of solutions. It then shows that any two solutions converge uniformly, demonstrating the local asymptotic attractivity of the solutions.
Congruence Lattices of Isoform LatticesIOSR Journals
This document discusses isoform lattices and congruence lattices. It begins by defining isoform congruences and isoform lattices. Every finite distributive lattice D can be represented as the congruence lattice of a finite isoform lattice. A new lattice construction called N(A,B,θ) is introduced, where A is a finite bounded lattice, B is a finite lattice with a discrete transitive congruence θ, and N(A,B,θ) is their pruned direct product. It is proved that N(A,B,θ) is a lattice. The document then discusses the congruences of N(A,B,θ) and proves the main theorem that
Metalogic: The non-algorithmic side of the mindHaskell Lambda
The document discusses metalogic, which is the study of metatheories of logic. It defines metalogic and contrasts it with logic. It then discusses classical and quantum metalanguages, Tarski's Convention T, and Gödel's incompleteness theorems. The key points are:
1) Metalogic studies the properties of logical systems themselves, not arguments within a system like logic does.
2) A quantum metalanguage assigns degrees of certainty to assertions rather than treating them classically.
3) Tarski's Convention T relates sentences to their truth values, and this is generalized to Convention PT for quantum logic.
4) Gödel's incompleteness theorems
Last time we talked about propositional logic, a logic on simple statements.
This time we will talk about first order logic, a logic on quantified statements.
First order logic is much more expressive than propositional logic.
The topics on first order logic are:
1-Quantifiers
2-Negation
3-Multiple quantifiers
4-Arguments of quantified statements
This document discusses using deductive reasoning to verify conjectures. It introduces the law of detachment and law of syllogism as valid forms of deductive reasoning. Examples show applying these laws to determine if conjectures are valid based on given conditional statements. The document also contains warm up questions, vocabulary, example problems, and a lesson quiz to assess understanding.
This document discusses inference rules in first-order logic, specifically resolution. It defines resolution in propositional logic and first-order logic, and proves it is a sound and complete proof procedure. It also discusses how to use resolution to answer true/false and fill-in-the-blank questions by adding clauses representing the question and searching for resolutions that prove or answer the question.
This document introduces the concept of fuzzy sets, which are classes of objects that have a continuum of grades of membership rather than crisp criteria for inclusion. Fuzzy sets are characterized by membership functions that assign each object a value between 0 and 1 indicating its grade of membership. Operations like union, intersection, and complementation are extended from ordinary sets to fuzzy sets. Basic properties of these operations are established, including identities analogous to those for ordinary sets. Fuzzy sets provide a framework for dealing with imprecisely defined classes that play an important role in human thinking and domains like pattern recognition.
Propositional logic is a good vehicle to introduce basic properties of logicpendragon6626
Propositional logic uses symbols and logical connectives to evaluate the validity of compound statements based on the validity of atomic statements. Natural deduction and resolution are deductive systems that use inference rules to prove statements. Natural deduction is sound and complete, while resolution is also complete. Propositional resolution can check validity by constructing a refutation tree, and linear resolution with Horn clauses is efficient for this task like the logic programming language Prolog.
This document provides an overview of logic, proofs, and their applications. It begins with definitions of logic and logical operations like conjunction, disjunction, and negation. It then discusses different types of proofs like direct proofs, proof by contradiction, and proof by induction. Examples are provided to illustrate logical operations and different proof techniques. The document concludes by discussing two applications of logic and proofs - translating English sentences into logical statements and performing Boolean searches.
This document discusses simulations of multinomial randomized response models. It describes how the Warner randomized response model was extended to allow for multiple mutually exclusive categories. The key points are:
1) Abul-Ela et al. (1967) defined the multinomial randomized response model which allows sampling of multiple groups during a single experiment.
2) Simulations of the trinomial randomized response model are discussed, including calculating variances, biases, and mean squared errors to compare to direct response models.
3) Optimal values of the randomization probabilities (p-values) need to be determined to minimize variance while maximizing compliance for the trinomial model.
1) The document discusses the limiting behavior of the "probability of claiming superiority" (PST) in Bayesian clinical trials as the sample size increases.
2) The main result is that under certain conditions, the PST (also called average power) converges to the prior probability that the alternative hypothesis is true.
3) The two key assumptions for this limiting result are: 1) the posterior distribution is "π-consistent", and 2) the prior probability of the boundary of the alternative hypothesis set is zero.
The document discusses knowledge representation using propositional logic and predicate logic. It begins by explaining the syntax and semantics of propositional logic for representing problems as logical theorems to prove. Predicate logic is then introduced as being more versatile than propositional logic for representing knowledge, as it allows quantifiers and relations between objects. Examples are provided to demonstrate how predicate logic can formally represent statements involving universal and existential quantification.
A Bayesian Networks Analysis of the Duhem-Quine Thesis.pdfLiz Adams
This document analyzes whether Bayesian network analysis can address the Duhem-Quine thesis that there is no objective way to revise scientific theories when confronted with disconfirming evidence. It summarizes the Bayesian approach proposed by some and the assumptions required. It then shows through theorems and examples that changing the probabilistic relationships between variables in different networks while relaxing certain assumptions can lead to network dependent disconfirmation, where the recommended revision changes. Specifically, relaxing independence or probability assumptions between the theory, auxiliary beliefs, and evidence can change which belief is recommended to reject given disconfirming evidence.
This document summarizes restrictions on sharing and reproducing an article from a journal published by Elsevier. The attached copy can be used by the author for non-commercial research and education purposes, including instruction and sharing with colleagues. Other uses like reproduction, distribution, selling, licensing copies or posting on websites are prohibited without permission. Authors are allowed to post their version of the article to their personal or institutional websites or repositories. The document provides a link for more information on Elsevier's archiving and manuscript policies.
This document provides an overview of simple linear regression and correlation analysis. It defines regression as estimating the relationship between two variables and correlation as measuring the strength and direction of that relationship. The key points covered include:
- Regression finds an estimating equation to relate known and unknown variables. Correlation determines how well that equation fits the data.
- Pearson's correlation coefficient r measures the linear relationship between two variables on a scale from -1 to 1.
- The coefficient of determination r2 indicates what percentage of variation in the dependent variable is explained by the independent variable.
- Statistical tests can evaluate whether a correlation is statistically significant or could be due to chance.
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...mathsjournal
The following document presents some novel numerical methods valid for one and several variables, which
using the fractional derivative, allow us to find solutions for some nonlinear systems in the complex space using
real initial conditions. The origin of these methods is the fractional Newton-Raphson method, but unlike the
latter, the orders proposed here for the fractional derivatives are functions. In the first method, a function is
used to guarantee an order of convergence (at least) quadratic, and in the other, a function is used to avoid the
discontinuity that is generated when the fractional derivative of the constants is used, and with this, it is possible
that the method has at most an order of convergence (at least) linear.
An improved demspter shafer algorithm for resolving conflicting eventsGauravv Prabhu
This document proposes an improved Dempster-Shafer algorithm to resolve conflicting evidence. It begins with an overview of Dempster-Shafer theory and identifies shortcomings when evidence conflicts. It then presents a new method to verify and modify conflicting evidence before combination. Experiments show the new method improves reliability by modifying conflicting evidence and producing more intuitive combination results, even when evidence highly conflicts.
The document provides an introduction to formal logic. It discusses how to formulate valid arguments through propositional logic and syllogistic logic. Propositional logic uses truth tables to evaluate combinations of propositions and operators like negation and conjunction. Syllogistic logic examines implications of general statements using domains and categories. The key rules of inference for valid arguments are hypothetical syllogism, modus ponens, and modus tollens.
The document discusses Godel's incompleteness theorems and Hilbert's program. It provides background on key figures like Hilbert, Godel, Russell and Cantor. It then explains Hilbert's program to formalize all of mathematics and prove its consistency. Godel showed that any theory capable of elementary arithmetic cannot be both consistent and complete. Specifically, for any formal theory T including basic arithmetic truths, T includes a statement of its own consistency if and only if T is inconsistent.
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...mathsjournal
The following document presents some novel numerical methods valid for one and several variables, which
using the fractional derivative, allow us to find solutions for some nonlinear systems in the complex space using
real initial conditions. The origin of these methods is the fractional Newton-Raphson method, but unlike the
latter, the orders proposed here for the fractional derivatives are functions. In the first method, a function is
used to guarantee an order of convergence (at least) quadratic, and in the other, a function is used to avoid the
discontinuity that is generated when the fractional derivative of the constants is used, and with this, it is possible
that the method has at most an order of convergence (at least) linear
This document provides an introduction to First Order Predicate Logic (FOPL). It discusses the differences between propositional logic and FOPL, the parts and syntax of FOPL including terms, atomic sentences, quantifiers and rules of inference. The semantics of FOPL are also explained. Pros and cons are provided, such as FOPL's ability to represent individual entities and generalizations compared to propositional logic. Applications include using FOPL as a framework for formulating theories.
The following document presents some novel numerical methods valid for one and several variables, which using the fractional derivative, allow us to find solutions for some nonlinear systems in the complex space using real initial conditions. The origin of these methods is the fractional Newton-Raphson method, but unlike the latter, the orders proposed here for the fractional derivatives are functions. In the first method, a function is used to guarantee an order of convergence (at least) quadratic, and in the other, a function is used to avoid the discontinuity that is generated when the fractional derivative of the constants is used, and with this, it is possible that the method has at most an order of convergence (at least) linear. Keywords: Iteration Function, Order of Convergence, Fractional Derivative.
This document provides lecture notes on hypothesis testing. It begins with an introduction to hypothesis testing and how it differs from estimation in its hypothetical reasoning approach. It then discusses Fisher's significance testing approach, including defining a test statistic, its sampling distribution under the null hypothesis, and calculating a p-value. It provides examples of applying this approach. Finally, it discusses some weaknesses of Fisher's approach identified by Neyman and Pearson and how their approach improved upon it by introducing the concept of alternative hypotheses and pre-data error probabilities.
Logicians sometimes talk about sentences being “true but unprovable." What does this mean? This presentation includes a fairly thorough introduction to mathematical logic.
This document discusses mathematical concepts related to relations including:
1. The inverse of a relation R-1, which relates elements in the opposite direction as R.
2. The composition of two relations R and S, denoted R◦S or RS, which relates elements related by both R and S.
3. Matrices can represent relations and be used to calculate their composition.
4. A partial order relation on a set A is a relation that is reflexive, antisymmetric, and transitive. Examples of partial order relations include set inclusion and the less than or equal to relation on real numbers.
This document provides an overview of one-dimensional random variables including definitions, types (discrete vs continuous), and probability distributions. It defines a random variable as a function that assigns a numerical value to each outcome of a random experiment. Random variables can be either discrete, taking on countable values, or continuous, assuming any value in an interval. The probability distribution of a discrete random variable is defined by a probability mass function, while a continuous random variable has a probability density function. Examples are given of both types of random variables and their distributions.
Oscillation and Convengence Properties of Second Order Nonlinear Neutral Dela...inventionjournals
In this paper, we consider the second order nonlinear neutral delay difference equations of the form We establish sufficient conditions which ensures that every solution of is either oscillatory or tends to zero as . We also gives examples to illustrate our results
Propositional logic is a good vehicle to introduce basic properties of logicpendragon6626
Propositional logic uses symbols and logical connectives to evaluate the validity of compound statements based on the validity of atomic statements. Natural deduction and resolution are deductive systems that use inference rules to prove statements. Natural deduction is sound and complete, while resolution is also complete. Propositional resolution can check validity by constructing a refutation tree, and linear resolution with Horn clauses is efficient for this task like the logic programming language Prolog.
This document provides an overview of logic, proofs, and their applications. It begins with definitions of logic and logical operations like conjunction, disjunction, and negation. It then discusses different types of proofs like direct proofs, proof by contradiction, and proof by induction. Examples are provided to illustrate logical operations and different proof techniques. The document concludes by discussing two applications of logic and proofs - translating English sentences into logical statements and performing Boolean searches.
This document discusses simulations of multinomial randomized response models. It describes how the Warner randomized response model was extended to allow for multiple mutually exclusive categories. The key points are:
1) Abul-Ela et al. (1967) defined the multinomial randomized response model which allows sampling of multiple groups during a single experiment.
2) Simulations of the trinomial randomized response model are discussed, including calculating variances, biases, and mean squared errors to compare to direct response models.
3) Optimal values of the randomization probabilities (p-values) need to be determined to minimize variance while maximizing compliance for the trinomial model.
1) The document discusses the limiting behavior of the "probability of claiming superiority" (PST) in Bayesian clinical trials as the sample size increases.
2) The main result is that under certain conditions, the PST (also called average power) converges to the prior probability that the alternative hypothesis is true.
3) The two key assumptions for this limiting result are: 1) the posterior distribution is "π-consistent", and 2) the prior probability of the boundary of the alternative hypothesis set is zero.
The document discusses knowledge representation using propositional logic and predicate logic. It begins by explaining the syntax and semantics of propositional logic for representing problems as logical theorems to prove. Predicate logic is then introduced as being more versatile than propositional logic for representing knowledge, as it allows quantifiers and relations between objects. Examples are provided to demonstrate how predicate logic can formally represent statements involving universal and existential quantification.
A Bayesian Networks Analysis of the Duhem-Quine Thesis.pdfLiz Adams
This document analyzes whether Bayesian network analysis can address the Duhem-Quine thesis that there is no objective way to revise scientific theories when confronted with disconfirming evidence. It summarizes the Bayesian approach proposed by some and the assumptions required. It then shows through theorems and examples that changing the probabilistic relationships between variables in different networks while relaxing certain assumptions can lead to network dependent disconfirmation, where the recommended revision changes. Specifically, relaxing independence or probability assumptions between the theory, auxiliary beliefs, and evidence can change which belief is recommended to reject given disconfirming evidence.
This document summarizes restrictions on sharing and reproducing an article from a journal published by Elsevier. The attached copy can be used by the author for non-commercial research and education purposes, including instruction and sharing with colleagues. Other uses like reproduction, distribution, selling, licensing copies or posting on websites are prohibited without permission. Authors are allowed to post their version of the article to their personal or institutional websites or repositories. The document provides a link for more information on Elsevier's archiving and manuscript policies.
This document provides an overview of simple linear regression and correlation analysis. It defines regression as estimating the relationship between two variables and correlation as measuring the strength and direction of that relationship. The key points covered include:
- Regression finds an estimating equation to relate known and unknown variables. Correlation determines how well that equation fits the data.
- Pearson's correlation coefficient r measures the linear relationship between two variables on a scale from -1 to 1.
- The coefficient of determination r2 indicates what percentage of variation in the dependent variable is explained by the independent variable.
- Statistical tests can evaluate whether a correlation is statistically significant or could be due to chance.
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...mathsjournal
The following document presents some novel numerical methods valid for one and several variables, which
using the fractional derivative, allow us to find solutions for some nonlinear systems in the complex space using
real initial conditions. The origin of these methods is the fractional Newton-Raphson method, but unlike the
latter, the orders proposed here for the fractional derivatives are functions. In the first method, a function is
used to guarantee an order of convergence (at least) quadratic, and in the other, a function is used to avoid the
discontinuity that is generated when the fractional derivative of the constants is used, and with this, it is possible
that the method has at most an order of convergence (at least) linear.
An improved demspter shafer algorithm for resolving conflicting eventsGauravv Prabhu
This document proposes an improved Dempster-Shafer algorithm to resolve conflicting evidence. It begins with an overview of Dempster-Shafer theory and identifies shortcomings when evidence conflicts. It then presents a new method to verify and modify conflicting evidence before combination. Experiments show the new method improves reliability by modifying conflicting evidence and producing more intuitive combination results, even when evidence highly conflicts.
The document provides an introduction to formal logic. It discusses how to formulate valid arguments through propositional logic and syllogistic logic. Propositional logic uses truth tables to evaluate combinations of propositions and operators like negation and conjunction. Syllogistic logic examines implications of general statements using domains and categories. The key rules of inference for valid arguments are hypothetical syllogism, modus ponens, and modus tollens.
The document discusses Godel's incompleteness theorems and Hilbert's program. It provides background on key figures like Hilbert, Godel, Russell and Cantor. It then explains Hilbert's program to formalize all of mathematics and prove its consistency. Godel showed that any theory capable of elementary arithmetic cannot be both consistent and complete. Specifically, for any formal theory T including basic arithmetic truths, T includes a statement of its own consistency if and only if T is inconsistent.
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...mathsjournal
The following document presents some novel numerical methods valid for one and several variables, which
using the fractional derivative, allow us to find solutions for some nonlinear systems in the complex space using
real initial conditions. The origin of these methods is the fractional Newton-Raphson method, but unlike the
latter, the orders proposed here for the fractional derivatives are functions. In the first method, a function is
used to guarantee an order of convergence (at least) quadratic, and in the other, a function is used to avoid the
discontinuity that is generated when the fractional derivative of the constants is used, and with this, it is possible
that the method has at most an order of convergence (at least) linear
This document provides an introduction to First Order Predicate Logic (FOPL). It discusses the differences between propositional logic and FOPL, the parts and syntax of FOPL including terms, atomic sentences, quantifiers and rules of inference. The semantics of FOPL are also explained. Pros and cons are provided, such as FOPL's ability to represent individual entities and generalizations compared to propositional logic. Applications include using FOPL as a framework for formulating theories.
The following document presents some novel numerical methods valid for one and several variables, which using the fractional derivative, allow us to find solutions for some nonlinear systems in the complex space using real initial conditions. The origin of these methods is the fractional Newton-Raphson method, but unlike the latter, the orders proposed here for the fractional derivatives are functions. In the first method, a function is used to guarantee an order of convergence (at least) quadratic, and in the other, a function is used to avoid the discontinuity that is generated when the fractional derivative of the constants is used, and with this, it is possible that the method has at most an order of convergence (at least) linear. Keywords: Iteration Function, Order of Convergence, Fractional Derivative.
This document provides lecture notes on hypothesis testing. It begins with an introduction to hypothesis testing and how it differs from estimation in its hypothetical reasoning approach. It then discusses Fisher's significance testing approach, including defining a test statistic, its sampling distribution under the null hypothesis, and calculating a p-value. It provides examples of applying this approach. Finally, it discusses some weaknesses of Fisher's approach identified by Neyman and Pearson and how their approach improved upon it by introducing the concept of alternative hypotheses and pre-data error probabilities.
Logicians sometimes talk about sentences being “true but unprovable." What does this mean? This presentation includes a fairly thorough introduction to mathematical logic.
This document discusses mathematical concepts related to relations including:
1. The inverse of a relation R-1, which relates elements in the opposite direction as R.
2. The composition of two relations R and S, denoted R◦S or RS, which relates elements related by both R and S.
3. Matrices can represent relations and be used to calculate their composition.
4. A partial order relation on a set A is a relation that is reflexive, antisymmetric, and transitive. Examples of partial order relations include set inclusion and the less than or equal to relation on real numbers.
This document provides an overview of one-dimensional random variables including definitions, types (discrete vs continuous), and probability distributions. It defines a random variable as a function that assigns a numerical value to each outcome of a random experiment. Random variables can be either discrete, taking on countable values, or continuous, assuming any value in an interval. The probability distribution of a discrete random variable is defined by a probability mass function, while a continuous random variable has a probability density function. Examples are given of both types of random variables and their distributions.
Oscillation and Convengence Properties of Second Order Nonlinear Neutral Dela...inventionjournals
In this paper, we consider the second order nonlinear neutral delay difference equations of the form We establish sufficient conditions which ensures that every solution of is either oscillatory or tends to zero as . We also gives examples to illustrate our results
2. domains, including collective classification [3], ontology alignment [4], personalized medicine [2],
r p as well, whereas the second makes the same statement for spouses. The rule weights ¬ l1 = 1 ˜ [8]. In the following,
I(`1 ),
at spouses are more likely to vote for the socialparty than friends. graph summarization
opinion diffusion [1], trust in same networks [7], and
Probabilistic Soft Logic
we provide an overview of the PSL modeling language and its efficient algorithms for most probable B
where we use ˜ to indicate the relaxation from the
Consider of itsconcrete persons a and b and party p truth values from the inter-
any rules with first order logic, it uses soft instantiating logical variables A, B, and P
L sharesexplanation and marginal inference.
the syntax
the extremes 0 (false) and 1 (true)ronly. is ! head and votesrbody . , ` there , an interpretation
body a friend of b ⌘ ` {` party r },
˜ ˜
nstead of respectively. The first rule states that if a Given arset of atoms ¬=for 1 , . ._p,nheadis a chance that
e mapping I : ` ! [0, 1]n well, whereas the is satisfied, interpretation. PSL defines a The rule satisfaction.
b votes for p as from atoms to soft truth values an and, statement for spouses.
second makes the same if not, its distance to weights
• Declarative language for relational
indicate that spouses are more likely to vote for the same party than friends.
y distributionPSL interpretations that makes those satisfying more ground rule instances
2 over Semantics to logical formulas. The truth value of a formula is
able. In the exampleshares the syntax of its rules with first orderperson’s uses soft truth values from the inter-
While PSL above, we prefer interpretations where a logic, it vote agrees with
nds, that is, satisfies many groundings of 0the (1), and (true) only.aGiven a set of atoms ` = {`1 ,truth },
val [0, 1] instead of the extremes Rule logical operators starting from the . . . , `n values o
(false) and 1 in case of tradeoff between a
probabilistic models
a spouse,we call the mapping I : ` ! is preferred due to thesoftI(r)values an if and onlyPSL I(rsingle literal
A PSL program consists of [0,setnof first order logic rules with conjunctive bodies and a
agreement with the spouse a 1] satisfied, i.e., truth = of Rule (2).
from atoms to higher weight 1, interpretation. if defines ) I
heads. Rules are labeled with non-negative makes those satisfying moreexample program encodes a
weights. The following ground rule instances body
ine the simple to which a predict ruleinterpretations thatuses thesocial network withand types of links denoting
probability distribution over is satisfied, PSL
degree model to ground voter behaviorvalue on athe body. Again, two coincides with the
truth prefer as Lukasiewicz t-norm this
• Syntax based on first-order logic
above, we based interpretations where a relax-
more probable. In the examplelogical AND and OR, respectively. Theseperson’s vote agrees with
onding co-norm as the relaxation of the
exact at the extremes, but provide a consistent groundings ofvalues (1), and in case 0 and 1. The rule’s dista
friend and spouse is, satisfies many mapping for are restricted Given antradeoff between a
many friends, that relationships:truth values Rule in-between. to of a
ion I, thefriend and for the relaxation of with the spouse is preferred due tothe higher and P of Rule (2). is violate
formulas a spouse, agreement the logical ^ votesFor (A, P ) ! votesFor (B, condition
measures the degree to which this )
conjunction (^), disjunction (_), weight
0.3 : friend (B, A) (1)
¬) are as follows:
To determine the degree to which a ground rule is satisfied, PSL uses the Lukasiewicz t-norm and
its corresponding co-normspouse(B,I(`2 ^of 1}, logical AND ! votesFor (B, P ).These relax- I(rbod
`1 ^ `2 =0.8 : as I(`1relaxation votesFor (A, P ) and OR, respectively.
˜ max{0, the ) + A) ) the
dr (I) = max{0, (2)
• Soft truth values via Lukasciewicz t-norm
ations are exact at the extremes, but provide a consistent mapping for values in-between. Given an
⇤
Also at`KU `2 =formulas 1 ) +For ), instance, consider the interpretation I
1_˜ min{I(` I(`2 1},
interpretation I,Leuven, Belgium the relaxation of the logical conjunction (^), disjunction (_), and
the for
=
negation (¬) arel1 = 1 I(`1 ), 0.9, votesFor (b, p) 7! 0.3}, and let r be the corre
¬ as follows:
˜
use ˜ to indicate the relaxation from the ^We get I(rbody ) I(`2 ) 1}, rule +⌘
`1 Boolean domain. For a = max{0, 1 r 0.9
˜ `2 = max{0, I(`1 ) + ground PSL 1} = 0.9 and th
rhead ⌘ ¬ rbody _ rhead , an interpretation I over the atoms in
˜ ˜ 1 r determines whether r
d, and, if not, its distance to satisfaction. _the distance +we expand theif the of I
`1 ˜Abusing notation, I(`2 ), 1}, 0 usage head had truth value 0.
`2 = min{I(`1 ) would be
formulas. The truth value of a formula is¬ l1 = 1 by applying the above definitions of
˜obtained I(`1 ),
• Resulting probability distribution computed
l operators starting from˜ the indicate the relaxationasa set of by I. Given I, a rule rof interest, a PSL prog
where we use to truth values ofGivenfrom the Boolean domain. For a ground PSL rule r ⌘
atoms specified ground atoms ` is
.e., I(r) = 1, if andhead ⌘ I(rbody ) head ,headinterpretationLet R the atoms the r of all ground rules that
rbody ! r only if ¬ r body _ r I(r an ), that is,I. head has at least setsame
˜ ˜ pretations the I over be the in determines whether r
e as the body. Again, this coincides with the usualsatisfaction. Abusing notation, wewhen
is satisfied, and, if not, its distance to definition of satisfaction of a rule expand the usage of I
using distance to satisfaction
to logical and 1. The rule’s distance to formula is under `. The probability density function
mention atoms in interpretation then
es are restricted to 0formulas. The truth value of asatisfaction obtained by applyingIthe above definitions of
the which operators starting from the 1 X
the degree to logicalthis condition is violated: truth values of atoms as specified by I. Given I, a rule r is
satisfied, i.e.,(I) = = 1, if and body ) if I(rbody ))}. I(rhead ), that is, the head (3) at least (drsame p ] ;
dr I(r) max{0, I(r only I(rhead f (I) = exp[ has r the (I))
Z
truth value as the body. Again, this coincides with the usual definition of satisfaction of a rule when
nce, consider the interpretation I = {spouse(b, a) 7! 1, votesFor (a, p) 7! r2R
truth values are restricted to 0 and 1. The rule’s distance to satisfaction under interpretation I then
For (b, p) 7! 0.3}, and let r be the corresponding ground instance of Rule (2) above.
3. Trust Modeling with PSL
TRUSTS (A, B) ^ TRUSTS (B, C) ) TRUSTS(A, C),
TRUSTS (A, B) ^ ¬ TRUSTS (B, C) ) ¬TRUSTS(A, C),
¬TRUSTS(A, B) ^ ¬TRUSTS(B, C) ) TRUSTS(A, C),
TRUSTS (A, B) ^ TRUSTS (A, C) ) TRUSTS(B, C),
TRUSTS (A, C) ^ TRUSTS (B, C) ) TRUSTS(A, B),
TRUSTS (A, B) ) TRUSTS(B, A),
¬TRUSTS(A, B) ) ¬TRUSTS(B, A).
Figure 1: Rules for PSL model of triadic closure (PSL-Triadic). Triadic closure implies the transitivity of trust, such that
individuals tend to determine whom to trust based on the opinions of those they trust.
TRUSTS (A, B) ) TRUSTING(A),
¬TRUSTS(A, B) ) ¬TRUSTING(A),
TRUSTS (A, B) ) TRUSTWORTHY(B),
¬TRUSTS(A, B) ) ¬TRUSTWORTHY(B),
TRUSTING (A) ^ TRUSTWORTHY (B) ) TRUSTS(A, B),
¬TRUSTING(A) ^ ¬TRUSTWORTHY(B) ) ¬TRUSTS(A, B),
4. Figure 1: Rules for PSL model of triadic closure (PSL-Triadic). Triadic closure implies the transitivity of trust, such that
Trust Modeling with PSL
individuals tend to determine whom to trust based on the opinions of those they trust.
TRUSTS (A, B) ) TRUSTING(A),
¬TRUSTS(A, B) ) ¬TRUSTING(A),
TRUSTS (A, B) ) TRUSTWORTHY(B),
¬TRUSTS(A, B) ) ¬TRUSTWORTHY(B),
TRUSTING (A) ^ TRUSTWORTHY (B) ) TRUSTS(A, B),
¬TRUSTING(A) ^ ¬TRUSTWORTHY(B) ) ¬TRUSTS(A, B),
TRUSTING (A) ) TRUSTS(A, B),
¬TRUSTING(A) ) ¬TRUSTS(A, B),
TRUSTWORTHY (B) ) TRUSTS(A, B),
¬TRUSTWORTHY(B) ) ¬TRUSTS(A, B).
Figure 2: Rules for PSL model of basic personality (PSL-Personality). This model maintains predicates for whether users
are trusting or trustworthy, and uses these predicates to determine each pairwise trust. Trusting individuals are more prone
to offer trust, while trustworthy individuals are more prone to receive trust.
SAME T RAITS (A, B) ) TRUSTS(A, B),
¬SAME T RAITS(A, B) ) ¬TRUSTS(A, B),
TRUSTS (A, B) ^ SAME T RAITS (B, C) ) TRUSTS(A, C),
6. Note that in the above example, we include the false (0.0) NEGATIVE predicate for completeness,
though PSL uses a closed-world assumption, so in practice one does not need to enumerate false
Group Modeling with PSL
statements.
The previously defined predicates will be fully observed in our experimental setup. We also reason
about (mostly) unobserved, latent predicates, which will be inferred. The latent group affiliations are
represented by the predicate MEMBERO F(U, G), which indicates that user U is a member of group
G. We additionally model group sentiment toward topics by inferring predicates LIKES(G, T ) and
DISLIKES (G, T ), which encode group G’s attitude toward tag T .
• Group memberships of Twitter users
From these predicates, we write rules that encode the ideas that: (1) users that message one another
We include the share group memberships, and (2) members of a group share rule, sincesentiment toward
are likely to POSITIVE predicate to filter out negative messages from this common users who
message each other with negative sentiment may be attacking one another, and thus are unlikely to
topics. The following rules encode the propagation of group affiliations through messages:
We include affiliations.
share group the POSITIVE predicate to ˜ filter out negative messages from this rule, since users who
message eachFother with negative sentiment may be attackingPOSITIVE(P ) ) MEMBERO F(B, G) to
MEMBERO (A, G) ^ POSTED (A, P ) ^ MESSAGE T O (P, B) ^ one another, ˜
˜ ˜
and thus are unlikely
The following rules encode the shared sentiment within groups: ˜
share group O F(A, G) ^ POSTED(B, P ) ^ MESSAGE T O(P, A) ^ POSITIVE(P ) ) MEMBERO F(B, G).
MEMBER affiliations. ˜ ˜ ˜
POSTED (U, P ) ^ TAGGED (P, T ) ^ POSITIVE (T ) ^ LIKES (G, T ) ) MEMBERO F (U, G)
˜ ˜ ˜ ˜
The following rules encode the shared sentiment within groups:
POSTED (U, P ) ^ TAGGED (P, T ) ^ NEGATIVE(T ) ^ DISLIKES (G, T ) ) MEMBERO F (U, G).
˜ ˜ ˜ ˜
3
POSTED (U, P ) ^ TAGGED (P, T ) ^ POSITIVE(T ) ^ LIKES(G, T ) ) MEMBERO F (U, G)
˜ ˜ ˜ ˜
Since the group sentiment is also latent, ˜ include the conceptual inverse to the above rules, which
we
POSTED (U, P ) ^ TAGGED (P, T ) ^ NEGATIVE(T ) ^ DISLIKES (G, T ) ) MEMBERO F (U, G).
˜ ˜ ˜
attributes the sentiment of posts by group members to the group’s own sentiment. These rules allow
this model to collectively infer group sentiment and affiliation:
Since the group sentiment is also latent, we include the conceptual inverse to the above rules, which
MEMBERO F (A, G) ^ POSTED (A, P ) ^ TAGGED (P, T ) ^ POSITIVE (P ) ) LIKES (G, T )
˜ ˜ ˜ ˜
attributes the sentiment of posts by group members to the group’s own sentiment. These rules allow
this MEMBEROcollectively infer (A, P ) sentiment (P, Taffiliation: (P ) ) DISLIKES(G, T ).
˜ ˜ ˜
model to F(A, G) ^ POSTED group ^ TAGGED and ) ^ NEGATIVE ˜
To MEMBERO F(A, G) ^ POSTED(A, P ) ^ TAGGED(P, T ) ^truth values(P ) LIKES(G, (G,and
enforce consistency ˜ group sentiment, we constrain the˜ POSITIVE of ) LIKES T ) T )
in ˜ ˜
DISLIKES (G, TF (A, G) ^ POSTED (A, P ) ^to sum to (P,more than 1.0, which ) ) DISLIKES (G, T ).
MEMBERO ) for any ˜ group G and tag T˜ TAGGED no T ) ^ NEGATIVE(P in˜effect prevents
˜
both from being true. We additionally constrain group membership for any individual user to sum
To no more than 1.0,depending aon thecan only fully belong to one the truth it appliesof LIKES(G, T ) and
to
enforce consistency thatgroup sentiment, we being considered, but values intuitively to
always appropriate,
such
in user types of groups constrain group. This last constraint is not
DISLIKES (G, T ) for anyour experiments.
the groups we consider in group G and tag T to sum to no more than 1.0, which in effect prevents
both from being true. We additionally constrain group membership for any individual user to sum
to no more than 1.0, such thateach of can only fully belong to one group. This last constraint is not
In our experiments, we weight a user these rules uniformly with weight 1.0. In settings where
fully-labeled training data is available, we can learn ideal weights for particular data sources. To
always appropriate, depending on the types of groups being considered, but it applies intuitively to
make predictions with this model, we seed inference with a small set of group affiliations and group
the groupsinformation. Theour experiments.
sentiment
we consider in next section describes the application of the model described here to
real social media data sets.
In our experiments, we weight each of these rules uniformly with weight 1.0. In settings where
7. Olympic Soccer Final
(a) Mexico Group Preferred Hashtags (b) Brazil Group Preferred Hashtags
(c) Ch´ vez Supporter Preferred Hashtags
a
(a) Mexico Group Heat Map (b) Brazil Group Heat Map
Figure 1: Heat maps indicating the concentration of geotagged tweets from users predicted by PSL
8. Venezuelan Election
(a) Mexico Group Preferred Hashtags
(a) Mexico Group Preferred Hashtags
(b) Brazil Group Preferred Hashtags
(b) Brazil Group Preferred Hashtags
(c) Ch´ vez Supporter Preferred Hashtags
a
(c) Ch´ vez Supporter Preferred Hashtags
a
(d) Capriles Supporter Preferred Hashtags
Figure 2: Hashtag clouds for predicted LIKES predicate. The font size is scaled according to
truth value of the inferred, latent LIKES predicate.