This document provides an introduction to probability theory. It defines key concepts like random experiments, sample spaces, events, and probabilities. Random experiments are experiments with unpredictable outcomes but a known set of possible results. The sample space is the set of all possible outcomes. Events are subsets of outcomes. Probabilities assign a numerical value between 0 and 1 to events, representing the likelihood they will occur. Two common methods to assign probabilities are the classical method, which uses equally likely outcomes, and the relative frequency method, which uses the limit of observed frequencies over many trials. Probability theory models and studies random processes, while statistics draws inferences from random process data.
This document provides an overview of key concepts in probability theory:
- An experiment yields possible outcomes called a sample space. Events are subsets of outcomes. Random variables assign values to outcomes.
- Probability is a measure of certainty that an event will occur, ranging from 0 (impossible) to 1 (certain). It can be defined in different ways.
- The frequentist definition is the limit of relative frequencies of an event over many trials. The Bayesian definition is a degree of belief in an event. The Laplacian definition assumes all outcomes are equally likely initially.
- Examples demonstrate random variables, events, and calculating probabilities based on the sample space and outcomes of an experiment. Key terms like sample space, event,
This document provides an introduction to probability. It defines probability as a measure of how likely an event is to occur. Probability is expressed as a ratio of favorable outcomes to total possible outcomes. The key terms used in probability are defined, including event, outcome, sample space, and elementary events. The theoretical approach to probability is discussed, where probability is predicted without performing the experiment. Random experiments are described as those that may not produce the same outcome each time. Laws of probability are presented, such as a probability being between 0 and 1. Applications of probability in everyday life are mentioned, such as reliability testing of products. Two example probability problems are worked out.
The axiomatic power of Kolmogorov complexity lbienven
1. The document discusses random axioms and probabilistic proofs in Peano arithmetic. It describes a proof strategy where one could randomly select an integer n that satisfies some formula φ and add it as a new axiom.
2. While this intuition of probabilistic proofs makes sense, it is not really useful since any statement provable with sufficiently high probability is already provable in PA. However, probabilistic proofs can be exponentially more concise than deterministic proofs.
3. The document also discusses Kolmogorov complexity and how statements about it relate to the provability of PA. It can be shown that if C(x) is less than some value, PA will prove it, but PA will never prove a
This document provides an overview of basic probability theory concepts including probability, mutually exclusive events, and independence. It discusses probability as a measure of likelihood between 0 and 1. Key concepts covered include interpretations of probability, the mathematical treatment including independent, conditional, and summary probabilities, and applications in areas like reliability and natural language processing. Mutually exclusive events are defined as events that cannot occur simultaneously, while independent events have probabilities that are unaffected by each other.
This document discusses basic concepts of probability, including:
- The addition rule and multiplication rule for calculating probabilities of compound events.
- Events can be disjoint (mutually exclusive) or not disjoint.
- The probability of an event occurring or its complement must equal 1.
- How to calculate the probability of at least one occurrence of an event using the complement.
- When applying the multiplication rule, you must consider whether events are independent or dependent.
Discrete probability distribution (complete)ISYousafzai
This document discusses discrete random variables. It begins by defining a random variable as a function that assigns a numerical value to each outcome of an experiment. There are two types of random variables: discrete and continuous. Discrete random variables have a countable set of possible values, while continuous variables can take any value within a range. Examples of discrete variables include the number of heads in a coin flip and the total value of dice. The document then discusses how to describe the probabilities associated with discrete random variables using lists, histograms, and probability mass functions.
This document provides an overview of key concepts in probability theory:
- An experiment yields possible outcomes called a sample space. Events are subsets of outcomes. Random variables assign values to outcomes.
- Probability is a measure of certainty that an event will occur, ranging from 0 (impossible) to 1 (certain). It can be defined in different ways.
- The frequentist definition is the limit of relative frequencies of an event over many trials. The Bayesian definition is a degree of belief in an event. The Laplacian definition assumes all outcomes are equally likely initially.
- Examples demonstrate random variables, events, and calculating probabilities based on the sample space and outcomes of an experiment. Key terms like sample space, event,
This document provides an introduction to probability. It defines probability as a measure of how likely an event is to occur. Probability is expressed as a ratio of favorable outcomes to total possible outcomes. The key terms used in probability are defined, including event, outcome, sample space, and elementary events. The theoretical approach to probability is discussed, where probability is predicted without performing the experiment. Random experiments are described as those that may not produce the same outcome each time. Laws of probability are presented, such as a probability being between 0 and 1. Applications of probability in everyday life are mentioned, such as reliability testing of products. Two example probability problems are worked out.
The axiomatic power of Kolmogorov complexity lbienven
1. The document discusses random axioms and probabilistic proofs in Peano arithmetic. It describes a proof strategy where one could randomly select an integer n that satisfies some formula φ and add it as a new axiom.
2. While this intuition of probabilistic proofs makes sense, it is not really useful since any statement provable with sufficiently high probability is already provable in PA. However, probabilistic proofs can be exponentially more concise than deterministic proofs.
3. The document also discusses Kolmogorov complexity and how statements about it relate to the provability of PA. It can be shown that if C(x) is less than some value, PA will prove it, but PA will never prove a
This document provides an overview of basic probability theory concepts including probability, mutually exclusive events, and independence. It discusses probability as a measure of likelihood between 0 and 1. Key concepts covered include interpretations of probability, the mathematical treatment including independent, conditional, and summary probabilities, and applications in areas like reliability and natural language processing. Mutually exclusive events are defined as events that cannot occur simultaneously, while independent events have probabilities that are unaffected by each other.
This document discusses basic concepts of probability, including:
- The addition rule and multiplication rule for calculating probabilities of compound events.
- Events can be disjoint (mutually exclusive) or not disjoint.
- The probability of an event occurring or its complement must equal 1.
- How to calculate the probability of at least one occurrence of an event using the complement.
- When applying the multiplication rule, you must consider whether events are independent or dependent.
Discrete probability distribution (complete)ISYousafzai
This document discusses discrete random variables. It begins by defining a random variable as a function that assigns a numerical value to each outcome of an experiment. There are two types of random variables: discrete and continuous. Discrete random variables have a countable set of possible values, while continuous variables can take any value within a range. Examples of discrete variables include the number of heads in a coin flip and the total value of dice. The document then discusses how to describe the probabilities associated with discrete random variables using lists, histograms, and probability mass functions.
This document describes a course on mathematical foundations for communication engineering. The course objectives are to develop understanding of probability theory, random variables, and sequences of random variables. Key topics covered include probability, random variables, functions of random variables, special distributions, sequences of random variables, and random processes. Assessment is based on assignments, tests, exams, attendance, and a final exam weighting 60%. The course aims to enable students to apply probability concepts in practical problems and communication systems analysis.
Continuous probability Business Statistics, ManagementDebjit Das
This document discusses different types of continuous probability distributions including uniform, normal, and exponential distributions. It provides examples of how each distribution is used and defined mathematically. The normal distribution is described as the most important for describing continuous random variables. Real-world examples of when each distribution would be used are given, such as height, test scores, and time between events. Business applications like risk evaluation, sales forecasting, and manufacturing costs are also summarized. Finally, it emphasizes that probability is involved in many aspects of daily life beyond just academics.
This document discusses key concepts in probability theory including:
- Probability is a quantitative measure of uncertainty ranging from 0 to 1.
- Experiments produce outcomes that define events in a sample space.
- There are different approaches to determining probability (classical, relative frequency, subjective).
- Probability can be classified as marginal, union, joint, or conditional depending on the relationship between events.
- Common discrete and continuous probability distributions include the binomial, Poisson, and normal distributions.
The document discusses binomial, Poisson, and hypergeometric probability distributions. It provides examples of experiments that follow each distribution and how to calculate probabilities using the respective formulas. For binomial experiments, the probability of success must be constant on each trial and trials must be independent. Poisson experiments involve rare, independent events with a known average rate. Hypergeometric probabilities are used when the probability of success changes on each dependent trial, such as sampling without replacement.
This document provides an overview of key concepts related to random variables and probability distributions. It discusses:
- Two types of random variables - discrete and continuous. Discrete variables can take countable values, continuous can be any value in an interval.
- Probability distributions for discrete random variables, which specify the probability of each possible outcome. Examples of common discrete distributions like binomial and Poisson are provided.
- Key properties and calculations for discrete distributions like expected value, variance, and the formulas for binomial and Poisson probabilities.
- Other discrete distributions like hypergeometric are introduced for situations where outcomes are not independent. Examples are provided to demonstrate calculating probabilities for each type of distribution.
The following presentation is an introduction to the Algebraic Methods – part one for level 4 Mathematics. This resources is a part of the 2009/2010 Engineering (foundation degree, BEng and HN) courses from University of Wales Newport (course codes H101, H691, H620, HH37 and 001H). This resource is a part of the core modules for the full time 1st year undergraduate programme.
The BEng & Foundation Degrees and HNC/D in Engineering are designed to meet the needs of employers by placing the emphasis on the theoretical, practical and vocational aspects of engineering within the workplace and beyond. Engineering is becoming more high profile, and therefore more in demand as a skill set, in today’s high-tech world. This course has been designed to provide you with knowledge, skills and practical experience encountered in everyday engineering environments.
This document provides details about a course on random variables and stochastic processes. It includes:
- An overview of the course content which will cover probability theory, random variables, distributions, and stochastic processes.
- Information about assignments, quizzes, grading policy, textbooks, and the instructor's office hours.
- Examples and explanations of key concepts from probability theory that will be covered, including sample spaces, probability values, events, and complements of events. Applications to games of chance, software errors, and power plant operations are discussed.
- The goal of developing mathematical tools to analyze and characterize random signals and stochastic processes is stated.
The document defines key concepts in probability and hypothesis testing. It discusses probability as a numerical quantity between 0 and 1 that expresses the likelihood of an event. Different probability distributions are covered, including binomial, normal, and Poisson distributions. Hypothesis testing is defined as a methodology to either accept or reject a null hypothesis based on sample data. Types of hypotheses, terms used in testing like test statistics and p-values, and types of errors are also summarized.
This document discusses probability and its key concepts. It begins by defining probability as a quantitative measure of uncertainty ranging from 0 to 1. Probability can be understood objectively based on problems or subjectively based on beliefs. Key probability concepts discussed include:
- Sample space, simple events, and compound events
- Classical, relative frequency, and subjective approaches to assigning probabilities
- Complement, intersection, and union of events
- Conditional probability and independence of events
- Rules for calculating probabilities of combined events like the multiplication rule
Examples are provided to illustrate concepts like defining sample spaces, calculating probabilities of individual and combined events, determining conditional probabilities, and assessing independence. Overall, the document provides a comprehensive overview of fundamental probability
The document discusses probability and set theory. It defines probability as a quantitative measure of uncertainty or a measure of degree of belief in a statement. It states that probability is measured on a scale from 0 to 1, where 0 is impossibility and 1 is certainty. It then discusses key concepts in set theory such as sets, subsets, Venn diagrams, and operations on sets like union, intersection, difference, and complement. Finally, it discusses definitions of probability including the classical, relative frequency, and axiomatic definitions.
Meaning of Probability, Experiment. Events – Simple and Compound, Sample Space, Probability of Events, Event Independent and Dependent Events, Probability Laws Bayes Theorem
4 1 probability and discrete probability distributionsLama K Banna
This document discusses probabilities and probability distributions. It begins by defining an experiment and sample space. A random variable is defined as a numerical value determined by the outcome of an experiment. Random variables can be discrete or continuous. Probability distributions show all possible outcomes of an experiment and their probabilities. The binomial distribution is discussed as modeling discrete experiments with binary outcomes and fixed probabilities. Key properties of the binomial include the mean, variance, and use of the binomial probability formula and tables to calculate probabilities of various outcomes.
The document provides an overview of key concepts in probability theory including:
- Definitions of probability, experiments, sample space, and events
- Approaches to probability including classical, statistical, and subjective
- Rules of probability including addition, multiplication, and conditional probability
- Bayes' theorem and how it differs from conditional probability
- Random variables and their probability distributions
The document is intended to introduce students to probability concepts and their applications in decision making under uncertainty.
This document summarizes key concepts from Chapter 4 on probability, including the addition rule, multiplication rule, conditional probability, dependent and independent events, and applying these concepts to calculate probabilities. The chapter covers basic probability concepts like sample spaces, events, and computing probabilities using relative frequency, classical, and subjective approaches. It also discusses odds, complementary events, and using simulations and counting to calculate probabilities.
This document discusses key concepts in probability and statistics such as population, sample, random experiments, sample space, events, and types of events. It provides examples and exercises to illustrate these concepts. Specifically, it defines a random experiment as a process that can be repeated under similar conditions leading to well-defined but unpredictable outcomes. The sample space represents all possible outcomes, while an event is a subset of outcomes of interest. Events can be elementary, impossible, or sure depending on whether they consist of one, no, or all possible outcomes.
This document defines discrete and continuous random variables and provides examples of each. It then focuses on discrete random variables and probability distributions. Specifically, it discusses the binomial probability distribution, giving its formula and providing examples of calculating binomial probabilities. It also discusses properties of the binomial distribution such as its shape and mean, and shows how binomial tables can be used to find probabilities.
Chapter 04 random variables and probabilityJuncar Tome
This chapter discusses discrete random variables and their probability distributions. It introduces the concept of a random variable and defines discrete and continuous random variables. It then covers the probability distributions for discrete random variables including the binomial, Poisson, and hypergeometric distributions. It defines key terms like expected value and variance and provides examples of calculating probabilities using these distributions.
The document provides an introduction to probability concepts including sample spaces, events, mutually exclusive and exhaustive events, independent and dependent events, and formulas like the addition rule and multiplication rule. It explains terms used in probability like sample points, trials, outcomes, and experiments. Various approaches to probability are discussed including classical, statistical, subjective, and axiomatic approaches.
Sample Space and Event,Probability,The Axioms of Probability,Bayes TheoremBharath kumar Karanam
The document discusses key concepts in statistics including:
- A sample space contains all possible outcomes of an experiment and events are subsets of the sample space.
- Probability is a branch of mathematics that quantifies the likelihood of events based on the sample space.
- The axioms of probability establish rules like probabilities being between 0 and 1 and the probability of the entire sample space being 1.
- Bayes' theorem calculates conditional probabilities and allows updating probabilities as new evidence becomes available.
This document provides an overview of key concepts in probability theory, including:
1. It defines probability as a measure between 0 and 1 of the likelihood of an event occurring or a statement being true.
2. It discusses applications of probability theory in areas like risk assessment and financial markets.
3. It outlines some common probability experiments like coin tosses, dice rolls, and cricket games and identifies their possible experimental outcomes.
1. The document discusses key concepts in probability theory such as random experiments, sample spaces, events, mutually exclusive events, independent events, and probability definitions and formulas.
2. Examples are provided to illustrate concepts like probability, conditional probability, Bayes' theorem, and probability formulas including addition rules.
3. Common probability theorems and formulas are defined including addition rules for two and n events, multiplication rules, and Bayes' theorem.
Probability is a numerical measure of how likely an event is to occur. It is defined as the number of favorable outcomes divided by the total number of possible outcomes. A random experiment is an action with some defined outcomes that may occur by chance. The sample space is the set of all possible outcomes. Conditional probability is the probability of one event occurring given that another event has occurred.
This document describes a course on mathematical foundations for communication engineering. The course objectives are to develop understanding of probability theory, random variables, and sequences of random variables. Key topics covered include probability, random variables, functions of random variables, special distributions, sequences of random variables, and random processes. Assessment is based on assignments, tests, exams, attendance, and a final exam weighting 60%. The course aims to enable students to apply probability concepts in practical problems and communication systems analysis.
Continuous probability Business Statistics, ManagementDebjit Das
This document discusses different types of continuous probability distributions including uniform, normal, and exponential distributions. It provides examples of how each distribution is used and defined mathematically. The normal distribution is described as the most important for describing continuous random variables. Real-world examples of when each distribution would be used are given, such as height, test scores, and time between events. Business applications like risk evaluation, sales forecasting, and manufacturing costs are also summarized. Finally, it emphasizes that probability is involved in many aspects of daily life beyond just academics.
This document discusses key concepts in probability theory including:
- Probability is a quantitative measure of uncertainty ranging from 0 to 1.
- Experiments produce outcomes that define events in a sample space.
- There are different approaches to determining probability (classical, relative frequency, subjective).
- Probability can be classified as marginal, union, joint, or conditional depending on the relationship between events.
- Common discrete and continuous probability distributions include the binomial, Poisson, and normal distributions.
The document discusses binomial, Poisson, and hypergeometric probability distributions. It provides examples of experiments that follow each distribution and how to calculate probabilities using the respective formulas. For binomial experiments, the probability of success must be constant on each trial and trials must be independent. Poisson experiments involve rare, independent events with a known average rate. Hypergeometric probabilities are used when the probability of success changes on each dependent trial, such as sampling without replacement.
This document provides an overview of key concepts related to random variables and probability distributions. It discusses:
- Two types of random variables - discrete and continuous. Discrete variables can take countable values, continuous can be any value in an interval.
- Probability distributions for discrete random variables, which specify the probability of each possible outcome. Examples of common discrete distributions like binomial and Poisson are provided.
- Key properties and calculations for discrete distributions like expected value, variance, and the formulas for binomial and Poisson probabilities.
- Other discrete distributions like hypergeometric are introduced for situations where outcomes are not independent. Examples are provided to demonstrate calculating probabilities for each type of distribution.
The following presentation is an introduction to the Algebraic Methods – part one for level 4 Mathematics. This resources is a part of the 2009/2010 Engineering (foundation degree, BEng and HN) courses from University of Wales Newport (course codes H101, H691, H620, HH37 and 001H). This resource is a part of the core modules for the full time 1st year undergraduate programme.
The BEng & Foundation Degrees and HNC/D in Engineering are designed to meet the needs of employers by placing the emphasis on the theoretical, practical and vocational aspects of engineering within the workplace and beyond. Engineering is becoming more high profile, and therefore more in demand as a skill set, in today’s high-tech world. This course has been designed to provide you with knowledge, skills and practical experience encountered in everyday engineering environments.
This document provides details about a course on random variables and stochastic processes. It includes:
- An overview of the course content which will cover probability theory, random variables, distributions, and stochastic processes.
- Information about assignments, quizzes, grading policy, textbooks, and the instructor's office hours.
- Examples and explanations of key concepts from probability theory that will be covered, including sample spaces, probability values, events, and complements of events. Applications to games of chance, software errors, and power plant operations are discussed.
- The goal of developing mathematical tools to analyze and characterize random signals and stochastic processes is stated.
The document defines key concepts in probability and hypothesis testing. It discusses probability as a numerical quantity between 0 and 1 that expresses the likelihood of an event. Different probability distributions are covered, including binomial, normal, and Poisson distributions. Hypothesis testing is defined as a methodology to either accept or reject a null hypothesis based on sample data. Types of hypotheses, terms used in testing like test statistics and p-values, and types of errors are also summarized.
This document discusses probability and its key concepts. It begins by defining probability as a quantitative measure of uncertainty ranging from 0 to 1. Probability can be understood objectively based on problems or subjectively based on beliefs. Key probability concepts discussed include:
- Sample space, simple events, and compound events
- Classical, relative frequency, and subjective approaches to assigning probabilities
- Complement, intersection, and union of events
- Conditional probability and independence of events
- Rules for calculating probabilities of combined events like the multiplication rule
Examples are provided to illustrate concepts like defining sample spaces, calculating probabilities of individual and combined events, determining conditional probabilities, and assessing independence. Overall, the document provides a comprehensive overview of fundamental probability
The document discusses probability and set theory. It defines probability as a quantitative measure of uncertainty or a measure of degree of belief in a statement. It states that probability is measured on a scale from 0 to 1, where 0 is impossibility and 1 is certainty. It then discusses key concepts in set theory such as sets, subsets, Venn diagrams, and operations on sets like union, intersection, difference, and complement. Finally, it discusses definitions of probability including the classical, relative frequency, and axiomatic definitions.
Meaning of Probability, Experiment. Events – Simple and Compound, Sample Space, Probability of Events, Event Independent and Dependent Events, Probability Laws Bayes Theorem
4 1 probability and discrete probability distributionsLama K Banna
This document discusses probabilities and probability distributions. It begins by defining an experiment and sample space. A random variable is defined as a numerical value determined by the outcome of an experiment. Random variables can be discrete or continuous. Probability distributions show all possible outcomes of an experiment and their probabilities. The binomial distribution is discussed as modeling discrete experiments with binary outcomes and fixed probabilities. Key properties of the binomial include the mean, variance, and use of the binomial probability formula and tables to calculate probabilities of various outcomes.
The document provides an overview of key concepts in probability theory including:
- Definitions of probability, experiments, sample space, and events
- Approaches to probability including classical, statistical, and subjective
- Rules of probability including addition, multiplication, and conditional probability
- Bayes' theorem and how it differs from conditional probability
- Random variables and their probability distributions
The document is intended to introduce students to probability concepts and their applications in decision making under uncertainty.
This document summarizes key concepts from Chapter 4 on probability, including the addition rule, multiplication rule, conditional probability, dependent and independent events, and applying these concepts to calculate probabilities. The chapter covers basic probability concepts like sample spaces, events, and computing probabilities using relative frequency, classical, and subjective approaches. It also discusses odds, complementary events, and using simulations and counting to calculate probabilities.
This document discusses key concepts in probability and statistics such as population, sample, random experiments, sample space, events, and types of events. It provides examples and exercises to illustrate these concepts. Specifically, it defines a random experiment as a process that can be repeated under similar conditions leading to well-defined but unpredictable outcomes. The sample space represents all possible outcomes, while an event is a subset of outcomes of interest. Events can be elementary, impossible, or sure depending on whether they consist of one, no, or all possible outcomes.
This document defines discrete and continuous random variables and provides examples of each. It then focuses on discrete random variables and probability distributions. Specifically, it discusses the binomial probability distribution, giving its formula and providing examples of calculating binomial probabilities. It also discusses properties of the binomial distribution such as its shape and mean, and shows how binomial tables can be used to find probabilities.
Chapter 04 random variables and probabilityJuncar Tome
This chapter discusses discrete random variables and their probability distributions. It introduces the concept of a random variable and defines discrete and continuous random variables. It then covers the probability distributions for discrete random variables including the binomial, Poisson, and hypergeometric distributions. It defines key terms like expected value and variance and provides examples of calculating probabilities using these distributions.
The document provides an introduction to probability concepts including sample spaces, events, mutually exclusive and exhaustive events, independent and dependent events, and formulas like the addition rule and multiplication rule. It explains terms used in probability like sample points, trials, outcomes, and experiments. Various approaches to probability are discussed including classical, statistical, subjective, and axiomatic approaches.
Sample Space and Event,Probability,The Axioms of Probability,Bayes TheoremBharath kumar Karanam
The document discusses key concepts in statistics including:
- A sample space contains all possible outcomes of an experiment and events are subsets of the sample space.
- Probability is a branch of mathematics that quantifies the likelihood of events based on the sample space.
- The axioms of probability establish rules like probabilities being between 0 and 1 and the probability of the entire sample space being 1.
- Bayes' theorem calculates conditional probabilities and allows updating probabilities as new evidence becomes available.
This document provides an overview of key concepts in probability theory, including:
1. It defines probability as a measure between 0 and 1 of the likelihood of an event occurring or a statement being true.
2. It discusses applications of probability theory in areas like risk assessment and financial markets.
3. It outlines some common probability experiments like coin tosses, dice rolls, and cricket games and identifies their possible experimental outcomes.
1. The document discusses key concepts in probability theory such as random experiments, sample spaces, events, mutually exclusive events, independent events, and probability definitions and formulas.
2. Examples are provided to illustrate concepts like probability, conditional probability, Bayes' theorem, and probability formulas including addition rules.
3. Common probability theorems and formulas are defined including addition rules for two and n events, multiplication rules, and Bayes' theorem.
Probability is a numerical measure of how likely an event is to occur. It is defined as the number of favorable outcomes divided by the total number of possible outcomes. A random experiment is an action with some defined outcomes that may occur by chance. The sample space is the set of all possible outcomes. Conditional probability is the probability of one event occurring given that another event has occurred.
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptxanshujain54751
Probability theory is a branch of mathematics that uses concepts like sample space, probability distributions, and random variables to assign numerical likelihoods to the chances of outcomes occurring in random phenomena. It involves both theoretical and experimental approaches. Key aspects of probability theory include defining events and random variables, understanding independent and dependent events, and using formulas to calculate probabilities. Probability theory has various applications, like in finance to model markets, in product design to reduce failure probabilities, and in casinos to shape games of chance.
2 Review of Statistics. 2 Review of Statistics.WeihanKhor2
This document provides an overview of discrete probability distributions, including the binomial and Poisson distributions.
1) It defines key concepts such as random variables, probability mass functions, and expected value as they relate to discrete random variables. 2) The binomial distribution describes independent Bernoulli trials with a constant probability of success, and is used to calculate probabilities of outcomes from events like coin flips. 3) The Poisson distribution approximates the binomial when the number of trials is large and the probability of success is small. It models rare, independent events with a constant average rate and can be used for problems involving traffic accidents or natural disasters.
This document defines key probability concepts and terms:
- Probability is the numerical study of chances of events occurring. It is applied in diverse fields.
- There are two approaches to probability: classical and axiomatic. Random experiments have uncertain outcomes but known possibilities, unlike deterministic experiments.
- Key terms defined include sample space, events, elementary events, compound events, equally likely events, mutually exclusive events, independent events, dependent events, exhaustive cases, and favorable cases.
- The classical definition of probability is the number of favorable outcomes divided by the total possible outcomes for random experiments with equally likely outcomes. Probability values must be between 0 and 1.
Probability is a measure of how likely outcomes of uncertain events are to occur. It can be determined through experiments repeated over the long-term to reveal consistent patterns (the classical method), by observing the proportion of outcomes in many trials of an experiment (the empirical method), or based on subjective judgments of likelihood (the subjective method). The probability of any given event is calculated as the number of desired outcomes divided by the total number of possible outcomes, assuming all outcomes are equally likely.
This document provides an introduction to probability theory and different probability distributions. It begins with defining probability as a quantitative measure of the likelihood of events occurring. It then covers fundamental probability concepts like mutually exclusive events, additive and multiplicative laws of probability, and independent events. The document also introduces random variables and common probability distributions like the binomial, Poisson, and normal distributions. It provides examples of how each distribution is used and concludes with characteristics of the normal distribution.
This document provides an overview of probability theory concepts. It defines an experiment as any process of observation or measurement, and a random experiment as one where the exact outcome cannot be predicted but the possible outcomes can be listed. The sample space is the set of all possible outcomes, and a subset is called an event. Probability is defined as the ratio of favorable outcomes to total possible outcomes. Examples are provided of calculating probabilities of events occurring for experiments like rolling dice, tossing coins, and assigning student grades.
Probability is the branch of mathematics concerning events and numerical descriptions of how likely they are to occur. The probability of an event is a number between 0 and 1; the larger the probability, the more likely an event is to occur.[note 1][1][2] The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes ('heads' and 'tails') are both equally probable; the probability of 'heads' equals the probability of 'tails'; and since no other outcomes are possible, the probability of either 'heads' or 'tails' is 1/2 (which could also be written as 0.5 or 50%).
These concepts have been given an axiomatic mathematical formalization in probability theory, which is used widely in areas of study such as statistics, mathematics, science, finance, gambling, artificial intelligence, machine learning, computer science, game theory, and philosophy to, for example, draw inferences about the expected frequency of events. Probability theory is also used to describe the underlying mechanics and regularities of complex systems.
This document discusses types of probability and provides definitions and examples of key probability concepts. It begins with an introduction to probability theory and its applications. The document then defines terms like random experiments, sample spaces, events, favorable events, mutually exclusive events, and independent events. It describes three approaches to measuring probability: classical, frequency, and axiomatic. It concludes with theorems of probability and references.
Classical probability is an approach based on the assumptions that random processes have a set of possible outcomes and each outcome is equally likely. There are three main types of probabilities: theoretical, experimental, and axiomatic. Theoretical probability is the ratio of favorable to possible outcomes. Experimental probability is the frequency of an outcome from experiments. Axiomatic probability describes probability through predefined axioms to ease calculating event occurrences.
Two events are independent if the occurrence of one event does not affect the probability of the other event occurring. The multiplication rule can be used to calculate the probability of independent events occurring. Two events are dependent if the occurrence of one event does affect the probability of the other event occurring. The multiplication rule must be modified to calculate probabilities of dependent events. The addition rule can be used to calculate the probability of one or more events occurring, whether the events are mutually exclusive or not mutually exclusive.
This document discusses key concepts in probability including experiments, outcomes, sample spaces, classical probability, empirical probability, subjective probability, complementary events, and the law of large numbers. Probability can be calculated classically by considering the number of outcomes in an event over the total number of outcomes, empirically by observing frequencies, or subjectively based on estimates. Understanding probability is important for properly evaluating risks and uncertainties.
This document defines common statistical terms and provides examples of sample spaces and events for random experiments involving a coin toss, dice roll, and child gender combinations. It defines statistics, parameters, probability, sample space, event, and random experiment. The examples construct sample spaces for coin tossing, dice rolling, and child gender combinations, then identify events within those sample spaces, such as rolling an even number or having two boys.
Probability is a mathematical measure of how likely events are to occur. It can be expressed as a fraction, decimal, or percentage between 0 and 1. A probability experiment involves possible outcomes that make up a sample space. An event is a subset of outcomes. Theoretical probability calculates the likelihood of an event based on equally likely outcomes. Experimental probability is based on observed frequencies. Subjective probability relies on estimates rather than calculations. As experiments are repeated, experimental probability approaches theoretical probability due to the law of large numbers.
This document provides an overview of key concepts in probability, including:
- Random experiments, sample spaces, elementary outcomes, and events
- Classical and empirical definitions of probability
- Operations on events like unions, intersections, complements
- Conditional probability and the multiplication rule
- Independent events and pairwise/mutual independence
It defines key terms and concepts and provides examples to illustrate probability calculations and relationships between events. Assignments are given to extend the formulas provided to additional events.
This document discusses theorems in probability and their proofs. It begins by stating that theorems help evaluate probabilities of compound events simply. Several probability theorems and their proofs are then presented, including:
1) The probability of an impossible event is zero.
2) The probability of a complementary event A' is 1 - P(A).
3) For any two events A and B, P(A or B) = P(A) + P(B) - P(A and B).
4) If A and B are independent, P(A and B) = P(A)P(B).
Examples are provided to demonstrate applying the theorems to probability problems involving events described using
This document provides an introduction to probability and important concepts in probability theory. It defines probability as a measure of the likelihood of an event occurring based on chance. Probability can be estimated empirically by calculating the relative frequency of outcomes in a series of trials, or estimated subjectively based on experience. Classical probability uses an a priori approach to assign probabilities to outcomes that are considered equally likely, such as outcomes of rolling dice or drawing cards. The document provides examples and definitions of key probability terms and concepts such as sample space, events, axioms of probability, and approaches to calculating probability.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesChristina Lin
Traditionally, dealing with real-time data pipelines has involved significant overhead, even for straightforward tasks like data transformation or masking. However, in this talk, we’ll venture into the dynamic realm of WebAssembly (WASM) and discover how it can revolutionize the creation of stateless streaming pipelines within a Kafka (Redpanda) broker. These pipelines are adept at managing low-latency, high-data-volume scenarios.
Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...University of Maribor
Slides from talk presenting:
Aleš Zamuda: Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapter and Networking.
Presentation at IcETRAN 2024 session:
"Inter-Society Networking Panel GRSS/MTT-S/CIS
Panel Session: Promoting Connection and Cooperation"
IEEE Slovenia GRSS
IEEE Serbia and Montenegro MTT-S
IEEE Slovenia CIS
11TH INTERNATIONAL CONFERENCE ON ELECTRICAL, ELECTRONIC AND COMPUTING ENGINEERING
3-6 June 2024, Niš, Serbia
1. 1
Module 1
Probability
1. Introduction
In our daily life we come across many processes whose nature cannot be predicted in advance.
Such processes are referred to as random processes. The only way to derive information about
random processes is to conduct experiments. Each such experiment results in an outcome
which cannot be predicted beforehand. In fact even if the experiment is repeated under
identical conditions, due to presence of factors which are beyond control, outcomes of the
experiment may vary from trial to trial. However we may know in advance that each outcome
of the experiment will result in one of the several given possibilities. For example, in the cast of
a die under a fixed environment the outcome (number of dots on the upper face of the die)
cannot be predicted in advance and it varies from trial to trial. However we know in advance
that the outcome has to be among one of the numbers 1, 2, … , 6. Probability theory deals with
the modeling and study of random processes. The field of Statistics is closely related to
probability theory and it deals with drawing inferences from the data pertaining to random
processes.
Definition 1.1
(i) A random experiment is an experiment in which:
(a) the set of all possible outcomes of the experiment is known in advance;
(b) the outcome of a particular performance (trial) of the experiment cannot be
predicted in advance;
(c) the experiment can be repeated under identical conditions.
(ii) The collection of all possible outcomes of a random experiment is called the sample
space. A sample space will usually be denoted by ߗ. ▄
Example 1.1
(i) In the random experiment of casting a die one may take the sample space as
ߗ = ሼ1, 2, 3, 4, 5, 6ሽ, where ݅ ∈ ߗ indicates that the experiment results in ݅ሺ݅ = 1, … ,6ሻ
dots on the upper face of die.
(ii) In the random experiment of simultaneously flipping a coin and casting a die one may
take the sample space as
ߗ = ሼ,ܪ ܶሽ × ሼ1, 2, … , 6ሽ = ൛ሺ,ݎ ݅ሻ: ݎ ∈ ሼ,ܪ ܶሽ, ݅ ∈ ሼ1, 2, … , 6ሽൟ,
2. 2
where ሺ,ܪ ݅ሻ൫ሺܶ, ݅ሻ൯ indicates that the flip of the coin resulted in head (tail) on the
upper face and the cast of the die resulted in ݅ሺ݅ = 1, 2, … , 6ሻ dots on the upper face.
(iii) Consider an experiment where a coin is tossed repeatedly until a head is observed. In
this case the sample space may be taken as ߗ = ሼ1, 2, … ሽ (or ߗ =
ሼT, TH, TTH, … ሽ),where ݅ ∈ ߗ (or TT ⋯ TH ∈ ߗ with ሺ݅ − 1ሻ Ts and one H) indicates
that the experiment terminates on the ݅-th trial with first ݅ − 1 trials resulting in tails on
the upper face and the ݅-th trial resulting in the head on the upper face.
(iv) In the random experiment of measuring lifetimes (in hours) of a particular brand of
batteries manufactured by a company one may take ߗ = ሾ0,70,000ሿ,where we have
assumed that no battery lasts for more than 70,000 hours. ▄
Definition 1.2
(i) Let ߗ be the sample space of a random experiment and let ܧ ⊆ ߗ. If the outcome of the
random experiment is a member of the set ܧ we say that the event ܧ has occurred.
(ii) Two events ܧଵand ܧଶare said to be mutually exclusive if they cannot occur simultaneously,
i.e., if ܧଵ ∩ ܧଶ = ߶, the empty set. ▄
In a random experiment some events may be more likely to occur than the others. For
example, in the cast of a fair die (a die that is not biased towards any particular outcome),
the occurrence of an odd number of dots on the upper face is more likely than the
occurrence of 2 or 4 dots on the upper face. Thus it may be desirable to quantify the
likelihoods of occurrences of various events. Probability of an event is a numerical measure
of chance with which that event occurs. To assign probabilities to various events associated
with a random experiment one may assign a real number ܲሺܧሻ ∈ ሾ0,1ሿ to each event ܧ with
the interpretation that there is a ൫100 × ܲሺܧሻ൯% chance that the event ܧ will occur and a
ቀ100 × ൫1 − ܲሺܧሻ൯ቁ % chance that the event ܧ will not occur. For example if the
probability of an event is 0.25 it would mean that there is a 25% chance that the event will
occur and that there is a 75% chance that the event will not occur. Note that, for any such
assignment of possibilities to be meaningful, one must have ܲሺߗሻ = 1. Now we will discuss
two methods of assigning probabilities.
I. Classical Method
This method of assigning probabilities is used for random experiments which result in a
finite number of equally likely outcomes. Let ߗ = ሼ߱ଵ, … , ߱ሽ be a finite sample space with
݊ ሺ∈ ℕሻ possible outcomes; here ℕ denotes the set of natural numbers. For ⊆ ߗ , let ||ܧ
denote the number of elements in .ܧ An outcome ߱ ∈ ߗ is said to be favorable to an event
3. 3
ܧ if ߱ ∈ .ܧ In the classical method of assigning probabilities, the probability of an event ܧ is
given by
ܲሺܧሻ =
number of outocmes favorable to E
total number of outcomes
=
||ܧ
|ߗ|
=
||ܧ
݊
.
Note that probabilities assigned through classical method satisfy the following properties of
intuitive appeal:
(i) For any event ,ܧ ܲሺܧሻ ≥ 0;
(ii) For mutually exclusive events ܧଵ, ܧଶ, … , ܧ ሺ i.e. , ܧ ∩ ܧ = ߶ , whenever ݅, ݆ ∈
ሼ1, … , ݊ሽ, ݅ ≠ ݆ሻ
ܲ ൭ራ ܧ
ୀଵ
൱ =
|⋃ E୧
୬
୧ୀଵ |
n
=
∑ |E୧|୬
୧ୀଵ
n
=
|E୧|
n
୬
୧ୀଵ
= ܲሺܧሻ;
୧ୀଵ
(iii) ܲሺߗሻ =
|ఆ|
|ఆ|
= 1 .
Example 1.2
Suppose that in a classroom we have 25 students (with registration numbers1, 2, … , 25) born in
the same year having 365 days. Suppose that we want to find the probability of the event ܧ
that they all are born on different days of the year. Here an outcome consists of a sequence of
25 birthdays. Suppose that all such sequences are equally likely. Then
|ߗ| = 365ଶହ
, |E| = 365 × 364 × ⋯ × 341 =ଷହ
ܲଶହ and ܲሺܧሻ =
|ா|
|ఆ|
=
ଷହುమఱ
ଷହమఱ ∙
The classical method of assigning probabilities has a limited applicability as it can be used only
for random experiments which result in a finite number of equally likely outcomes. ▄
II. Relative Frequency Method
Suppose that we have independent repetitions of a random experiment (here independent
repetitions means that the outcome of one trial is not affected by the outcome of another trial)
under identical conditions. Let ݂ேሺܧሻ denote the number of times an event ܧ occurs (also
called the frequency of event ܧ in ܰ trials) in the first ܰ trials and let ݎேሺܧሻ = ݂ேሺܧሻ/ܰ denote
the corresponding relative frequency. Using advanced probabilistic arguments (e.g., using Weak
Law of Large Numbers to be discussed in Module 7) it can be shown that, under mild
conditions, the relative frequencies stabilize (in certain sense) as ܰ gets large (i.e., for any
event ,ܧ lim
ே→ஶ
rሺEሻ exists in certain sense). In the relative frequency method of assigning
probabilities the probability of an event ܧ is given by
4. 4
ܲሺܧሻ = lim
ே→ஶ
ݎேሺܧሻ ൌ lim
ே→ஶ
݂ேሺܧሻ
ܰ
∙
Figure 1.1. Plot of relative frequencies (ݎேሺܧሻ) of number of heads against number of trials (N)
in the random experiment of tossing a fair coin (with probability of head in each trial as 0.5).
In practice, to assign probability to an event ,ܧ the experiment is repeated a large (but fixed)
number of times (say ܰ times) and the approximation ܲሺܧሻ ൎ ݎேሺܧሻ is used for assigning
probability to event .ܧ Note that probabilities assigned through relative frequency method also
satisfy the following properties of intuitive appeal:
(i) for any event ,ܧ ܲሺܧሻ 0;
(ii) for mutually exclusive events ܧଵ, ܧଶ, … , ܧ
ܲ ൭ራܧ
ୀଵ
൱ ൌ ܲሺܧሻ
ୀଵ
;
(iii) ܲሺߗሻ ൌ 1.
Although the relative frequency method seems to have more applicability than the classical
method it too has limitations. A major problem with the relative frequency method is that it is
5. 5
imprecise as it is based on an approximation൫ܲሺܧሻ ≈ ݎேሺܧሻ൯. Another difficulty with relative
frequency method is that it assumes that the experiment can be repeated a large number of
times. This may not be always possible due to budgetary and other constraints (e.g., in
predicting the success of a new space technology it may not be possible to repeat the
experiment a large number of times due to high costs involved).
The following definitions will be useful in future discussions.
Definition 1.3
(i) A set ܧ is said to be finite if either ܧ = ߶ (the empty set) or if there exists a one-one and
onto function ݂: ሼ1,2, … , ݊ሽ → ܧሺor ݂: ܧ → ሼ1,2, … , ݊ሽሻ for some natural number ݊;
(ii) A set is said to be infinite if it is not finite;
(iii) A set ܧ is said to be countable if either ܧ = ߶ or if there is an onto function ݂: ℕ → ,ܧ
where ℕ denotes the set of natural numbers;
(iv) A set is said to be countably infinite if it is countable and infinite;
(v) A set is said to be uncountable if it is not countable;
(vi) A set ܧ is said to be continuum if there is a one-one and onto function ݂: ℝ →
ܧሺor ݂: ܧ → ℝ ሻ, where ℝ denotes the set of real numbers. ▄
The following proposition, whose proof(s) can be found in any standard textbook on set theory,
provides some of the properties of finite, countable and uncountable sets.
Proposition 1.1
(i) Any finite set is countable;
(ii) If ܣ is a countable and ܤ ⊆ ܣ then ܤ is countable;
(iii) Any uncountable set is an infinite set;
(iv) If ܣ is an infinite set and ܣ ⊆ ܤ then ܤ is infinite;
(v) If ܣ is an uncountable set and ܣ ⊆ ܤ then ܤ is uncountable;
(vi) If ܧ is a finite set and ܨ is a set such that there exists a one-one and onto function
݂: ܧ → ܨሺor ݂: ܨ → ܧሻ then ܨ is finite;
(vii) If ܧ is a countably infinite (continuum) set and ܨ is a set such that there exists a one-one
and onto function ݂: ܧ → ܨሺor ݂: ܨ → ܧሻ then ܨ is countably infinite (continuum);
(viii) A set ܧ is countable if and only if either ܧ = ߶ or there exists a one-one and onto map
݂: ܧ → ℕ, for some ℕ ⊆ ℕ;
(ix) A set ܧ is countable if, and only if, either ܧ is finite or there exists a one-one map
݂: ℕ → ;ܧ
(x) A set ܧ is countable if, and only if, either ܧ = ߶ or there exists a one-one map ݂: ܧ →
ℕ;
6. 6
(xi) A non empty countable set ܧ can be either written as ܧ = ሼ߱ଵ, ߱ଶ, … ߱ሽ, for some
݊ ∈ ℕ, or as ܧ = ሼ߱ଵ, ߱ଶ, … ሽ;
(xii) Unit interval ሺ0,1ሻ is uncountable. Hence any interval ሺܽ, ܾሻ, where −∞ < ܽ < ܾ < ∞,
is uncountable;
(xiii) ℕ × ℕ is countable;
(xiv) Let ߉ be a countable set and let ሼܣఈ: ߙ ∈ ߉ሽ be a (countable) collection of countable
sets. Then ⋃ఈ∈௸ܣఈ is countable. In other words, countable union of countable sets is
countable;
(xv) Any continuum set is uncountable. ▄
Example 1.3
(i) Define ݂: ℕ → ℕ by ݂ሺ݊ሻ = ݊, ݊ ∈ ℕ. Clearly ݂: ℕ → ℕ is one-one and onto. Thus ℕ is
countable. Also it can be easily seen (using the contradiction method) that ℕ is infinite.
Thus ℕ is countably infinite.
(ii) Let ℤ denote the set of integers. Define ݂: ℕ → ℤ by
݂ሺ݊ሻ = ൞
݊ − 1
2
, if ݊ is odd
−
݊
2
, if ݊ is even
Clearly ݂: ℕ → ℤ is one-one and onto. Therefore, using (i) above and Proportion 1.1 (vii),
ℤ is countably infinite. Now on using Proportion 1.1 (ii) it follows that any subset of ℤ is
countable.
(iii) Using the fact that ℕ is countably infinite and Proposition 1.1 (xiv) it is straight forward
to show that ℚ (the set of rational numbers) is countably infinite.
(iv) Define ݂: ℝ → ℝ and ݃: ℝ → ሺ0, 1ሻ by ݂ሺݔሻ = ,ݔ ݔ ∈ ℝ, and ݃ ሺݔሻ =
ଵ
ଵାೣ , ݔ ∈ ℝ. Then
݂: ℝ → ℝ and ݃: ℝ → ሺ0, 1ሻ are one-one and onto functions. It follows that ℝand (0, 1)
are continuum (using Proposition 1.1 (vii)). Further, for − ∞ < ܽ < ܾ < ∞ , let
ℎሺݔሻ = ሺܾ − ܽሻݔ + ܽ, ݔ ∈ ሺ0, 1ሻ. Clearly ℎ: ሺ0,1ሻ → ሺܽ, ܾሻ is one-one and onto. Again
using proposition 1.1 (vii) it follows that any interval ሺܽ, ܾሻ is continuum. ▄
It is clear that it may not be possible to assign probabilities in a way that applies to every
situation. In the modern approach to probability theory one does not bother about how
probabilities are assigned. Assignment of probabilities to various subsets of the sample space ߗ
that is consistent with intuitively appealing properties (i)-(iii) of classical (or relative frequency)
method is done through probability modeling. In advanced courses on probability theory it is
shown that in many situations (especially when the sample space ߗ is continuum) it is not
7. 7
possible to assign probabilities to all subsets of ߗ such that properties (i)-(iii) of classical (or
relative frequency) method are satisfied. Therefore probabilities are assigned to only certain
types of subsets of ߗ.
In the following section we will discuss the modern approach to probability theory where we
will not be concerned with how probabilities are assigned to suitably chosen subsets of ߗ.
Rather we will define the concept of probability for certain types of subsets ߗ using a set of
axioms that are consistent with properties (i)-(iii) of classical (or relative frequency) method.
We will also study various properties of probability measures.
2. Axiomatic Approach to Probability and Properties of Probability Measure
We begin this section with the following definitions.
Definition 2.1
(i) A set whose elements are themselves set is called a class of sets. A class of sets will be
usually denoted by script letters ࣛ, ℬ, ࣝ, …. For example ࣛ = ൛ሼ1ሽ, ሼ1, 3ሽ, ሼ2, 5, 6ሽൟ;
(ii) Let ࣝ be a class of sets. A function ߤ: ࣝ → ℝ is called a set function. In other words, a
real-valued function whose domain is a class of sets is called a set function. ▄
As stated above, in many situations, it may not be possible to assign probabilities to all subsets
of the sample space ߗ such that properties (i)-(iii) of classical (or relative frequency) method
are satisfied. Therefore one begins with assigning probabilities to members of an appropriately
chosen class ࣝ of subsets of ߗ (e.g., if ߗ = ℝ, then ࣝ may be class of all open intervals in ℝ; if ߗ
is a countable set, then ࣝ may be class of all singletons ሼ߱ሽ, ߱ ∈ ߗ). We call the members of ࣝ
as basic sets. Starting from the basic sets in ࣝ assignment of probabilities is extended, in an
intuitively justified manner, to as many subsets of ߗ as possible keeping in mind that properties
(i)-(iii) of classical (or relative frequency) method are not violated. Let us denote by ℱ the class
of sets for which the probability assignments can be finally done. We call the class ℱ as event
space and elements of ℱare called events. It will be reasonable to assume that ℱ satisfies the
following properties: (i) ߗ ∈ ℱ, (ii) ܣ ∈ ℱ ⟹ ܣ
= ߗ − ܣ ∈ ℱ ,and (iii)ܣ ∈ ℱ, ݅ = 1,2, … ⇒
⋃ ܣ ∈ ℱஶ
ୀଵ . This leads to introduction of the following definition.
Definition 2.2
A sigma-field (ߪ-field) of subsets of ߗ is a class ℱ of subsets of ߗ satisfying the following
properties:
(i) ߗ ∈ ℱ;
(ii) ܣ ∈ ℱ ⇒ ܣ
= ߗ − ܣ ∈ ℱ (closed under complements);
8. 8
(iii) ܣ ∈ ℱ, ݅ = 1, 2, … ⇒ ⋃ ܣ ∈ ℱஶ
ୀଵ (closed under countably infinite unions). ▄
Remark 2.1
(i) We expect the event space to be a ߪ-field;
(ii) Suppose that ℱ is a ߪ-field of subsets of ߗ. Then,
(a) ߶ ∈ ℱ ሺsince ߶ = ߗሻ
(b) ܧଵ, ܧଶ, … ∈ ℱ ⇒ ⋂ ܧ ∈ ℱஶ
ୀଵ ሺsince ⋂ ܧ
ஶ
ୀଵ = ሺ⋃ ܧ
ஶ
ୀଵ ሻሻ;
(c) ,ܧ ܨ ∈ ℱ ⇒ ܧ − ܨ = ܧ ∩ ܨ
∈ ℱ and ܧΔܨ ≝ ሺܧ − ܨሻ ∪ ሺܨ − ܧሻ ∈ ℱ;
(d) ܧଵ, ܧଶ, … , ܧ ∈ ℱ, for some ݊ ∈ ℕ, ⇒ ⋃ ܧ ∈ ℱ
ୀଵ and ⋂ ܧ ∈ ℱ
ୀଵ (take
ܧାଵ = ܧାଶ = ⋯ = ߶so that ⋃ ܧ
ୀଵ = ⋃ ܧ
∞
ୀଵ or ܧାଵ = ܧାଶ = ⋯ = ߗ so
that ⋂ ܧ
ୀଵ = ⋂ ܧ
∞
ୀଵ );
(e) although the power set of ߗ൫࣪ሺߗሻ൯ is a ߪ-field of subsets of ߗ,in general, a ߪ-
field may not contain all subsets of ߗ. ▄
Example 2.1
(i) ℱ = ሼ߶, ߗሽ is a sigma field, called the trivial sigma-field;
(ii) Suppose that ܣ ⊆ ߗ. Then ℱ = ሼ,ܣ ܣ
, ߶, ߗሽ is a ߪ-field of subsets of ߗ. It is the
smallest sigma-field containing the set ;ܣ
(iii) Arbitrary intersection of ߪ-fields is a ߪ-field (see Problem 3 (i));
(iv) Let ࣝ be a class of subsets of ߗ and let ሼܨఈ ∶ ߙ ∈ ߉ሽ be the collection of all ߪ-fields
that contain ࣝ. Then
ℱ = ሩ ℱఈ
ఈ∈௸
is a ߪ-field and it is the smallest ߪ-field that contains class ࣝ (called the ߪ-field
generated by ࣝ and is denoted by ߪሺࣝሻ) (see Problem 3 (iii));
(v) Let ߗ = ℝ and let ࣤ be the class of all open intervals in ℝ. Then ℬଵ = ߪሺࣤሻ is called
the Borel ߪ-field on ℝ. The Borel ߪ-field in ℝ
(denoted by ℬ ) is the ߪ-field
generated by class of all open rectangles in ℝ
. A set ܤ ∈ ℬ is called a Borel set in
ℝ
; here ℝ
= ሼሺݔଵ, … , ݔሻ: −∞ < ݔ < ∞, ݅ = 1, … , ݇ሽ denotes the ݇-dimensional
Euclidean space;
(vi) ℬଵ contains all singletons and hence all countable subsets of ℝ ቀሼܽሽ = ⋂ ቀܽ −ஶ
ୀଵ
ଵ
, ܽ +
ଵ
ቁቁ ∙ ▄
Let ࣝ be an appropriately chosen class of basic subsets of ߗ for which the probabilities can be
assigned to begin with (e.g., if ߗ = ℝ then ࣝ may be class of all open intervals in ℝ; if ߗ is a
countable set then ࣝ may be class of all singletons ሼ߱ሽ, ߱ ∈ ߗ). It turns out (a topic for an
advanced course in probability theory) that, for an appropriately chosen class ࣝ of basic sets,
9. 9
the assignment of probabilities that is consistent with properties (i)-(iii) of classical (or relative
frequency) method can be extended in an unique manner from ࣝ to ߪሺࣝሻ, the smallest ߪ-field
containing the class ࣝ. Therefore, generally the domain ℱ of a probability measure is taken to
be ߪሺࣝሻ, the ߪ-field generated by the class ࣝ of basic subsets of ߗ. We have stated before that
we will not care about how assignment of probabilities to various members of event space ℱ (a
ߪ-field of subsets of ߗ) is done. Rather we will be interested in properties of probability
measure defined on event space ℱ.
Let ߗ be a sample space associated with a random experiment and let ℱ be the event space (a
ߪ-field of subsets of ߗ). Recall that members of ℱ are called events. Now we provide a
mathematical definition of probability based on a set of axioms.
Definition 2.3
(i) Let ℱ be a ߪ-field of subsets of ߗ. A probability function (or a probability measure) is a
set function ܲ, defined on ℱ, satisfying the following three axioms:
(a) ܲሺܧሻ ≥ 0, ∀ܧ ∈ ℱ; (Axiom 1: Non-negativity);
(b) If ܧଵ, ܧଶ, … is a countably infinite collection of mutually exclusive events ൫i. e., ܧ ∈
ℱ, ݅ = 1, 2, … , ܧ ∩ ܧ = ߶, ݅ ≠ ݆ ൯ then
ܲ ൭ራ ܧ
∞
ୀଵ
൱ = ܲሺܧሻ
∞
ଵୀଵ
; ሺAxiom 2: Countably infinite additiveሻ
(c) ܲሺߗሻ = 1 (Axiom 3: Probability of the sample space is 1).
(ii) The triplet ሺߗ, ℱ, ܲሻ is called a probability space. ▄
Remark 2.2
(i) Note that if ܧଵ, ܧଶ, … is a countably infinite collection of sets in a ߪ-field ℱthen
⋃ ܧ
ஶ
ୀଵ ∈ ℱ and, therefore, ܲሺ⋃ ܧ
ஶ
ୀଵ ሻ is well defined;
(ii) In any probability space ሺߗ, ℱ, ܲሻ we have ܲሺߗሻ = 1 (or ܲሺ߶ሻ = 0; see Theorem 2.1 (i)
proved later) but if ܲሺܣሻ = 1 (or ܲሺܣሻ = 0), for some ܣ ∈ ℱ, then it does not mean that
ܣ = ߗ ( or ܣ = ߶) (see Problem 14 (ii).
(iii) In general not all subsets of ߗ are events, i.e., not all subsets of ߗ are elements of ℱ.
(iv) When ߗ is countable it is possible to assign probabilities to all subsets of ߗ using Axiom
2 provided we can assign probabilities to singleton subsets ሼݔሽ of ߗ. To illustrate this let
ߗ = ሼ߱ଵ, ߱ଶ, … ሽ ሺor Ω = ሼ߱ଵ, … , ߱ሽ, for some n ∈ ℕሻ and let ܲሺሼ߱ሽሻ = , ݅ =
10. 10
1, 2, … , so that 0 ≤ ≤ 1, ݅ = 1,2, … (see Theorem 2.1 (iii) below) and
∑ =ஶ
ୀଵ ∑ ܲሺሼ߱ሽሻஶ
ୀଵ = ܲሺ⋃ ሼ߱ሽஶ
ୀଵ ሻ = ܲሺߗሻ = 1. Then, for any ܣ ⊆ ߗ,
ܲሺܣሻ = .
:ఠ ∈
Thus in this case we may take ℱ = ܲሺߗሻ, the power set of ߗ. It is worth mentioning
here that if ߗ is countable and ࣝ = ൛ሼ߱ሽ ∶ ߱ ∈ ߗൟ (class of all singleton subsets of ߗ) is
the class of basic sets for which the assignment of the probabilities can be done, to
begin with, then ߪሺࣝሻ = ࣪ሺߗሻ (see Problem 5 (ii)).
(v) Due to some inconsistency problems, assignment of probabilities for all subsets of ߗ is
not possible when ߗ is continuum (e.g., if ߗ contains an interval). ▄
Theorem 2.1
Letሺߗ, ℱ, ܲሻbe a probability space. Then
(i) ܲሺ߶ሻ = 0;
(ii) ܧ ∈ ℱ, ݅ = 1, 2, … . ݊ , and ܧ ∩ ܧ = ߶, ݅ ≠ ݆ ⇒ ܲሺ⋃ ܧ
ୀଵ ሻ = ∑ ܲሺܧሻ
ୀଵ (finite
additivity);
(iii) ∀ܧ ∈ ℱ, 0 ≤ ܲሺܧሻ ≤ 1 and ܲሺܧሻ = 1 − ܲሺܧሻ;
(iv) ܧଵ, ܧଶ ∈ ℱ and ܧଵ ⊆ ܧଶ ⇒ ܲሺܧଶ − ܧଵሻ = ܲሺܧଶሻ − ܲሺܧଵሻ and ܲሺܧଵሻ ≤ ܲሺܧଶሻ
(monotonicity of probability measures);
(v) ܧଵ, ܧଶ ∈ ℱ ⇒ ܲሺܧଵ ∪ ܧଶሻ = ܲሺܧଵሻ + ܲሺܧଶሻ − ܲሺܧଵ ∩ ܧଶሻ.
Proof.
(i) Let ܧଵ = ߗ and ܧ = ߶, ݅ = 2, 3, …. Then ܲሺܧଵሻ = 1, (Axiom 3) ܧ ∈ ℱ, ݅ = 1, 2, … ,
ܧଵ = ⋃ ܧ
ஶ
ୀଵ and ܧ ∩ ܧ = ߶, ݅ ≠ ݆. Therefore,
1 = ܲሺܧଵሻ = ܲ ൭ራ ܧ
ஶ
ୀଵ
൱
= ܲሺܧሻ ሺusing Axiom 2ሻ
ஶ
ୀଵ
= 1 + ܲሺ߶ሻ
ஶ
ୀଶ
⇒ ܲሺ߶ሻ
ஶ
ୀଶ
= 0
15. 15
, = ൜
ܵ,, if ݅ is odd
−ܵ,, if ݅ is even
, ݅ = 1, 2, … ݊.
(ii) We have
1 ≥ ܲሺܧଵ ∪ ܧଶሻ = ܲሺܧଵሻ + ܲሺܧଶሻ − ܲሺܧଵ ∩ ܧଶሻ
⇒ ܲሺܧଵ ∩ ܧଶሻ ≥ ܲሺܧଵሻ + ܲሺܧଶሻ − 1.
The above inequality is known as Bonferroni’s inequality. ▄
Theorem 2.3
Let ሺߗ, ℱ, ܲሻ be a probability space and let ܧଵ, ܧଶ, … , ܧ ∈ ℱ ሺ݊ ∈ ℕ, ݊ ≥ 2 ሻ. Then, under
the notations of Theorem 2.2,
(i) (Boole’s Inequality) ܵଵ, + ܵଶ, ≤ ܲሺ⋃ ܧ
ଵୀଵ ሻ ≤ ܵଵ,;
(ii) (Bonferroni’s Inequality) ܲሺ⋂ ܧ
ଵୀଵ ሻ ≥ ܵଵ, − ሺ݊ − 1ሻ.
Proof.
(i) We will use the principle of mathematical induction. We have
ܲሺܧଵ ∪ ܧଶሻ = ܲሺܧଵሻ + ܲሺܧଶሻᇣᇧᇧᇧᇤᇧᇧᇧᇥ
ௌభ,మ
−ܲሺܧଵ ∩ ܧଶሻᇣᇧᇧᇧᇤᇧᇧᇧᇥ
ௌమ,మ
= ܵଵ,ଶ + ܵଶ,ଶ
≤ ܵଵ,ଶ,
where ܵଵ,ଶ = ܲሺܧଵሻ + ܲሺܧଶሻ and ܵଶ,ଶ = −ܲሺܧଵ ∩ ܧଶሻ ≤ 0.
Thus the result is true for ݊ = 2. Now suppose that the result is true for ݊ ∈
ሼ2, 3, … , ݉ሽ for some positive integer ݉ ሺ≥ 2ሻ, i.e., suppose that for arbitrary events
ܨଵ, … , ܨ ∈ ℱ
ܲ ቌራ ܨ
ୀଵ
ቍ ≤ ܲሺܨሻ
ୀଵ
, ݇ = 2, 3, … , ݉ ሺ2.5ሻ
and
ܲ ቌራ ܨ
ୀଵ
ቍ ≥ ܲሺܨሻ
ୀଵ
− ܲ൫ܨ ∩ ܨ൯
ଵஸழஸ
, ݇ = 2, 3, … , ݉. ሺ2.6ሻ
Then
16. 16
ܲ ൭ራ ܧ
ାଵ
ୀଵ
൱ = ܲ ቌ൭ራ ܧ
ୀଵ
൱ ∪ ܧାଵቍ
≤ ܲ ൭ራ ܧ
ୀଵ
൱ + ܲሺܧାଵሻ ሺusing ሺ2.5ሻ for ݇ = 2ሻ
≤ ܲሺܧሻ
ୀଵ
+ ܲሺܧାଵሻ ሺusing ሺ2.5ሻ for k = mሻ
= ܲሺܧሻ
ାଵ
ୀଵ
= ܵଵ,ାଵ. ሺ2.7ሻ
Also,
ܲ ൭ራ ܧ
ାଵ
ୀଵ
൱ = ܲ ቌ൭ራ ܧ
ୀଵ
൱ ∪ ܧାଵቍ
= ܲ ൭ራ ܧ
ୀଵ
൱ + ܲሺܧାଵሻ − ܲ ቌ൭ራ ܧ
ୀଵ
൱ ∩ ܧାଵቍ ሺusing Theorem 2.2ሻ
= ܲ ൭ራ ܧ
ୀଵ
൱ + ܲሺܧାଵሻ − ܲ ൭ራሺܧ ∩ ܧାଵሻ
ୀଵ
൱. ሺ2.8ሻ
Using (2.5), for ݇ = ݉, we get
ܲ ൭ራሺܧ ∩ ܧାଵሻ
ୀଵ
൱ ≤ ܲ
ୀଵ
ሺܧ ∩ ܧାଵሻ, ሺ2.9ሻ
and using (2.6), for ݇ = ݉, we get
ܲ ൭ራ ܧ
ୀଵ
൱ ≥ ܵଵ, + ܵଶ,. ሺ2.10ሻ
Now using (2.9) and (2.10) in (2.8), we get
17. 17
ܲ ൭ራ ܧ
ାଵ
ୀଵ
൱ ≥ ܵଵ, + ܵଶ, + ܲሺܧାଵሻ − ܲሺܧ ∩ ܧାଵሻ
ୀଵ
= ܲሺܧሻ
ାଵ
ୀଵ
− ܲ൫ܧ ∩ ܧ൯
ଵஸழஸାଵ
= ܵଵ,ାଵ + ܵଶ,ାଵ. (2.11)
Combining (2.7) and (2.11), we get
ܵଵ,ାଵ + ܵଶ,ାଵ ≤ ܲ ൭ ራ ܧ
ାଵ
ଵୀଵ
൱ ≤ ܵଵ,ାଵ,
and the assertion follows by principle of mathematical induction.
(ii) We have
ܲ ൭ሩ ܧ
୧ୀଵ
൱ = 1 − ܲ ቌ൭ሩ ܧ
୧ୀଵ
൱
ቍ
= 1 − ܲሺራ E୧
୬
୧ୀଵ
ሻ
≥ 1 − ܲ
ଵୀଵ
ሺܧ
ሻ ሺusing Booleᇱ
sinequalityሻ
= 1 − ൫1 − ܲሺܧሻ൯
ୀଵ
= ܲሺܧሻ − ሺ݊ − 1ሻ. ▄
ୀଵ
Remark 2.4
Under the notation of Theorem 2.2 we can in fact prove the following inequalities:
ܵ,
ଶ
ୀଵ
≤ ܲ ቌራ ܧ
ୀଵ
ቍ ≤ ܵ,
ଶିଵ
ୀଵ
, ݇ = 1,2, … , ቂ
݊
2
ቃ,
18. 18
where ቂ
ଶ
ቃ denotes the largest integer not exceeding
ଶ
. ▄
Corollary 2.1
Let ሺߗ, ℱ, ܲሻ be a probability space and let ܧଵ, ܧଶ, … , ܧ ∈ ℱ be events. Then
(i) ܲሺܧሻ = 0, ݅ = 1, … , ݊ ⇔ ܲሺ⋃ ܧ
ୀଵ ሻ = 0;
(ii) ܲሺܧሻ = 1, ݅ = 1, … , ݊ ⇔ ܲሺ⋂ ܧ
ୀଵ ሻ = 1.
Proof.
(i) First suppose that ܲሺܧሻ = 0, ݅ = 1, … , ݊. Using Boole’s inequality, we get
0 ≤ ܲ ൭ራ ܧ
ୀଵ
൱ ≤ ܲሺܧሻ
ୀଵ
= 0.
It follows that ܲሺ⋃ ܧ
ୀଵ ሻ = 0.
Conversely, suppose that ܲ൫⋃ ܧ
ୀଵ ൯ = 0 . Then ܧ ⊆ ⋃ ܧ
ୀଵ , ݅ = 1, … , ݊ , and
therefore,
0 ≤ ܲሺܧሻ ≤ ܲ ቌራ ܧ
ୀଵ
ቍ = 0, ݅ = 1, … , ݊,
i.e., ܲሺܧሻ = 0, ݅ = 1, … , ݊.
(ii) We have
ܲሺܧሻ = 1, ݅ = 1, … , ݊ ⇔ ܲሺܧ
ሻ = 0, ݅ = 1, … , ݊
⇔ ܲ ൭ራ ܧ
ୀଵ
൱ = 0 ሺusing ሺiሻሻ
⇔ ܲ ቌ൭ራ ܧ
ୀଵ
൱
ቍ = 1,
⇔ ܲ ൭ሩ ܧ
ୀଵ
൱ = 1. ▄
Definition 2.4
A countable collection ሼܧ: ݅ ∈ ߉ሽ of events is said to be exhaustive if ܲሺ⋃ ܧ∈௸ ሻ = 1. ▄
19. 19
Example 2.2 (Equally Likely Probability Models)
Consider a probability space ሺߗ, ℱ, ܲሻ. Suppose that, for some positive integer ݇ ≥ 2,
ߗ = ⋃ ܥ
ୀଵ , where ܥଵ, ܥଶ, … , ܥ are mutually exclusive, exhaustive and equally likely events,
i.e., ܥ ∩ ܥ = ߶, if ݅ ≠ ݆, ܲ൫⋃ ܥ
ୀଵ ൯ = ∑ ܲ
ୀଵ ሺܥሻ = 1 and ܲሺܥଵሻ = ⋯ = ܲሺܥሻ =
ଵ
.Further
suppose that an event ܧ ∈ ℱ can be written as
ܧ = ܥଵ
∪ ܥଶ
∪ ⋯ ∪ ܥ
,
where ሼ݅ଵ, … , ݅ሽ ⊆ ሼ1, … , ݇ሽ, ܥ
∩ ܥ
= ߶, ݆ ≠ ݇ and ݎ ∈ ሼ2, … , ݇ሽ. Then
ܲሺܧሻ = ܲ ቀܥ
ቁ
ୀଵ
=
ݎ
݇
.
Note that here ݇ is the total number of ways in which the random experiment can terminate
(number of partition sets ܥଵ, … , ܥ ), and ݎ is the number of ways that are favorable to ܧ ∈ ℱ.
Thus, for any ܧ ∈ ℱ,
ܲሺܧሻ =
number of cases favorable to ܧ
total number of cases
=
ݎ
݇
,
which is the same as classical method of assigning probabilities. Here the assumption that
ܥଵ, … , ܥ are equally likely is a part of probability modeling. ▄
For a finite sample space ߗ, when we say that an experiment has been performed at random
we mean that various possible outcomes in ߗ are equally likely. For example when we say that
two numbers are chosen at random, without replacement, from the set ሼ1, 2, 3ሽ then
ߗ = ൛ሼ1, 2ሽ, ሼ1, 3ሽ, ሼ2, 3ሽൟand ܲሺሼ1, 2ሽሻ = ܲሺሼ1, 3ሽሻ = ܲሺሼ2, 3ሽሻ =
ଵ
ଷ
, where ሼ݅, ݆ሽ indicates that
the experiment terminates with chosen numbers as ݅ and ݆, ݅, ݆ ∈ ሼ1, 2, 3ሽ, ݅ ≠ ݆.
Example 2.3
Suppose that five cards are drawn at random and without replacement from a deck of 52
cards. Here the sample space ߗ comprises of all ቀ
52
5
ቁ combinations of 5 cards. Thus number of
favorable cases= ቀ
52
5
ቁ = ݇, say. Let ܥଵ, … , ܥ be singleton subsets of ߗ.Then ߗ = ⋃ ܥ
ୀଵ and
ܲሺܥଵሻ = ⋯ = ܲሺܥሻ =
ଵ
. Let ܧଵ be the event that each card is spade. Then
Number of cases favorable to ܧଵ = ቀ
13
5
ቁ.
20. 20
Therefore,
ܲሺܧଵሻ =
ቀ
13
5
ቁ
ቀ
52
5
ቁ
∙
Now let ܧଶ be the event that at least one of the drawn cards is spade. Then ܧଶ
is the event that
none of the drawn cards is spade, andnumber of cases favorable to ܧଶ
= ቀ
39
5
ቁ ∙ Therefore,
ܲሺܧଶ
ሻ =
ቀ
39
5
ቁ
ቀ
52
5
ቁ
,
and ܲሺܧଶሻ = 1 − ܲሺܧଶ
ሻ = 1 −
ቀଷଽ
ହ
ቁ
ቀହଶ
ହ
ቁ
∙
Let ܧଷ be the event that among the drawn cards three are kings and two are queens. Then
number of cases favorable to ܧଷ = ቀ
4
3
ቁ ቀ
4
2
ቁ and, therefore,
ܲሺܧଷሻ =
ቀ
4
3
ቁ ቀ
4
2
ቁ
ቀ
52
5
ቁ
∙
Similarly, if ܧସ is the event that among the drawn cards two are kings, two are queens and one
is jack, then
ܲሺܧସሻ =
ቀ
4
2
ቁ ቀ
4
2
ቁ ቀ
4
1
ቁ
ቀ
52
5
ቁ
. ▄
Example 2.4
Suppose that we have ݊ ሺ≥ 2ሻ letters and corresponding ݊ addressed envelopes. If these
letters are inserted at random in ݊ envelopes find the probability that no letter is inserted into
the correct envelope.
Solution. Let us label the letters as ܮଵ, ܮଶ, … , ܮ and respective envelopes as ܣଵ, ܣଶ, … , ܣ. Let
ܧ denote the event that letter ܮ is (correctly) inserted into envelope ܣ, ݅ = 1, 2, … , ݊. We
need to find ܲሺ⋂ ܧ
ୀଵ ሻ. We have
21. 21
ܲ ൭ሩ ܧ
ୀଵ
൱ = ܲ ቌ൭ራ ܧ
ୀଵ
൱
ቍ = 1 − ܲ ൭ራ ܧ
ୀଵ
൱ = 1 − ܵ,,
ୀଵ
where, for ݇ ∈ ሼ1, 2, … , ݊ሽ,
ܵ, = ሺ−1ሻିଵ
ܲ൫ܧభ
∩ ܧమ
∩ ⋯ ∩ ܧೖ
൯.
ଵஸభழమழ⋯ழೖஸ
Note that ݊ letters can be inserted into ݊ envelopes in ݊! ways. Also, for 1 ≤ ݅ଵ < ݅ଶ < ⋯ <
݅ ≤ ݊, ܧభ
∩ ܧమ
∩ ⋯ ∩ ܧೖ
is the event that letters ܮభ
, ܮమ
, … , ܮೖ
are inserted into correct
envelopes. Clearly number of cases favorable to this event is ሺ݊ − ݇ሻ!. Therefore, for
1 ≤ ݅ଵ < ݅ଶ < ⋯ < ݅ ≤ ݊,
ܲ൫ܧభ
∩ ܧమ
∩ ⋯ ∩ ܧೖ
൯ =
ሺ݊ − ݇ሻ!
݊!
⇒ ܵ, = ሺ−1ሻିଵ
ሺ݊ − ݇ሻ!
݊!
1≤݅1<݅2<⋯<݅݇≤݊
= ሺ−1ሻିଵ
ቀ
݊
݇
ቁ
ሺ݊ − ݇ሻ!
݊!
=
ሺ−1ሻିଵ
݇!
⇒ ܲ ൭ሩ ܧ
ୀଵ
൱ =
1
2!
−
1
3!
+
1
4!
− ⋯ +
ሺ−1ሻ
݊!
. ▄
3. Conditional Probability and Independence of Events
Let ሺߗ, ℱ, ܲሻ be a given probability space. In many situations we may not be interested in the
whole space ߗ. Rather we may be interested in a subset ܤ ∈ ℱ of the sample space ߗ. This may
happen, for example, when we know apriori that the outcome of the experiment has to be an
element of ܤ ∈ ℱ.
Example 3.1
Consider a random experiment of shuffling a deck of 52 cards in such a way that all 52!
arrangements of cards (when looked from top to bottom) are equally likely.
22. 22
Here,
ߗ =all 52! permutations of cards,
and
ℱ = ࣪ሺΩሻ.
Now suppose that it is noticed that the bottom card is the king of heart. In the light of this
information, sample space ܤ comprises of 51! arrangements of 52 cards with bottom card as
king of heart.Define the event
:ܭtop card is king.
For ܧ ∈ ℱ, define
ܲሺܧሻ = probability of event ܧ under sample space ߗ,
ܲሺܧሻ = probability of event ܧ under sample space .ܤ
Clearly,
ܲሺܭሻ =
ଷ×ହ!
ହଵ!
.
Note that
ܲሺܭሻ =
3 × 50!
51!
=
ଷ×ହ!
ହଶ!
ହଵ!
ହଶ!
=
ܲሺܭ ∩ ܤሻ
ܲሺܤሻ
i. e. , ܲሺܭሻ =
ܲሺܭ ∩ ܤሻ
ܲሺܤሻ
. ሺ3.1ሻ
We call ܲሺܭሻ the conditional probability of event ܭ given that the experiment will result in an
outcome in ܤ (i.e., the experiment will result in an outcome ߱ ∈ ܤ ) and ܲሺܭሻ the
unconditional probability of event .ܭ ▄
Example 3.1 lays ground for introduction of the concept of conditional probability.
Let ሺߗ, ℱ, ܲሻ be a given probability space. Suppose that we know in advance that the outcome
of the experiment has to be an element of ܤ ∈ ℱ, where ܲሺܤሻ > 0. In such situations the
sample space is ܤ and natural contenders for the membership of the event space are
23. 23
ሼܣ ∩ ܤ ∶ ܣ ∈ ࣠ሽ. This raises the question whether ࣠ ൌ ሼܣ ∩ ܤ ∶ ܣ ∈ ࣠ሽ is an event space?
i.e., whether ࣠ ൌ ሼܣ ∩ ܤ ∶ ܣ ∈ ࣠ሽ is a sigma-field of subsets of ?ܤ
Theorem 3.1
Let ࣠ be a ߪ-field of subsets ߗ and let ܤ ∈ ࣠. Define ࣠ ൌ ሼܣ ∩ ܤ ∶ ܣ ∈ ࣠ሽ. Then ࣠ is a ߪ-
field of subsets of ܤand ࣠ ⊆ ࣠.
Proof. Since ܤ ∈ ࣠ and ࣠ ൌ ሼܣ ∩ ܤ ∶ ܣ ∈ ࣠ሽ it is obvious that ࣠ ⊆ ࣠. We have ߗ ∈ ࣠ and
therefore
ܤ ൌ ߗ ∩ ܤ ∈ ࣠. ሺ3.2ሻ
Also,
ܥ ∈ ࣠ ⇒ C ൌ A ∩ ܤ for same ܣ ∈ ࣠
⇒ ܥ
ൌ ܤ െ ܥ ൌ ሺߗ െ ܣሻᇣᇧᇤᇧᇥ
∈࣠
∩ ܤ (since ܣ ∈ ࣠)
Figure 3.1
⇒ ܥ
ൌ ܤ െ ܥ ∈ ࣠, (3.3)
i.e., ࣠ is closed under complements with respect to .ܤ
Now suppose that ܥ ∈ ࣠, ݅ ൌ 1,2, ….Thenܥ ൌ ܣ ∩ ,ܤ for someܣ ∈ ࣠, ݅ ൌ 1,2, …. Therefore,
ራ ܥ
ஶ
ୀଵ
ൌ ൭ራ ܣ
ஶ
ୀଵ
൱
ᇣᇧᇧᇤᇧᇧᇥ
∈࣠
∩ ܤሺsince ܣ ∈ ࣠, ݅ ൌ 1,2, … ሻ
24. 24
∈ ℱ, ሺ3.4ሻ
i.e., ℱ is closed under countable unions.
Now (3.2), (3.3) and (3.4) imply that ℱis a ߪ-field of subsets of .ܤ ▄
Equation (3.1) suggests considering the set function ܲ: ℱ → ℝ defined by
ܲሺܥሻ =
ܲሺܥሻ
ܲሺܤሻ
, ܥ ∈ ℱ = ሼܣ ∩ :ܤ ܣ ∈ ℱሽ.
Note that, for ܥ ∈ ℱ, ܲሺܥሻ is well defined as ℱ ⊆ ℱ.
Let us define another set function ܲሺ∙ |ܤሻ ∶ ℱ → ℝ by
Pሺܤ|ܣሻ ൌ ܲሺܣ ∩ ܤሻ =
ܲሺܣ ∩ ܤሻ
ܲሺܤሻ
, ܣ ∈ ℱ.
Theorem 3.2
Let ሺߗ, ℱ, ܲሻbe a probability space and let ܤ ∈ ℱ be such that ܲሺܤሻ > 0. Then ሺ,ܤ ℱ, ܲ ሻ
and ൫ߗ, ℱ, ܲሺ⋅ |ܤሻ൯ are probability spaces.
Proof. Clearly
ܲሺܥሻ ൌ
ሺሻ
ሺሻ
0, ∀ ܥ ∈ ℱ.
Let ܥ ∈ ℱ, ݅ = 1, 2, … be mutually exclusive.Then ܥ ∈ ℱ, ݅ = 1, 2, … (since ℱ ⊆ ℱ), and
ܲ ൭ራ ܥ
ஶ
ୀଵ
൱ =
ܲሺ⋃ ܥ
ஶ
ୀଵ ሻ
ܲሺܤሻ
=
∑ ܲሺܥሻஶ
ୀଵ
ܲሺܤሻ
=
ܲሺܥሻ
ܲሺܤሻ
ஶ
ୀଵ
= ܲ
ஶ
ୀଵ
ሺܥሻ, ሺ3.5ሻ
i.e., ܲ is countable additive on ℱ.
25. 25
Also
ܲሺܤሻ =
ܲሺܤሻ
ܲሺܤሻ
= 1 ∙
Thus ܲ is a probability measure on ℱ.
Note that ܲሺܤ|ܣሻ 0, ∀ ܣ ∈ ℱ and
ܲሺߗ|Bሻ ൌ
ܲሺߗ ∩ Bሻ
ܲሺܤሻ
=
ܲሺܤሻ
ܲሺܤሻ
= 1 ∙
Let ܧ ∈ ℱ, ݅ = 1,2, … be mutually exclusive. Then ܥ = ܧ ∩ ܤ ∈ ℱ, ݅ = 1, 2, … are mutually
exclusive and
ܲ ൭ራ ܧ|ܤ
ஶ
ୀଵ
൱ ൌ ܲ ൭ራ ܥ
ஶ
ୀଵ
൱ ൌ ܲሺܥሻ
ஶ
ୀଵ
ൌ ܲ
ஶ
ୀଵ
ሺܧ ∩ ܤሻ = ܲ
ஶ
ୀଵ
ሺܧ|ܤሻ. ሺusing ሺ3.5ሻሻ
It follows thatܲሺ∙ |ܤሻ is a probability measure on ℱ. ▄
Note that domains of ܲሺ∙ሻ and ܲሺ∙ |ܤሻ are ℱ and ℱ respectively. Moreover,
ܲሺܤ|ܣሻ ൌ ܲሺܣ ∩ ܤሻ =
ܲሺܣ ∩ ܤሻ
ܲሺܤሻ
, ܣ ∈ ℱ.
Definition 3.1
Let ሺߗ, ℱ, ܲሻ be a probability space and let ܤ ∈ ℱ be a fixed event such that ܲሺܤሻ > 0. Define
the set function ܲሺ∙ |ܤሻ: ℱ → ℝ by
ܲሺܤ|ܣሻ ൌ ܲሺܣ ∩ ܤሻ =
ܲሺܣ ∩ ܤሻ
ܲሺܤሻ
, ܣ ∈ ℱ.
We call ܲሺܤ|ܣሻ the conditional probability of event ܣ given that the outcome of the
experiment is in ܤ or simply the conditional probability of ܣ given .ܤ ▄
Example 3.2
Six cards are dealt at random (without replacement) from a deck of 52 cards. Find the
probability of getting all cards of heart in a hand (event A) given that there are at least 5 cards
of heart in the hand (event B).
Solution. We have,
27. 27
Example 3.3
An urn contains four red and six black balls. Two balls are drawn successively, at random and
without replacement, from the urn. Find the probability that the first draw resulted in a red ball
and the second draw resulted in a black ball.
Solution. Define the events
:ܣ first draw results in a red ball;
:ܤ second draw results in a black ball.
Then,
Required probability = ܲሺܣ ∩ ܤሻ
= ܲሺܣሻܲሺܣ|ܤሻ
ൌ
4
10
×
6
9
=
12
45
. ▄
Let ሺߗ, ℱ, ܲሻ be a probability space. For a countable collection ሼܧ: ݅ ∈ ߉ሽ of mutually exclusive
and exhaustive events, the following theorem provides a relationship between marginal
probability ܲሺܧሻ of an event ܧ ∈ ℱ and joint probabilities ܲሺܧ ∩ ܧሻ of events ܧ and ܧ, ݅ ∈ ߉.
Theorem 3.3 (Theorem of Total Probability)
Let ሺߗ, ℱ, ܲሻ be a probability space and let ሼܧ: ݅ ∈ ߉ሽ be a countable collection of mutually
exclusive and exhaustive events (i.e., ܧ ∩ ܧ = ߶, whenever ݅ ≠ ݆, and ܲሺ⋃ ܧ∈௸ ሻ = 1) such
that ܲሺܧሻ > 0, ∀݅ ∈ ߉.Then, for any event ܧ ∈ ℱ,
ܲሺܧሻ = ܲሺܧ ∩ ܧሻ
∈௸
= ܲሺܧ|ܧሻ
∈௸
ܲሺܧሻ.
Proof. Let ܨ = ⋃ ܧ∈௸ . Then ܲሺܨሻ = 1 and ܲሺܨሻ = 1 − ܲሺܨሻ = 0. Therefore,
ܲሺܧሻ = ܲሺܧ ∩ ܨሻ + ܲሺܧ ∩ ܨ
ሻ
= ܲሺܧ ∩ ܨሻ ሺܧ ∩ ܨ
⊆ ܨ
⇒ 0 ≤ ܲሺܧ ∩ ܨሻ ≤ ܲሺܨሻ = 0ሻ
= ܲ ൭ራሺܧ ∩ ܧሻ
∈௸
൱ = ܲሺܧ ∩ ܧሻ
∈௸
ሺܧ ݏare disjoint
⇒ ܧ ∩ ܧs ሺ⊆ ܧሻ are disjoint ሻ
28. 28
= ܲሺܧ|ܧሻ
∈௸
ܲሺܧሻ. ▄
Example 3.4
Urn ܷଵ contains 4 white and 6 black balls and urn ܷଶ contains 6 white and 4 black balls. A fair
die is cast and urn ܷଵ is selected if the upper face of die shows 5 or 6 dots. Otherwise urn ܷଶ is
selected. If a ball is drawn at random from the selected urn find the probability that the drawn
ball is white.
Solution. Define the events:
ܹ ∶ drawn ball is white;
ܧଵ ∶ urn ܷଵ is selected;
ܧଶ ∶ urn ܷଶis selected.
Then ሼܧଵ, ܧଶሽ is a collection of mutually exclusive and exhaustive events. Therefore
ܲሺܹሻ = ܲሺܧଵሻ ܲሺܹ|ܧଵሻ ܲሺܧଶሻܲሺܹ|ܧଶሻ
ൌ
2
6
×
4
10
+
4
6
×
6
10
=
8
15
∙ ▄
The following theorem provides a method for finding the probability of occurrence of an event
in a past trial based on information on occurrences in future trials.
Theorem 3.4 (Bayes’ Theorem)
Let ሺߗ, ℱ, ܲሻ be a probability space and let ሼܧ: ݅ ∈ ߉ሽ be a countable collection of mutually
exclusive and exhaustive events with ܲሺܧሻ > 0, ݅ ∈ ߉. Then, for any event ܧ ∈ ℱ with
ܲሺܧሻ > 0, we have
ܲ൫ܧ|ܧ൯ ൌ
ܲ൫ܧ|ܧ൯ܲ൫ܧ൯
∑ ܲሺܧ|ܧሻܲሺܧሻ∈௸
, ݆ ∈ ߉ ∙
Proof. We have, for ݆ ∈ ߉,
ܲ൫ܧ|ܧ൯ ൌ
ܲ൫ܧ ∩ ܧ൯
ܲሺܧሻ
29. 29
=
ܲ൫ܧ|ܧ൯ܲ൫ܧ൯
ܲሺܧሻ
ൌ
ܲ൫ܧ|ܧ൯ܲ൫ܧ൯
∑ ܲሺܧ|ܧሻܲሺܧሻ∈௸
ሺusing Theorem of Total Probabilityሻ. ▄
Remark 3.2
(i) Suppose that the occurrence of any one of the mutually exclusive and exhaustive
events ܧ, ݅ ∈ ߉, causes the occurrence of an event .ܧ Given that the event ܧ has
occurred, Bayes’ theorem provides the conditional probability that the event ܧ is
caused by occurrence of event ܧ, ݆ ∈ ߉.
(ii) In Bayes’ theorem the probabilities ܲ൫ܧ൯, ݆ ∈ ߉, are referred to as prior probabilities
and the probabilities ܲ൫ܧ|ܧ൯, ݆ ∈ ߉, are referred to as posterior probabilities. ▄
To see an application of Bayes’ theorem let us revisit Example 3.4.
Example 3.5
Urn ܷଵcontains 4 white and 6 black balls and urn ܷଶ contains 6 white and 4 black balls. A fair
die is cast and urn ܷଵ is selected if the upper face of die shows five or six dots. Otherwise urn
ܷଶ is selected. A ball is drawn at random from the selected urn.
(i) Given that the drawn ball is white, find the conditional probability that it came from
urn ܷଵ;
(ii) Given that the drawn ball is white, find the conditional probability that it came from
urn ܷଶ.
Solution. Define the events:
ܹ ∶ drawn ball is white;
ܧଵ ∶ urn ܷଵ is selected
ܧଶ ∶ urn ܷଶ is selected
ൠ mutually & exhaustive events
(i) We have
ܲሺܧଵ|ܹሻ ൌ
ܲሺܹ|ܧଵሻ ܲሺܧଵሻ
ܲሺܹ|ܧଵሻܲሺܧଵሻ ܲሺܹ|ܧଶሻܲሺܧଶሻ
ൌ
ସ
ଵ
×
ଶ
ସ
ଵ
×
ଶ
ଵ
×
ସ
30. 30
=
1
4
∙
(ii) Since ܧଵ and ܧଶ are mutually exclusive and ܲሺܧଵ ∪ ܧଶ|ܹሻ ൌ ܲሺߗ|ܹሻ ൌ 1, we have
ܲሺܧଶ|ܹሻ ൌ 1 − ܲሺܧଵ|ܹሻ
ൌ
3
4
∙ ▄
In the above example
ܲሺܧଵ|ܹሻ ൌ
ଵ
ସ
<
ଵ
ଷ
ൌ ܲሺܧଵሻ,
and ܲሺܧଶ|ܹሻ ൌ
3
4
>
2
3
= ܲሺܧଶሻ,
i.e.,
(i) the probability of occurrence of event ܧଵ decreases in the presence of the information
that the outcome will be an element of ܹ;
(ii) the probability of occurrence of event ܧଶ increases in the presence of information that
the outcome will be an element of ܹ.
These phenomena are related to the concept of association defined in the sequel.
Note that
ܲሺܧଵ|ܹሻ < ܲሺܧଵሻ ⇔ ܲሺܧଵ ∩ ܹሻ < ܲሺܧଵሻܲሺܹሻ,
and
ܲሺܧଶ|ܹሻ > ܲሺܧଶሻ ⇔ ܲሺܧଶ ∩ ܹሻ > ܲሺܧଶሻܲሺܹሻ.
Definition 3.2
Letሺߗ, ℱ, ܲሻ be a probability space and let ܣ and ܤ be two events. Events ܣ and ܤ are said to
be
(i) negatively associated if ܲሺܣ ∩ ܤሻ < ܲሺܣሻܲሺܤሻ;
(ii) positively associated if ܲሺܣ ∩ ܤሻ > ܲሺܣሻܲሺܤሻ;
(iii) independent if ܲሺܣ ∩ ܤሻ = ܲሺܣሻܲሺܤሻ. ▄
Remark 3.3
31. 31
(i) If ܲሺܤሻ = 0 then ܲሺܣ ∩ ܤሻ = 0 = ܲሺܣሻܲሺܤሻ, ∀ ܣ ∈ ℱ, i.e., if ܲሺܤሻ = 0 then any
event ܣ ∈ ℱ and ܤ are independent;
(ii) If ܲሺܤሻ > 0 then ܣ and ܤ are independent If, and only if, ܲሺܤ|ܣሻ ൌ ܲሺܣሻ, i.e., if
ܲሺܤሻ > 0, then events ܣ and ܤ are independent if, and only if, the availability of the
information that event ܤ has occurred does not alter the probability of occurrence
of event .ܣ ▄
Now we define the concept of independence for arbitrary collection of events.
Definition 3.3
Let ሺߗ, ℱ, ܲሻ be a probability space. Let ߉ ⊆ ℝ be an index set and let ሼܧఈ: ߙ ∈ ߉ሽbe a
collection of events in ℱ.
(i) Events ሼܧఈ: ߙ ∈ ߉ሽ are said to be pair wise independent if any pair of events ܧఈ and
ܧఉ, ߙ ≠ ߚ in the collection ൛ܧ: ݆ ∈ ߉ൟ are independent. i.e., if ܲ൫ܧఈ ∩ ܧఉ൯ =
ܲሺܧఈሻܲ൫ܧఉ൯, whenever ߙ, ߚ ∈ ߉ and ߙ ≠ ߚ;
(ii) Let ߉ = ሼ1, 2, … , nሽ, for some ݊ ∈ ℕ, so that ሼܧఈ: ߙ ∈ ߉ሽ = ሼܧଵ, … , ܧሽ is a finite
collection of events in ℱ. Events ܧଵ, … , ܧ are said to be independent if, for any sub
collection ൛ܧఈଵ
, … , ܧఈ
ൟ of ሼܧଵ, … , ܧሽሺ݇ = 2,3, … , ݊ሻ
ܲ ቌሩ ܧఈ
ୀଵ
ቍ = ෑ ܲ
ୀଵ
ቀܧఈ
ቁ. ሺ3.6ሻ
(iii) Let ߉ ⊆ ℝ be an arbitrary index set. Events ሼܧఈ: ߙ ∈ ߉ሽ are said to be independent if
any finite sub collection of events in ሼܧఈ: ߙ ∈ ߉ሽ forms a collection of independent
events. ▄
Remark 3.4
(i) To verify that ݊ events ܧଵ, … , ܧ ∈ ℱ are independent one must verify 2
− ݊ −
1 ቀ= ∑ ቀ
݊
݆ቁ
ୀଶ ቁ conditions in (3.6). For example, to conclude that three events
ܧଵ, ܧଶ and ܧଷ are independent, the following 4 ሺ= 2ଷ
− 3 − 1ሻ conditions must be
verified:
ܲሺܧଵ ∩ ܧଶሻ = ܲሺܧଵሻܲሺܧଶሻ;
ܲሺܧଵ ∩ ܧଷሻ = ܲሺܧଵሻܲሺܧଷሻ;
32. 32
ܲሺܧଶ ∩ ܧଷሻ = ܲሺܧଶሻܲሺܧଷሻ;
ܲሺܧଵ ∩ ܧଶ ∩ ܧଷሻ = ܲሺܧଵሻܲሺܧଶሻܲሺܧଷሻ.
(ii) If events ܧଵ, … , ܧ are independent then, for any permutation ሺߙଵ, … , ߙሻ of
ሺ1, … , ݊ሻ, the events ܧఈଵ
, … , ܧఈ
are also independent. Thus the notion of
independence is symmetric in the events involved.
(iv) Events in any sub collection of independent events are independent. In particular
independence of a collection of events implies their pair wise independence. ▄
The following example illustrates that, in general, pair wise independence of a collection of
events may not imply their independence.
Example 3.6
Let ߗ = ሼ1, 2, 3, 4ሽ and let ℱ = ࣪ሺߗሻ , the power set of ߗ . Consider the probability
space ሺߗ, ℱ, Pሻ, where ܲሺሼ݅ሽሻ =
ଵ
ସ
, ݅ = 1, 2, 3, 4 . Let ܣ = ሼ1, 4ሽ, ܤ = ሼ2, 4ሽ and ܥ = ሼ3, 4ሽ.
Then,
ܲሺܣሻ = ܲሺܤሻ = ܲሺܥሻ =
ଵ
ଶ
,
ܲሺܣ ∩ ܤሻ = ܲሺܣ ∩ ܥሻ = ܲሺܤ ∩ ܥሻ = ܲሺሼ4ሽሻ =
ଵ
ସ
,
and ܲሺܣ ∩ ܤ ∩ ܥሻ = ܲሺሼ4ሽሻ =
ଵ
ସ
∙
Clearly,
ܲሺܣ ∩ ܤሻ = ܲሺܣሻܲሺܤሻ; ܲሺܣ ∩ ܥሻ = ܲሺܣሻܲሺܥሻ, and ܲሺܤ ∩ ܥሻ = ܲሺܤሻܲሺܥሻ,
i.e., ,ܣ ܤ and ܥ are pairwise independent.
However,
ܲሺܣ ∩ ܤ ∩ ܥሻ =
ଵ
ସ
≠ ܲሺܣሻܲሺܤሻܲሺܥሻ.
Thus ,ܣ ܤ and ܥare not independent. ▄
Theorem 3.5
Let ሺߗ, ℱ, ܲሻ be a probability space and let ܣ and ܤ be independent events (,ܣ ܤ ∈ ℱ).Then
(i) ܣ
and ܤ are independent events;
33. 33
(ii) ܣ and ܤ
are independent events;
(iii) ܣ
and ܤ
are independent events.
Proof. We have
ܲሺܣ ∩ ܤሻ = ܲሺܣሻܲሺܤሻ.
(i) Since ܤ = ሺܣ ∩ ܤሻ ∪ ሺܣ
∩ ܤሻ and ሺܣ ∩ ܤሻ ∩ ሺܣ
∩ ܤሻ = ߶, we have
ܲሺܤሻ = ܲሺܣ ∩ ܤሻ + ܲሺܣ
∩ ܤሻ
⇒ ܲሺܣ
∩ ܤሻ = ܲሺܤሻ − ܲሺܣ ∩ ܤሻ
= ܲሺܤሻ − ܲሺܣሻܲሺܤሻ
= ൫1 − ܲሺܣሻ൯ܲሺܤሻ
= ܲሺܣ
ሻܲሺܤሻ,
i.e., ܣ
and ܤ are independent events.
(ii) Follows from (i) by interchanging the roles of ܣ and .ܤ
(iii) Follows on using (i) and (ii) sequentially. ▄
The following theorem strengthens the results of Theorem 3.5.
Theorem 3.6
Let ሺߗ, ℱ, ܲሻ be a probability space and let ܨଵ, … , ܨሺ݊ ∈ ℕ, ݊ ≥ 2ሻ be independent events in
ℱ. Then, for any ݇ ∈ ሼ1, 2, … , ݊ − 1ሽ and any permutationሺߙଵ, … , ߙሻ of ሺ1, … , ݊ሻ, the events
ܨఈଵ
, … , ܨఈ
, ܨఈೖశభ
, … , ܨఈ
are independent. Moreover the events ܨଵ
, … , ܨ
are independent.
Proof. Since the notion of independence is symmetric in the events involved, it is enough to
show that for any ݇ ∈ ሼ1, 2, … , ݊ − 1ሽ the events ܨଵ, … , ܨ, ܨାଵ
, … , ܨ
are independent. Using
backward induction and symmetry in the notion of independence the above mentioned
assertion would follow if, under the hypothesis of the theorem, we show that the events
ܨଵ, … , ܨିଵ, ܨ
are independent. For this consider a sub collection ൛ܨଵ
, … , ܨ
, ܩൟ of
ܨଵ, … , ܨିଵ, ܨ
ሺሼ݅ଵ, … , ݅ሽ ⊆ ሼ1, … , ݊ − 1ሽሻ, where ܩ = ܨ
or ܩ = ܨ, for some ݆ ∈ ሼ1, … , ݊ −
1ሽ − ሼ݅ଵ, … , ݅ሽ, depending on whether or not ܨ
is a part of sub collection ൛ܨଵ
, … , ܨ
, ܩൟ .
Thus the following two cases arise:
۱.۷ ܍ܛ܉ ܩ = ܨ
Since ܨଵ, … , ܨ are independent, we have
34. 34
ܲ ቌሩ ܨ
ୀଵ
ቍ = ෑ ܲ
ୀଵ
ቀܨ
ቁ,
and
ܲ ൮ቌሩ ܨ
ୀଵ
ቍ ∩ ܨ൲ = ෑ ܲ ቀܨ
ቁ
ୀଵ
ܲሺܨሻ
= ܲ ቌሩ ܨ
ୀଵ
ቍ ܲሺܨሻ
⇒ events ሩ ܨ
ୀଵ
and ܨ are independent
⇒ events ⋂ ܨ
ୀଵ and ܨ
are independent ሺTheorem 3.5 ሺiiሻ)
⇒ ܲ ൮ቌሩ ܨ
ୀଵ
ቍ ∩ ܨ
൲ = ܲ ቌሩ ܨ
ୀଵ
ቍ ܲሺܨ
ሻ
= ෑ ܲ ቀܨ
ቁ
ୀଵ
ܲሺܨ
ሻ
⇒ ܲ൫ܨଵ
∩ ⋯ ∩ ܨ
∩ ܩ൯ = ෑ ܲ ቀܨ
ቁ
ୀଵ
ܲሺܩሻ.
Case II. ܩ = ܨ, for some ݆ ∈ ሼ1, … , ݊ − 1ሽ − ሼ݅ଵ, … , ݅ሽ.
In this case ൛ܨଵ
, … , ܨ
, ܩൟ is a sub collection of independent events ܨଵ, … , ܨ and therefore
ܲ൫ܨଵ
∩ ⋯ ∩ ܨ
∩ ܩ൯ = ෑ ܨ
ୀଵ
ܲሺܩሻ.
Now the result follows on combining the two cases. ▄
35. 35
When we say that two or more random experiments are independent (or that two or more
random experiments are performed independently) it simply means that the events associated
with the respective random experiments are independent.
4. Continuity of Probability Measures
We begin this section with the following definition.
Definition 4.1
Let ሺߗ, ℱ, ܲሻ be a probability space and let ሼܣ: ݊ = 1, 2, … ሽ be a sequence of events in ℱ.
(i) We say that the sequence ሼܣ: ݊ = 1, 2, … ሽ is increasing (written as ܣ ↑) if
ܣ ⊆ ܣାଵ, ݊ = 1,2, … ;
(ii) We say that the sequence ሼܣ: ݊ = 1, 2, … ሽ is decreasing (written as ܣ ↓) if
ܣାଵ ⊆ ܣ, ݊ = 1,2, … ;
(iii) We say that the sequence ሼܣ: ݊ = 1, 2, … ሽ is monotone if either ܣ ↑ or ܣ ↓;
(iv) If ܣ ↑ we define the limit of the sequence ሼܣ: ݊ = 1, 2, … ሽ as ⋃ ܣ
ஶ
ୀଵ and write
Lim→ஶ ܣ = ⋃ ܣ
ஶ
ୀଵ ;
(v) If ܣ ↓ we define the limit of the sequence ሼܣ: ݊ = 1, 2, … ሽ as ⋂ ܣ
ஶ
ୀଵ and write
Lim→ஶ ܣ = ⋂ ܣ
ஶ
ୀଵ . ▄
Throughout we will denote the limit of a monotone sequence ሼܣ: ݊ = 1, 2, … ሽ of events by
Lim→ஶ ܣ and the limit of a sequence ሼܽ: ݊ = 1, 2, … ሽ of real numbers (provided it exists) by
lim→ஶ ܽ.
Theorem 4.1 (Continuity of Probability Measures)
Let ሼܣ: ݊ = 1, 2, … ሽ be a sequence of monotone events in a probability spaceሺߗ, ℱ, ܲሻ. Then
ܲ ቀLim
→ஶ
ܣቁ = lim
→ஶ
ܲሺܣሻ.
Proof.
Case I. ܣ ↑
In this case, Lim→ஶ ܣ = ⋃ ܣ
ஶ
ୀଵ . Define ܤଵ = ܣଵ, ܤ = ܣ − ܣିଵ, ݊ = 2, 3, ….
37. 37
Case II. ܣ ↓
In this case, Lim
→ஶ
ܣ = ⋂ ܣ
ஶ
ୀଵ and ܣ
↑. Therefore,
ܲ ቀLim
→ஶ
ܣቁ = ܲ ൭ሩ ܣ
ஶ
ୀଵ
൱
= 1 − ܲ ൭൭ሩ ܣ
ஶ
ୀଵ
൱
൱
= 1 − ܲ ൭ራ ܣ
ஶ
ୀଵ
൱
= 1 − ܲሺLim
→ஶ
ܣ
ሻ
= 1 − lim
→ஶ
ܲሺܣ
ሻ ሺusing Case I, since ܣ
↑ሻ
= 1 − lim
→ஶ
൫1 − ܲሺܣሻ൯
= lim
→ஶ
ܲሺܣሻ. ▄
Remark 4.1
Let ሺߗ, ℱ, ܲሻ be a probability space and let ሼܧ: ݅ = 1, 2, … ሽ be a countably infinite collection of
events in ℱ. Define
ܤ = ራ ܧ
ୀଵ
and ܥ = ሩ ܧ
ୀଵ
, ݊ = 1,2, …
Then ܤ ↑, ܥ ↓, Lim
→ஶ
ܤ = ⋃ ܤ
ஶ
ୀଵ = ⋃ ܧ
ஶ
ୀଵ and Lim
→ஶ
ܥ = ⋂ ܧ
ஶ
ୀଵ . Therefore
ܲ ൭ራ ܧ
ஶ
ୀଵ
൱ = ܲ ቀLim
→ஶ
ܤቁ
= lim
→ஶ
ܲሺܤሻ ሺusing Theorem 4.1ሻ
= lim
→ஶ
ܲ ൭ራ ܧ
ୀଵ
൱
= lim
→ஶ
ൣܵଵ, + ܵଶ, + ⋯ + ܵ,൧,
38. 38
where S୩,୬s are as defined in Theorem 2.2.
Moreover,
ܲ ൭ሩ ܧ
ஶ
ୀଵ
൱ = ܲ ቀLim
→ஶ
ܥቁ
= lim
→ஶ
ܲሺܥሻ ሺusing Theorem 4.1ሻ
= lim
→ஶ
ܲሺ⋂ ܧ
ୀଵ ሻ.
Similarly, if ሼܧ: ݅ = 1, 2, ⋯ ሽ is a collection of independent events, then
ܲ ൭ሩ ܧ
ஶ
ୀଵ
൱ = lim
→ஶ
ܲ ൭ሩ ܧ
ୀଵ
൱
= lim
→ஶ
ෑ ܲ
ୀଵ
ሺܧሻ൩
= ෑ ܲ
ஶ
ୀଵ
ሺܧሻ. ▄
Problems
1. Let ߗ = ሼ1, 2, 3, 4ሽ. Check which of the following is a sigma-field of subsets of ߗ:
(i) ℱଵ = ൛߶, ሼ1, 2ሽ, ሼ3, 4ሽൟ;
(ii)ℱଶ = ൛߶, ߗ, ሼ1ሽ, ሼ2, 3, 4ሽ, ሼ1, 2ሽ, ሼ3, 4ሽൟ;
(iii) ℱଷ = ൛߶, ߗ, ሼ1ሽ, ሼ2ሽ, ሼ1, 2ሽ, ሼ3, 4ሽሼ2, 3, 4ሽ, ሼ1, 3, 4ሽൟ.
2. Show that a class ℱ of subsets of ߗ is a sigma-field of subsets of ߗ if, and only if, the
following three conditions are satisfied: (i) ߗ ∈ ℱ; (ii) ܣ ∈ ℱ ⇒ ܣ
= ߗ − ܣ ∈ ℱ;
(iii) ܣ ∈ ℱ, n = 1, 2, ⋯ ⇒ ⋂ ܣ ∈ஶ
ୀଵ ℱ.
3. Let ሼℱఒ: ߣ ∈ ߉ሽ be a collection of sigma-fields of subsets of ߗ.
(i) Show that ⋂ ℱఒఒ∈௸ is a sigma-field;
(ii) Using a counter example show that ∪ఒ∈௸ ℱఒ may not be a sigma-field;
39. 39
(iii) Let ࣝ be a class of subsets of ߗ and let ሼℱఒ: ߣ ∈ ߉ሽ be a collection of all sigma-fields
that contain the class ࣝ. Show that ߪሺࣝሻ = ⋂ ℱఒఒ∈௸ , where ߪሺࣝሻ denotes the
smallest sigma-field containing the class ࣝ (or the sigma-field generated by class ࣝ).
4. Let ߗ be an infinite set and let ࣛ = ሼܣ ⊆ ߗ: ܣis finite or ܣ
is finiteሽ.
(i) Show that ࣛ is closed under complements and finite unions;
(ii) Using a counter example show that ࣛ may not be closed under countably infinite
unions (and hence ࣛ may not be a sigma-field).
5. (i) Let ߗ be an uncountable set and let ℱ = ሼܣ ⊆ ߗ: ܣis countable orܣ
is countableሽ.
(a) Show that ℱ is a sigma-field;
(b) What can you say about ℱwhen ߗ is countable?
(ii) Let Ω be a countable set and let ࣝ = ሼሼ߱ሽ: ߱ ∈ Ωሽ. Show that ߪሺࣝሻ = ࣪ሺߗሻ.
6. Let ℱ = ࣪ሺߗሻ =the power set of ߗ = ሼ0, 1, 2, … ሽ. In each of the following cases, verify
if ሺߗ, ℱ, ܲሻ is a probability space:
(i) ܲሺܣሻ = ∑ ݁ିఒ
௫ ∈ ߣ௫
!ݔ⁄ , ܣ ∈ ℱ, ߣ > 0;
(ii) ܲሺܣሻ = ∑ ሺ1 − ሻ௫
௫ ∈ , ܣ ∈ ℱ, 0 < < 1;
(iii) ܲሺܣሻ = 0, if ܣ has a finite number of elements, and ܲሺܣሻ = 1, if ܣ has infinite
number of elements, ܣ ∈ ℱ.
7. Let ሺߗ, ℱ, ܲሻ be a probability space and let ,ܣ ,ܤ ,ܥ ܦ ∈ ℱ . Suppose that ܲሺܣሻ =
0.6, ܲሺܤሻ = 0.5, ܲሺܥሻ = 0.4, ܲሺܣ ∩ ܤሻ = 0.3, ܲሺܣ ∩ ܥሻ = 0.2, ܲሺܤ ∩ ܥሻ = 0.2,
ܲሺܣ ∩ ܤ ∩ ܥሻ = 0.1, ܲሺܤ ∩ ܦሻ = ܲሺܥ ∩ ܦሻ = 0, ܲሺܣ ∩ ܦሻ = 0.1 and ܲሺܦሻ = 0.2.
Find:
(i) ܲሺܣ ∪ ܤ ∪ ܥሻand ܲሺܣ
∩ ܤ
∩ ܥሻ;
(ii) ܲሺሺܣ ∪ ܤሻ ∩ ܥሻand ܲሺܣ ∪ ሺܤ ∩ ܥሻሻ;
(iii) ܲሺሺܣ
∪ ܤ
ሻ ∩ ܥሻand ܲሺሺܣ
∩ ܤ
ሻ ∪ ܥሻ;
(iv) ܲሺܤ ∩ ܥ ∩ ܦሻand ܲሺܣ ∩ ܥ ∩ ܦሻ;
(v) ܲሺܣ ∪ ܤ ∪ ܦሻand ܲሺܣ ∪ ܤ ∪ ܥ ∪ ܦሻ;
(vi) ܲ൫ሺܣ ∩ ܤሻ ∪ ሺܥ ∩ ܦሻ൯.
8. Let ሺߗ, ℱ, ܲሻ be a probability space and let ܣ and ܤbe two events (i.e., ,ܣ ܤ ∈ ℱ).
(i) Show that the probability that exactly one of the events ܣ or ܤ will occur is given by
ܲሺܣሻ + ܲሺܤሻ − 2ܲሺܣ ∩ ܤሻ;
(ii) Show that ܲሺܣ ∩ ܤሻ − ܲሺܣሻܲሺܤሻ = ܲሺܣሻܲሺܤሻ − ܲሺܣ ∩ ܤሻ = ܲሺܣሻܲሺܤሻ −
ܲሺܣ
∩ ܤሻ = ܲሺሺܣ ∪ ܤሻሻ − ܲሺܣሻܲሺܤሻ.
40. 40
9. Suppose that ݊ ሺ≥ 3ሻ persons ܲଵ, … , ܲ are made to stand in a row at random. Find the
probability that there are exactly ݎ person between ܲଵand ܲଶ; here ݎ ∈ ሼ1, 2, … , ݊ − 2ሽ.
10. A point ሺܺ, ܻሻ is randomly chosen on the unit square ܵ = ሼሺ,ݔ ݕሻ: 0 ≤ ݔ ≤ 1, 0 ≤ ݕ ≤
1ሽ (i.e., for any region ܴ ⊆ ܵ for which the area is defined, the probability that ሺܺ, ܻሻ
lies on ܴ is
ୟ୰ୣୟ ୭ ோ
ୟ୰ୣୟ ୭ ௌ
ሻ ⋅ Find the probability that the distance from ሺܺ, ܻሻ to the nearest
side does not exceed
ଵ
ଷ
units.
11. Three numbers ܽ, ܾ and ܿ are chosen at random and with replacement from the set
ሼ1, 2, … ,6ሽ. Find the probability that the quadratic equation ܽݔଶ
+ ܾݔ + ܿ = 0 will have
real root(s).
12. Three numbers are chosen at random from the set ሼ1, 2, … ,50ሽ. Find the probability that
the chosen numbers are in
(i) arithmetic progression;
(ii) geometric progression.
13. Consider an empty box in which four balls are to be placed (one-by-one) according to
the following scheme. A fair die is cast each time and the number of dots on the upper
face is noted. If the upper face shows up 2 or 5 dots then a white ball is placed in the
box. Otherwise a black ball is placed in the box. Given that the first ball placed in the box
was white find the probability that the box will contain exactly two black balls.
14. Let ൫ሺ0, 1ሿ, ℱ, ܲ൯ be a probability space such that ℱ is the smallest sigma-field
containing all subintervals of ߗ = ሺ0, 1ሿ and ܲሺሺܽ, ܾሿሻ = ܾ − ܽ, where 0 ≤ ܽ < ܾ ≤ 1
(such a probability measure is known to exist).
(i) Show that ሼܾሽ = ⋂ ቀܾ −
ଵ
ାଵ
, ܾቃஶ
ୀଵ , ∀ܾ ∈ ሺ0, 1ሿ;
(ii) Show that ܲሺሼܾሽሻ = 0, ∀ܾ ∈ ሺ0, 1ሿand ܲ൫ሺ0, 1ሿ൯ = 1(Note that here ܲሺሼܾሽሻ = 0
but ሼܾሽ ≠ ߶ and ܲ൫ሺ0, 1ሻ൯ = 1 but ሺ0, 1ሻ ≠ Ω) ;
(iii) Show that, for any countable set ܣ ∈ ℱ, ܲሺܣሻ = 0;
(iv) For ݊ ∈ ℕ, let ܣ = ቀ0,
ଵ
ቃ and ܤ = ቀ
ଵ
ଶ
+
ଵ
ାଶ
, 1ቃ . Verify that ܣ ↓, ܤ ↑,
ܲሺLim→ஶ ܣሻ = lim→ஶ ܲሺܣሻ and ܲሺLim→ஶ ܤሻ = lim→ஶ ܲሺܤሻ.
15. Consider four coding machines ܯଵ, ܯଶ, ܯଷand ܯସ producing binary codes 0 and 1. The
machine ܯଵ produces codes0 and 1 with respective probabilities
ଵ
ସ
and
ଷ
ସ
. The code
produced by machine ܯ is fed into machine ܯାଵሺ݇ = 1, 2, 3ሻ which may either leave
41. 41
the received code unchanged or may change it. Suppose that each of the machines
ܯଶ, ܯଷ andܯସ change the received code with probability
ଷ
ସ
. Given that the machine ܯସ
has produced code 1, find the conditional probability that the machine ܯଵ produced
code 0.
16. A student appears in the examinations of four subjects Biology, Chemistry, Physics and
Mathematics. Suppose that probabilities of the student clearing examinations in these
subjects are
ଵ
ଶ
,
ଵ
ଷ
,
ଵ
ସ
and
ଵ
ହ
respectively. Assuming that the performances of the students
in four subjects are independent, find the probability that the student will clear
examination(s) of
(i) all the subjects; (ii) no subject; (iii) exactly one subject;
(iv) exactly two subjects; (v) at least one subject.
17. Let ܣ and ܤbe independent events. Show that
maxሼܲሺሺܣ ∪ ܤሻሻ, ܲ ሺܣ ∩ ܤሻ, ܲ ሺ ܣΔ ܤሻሽ ≥
4
9
,
where ܣΔ ܤ = ሺܣ − ܤሻ ∪ ሺܤ − ܣሻ.
18. For independent events ܣଵ, … , ܣ, show that:
ܲ ൭ሩ ܣ
ୀଵ
൱ ≤ ݁ି ∑ ሺሻ
సభ .
19. Let ሺߗ, ℱ, ܲሻ be a probability space and let ܣଵ, ܣଶ, … be a sequence of events
ሺi. e. , ܣ ∈ ℱ, ݅ = 1, 2, … ሻ . Define ܤ = ⋂ ܣ
ஶ
ୀ , ܥ = ⋃ ܣ, ݊ = 1,2, … ,ஶ
ୀ ܦ =
⋃ ܤ
ஶ
ୀଵ and ܧ = ⋂ ܥ
ஶ
ୀଵ . Show that:
(i) ܦ is the event that all but a finite number of ܣs occur and ܧ is the event that
infinitely many ܣs occur;
(ii) ܦ ⊆ ;ܧ
(iii) ܲሺܧሻ = lim→ஶ ܲሺܥ
ሻ = lim→ஶ lim→ஶ ܲሺ⋂ ܣ
ୀ ሻ and ܲሺܧሻ = lim→ஶ ܲሺܥሻ;
(iv) if ∑ ܲሺܣሻஶ
ୀଵ < ∞ then, with probability one, only finitely many ܣs will occur;
(v) if ܣଵ, ܣଶ, … are independent and ∑ ܲሺܣሻஶ
ୀଵ < ∞ then, with probability one,
infinitely many ܣݏ will occur.
42. 42
20. Let ,ܣ ܤand ܥ be three events such that ܣand ܤ are negatively (positively) associated
and ܤ and ܥ are negatively (positively) associated. Can we conclude that, in general, ܣ
and ܥ are negatively (positively) associated?
21. Let ሺߗ, ℱ, ܲሻ be a probability space and let A and B two eventsሺi. e., ,ܣ ܤ ∈ ℱሻ. Show
that if ܣ and ܤ are positively (negatively) associated then ܣ and ܤ
are negatively
(positively) associated.
22. A locality has ݊ houses numbered 1, … . , ݊ and a terrorist is hiding in one of these
houses. Let ܪ denote the event that the terrorist is hiding in house numbered
݆, ݆ = 1, … , ݊ and let ܲ൫ܪ൯ = ∈ ሺ0,1ሻ, ݆ = 1, … , ݊. During a search operation, let ܨ
denote the event that search of the house number ݆ will fail to nab the terrorist there
and let ܲ൫ܨ|ܪ൯ ൌ ݎ ∈ ሺ0,1ሻ, ݆ = 1, … , ݊. For each ݅, ݆ ∈ ሼ1, … , ݊ሽ, ݅ ≠ ݆, show that
ܪ and ܨ are negatively associated but ܪ and ܨ are positively associated. Interpret
these findings.
23. Let ,ܣ ܤand ܥ be three events such that ܲሺܤ ∩ ܥሻ > 0. Prove or disprove each of the
following:
(i) ܲሺܣ ∩ ܥ|ܤሻ ൌ ܲሺܤ|ܣ ∩ ܥሻܲሺܥ|ܤሻ; (ii) ܲሺܣ ∩ ܥ|ܤሻ ൌ ܲሺܥ|ܣሻܲሺܥ|ܤሻ if ܣand ܤ
are independent events.
24. A ݇-out-of-݊ system is a system comprising of ݊ components that functions if, and only
if, at least ݇ ሺ݇ ∈ ሼ1,2, … , ݊ሽሻ of the components function. A1-out-of-݊ system is called
a parallel system and an݊-out-of-݊ system is called a series system. Consider ݊
components ܥଵ, … , ܥ that function independently. At any given time ݐ the probability
that the component ܥ will be functioning is ሺݐሻ൫∈ ሺ0,1ሻ൯ and the probability that it
will not be functioning at time ݐ is 1 − ሺݐሻ, ݅ = 1, … , ݊.
(i) Find the probability that a parallel system comprising of components ܥଵ, … , ܥ will
function at time ;ݐ
(ii) Find the probability that a series system comprising of components ܥଵ, …,ܥ will
function at time ;ݐ
(iii) If ሺݐሻ = ሺݐሻ, ݅ = 1, … , ݊, find the probability that a ݇-out-of-݊ system comprising
of components ܥଵ, … , ܥ will function at time .ݐ