“MATH 307”
“THEORY OF PROBABILITY”
Cheryl B. Subaldo, EdD.
MT II – Kidapawan City NHS
Bernoulli, Binomial, Poisson, and
RVs
CENTRAL MINDANAO COLLEGE
Prepared by: JOSEPHINE R. CLAVERIA
MAED Student
BERNOULLI
INTRODUCTION
Bernoulli, (named after the Swiss mathematician Jacob Bernoulli),
constitutes a fundamental branch of probability theory dealing
with random experiments having binary outcomes. This report
provides an overview of Bernoulli statistics, covering its key
concepts, properties, applications, and relevance in modern data
analysis.
Bernoulli statistics lies at the heart of probability theory, providing
a framework to analyze situations where outcomes are binary,
such as success/failure, heads/tails, or yes/no. It serves as the
foundation for more complex probability distributions and
statistical models.
KEY CONCEPTS
1.Bernoulli Trial
A Bernoulli trial is an experiment with exactly two possible
outcomes: success or failure.
These outcomes are often denoted as 1 for success and 0 for
failure.
The probability of success is denoted by p, and the probability
of failure (or complement of success) is 1−p.
Example
Suppose a biased coin is tossed 10 times. The probability
of getting heads (success) on each toss is 0.30.3.
Calculate the probability of getting exactly 3 heads.
Given:
Number of trials, n=10
Probability of success (getting heads), p=0.3
Number of successes we want to find, k=3
Formula
Substitute Values
Calculate
2. Bernoulli Distribution
The Bernoulli distribution represents the probability
distribution of a single Bernoulli trial.
It is characterized by a single parameter p, the probability of
success.
The probability mass function (PMF) of the Bernoulli
distribution is:
Example
In a factory, the probability of a defective product coming off
the assembly line is 0.1. If a sample of 8 products is randomly
selected, what is the probability that exactly 2 of them are
defective?
Given:
Number of trials, n=8 (selecting 88 products)
Probability of success (defective product), p=0.1
Number of successes we want to find, k=2 (exactly 22 defective products)
Solution
3. Mean and Variance
Mean: The expected value or mean of a Bernoulli
random variable is E(X)=p, representing the average
proportion of successes in a series of trials.
Variance: The variance of a Bernoulli random
variable is Var(X)=p(1−p), indicating the spread or
variability of outcomes around the mean.
Example
Consider a random variable X following a Bernoulli
distribution with a probability of success p=0.6. Calculate
the mean and variance of X.
Solution
• Mean (Expected Value): The mean or expected value of a Bernoulli random variable
X is given by: E(X) = p
Given p=0.6, we can directly substitute this value into the formula:
E(X) = 0.6
• Variance: The variance of a Bernoulli random variable X is given by: Var(X) = p(1−p)
Again, substituting p=0.6 into the formula:
Var(X) = 0.6×(1−0.6)
Var(X) = 0.6×0.4
Var(X) = 0.24
4. Bernoulli Process
A sequence of independent Bernoulli trials, all with the
same probability of success p, is known as a Bernoulli
process.
Each trial in a Bernoulli process is independent, meaning
the outcome of one trial does not influence the outcome
of another.
Example
Suppose you're observing a series of independent
trials, each with a probability p=0.7 of success. You
want to find the probability of observing exactly 4
successes in 6 trials using the Bernoulli process.
Calculate this probability.
Given p=0.7, n=6, and we want to find the probability of observing exactly k=4 successes.
CONCLUSION
Bernoulli statistics provides a simple yet powerful framework
for modeling binary outcomes in various real-world scenarios.
Understanding the Bernoulli distribution and its properties is
fundamental in probability theory and statistics, forming the
basis for more complex statistical models and analyses. From
quality control in manufacturing to predictive modeling in
machine learning, the principles of Bernoulli statistics find
widespread applications across diverse fields.
BINOMIAL
Understanding Binomial Distributions
Binomial distribution is a fundamental concept in probability
theory and statistics, frequently used to model the number of
successes in a fixed number of independent Bernoulli trials.
These trials have only two possible outcomes, typically labeled
as success and failure. This report aims to provide a
comprehensive overview of binomial distributions, including
their definition, properties, applications, and relevant formulas.
Definition:
A binomial distribution arises from a series of independent
Bernoulli trials, where each trial has only two possible
outcomes - success (usually denoted as 1) or failure (denoted
as 0). The distribution represents the probability of getting a
certain number of successes in a fixed number of trials.
Properties of Binomial Distribution:
1. Fixed Number of Trials-- A binomial distribution involves a
fixed number of trials, denoted as n.
2. Independent Trials-- Each trial must be independent of the
others.
3. Constant Probability-- The probability of success (p) remains
constant for each trial, while the probability of failure (1−p) is
complementary.
4. Discrete Probability Distribution-- The binomial random
variable represents the count of successes, making it a discrete
distribution.
5. Two Possible Outcomes-- Each trial has only two possible
outcomes - success or failure.
Probability Mass Function (PMF):
The probability mass function for the binomial distribution, denoted as P(X=k),
gives the probability of obtaining exactly k successes in n trials. It can be
calculated using the formula:
Where:
n is the number of trials,
p is the probability of success in each trial,
k is the number of successes, and
represents the binomial coefficient, which is the number of ways to choose k
successes out of n trials.
Cumulative Distribution Function (CDF)
The cumulative distribution function for the binomial distribution gives
the probability of having at most k successes in n trials. It is obtained
by summing the probabilities of all possible outcomes up to k.
Mean and Variance
The mean (μ) and variance of a binomial distribution are given by
the formulas:
μ = n ⋅ p
=n ⋅ p ⋅ (1−p)
Example
In a survey conducted among 200 people, it was
found that 120 people prefer tea over coffee. Calculate
the mean and variance of the preference for tea
assuming that preference follows a binomial distribution.
Solution
Given:
Number of trials (n) = 200
Probability of success (p), i.e., preference for tea =
120/200 = 0.6 (as 120 out of 200 prefer tea)
Probability of failure (1 - p), i.e., preference for coffee = 1
- 0.6 = 0.4
Calculate the Mean (μ) for a binomial distribution is given by
the formula: μ = n . p
Substitute the given values: μ = 200 . (0.6)
μ = 120
Calculate the Variance for a binomial distribution is given
by the formula: Variance = n . p . (1 - p)
Substitute the given values: = 200 . 0.6 . (1 - 0.6)
= 200 . 0.6 . (0.4 )
= 48
POISSON
The Poisson distribution is a probability distribution used to model the
number of events occurring in a fixed interval of time or space, given
the average rate of occurrence. This report provides a comprehensive
overview of the Poisson distribution, including its definition, properties,
applications, and analytical techniques.
Definition
The Poisson distribution describes the probability of a given number of
events occurring in a fixed interval of time or space, under the
assumption that the events occur independently and at a constant
average rate. It is named after the French mathematician Siméon Denis
Poisson, who first introduced it in the early 19th century.
Properties of Poisson Distribution
1. Fixed Interval: The Poisson distribution applies to a fixed interval of
time or space, such as a minute, an hour, a mile, etc.
2. Independent Events: The occurrence of events within the interval is
assumed to be independent of each other.
3. Constant Average Rate (λ): The average rate of occurrence of events
(λ) remains constant throughout the interval.
4. Discrete Probability Distribution: The Poisson distribution is discrete,
with probabilities assigned to a countable number of outcomes (non-
negative integers).
Probability Mass Function (PMF)
The probability mass function for the Poisson distribution, denoted as
P(X=k), gives the probability of observing exactly k events in the fixed
interval. It can be expressed as:
Where:
λ is the average rate of occurrence of events,
k is the number of events observed,
e is the base of the natural logarithm, and
k! represents the factorial of k.
Mean and Variance
The mean (μ) and variance of a Poisson distribution are
both equal to the average rate (λ):
μ=λ
=λ
EXAMPLE
A call center receives an average of 10 calls per hour. What is the
probability that the call center receives exactly 8 calls in the next hour,
assuming the number of calls follows a Poisson distribution?
SOLUTION:
Given:
Average rate of calls (λ) = 10 calls per hour
Number of calls observed (k) = 8
We can use the Poisson probability mass function (PMF) to calculate the
probability of observing exactly 8 calls in the next hour:
Substituting the given values:
Using a calculator or software with mathematical functions, we can
evaluate this expression
RVs
Random Variables (RVs) are a fundamental concept in
probability theory and statistics, serving as a key tool for
modeling uncertainty and analyzing probabilistic phenomena.
This report provides a comprehensive overview of random
variables, including their definition, types, properties, and
applications in various fields.
A random variable is a numerical quantity whose value is
determined by the outcome of a random experiment. It assigns
a real number to each outcome of a probability space, thereby
providing a quantitative representation of uncertainty.
2 Types of Random Variables
1. Discrete Random Variables-- Discrete random variables
take on a countable number of distinct values.
2. Continuous Random Variables-- Continuous random
variables can take any value within a certain range or interval.
Properties of Random Variables
1. Probability Distribution-- Each random variable has an
associated probability distribution that describes the likelihood
of different outcomes.
2. Expected Value-- The expected value (or mean) of a random
variable represents the average value it would take over an
infinite number of trials.
3. Variance-- The variance of a random variable measures the
spread or dispersion of its values around the mean.
4. Moments-- Higher moments, such as skewness and kurtosis,
provide additional information about the shape and symmetry
of the distribution.
5. Cumulative Distribution Function (CDF)--The CDF of a
random variable gives the probability that the variable takes on
a value less than or equal to a given value.
EXAMPLE
Suppose you roll a fair six-sided die. Let X represent the random variable
corresponding to the number rolled. Determine the expected value and
variance of X.
SOLUTION
Given:
The random variable X represents the outcome of rolling a fair six-sided
die.
The possible outcomes of X are the numbers 1 through 6, each with
probability 1/6​ (assuming the die is fair).
To find the expected value E[X], we use the formula:
Where represents each possible outcome, and P(X= ​ ) is the
probability of obtaining outcome .
To find the variance Var(X) of X, we use the formula:
Now, we find Var(X):
EXERCISES
1. In a multiple-choice test, each question has four possible answers,
labeled A, B, C, and D. A student randomly guesses the answers to
five questions. What is the probability that the student gets exactly
three correct answers?
2. In a survey conducted among a group of people, it was found that
the probability of a randomly selected person subscribing to a
particular magazine is 0.25. If a sample of 10 people is randomly
selected, what is the probability that exactly 3 of them subscribe to the
magazine?
3. Suppose you are conducting a quality control inspection on a
production line where each item produced either passes
(success) or fails (failure) based on certain criteria. The
probability of any item passing inspection is p=0.85. Calculate
the mean and variance of the number of items that pass
inspection out of a sample of 100 items.
4. Suppose you have a biased coin that lands on heads with a
probability p=0.6. You flip this coin 10 times. Calculate the
probability of getting at least 7 heads using the Bernoulli
process.
5. In a manufacturing plant, machine failures occur at an average rate of
3 times per day, following a Poisson distribution. What is the probability
that there will be no machine failures tomorrow?
6. Suppose you have a six-sided fair die (with faces numbered 1 through
6). Let X be a random variable representing the outcome of a single roll
of this die. Now, let Y be another random variable defined as Y=2X+1.
ANSWERS
1.
2.
3. Mean (Expected Value): The mean or expected value of a Bernoulli random variable X
is given by: E(X)=p
Given p=0.85, we can directly substitute this value into the formula:
E(X)=0.85
So, the mean of the Bernoulli random variable X is 0.85.
Variance: The variance of a Bernoulli random variable X is given by:
Var(X)=p(1−p)
Substituting p=0.85 into the formula:
Var(X)=0.85×(1−0.85)
Var(X)=0.85×0.15
Var(X)=0.1275
4. Given p = 0.6, n = 10, and we want to find the probability of getting at least k = 7 heads.
P(X≥7) = P(X=7) + P(X=8) + P(X=9) + P(X=10)
Using the binomial distribution formula:
Calculating each term: Now, calculate each term
Finally, sum up these probabilities
5.
6. When X=1, Y=2(1)+1=3
When X=2, Y=2(2)+1=5
When X=3, Y=2(3)+1=7
When X=4, Y=2(4)+1=9
When X=5, Y=2(5)+1=11
When X=6, Y=2(6)+1=13
Since each outcome of X is equally likely (because we have a fair die),
the probability of each outcome is 1/6​. Therefore, the PMF of Y is:
Theory of Probability-Bernoulli, Binomial, Passion
Theory of Probability-Bernoulli, Binomial, Passion
Theory of Probability-Bernoulli, Binomial, Passion

Theory of Probability-Bernoulli, Binomial, Passion

  • 1.
    “MATH 307” “THEORY OFPROBABILITY” Cheryl B. Subaldo, EdD. MT II – Kidapawan City NHS Bernoulli, Binomial, Poisson, and RVs CENTRAL MINDANAO COLLEGE Prepared by: JOSEPHINE R. CLAVERIA MAED Student
  • 2.
  • 3.
  • 4.
    Bernoulli, (named afterthe Swiss mathematician Jacob Bernoulli), constitutes a fundamental branch of probability theory dealing with random experiments having binary outcomes. This report provides an overview of Bernoulli statistics, covering its key concepts, properties, applications, and relevance in modern data analysis. Bernoulli statistics lies at the heart of probability theory, providing a framework to analyze situations where outcomes are binary, such as success/failure, heads/tails, or yes/no. It serves as the foundation for more complex probability distributions and statistical models.
  • 5.
    KEY CONCEPTS 1.Bernoulli Trial ABernoulli trial is an experiment with exactly two possible outcomes: success or failure. These outcomes are often denoted as 1 for success and 0 for failure. The probability of success is denoted by p, and the probability of failure (or complement of success) is 1−p.
  • 6.
    Example Suppose a biasedcoin is tossed 10 times. The probability of getting heads (success) on each toss is 0.30.3. Calculate the probability of getting exactly 3 heads. Given: Number of trials, n=10 Probability of success (getting heads), p=0.3 Number of successes we want to find, k=3
  • 7.
  • 8.
  • 9.
    2. Bernoulli Distribution TheBernoulli distribution represents the probability distribution of a single Bernoulli trial. It is characterized by a single parameter p, the probability of success. The probability mass function (PMF) of the Bernoulli distribution is:
  • 10.
    Example In a factory,the probability of a defective product coming off the assembly line is 0.1. If a sample of 8 products is randomly selected, what is the probability that exactly 2 of them are defective? Given: Number of trials, n=8 (selecting 88 products) Probability of success (defective product), p=0.1 Number of successes we want to find, k=2 (exactly 22 defective products)
  • 11.
  • 12.
    3. Mean andVariance Mean: The expected value or mean of a Bernoulli random variable is E(X)=p, representing the average proportion of successes in a series of trials. Variance: The variance of a Bernoulli random variable is Var(X)=p(1−p), indicating the spread or variability of outcomes around the mean.
  • 13.
    Example Consider a randomvariable X following a Bernoulli distribution with a probability of success p=0.6. Calculate the mean and variance of X.
  • 14.
    Solution • Mean (ExpectedValue): The mean or expected value of a Bernoulli random variable X is given by: E(X) = p Given p=0.6, we can directly substitute this value into the formula: E(X) = 0.6 • Variance: The variance of a Bernoulli random variable X is given by: Var(X) = p(1−p) Again, substituting p=0.6 into the formula: Var(X) = 0.6×(1−0.6) Var(X) = 0.6×0.4 Var(X) = 0.24
  • 15.
    4. Bernoulli Process Asequence of independent Bernoulli trials, all with the same probability of success p, is known as a Bernoulli process. Each trial in a Bernoulli process is independent, meaning the outcome of one trial does not influence the outcome of another.
  • 16.
    Example Suppose you're observinga series of independent trials, each with a probability p=0.7 of success. You want to find the probability of observing exactly 4 successes in 6 trials using the Bernoulli process. Calculate this probability.
  • 17.
    Given p=0.7, n=6,and we want to find the probability of observing exactly k=4 successes.
  • 18.
    CONCLUSION Bernoulli statistics providesa simple yet powerful framework for modeling binary outcomes in various real-world scenarios. Understanding the Bernoulli distribution and its properties is fundamental in probability theory and statistics, forming the basis for more complex statistical models and analyses. From quality control in manufacturing to predictive modeling in machine learning, the principles of Bernoulli statistics find widespread applications across diverse fields.
  • 19.
  • 20.
    Understanding Binomial Distributions Binomialdistribution is a fundamental concept in probability theory and statistics, frequently used to model the number of successes in a fixed number of independent Bernoulli trials. These trials have only two possible outcomes, typically labeled as success and failure. This report aims to provide a comprehensive overview of binomial distributions, including their definition, properties, applications, and relevant formulas.
  • 21.
    Definition: A binomial distributionarises from a series of independent Bernoulli trials, where each trial has only two possible outcomes - success (usually denoted as 1) or failure (denoted as 0). The distribution represents the probability of getting a certain number of successes in a fixed number of trials. Properties of Binomial Distribution: 1. Fixed Number of Trials-- A binomial distribution involves a fixed number of trials, denoted as n.
  • 22.
    2. Independent Trials--Each trial must be independent of the others. 3. Constant Probability-- The probability of success (p) remains constant for each trial, while the probability of failure (1−p) is complementary. 4. Discrete Probability Distribution-- The binomial random variable represents the count of successes, making it a discrete distribution. 5. Two Possible Outcomes-- Each trial has only two possible outcomes - success or failure.
  • 23.
    Probability Mass Function(PMF): The probability mass function for the binomial distribution, denoted as P(X=k), gives the probability of obtaining exactly k successes in n trials. It can be calculated using the formula: Where: n is the number of trials, p is the probability of success in each trial, k is the number of successes, and represents the binomial coefficient, which is the number of ways to choose k successes out of n trials.
  • 24.
    Cumulative Distribution Function(CDF) The cumulative distribution function for the binomial distribution gives the probability of having at most k successes in n trials. It is obtained by summing the probabilities of all possible outcomes up to k. Mean and Variance The mean (μ) and variance of a binomial distribution are given by the formulas: μ = n ⋅ p =n ⋅ p ⋅ (1−p)
  • 25.
    Example In a surveyconducted among 200 people, it was found that 120 people prefer tea over coffee. Calculate the mean and variance of the preference for tea assuming that preference follows a binomial distribution.
  • 26.
    Solution Given: Number of trials(n) = 200 Probability of success (p), i.e., preference for tea = 120/200 = 0.6 (as 120 out of 200 prefer tea) Probability of failure (1 - p), i.e., preference for coffee = 1 - 0.6 = 0.4
  • 27.
    Calculate the Mean(μ) for a binomial distribution is given by the formula: μ = n . p Substitute the given values: μ = 200 . (0.6) μ = 120 Calculate the Variance for a binomial distribution is given by the formula: Variance = n . p . (1 - p) Substitute the given values: = 200 . 0.6 . (1 - 0.6) = 200 . 0.6 . (0.4 ) = 48
  • 28.
  • 29.
    The Poisson distributionis a probability distribution used to model the number of events occurring in a fixed interval of time or space, given the average rate of occurrence. This report provides a comprehensive overview of the Poisson distribution, including its definition, properties, applications, and analytical techniques. Definition The Poisson distribution describes the probability of a given number of events occurring in a fixed interval of time or space, under the assumption that the events occur independently and at a constant average rate. It is named after the French mathematician Siméon Denis Poisson, who first introduced it in the early 19th century.
  • 30.
    Properties of PoissonDistribution 1. Fixed Interval: The Poisson distribution applies to a fixed interval of time or space, such as a minute, an hour, a mile, etc. 2. Independent Events: The occurrence of events within the interval is assumed to be independent of each other. 3. Constant Average Rate (λ): The average rate of occurrence of events (λ) remains constant throughout the interval. 4. Discrete Probability Distribution: The Poisson distribution is discrete, with probabilities assigned to a countable number of outcomes (non- negative integers).
  • 31.
    Probability Mass Function(PMF) The probability mass function for the Poisson distribution, denoted as P(X=k), gives the probability of observing exactly k events in the fixed interval. It can be expressed as: Where: λ is the average rate of occurrence of events, k is the number of events observed, e is the base of the natural logarithm, and k! represents the factorial of k.
  • 32.
    Mean and Variance Themean (μ) and variance of a Poisson distribution are both equal to the average rate (λ): μ=λ =λ
  • 33.
    EXAMPLE A call centerreceives an average of 10 calls per hour. What is the probability that the call center receives exactly 8 calls in the next hour, assuming the number of calls follows a Poisson distribution? SOLUTION: Given: Average rate of calls (λ) = 10 calls per hour Number of calls observed (k) = 8
  • 34.
    We can usethe Poisson probability mass function (PMF) to calculate the probability of observing exactly 8 calls in the next hour: Substituting the given values: Using a calculator or software with mathematical functions, we can evaluate this expression
  • 35.
  • 36.
    Random Variables (RVs)are a fundamental concept in probability theory and statistics, serving as a key tool for modeling uncertainty and analyzing probabilistic phenomena. This report provides a comprehensive overview of random variables, including their definition, types, properties, and applications in various fields. A random variable is a numerical quantity whose value is determined by the outcome of a random experiment. It assigns a real number to each outcome of a probability space, thereby providing a quantitative representation of uncertainty.
  • 37.
    2 Types ofRandom Variables 1. Discrete Random Variables-- Discrete random variables take on a countable number of distinct values. 2. Continuous Random Variables-- Continuous random variables can take any value within a certain range or interval.
  • 38.
    Properties of RandomVariables 1. Probability Distribution-- Each random variable has an associated probability distribution that describes the likelihood of different outcomes. 2. Expected Value-- The expected value (or mean) of a random variable represents the average value it would take over an infinite number of trials. 3. Variance-- The variance of a random variable measures the spread or dispersion of its values around the mean.
  • 39.
    4. Moments-- Highermoments, such as skewness and kurtosis, provide additional information about the shape and symmetry of the distribution. 5. Cumulative Distribution Function (CDF)--The CDF of a random variable gives the probability that the variable takes on a value less than or equal to a given value.
  • 40.
    EXAMPLE Suppose you rolla fair six-sided die. Let X represent the random variable corresponding to the number rolled. Determine the expected value and variance of X. SOLUTION Given: The random variable X represents the outcome of rolling a fair six-sided die. The possible outcomes of X are the numbers 1 through 6, each with probability 1/6​ (assuming the die is fair).
  • 41.
    To find theexpected value E[X], we use the formula: Where represents each possible outcome, and P(X= ​ ) is the probability of obtaining outcome .
  • 42.
    To find thevariance Var(X) of X, we use the formula:
  • 43.
    Now, we findVar(X):
  • 44.
    EXERCISES 1. In amultiple-choice test, each question has four possible answers, labeled A, B, C, and D. A student randomly guesses the answers to five questions. What is the probability that the student gets exactly three correct answers? 2. In a survey conducted among a group of people, it was found that the probability of a randomly selected person subscribing to a particular magazine is 0.25. If a sample of 10 people is randomly selected, what is the probability that exactly 3 of them subscribe to the magazine?
  • 45.
    3. Suppose youare conducting a quality control inspection on a production line where each item produced either passes (success) or fails (failure) based on certain criteria. The probability of any item passing inspection is p=0.85. Calculate the mean and variance of the number of items that pass inspection out of a sample of 100 items. 4. Suppose you have a biased coin that lands on heads with a probability p=0.6. You flip this coin 10 times. Calculate the probability of getting at least 7 heads using the Bernoulli process.
  • 46.
    5. In amanufacturing plant, machine failures occur at an average rate of 3 times per day, following a Poisson distribution. What is the probability that there will be no machine failures tomorrow? 6. Suppose you have a six-sided fair die (with faces numbered 1 through 6). Let X be a random variable representing the outcome of a single roll of this die. Now, let Y be another random variable defined as Y=2X+1.
  • 47.
  • 48.
  • 49.
    3. Mean (ExpectedValue): The mean or expected value of a Bernoulli random variable X is given by: E(X)=p Given p=0.85, we can directly substitute this value into the formula: E(X)=0.85 So, the mean of the Bernoulli random variable X is 0.85. Variance: The variance of a Bernoulli random variable X is given by: Var(X)=p(1−p) Substituting p=0.85 into the formula: Var(X)=0.85×(1−0.85) Var(X)=0.85×0.15 Var(X)=0.1275
  • 50.
    4. Given p= 0.6, n = 10, and we want to find the probability of getting at least k = 7 heads. P(X≥7) = P(X=7) + P(X=8) + P(X=9) + P(X=10) Using the binomial distribution formula:
  • 51.
    Calculating each term:Now, calculate each term Finally, sum up these probabilities
  • 52.
  • 53.
    6. When X=1,Y=2(1)+1=3 When X=2, Y=2(2)+1=5 When X=3, Y=2(3)+1=7 When X=4, Y=2(4)+1=9 When X=5, Y=2(5)+1=11 When X=6, Y=2(6)+1=13
  • 54.
    Since each outcomeof X is equally likely (because we have a fair die), the probability of each outcome is 1/6​. Therefore, the PMF of Y is: