SlideShare a Scribd company logo
1 of 27
For any help regarding Statistics Exam Help
visit : - https://www.statisticsexamhelp.com/ ,
Email : - support@statisticsexamhelp.com or
call us at : - +1 678 648 4277
Problem 1: Multiple Choice Questions: There is only one correct answer for each question listed
below, please clearly indicate the correct answer. There will be no partial credit given for multiple
choice questions, thus any explanations will not be graded.
(I) Alice and Bob each choose a number independently and uniformly at random from the interval [0,
2]. Consider the following events:
A : The absolute difference between the two numbers is greater than 1/4.
B : Alice’s number is greater than 1/4.
Statistics Exam Help
(II) There are m red balls and n white balls in an urn. We draw two balls simultaneously and at
random. What is the probability that the balls are of different color?
(III) 20 blackpebbles are arranged in 4 rows of 5 pebbles each. We choose 4 of these pebbles at
random and color them red. What is the probability that all the red pebbles lie in different rows?
Statistics Exam Help
(IV) We have two light bulbs, A and B. Bulb A has an exponentially distributed lifetime with mean
lifetime 4 days. Bulb B has an exponentially distributed lifetime with mean lifetime 6 days. We
select one of the two bulbs at random; each bulb is equally likely to be chosen. Given that the bulb
we selected is still working after 12 hours, what is the probability that we selected bulb A?
(V) A test for some rare disease is assumed to be correct 95% of the time: if a person has the
disease, the test results are positive with probability 0.95, and if the person does not have the
disease, the test results are negative with probability 0.95. A random person drawn from a certain
population has probability 0.001 of having the disease. Given that the person just tested positive,
what is the probability of having the disease?
Statistics Exam Help
(VI) Let X be a random variable with
(VII) X and Y are independent random variables, with
Statistics Exam Help
(VIII) A number p is drawn from the interval [0, 1] according to the uniform distribution, and then a
sequence of independent Bernoulli trials is performed, each with success probability p. What is the
variance of the number of successes in k trials? Note k is a deterministic number.
(IX) A police radar always over-estimates the speed of incoming cars by an amount that is uniformly
distributed between 0 and 5 mph. Assume that car speeds are uniformly distributed between 60 and
75 mph and are independent of the radar over-estimate. If the radar measures a speed of 76 mph,
what is the least squares estimate of the actual car speed?
(a) 72.5 mph
(b) 73.0 mph
(c) 73.5 mph
(d) 72.9 mph
Statistics Exam Help
(X) You are visiting the Killian rainforest, when your insect repellant runs out. This forest is infested
with lethal mosquitoes, whose second bite will kill any human instantaneously. Assume mosquito’s
land on your backand deliver a vicious bite according to a Bernoulli Process. Also assume the expected
time till the first bite is 10 seconds. If you arrive at the forest at t = 0, what is the probability that you
die at exactly t = 10 seconds?
(XI) In order to estimate p, the fraction of people who will vote for George Bush in the next election,
you conduct a poll of n people drawn randomly and independently from the population. Your
estimator Mn is obtained by dividing Sn, the number of people who vote for Bush in your sample, by
n, i.e., Mn = Sn/n. Find the smallest value of n, the number of people you must poll, for which the
Chebyshev inequality yields a guarantee that
Statistics Exam Help
(XII) On each day of the year, it rains with probability 0.1, independent of every other day. Use the
Central Limit Theorem (CLT) (with no half step) to approximate the probability that, out of the 365
days in the year, it will rain on at least 100 days. In the answers below, Φ denotes the standard
normal CDF.
Problem 2: Please write all workfor Problem 2 in your first blue book. No workrecorded below will be
graded..
You take a safari trip to the Porobilati game reserve. A highlight of the game reserve is the Poseni river
where one can watch deer and elephants coming to drinkwater. Deer come to the river according to a
Poisson process with arrival rate λd = 8 per hour; elephants come to the river according to an
independent Poisson process with arrival rate λe = 2 per hour. On the first day of your safari, you
reach the Poseni river early in the morning hoping to see some elephants. Assume that deer and
elephants are the only kinds of animals that visit this river.
(a) Let N be the number of animals you see during the first 3 hours. Find E[N] and var(N).
(b) What’s the probability of seeing your 3rd elephant before your 9th deer?
(c) At the end of 3 hours, you have seen 24 deer but no elephants. What is the probability that you
will see an elephant in the next hour?
Statistics Exam Help
(d) It is now 6 hours since you started watching. You have seen 53 deer but there is still no sight of an
elephant. How many more deer do you expect to see before you see your first elephant?
(e) Unfortunately, you forgot your camera in your lodge the first day. You come backwith your camera
the next day and plan to stay at the river until you’ve clicked a picture of both a deer and an elephant.
Assume you’re always able to clicka picture as soon as an animal arrives. How long do you expect to
stay?
(f) Your friend, a wildlife photographer, is fascinated by the stories of your trip, and decides to take the
safari herself. All excited, she arrives at the Poseni river, and is prepared to stay as long as required in
order to get lots of beautiful pictures. Unfortunately, she dropped her camera in a bush on the way, as
a result of which the camera is not functioning properly; each time she clicks, there is a probability 0.1
that the camera will produce a faulty picture, independently of everything else. Assume she clicks one
picture each time an elephant or deer arrives at the river, as soon as it arrives. Let X be the time (in
hours) until she gets her third successful picture of an elephant. Find the PDF of X
Problem 3: Please write all workfor Problem 3 in your second blue book. No workrecorded below will
be graded. Each question is equally weighted at 4 points each.
The MIT football team’s performance in any given game is correlated to its morale. If the team has
won the past two games, then it has a 0.7 probability of winning the next game. If it lost the last game
but won the one before that, it has a 0.4 probability of winning. If it won its last game but lost the one
before that it has a 0.6 probability of winning. Finally if it lost the last two games it has only a 0.3
probability of winning the next game. No game can end up in a draw. Consider a starting time when
the team has won its preceding two games
Statistics Exam Help
(a) Graph the minimum state Markov chain model for the MIT football team’s performance. Be sure
to label all transition probabilities, and clearly indicate what state of the world each Markov state
represents. You may find it convenient for the below questions to label each state with an integer.
(b) Find the probability that the first future loss will be followed by another loss.
(c) Let X be the number of games played up to, but not including the first loss. Find the PMF of X.
(d) Either evaluate the steady-state probabilities or explain why they do not exist.
(e) Find a good approximation to the probability that the team will win its 1000th game, given that
the outcomes of games 1000 and 1001 are the same. Clearly state any assumptions you use in the
approximation.
(f) Let T be the number of games up to and including the team’s 2nd consecutive loss (i.e. 2 losses in
a row). Write a system of equations that can be used to find E[T].
(g) The coach decides that he will withdraw the team from the competition immediately after the
team’s 3rd consecutive loss (i.e. three losses in a row). Let N be the number of games that the team
will play in the competition? Write a system of equations that can be used to find E[N].
Statistics Exam Help
Problem 1
Multiple Choice Questions: There is only one correct answer for each question listed below, please
clearly indicate the correct answer.
(1)D Alice's and Bob's choices of number can be illustrated in the following figure. Event A (the absolute
difference of the two numbers is greater than i) corresponds to the area shaded by the vertical line. Event B
(Alice's number is greater than 1) corresponds to the area shaded by the horizontal line. There fore, event
AFB corresponds to the double shaded area. Since both choices is uniformly distributed between 0 and 2,
we have that
Statistics Exam Help
2B
Let B1 be the first ball and B2 be the second ball (you may do this by drawing two balls with both
hands, then first look at the ball in left hand and then look at the ball in right hand).
(3) C
Without any constraint, there are totally 20 × 19 × 18 × 17 possible ways to get 4 pebbles out of 20.
Now we consider the case under the constraint that the 4 picked pebbles are on different rows. As
illustrated by the following figure, there are 20 possible positions for the first pebble, 15 for the
second, 10 for the third and finally 5 for the last pebble. Therefore, there are a total of 20 × 15 × 10
× 5 possible ways. With the uniform discrete probability law, the probability of picking 4 pebbles
from different rows is
Statistics Exam Help
(4) A
1 Let XA and XB denote the the life time of bulb A and bulb B, respectively. Clearly, XA has exponential
distribution with parameter λA = 1 and XB has exponential distribution with parameter 4 1 λB = 6 . Let
E be the event that we have selected bulb A. We use X to denote the lifetime of the selected bulb.
Using Bayes’ rule, we have that
Statistics Exam Help
(7) B
Clearly, X is exponentially distributed with parameter λ = 3. Therefore, its transform is M 3 X(s) = 3−s
. The transform of Y can be easily calculated from its PMF as MY (s) = 1 es + 1 . Since 2 2 X and Y are
independent, transform of the X+Y is just the product of the transform of X and the transform of Y.
Thus,
(8)C
Let P be the random variable for the value drawn according to the uniform distribution in interval [0,
1] and let X be the number of successes in k trials. Given P = p, X is a binomial random variable
Statistics Exam Help
(9) B
Let X be the car speed and let Y be the radar’s measurement. Then, the joint PDF of X and Y is uniform
in the range of pairs (x, y ) such that x ∈ [60, 75] and x ≤ y ≤ x + 5. Therefore, the least square estimator
of X given Y = y is
Statistics Exam Help
(11) B
Let Xi be the indicator random variable for the i th person’s vote as follows:
(12) D
Let Xi be the indicator random variable for the i th day being rainy as follows:
Statistics Exam Help
We have that µ = E[Xi] = 0.1 and σ2 = var(Xi) = 0.1 · (1 − 0.1) = 0.09. Then the number of rainy days in
a year is S365 = X1 + . . . + X365, which is distributed as a binomial(n = 365, p = 0.1). Using the central
limit theorem, we have that
Problem 2,
Each question is equally weighted at 6 points each) The deer process is a Poisson process of arrival
rate 8 (λD = 8). The elephant process is a Poisson process of arrival rate 2 (λE = 2).
(a) 30 Since deer and elephants are the only animals that visit the river, then the animal process is
the merged process of the deer process and the elephant process. Since the deer process and the
elephant process are independent Poisson processes with λD = 8 and λE = 2 respectively, the animal
process is also a poisson process with λA = λD + λE = 10. Therefore, the number of animals arriving
the three hours is distributed according to a Poisson distribution with parameter λA · 3 = 30.
Therefore
E[ number of animals arriving in three hours ] = λA · 3 = 30 ,
var( number of animals arriving in three hours ) = λA · 3 = 30 .
Statistics Exam Help
Let’s now focus on the animal arrivals themselves and ignore the inter-arrival times. Each animal
arrival can be either a deer arrival or a elephant arrival. Let Ek denote the event that the kth animal
arrival is an elephant. It is explained in the book (Example 5.14, Page 295) that
and the events E1, E2, . . . are independent. Regarding the arrival of elephant as “success”, it forms a
Bernoulli process, with parameter p = 1 . Clearly, observing the 3rd elephant before the 9th deer is 5
equivalent to that the 3rd elephant arrives before time 12 in the above Bernoulli process. Let T3
denote the 3rd elephant arrival time in the Bernoulli process.
Statistics Exam Help
Since the deer arrival process is independent of the elephant arrival process, the event that “several
deer arrive by the end of 3 hours” is independent of any event defined on the elephant arrival
process. Let T1 denote the first arrival time. Then the desired probability can be expressed as
Step (3) follows from the memoryless property of the Poisson process.
(d) 4
As in part (b), regarding the arrival of elephant as “success”, it forms a Bernoulli process, with
parameter p = 1 . Owing to the memoryless property, given that no “success” in the first 53 trials 5
(have seen 53 deer but no elephant), the number of trials, X1, up to and including the first “success”
is still a geometrical random variable with parameter p = 1 . Since the number of “failures” (deers) 5
before the first “success” (elephant) is X1 − 1, then
(e) 40
Let TD (resp. TE) denote the first arrival time of the deer process (resp. elephant process). The time
S until I clicked a picture of both a deer and a elephant is equal to max{TD, TE}. We express S into
two parts,
Statistics Exam Help
where S1 = min{TD, TE} is the first arrival time of an animal and S2 = max{TD, TE} − min{TD, TE} is the
additional time until both animals register one arrival. Since the animal arrival process is Poisson
with rate 10,
Concerning S2, there are two cases.
(1) The first arrival is a deer, which happens with probability λD+λE λD . Then we wait for an
elephant 1 arrival, which takes λE time on average.
(2) (2) The first arrival is an elephant, which happens with probability λD+λE λE . Then we wait for a
deer 1 arrival, which takes λD time on average.
Statistics Exam Help
X denotes the time until the third successful picture of an elephant. Since the deer arrivals are
independent of the elephant arrivals and X is only related to elephant arrivals, we can focus on the
elephant arrival process. We split the elephant arrival process into two processes, one composed of
the successful elephant pictures (We call this process the “successful elephant picture arrival
process”) and the other one composed of unsuccessful elephant pictures (We call this process the
“unsuccessful elephant picture arrival process”). Since the quality of each picture is independent of
everything else, the successful elephant picture arrival process is a Poisson process with parameter
λSE = 0.9 λE. Therefore, X is the 3rd · arrival time of the successful elephant picture arrival process.
Thus X has the following Erlang distribution
(a) The performance of the MIT football team is described by a Markov chain, where the state is
taken to be
(result of the game before the last game, result of the last game).
There are four possible states: {WW, W L, LW, LL}. The corresponding transition probability graph is
We use throughout this problem the following labels for the four states
Statistics Exam Help
(b) 0.6 When the winning streak of the team gets interrupted, the chain is in state 2. The probability
of losing the next game is 0.6.
An alternative computation-intensive method: Denote the starting time to be n = 1 and let Xn denote
the result of the nth game. Since the team starts when they has won their previous two games, we
set X0 = X−1 = W. Then,
P[ the first future loss followed by another loss | X0 = X−1 = W ] +∞ = P
the first future loss occurs at time n and another loss at time n + 1 | X0 = X−1 = W ] n=1
Statistics Exam Help
Clearly, X, the number of games played before the first loss, takes value in {0, 1, . . .}. Using the same
notations as in (b),the PMF of X can be calculated as follows.
Statistics Exam Help
The Markov chain consists of a single aperiodic recurrent class, so the steady-state distribution exists,
denoted by π = (π1, π2, π3, π4). Solving the equilibrium system πP = π together with 4 i=1 πi = 1, we
get the desired result.
(e) 2 Recall that Xn denotes the outcome of the nth game. The desired probability is
T is the first passage time from state 1 to state 4. Let µ1, µ2, µ3, µ4 be the average first passage time
from each state to state 4. We have that E[T] = µ1 Since we are concerned with the first passage
time to state 4, the Markov chain’s behavior from state 4 is irrelevant. Therefore, we focus on the
modified Markov chain graph where state 4 is converted to an absorbing state. The average first
Statistics Exam Help
passage time to state 4 in the original Markov chain is equal to the average absorption time to state 4
in the modified Markov chain. The required linear equation system is
Statistics Exam Help
Three losses in a row can only be reached from state LL. Following this observation, we create a fifth
state LLL (labeled 5) and construct the following Markov transition graph. The probability transition
matrix of this Markov chain is
The number of games played by the MIT football team corresponds to the absorption time from
state 1 to state 5. Denote the average absorption time to state 5 as t1, t2, t3, t4, t5. The linear
equation system is as follows.
Statistics Exam Help
Statistics Exam Help

More Related Content

What's hot

STAB52 Lecture Notes (Week 2)
STAB52 Lecture Notes (Week 2)STAB52 Lecture Notes (Week 2)
STAB52 Lecture Notes (Week 2)
Danny Cao
 
Introduction to random variables
Introduction to random variablesIntroduction to random variables
Introduction to random variables
Hadley Wickham
 
Discrete Probability Distributions
Discrete Probability DistributionsDiscrete Probability Distributions
Discrete Probability Distributions
mandalina landy
 
Lecture 3 qualtifed rules of inference
Lecture 3 qualtifed rules of inferenceLecture 3 qualtifed rules of inference
Lecture 3 qualtifed rules of inference
asimnawaz54
 

What's hot (18)

probability assignment help
probability assignment helpprobability assignment help
probability assignment help
 
Bayesian statistics
Bayesian statisticsBayesian statistics
Bayesian statistics
 
STAB52 Lecture Notes (Week 2)
STAB52 Lecture Notes (Week 2)STAB52 Lecture Notes (Week 2)
STAB52 Lecture Notes (Week 2)
 
stochastic processes assignment help
stochastic processes assignment helpstochastic processes assignment help
stochastic processes assignment help
 
Mrbml004 : Introduction to Information Theory for Machine Learning
Mrbml004 : Introduction to Information Theory for Machine LearningMrbml004 : Introduction to Information Theory for Machine Learning
Mrbml004 : Introduction to Information Theory for Machine Learning
 
Introduction to random variables
Introduction to random variablesIntroduction to random variables
Introduction to random variables
 
7주차
7주차7주차
7주차
 
Lecture7 channel capacity
Lecture7   channel capacityLecture7   channel capacity
Lecture7 channel capacity
 
Sfs4e ppt 06
Sfs4e ppt 06Sfs4e ppt 06
Sfs4e ppt 06
 
Discrete Probability Distributions
Discrete Probability DistributionsDiscrete Probability Distributions
Discrete Probability Distributions
 
Linear Programming
Linear ProgrammingLinear Programming
Linear Programming
 
random variables-descriptive and contincuous
random variables-descriptive and contincuousrandom variables-descriptive and contincuous
random variables-descriptive and contincuous
 
Talk 2
Talk 2Talk 2
Talk 2
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
 
IJSRED-V2I5P56
IJSRED-V2I5P56IJSRED-V2I5P56
IJSRED-V2I5P56
 
Lecture 3 qualtifed rules of inference
Lecture 3 qualtifed rules of inferenceLecture 3 qualtifed rules of inference
Lecture 3 qualtifed rules of inference
 
Assignment 2 solution acs
Assignment 2 solution acsAssignment 2 solution acs
Assignment 2 solution acs
 
Unit 1 quantifiers
Unit 1  quantifiersUnit 1  quantifiers
Unit 1 quantifiers
 

Similar to Statistics Exam Help

Topic 9_special-probability-distributions.pdf.pdf
Topic 9_special-probability-distributions.pdf.pdfTopic 9_special-probability-distributions.pdf.pdf
Topic 9_special-probability-distributions.pdf.pdf
m6841757
 
Question 1 In your own words, write a minimum of three sen.docx
Question 1 In your own words, write a minimum of three sen.docxQuestion 1 In your own words, write a minimum of three sen.docx
Question 1 In your own words, write a minimum of three sen.docx
IRESH3
 
2.12 13.08 Counting Principles
2.12 13.08   Counting Principles2.12 13.08   Counting Principles
2.12 13.08 Counting Principles
chrismac47
 

Similar to Statistics Exam Help (20)

Statistics Assignment Help
Statistics Assignment HelpStatistics Assignment Help
Statistics Assignment Help
 
Probability Assignment Help
Probability Assignment HelpProbability Assignment Help
Probability Assignment Help
 
Probability Homework Help
Probability Homework HelpProbability Homework Help
Probability Homework Help
 
Mathematics Homework Help
Mathematics Homework HelpMathematics Homework Help
Mathematics Homework Help
 
Statistics Homework Help
Statistics Homework HelpStatistics Homework Help
Statistics Homework Help
 
Practice Test 2 Probability
Practice Test 2 ProbabilityPractice Test 2 Probability
Practice Test 2 Probability
 
Probabilistic Systems Analysis- Edu Assignment Help
Probabilistic Systems Analysis- Edu Assignment HelpProbabilistic Systems Analysis- Edu Assignment Help
Probabilistic Systems Analysis- Edu Assignment Help
 
Topic 9_special-probability-distributions.pdf.pdf
Topic 9_special-probability-distributions.pdf.pdfTopic 9_special-probability-distributions.pdf.pdf
Topic 9_special-probability-distributions.pdf.pdf
 
Tests
TestsTests
Tests
 
Probabilistic Systems Analysis Exam Help
Probabilistic Systems Analysis Exam HelpProbabilistic Systems Analysis Exam Help
Probabilistic Systems Analysis Exam Help
 
Probabilistic Systems Analysis Assignment Help
Probabilistic Systems Analysis Assignment HelpProbabilistic Systems Analysis Assignment Help
Probabilistic Systems Analysis Assignment Help
 
Probability and Probability Distribution.pptx
Probability and Probability Distribution.pptxProbability and Probability Distribution.pptx
Probability and Probability Distribution.pptx
 
Infy
InfyInfy
Infy
 
Probability and Conditional.pdf
Probability and Conditional.pdfProbability and Conditional.pdf
Probability and Conditional.pdf
 
Statistics Coursework Help
Statistics Coursework HelpStatistics Coursework Help
Statistics Coursework Help
 
Statistics Coursework Assignment Help
Statistics Coursework Assignment HelpStatistics Coursework Assignment Help
Statistics Coursework Assignment Help
 
Pre-Cal 40S Slides May 28, 2007
Pre-Cal 40S Slides May 28, 2007Pre-Cal 40S Slides May 28, 2007
Pre-Cal 40S Slides May 28, 2007
 
Bahan ajar materi peluang kelas viii
Bahan ajar materi peluang kelas viiiBahan ajar materi peluang kelas viii
Bahan ajar materi peluang kelas viii
 
Question 1 In your own words, write a minimum of three sen.docx
Question 1 In your own words, write a minimum of three sen.docxQuestion 1 In your own words, write a minimum of three sen.docx
Question 1 In your own words, write a minimum of three sen.docx
 
2.12 13.08 Counting Principles
2.12 13.08   Counting Principles2.12 13.08   Counting Principles
2.12 13.08 Counting Principles
 

More from Statistics Exam Help

More from Statistics Exam Help (8)

Probability and Statistics Exam Help
Probability and Statistics Exam HelpProbability and Statistics Exam Help
Probability and Statistics Exam Help
 
Excel Exam Help
Excel Exam HelpExcel Exam Help
Excel Exam Help
 
Mathematical Statistics Exam Help
Mathematical Statistics Exam HelpMathematical Statistics Exam Help
Mathematical Statistics Exam Help
 
Statistics Exam Help
Statistics Exam HelpStatistics Exam Help
Statistics Exam Help
 
Advanced Statistics Exam Help
Advanced Statistics Exam HelpAdvanced Statistics Exam Help
Advanced Statistics Exam Help
 
Statistics Exam Help
Statistics Exam HelpStatistics Exam Help
Statistics Exam Help
 
Probabilistic systems exam help
Probabilistic systems exam helpProbabilistic systems exam help
Probabilistic systems exam help
 
Stochastic Process Exam Help
Stochastic Process Exam HelpStochastic Process Exam Help
Stochastic Process Exam Help
 

Recently uploaded

Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
AnaAcapella
 
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lessonQUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
httgc7rh9c
 

Recently uploaded (20)

OSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & SystemsOSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & Systems
 
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdfUGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
 
Introduction to TechSoup’s Digital Marketing Services and Use Cases
Introduction to TechSoup’s Digital Marketing  Services and Use CasesIntroduction to TechSoup’s Digital Marketing  Services and Use Cases
Introduction to TechSoup’s Digital Marketing Services and Use Cases
 
Play hard learn harder: The Serious Business of Play
Play hard learn harder:  The Serious Business of PlayPlay hard learn harder:  The Serious Business of Play
Play hard learn harder: The Serious Business of Play
 
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
 
Simple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdfSimple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdf
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
REMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxREMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptx
 
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
 
Economic Importance Of Fungi In Food Additives
Economic Importance Of Fungi In Food AdditivesEconomic Importance Of Fungi In Food Additives
Economic Importance Of Fungi In Food Additives
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)
 
PANDITA RAMABAI- Indian political thought GENDER.pptx
PANDITA RAMABAI- Indian political thought GENDER.pptxPANDITA RAMABAI- Indian political thought GENDER.pptx
PANDITA RAMABAI- Indian political thought GENDER.pptx
 
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lessonQUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
QUATER-1-PE-HEALTH-LC2- this is just a sample of unpacked lesson
 
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxOn_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptx
 
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxCOMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
 
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxExploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
 

Statistics Exam Help

  • 1. For any help regarding Statistics Exam Help visit : - https://www.statisticsexamhelp.com/ , Email : - support@statisticsexamhelp.com or call us at : - +1 678 648 4277
  • 2. Problem 1: Multiple Choice Questions: There is only one correct answer for each question listed below, please clearly indicate the correct answer. There will be no partial credit given for multiple choice questions, thus any explanations will not be graded. (I) Alice and Bob each choose a number independently and uniformly at random from the interval [0, 2]. Consider the following events: A : The absolute difference between the two numbers is greater than 1/4. B : Alice’s number is greater than 1/4. Statistics Exam Help
  • 3. (II) There are m red balls and n white balls in an urn. We draw two balls simultaneously and at random. What is the probability that the balls are of different color? (III) 20 blackpebbles are arranged in 4 rows of 5 pebbles each. We choose 4 of these pebbles at random and color them red. What is the probability that all the red pebbles lie in different rows? Statistics Exam Help
  • 4. (IV) We have two light bulbs, A and B. Bulb A has an exponentially distributed lifetime with mean lifetime 4 days. Bulb B has an exponentially distributed lifetime with mean lifetime 6 days. We select one of the two bulbs at random; each bulb is equally likely to be chosen. Given that the bulb we selected is still working after 12 hours, what is the probability that we selected bulb A? (V) A test for some rare disease is assumed to be correct 95% of the time: if a person has the disease, the test results are positive with probability 0.95, and if the person does not have the disease, the test results are negative with probability 0.95. A random person drawn from a certain population has probability 0.001 of having the disease. Given that the person just tested positive, what is the probability of having the disease? Statistics Exam Help
  • 5. (VI) Let X be a random variable with (VII) X and Y are independent random variables, with Statistics Exam Help
  • 6. (VIII) A number p is drawn from the interval [0, 1] according to the uniform distribution, and then a sequence of independent Bernoulli trials is performed, each with success probability p. What is the variance of the number of successes in k trials? Note k is a deterministic number. (IX) A police radar always over-estimates the speed of incoming cars by an amount that is uniformly distributed between 0 and 5 mph. Assume that car speeds are uniformly distributed between 60 and 75 mph and are independent of the radar over-estimate. If the radar measures a speed of 76 mph, what is the least squares estimate of the actual car speed? (a) 72.5 mph (b) 73.0 mph (c) 73.5 mph (d) 72.9 mph Statistics Exam Help
  • 7. (X) You are visiting the Killian rainforest, when your insect repellant runs out. This forest is infested with lethal mosquitoes, whose second bite will kill any human instantaneously. Assume mosquito’s land on your backand deliver a vicious bite according to a Bernoulli Process. Also assume the expected time till the first bite is 10 seconds. If you arrive at the forest at t = 0, what is the probability that you die at exactly t = 10 seconds? (XI) In order to estimate p, the fraction of people who will vote for George Bush in the next election, you conduct a poll of n people drawn randomly and independently from the population. Your estimator Mn is obtained by dividing Sn, the number of people who vote for Bush in your sample, by n, i.e., Mn = Sn/n. Find the smallest value of n, the number of people you must poll, for which the Chebyshev inequality yields a guarantee that Statistics Exam Help
  • 8. (XII) On each day of the year, it rains with probability 0.1, independent of every other day. Use the Central Limit Theorem (CLT) (with no half step) to approximate the probability that, out of the 365 days in the year, it will rain on at least 100 days. In the answers below, Φ denotes the standard normal CDF. Problem 2: Please write all workfor Problem 2 in your first blue book. No workrecorded below will be graded.. You take a safari trip to the Porobilati game reserve. A highlight of the game reserve is the Poseni river where one can watch deer and elephants coming to drinkwater. Deer come to the river according to a Poisson process with arrival rate λd = 8 per hour; elephants come to the river according to an independent Poisson process with arrival rate λe = 2 per hour. On the first day of your safari, you reach the Poseni river early in the morning hoping to see some elephants. Assume that deer and elephants are the only kinds of animals that visit this river. (a) Let N be the number of animals you see during the first 3 hours. Find E[N] and var(N). (b) What’s the probability of seeing your 3rd elephant before your 9th deer? (c) At the end of 3 hours, you have seen 24 deer but no elephants. What is the probability that you will see an elephant in the next hour? Statistics Exam Help
  • 9. (d) It is now 6 hours since you started watching. You have seen 53 deer but there is still no sight of an elephant. How many more deer do you expect to see before you see your first elephant? (e) Unfortunately, you forgot your camera in your lodge the first day. You come backwith your camera the next day and plan to stay at the river until you’ve clicked a picture of both a deer and an elephant. Assume you’re always able to clicka picture as soon as an animal arrives. How long do you expect to stay? (f) Your friend, a wildlife photographer, is fascinated by the stories of your trip, and decides to take the safari herself. All excited, she arrives at the Poseni river, and is prepared to stay as long as required in order to get lots of beautiful pictures. Unfortunately, she dropped her camera in a bush on the way, as a result of which the camera is not functioning properly; each time she clicks, there is a probability 0.1 that the camera will produce a faulty picture, independently of everything else. Assume she clicks one picture each time an elephant or deer arrives at the river, as soon as it arrives. Let X be the time (in hours) until she gets her third successful picture of an elephant. Find the PDF of X Problem 3: Please write all workfor Problem 3 in your second blue book. No workrecorded below will be graded. Each question is equally weighted at 4 points each. The MIT football team’s performance in any given game is correlated to its morale. If the team has won the past two games, then it has a 0.7 probability of winning the next game. If it lost the last game but won the one before that, it has a 0.4 probability of winning. If it won its last game but lost the one before that it has a 0.6 probability of winning. Finally if it lost the last two games it has only a 0.3 probability of winning the next game. No game can end up in a draw. Consider a starting time when the team has won its preceding two games Statistics Exam Help
  • 10. (a) Graph the minimum state Markov chain model for the MIT football team’s performance. Be sure to label all transition probabilities, and clearly indicate what state of the world each Markov state represents. You may find it convenient for the below questions to label each state with an integer. (b) Find the probability that the first future loss will be followed by another loss. (c) Let X be the number of games played up to, but not including the first loss. Find the PMF of X. (d) Either evaluate the steady-state probabilities or explain why they do not exist. (e) Find a good approximation to the probability that the team will win its 1000th game, given that the outcomes of games 1000 and 1001 are the same. Clearly state any assumptions you use in the approximation. (f) Let T be the number of games up to and including the team’s 2nd consecutive loss (i.e. 2 losses in a row). Write a system of equations that can be used to find E[T]. (g) The coach decides that he will withdraw the team from the competition immediately after the team’s 3rd consecutive loss (i.e. three losses in a row). Let N be the number of games that the team will play in the competition? Write a system of equations that can be used to find E[N]. Statistics Exam Help
  • 11. Problem 1 Multiple Choice Questions: There is only one correct answer for each question listed below, please clearly indicate the correct answer. (1)D Alice's and Bob's choices of number can be illustrated in the following figure. Event A (the absolute difference of the two numbers is greater than i) corresponds to the area shaded by the vertical line. Event B (Alice's number is greater than 1) corresponds to the area shaded by the horizontal line. There fore, event AFB corresponds to the double shaded area. Since both choices is uniformly distributed between 0 and 2, we have that Statistics Exam Help
  • 12. 2B Let B1 be the first ball and B2 be the second ball (you may do this by drawing two balls with both hands, then first look at the ball in left hand and then look at the ball in right hand). (3) C Without any constraint, there are totally 20 × 19 × 18 × 17 possible ways to get 4 pebbles out of 20. Now we consider the case under the constraint that the 4 picked pebbles are on different rows. As illustrated by the following figure, there are 20 possible positions for the first pebble, 15 for the second, 10 for the third and finally 5 for the last pebble. Therefore, there are a total of 20 × 15 × 10 × 5 possible ways. With the uniform discrete probability law, the probability of picking 4 pebbles from different rows is Statistics Exam Help
  • 13. (4) A 1 Let XA and XB denote the the life time of bulb A and bulb B, respectively. Clearly, XA has exponential distribution with parameter λA = 1 and XB has exponential distribution with parameter 4 1 λB = 6 . Let E be the event that we have selected bulb A. We use X to denote the lifetime of the selected bulb. Using Bayes’ rule, we have that Statistics Exam Help
  • 14. (7) B Clearly, X is exponentially distributed with parameter λ = 3. Therefore, its transform is M 3 X(s) = 3−s . The transform of Y can be easily calculated from its PMF as MY (s) = 1 es + 1 . Since 2 2 X and Y are independent, transform of the X+Y is just the product of the transform of X and the transform of Y. Thus, (8)C Let P be the random variable for the value drawn according to the uniform distribution in interval [0, 1] and let X be the number of successes in k trials. Given P = p, X is a binomial random variable Statistics Exam Help
  • 15. (9) B Let X be the car speed and let Y be the radar’s measurement. Then, the joint PDF of X and Y is uniform in the range of pairs (x, y ) such that x ∈ [60, 75] and x ≤ y ≤ x + 5. Therefore, the least square estimator of X given Y = y is Statistics Exam Help
  • 16. (11) B Let Xi be the indicator random variable for the i th person’s vote as follows: (12) D Let Xi be the indicator random variable for the i th day being rainy as follows: Statistics Exam Help
  • 17. We have that µ = E[Xi] = 0.1 and σ2 = var(Xi) = 0.1 · (1 − 0.1) = 0.09. Then the number of rainy days in a year is S365 = X1 + . . . + X365, which is distributed as a binomial(n = 365, p = 0.1). Using the central limit theorem, we have that Problem 2, Each question is equally weighted at 6 points each) The deer process is a Poisson process of arrival rate 8 (λD = 8). The elephant process is a Poisson process of arrival rate 2 (λE = 2). (a) 30 Since deer and elephants are the only animals that visit the river, then the animal process is the merged process of the deer process and the elephant process. Since the deer process and the elephant process are independent Poisson processes with λD = 8 and λE = 2 respectively, the animal process is also a poisson process with λA = λD + λE = 10. Therefore, the number of animals arriving the three hours is distributed according to a Poisson distribution with parameter λA · 3 = 30. Therefore E[ number of animals arriving in three hours ] = λA · 3 = 30 , var( number of animals arriving in three hours ) = λA · 3 = 30 . Statistics Exam Help
  • 18. Let’s now focus on the animal arrivals themselves and ignore the inter-arrival times. Each animal arrival can be either a deer arrival or a elephant arrival. Let Ek denote the event that the kth animal arrival is an elephant. It is explained in the book (Example 5.14, Page 295) that and the events E1, E2, . . . are independent. Regarding the arrival of elephant as “success”, it forms a Bernoulli process, with parameter p = 1 . Clearly, observing the 3rd elephant before the 9th deer is 5 equivalent to that the 3rd elephant arrives before time 12 in the above Bernoulli process. Let T3 denote the 3rd elephant arrival time in the Bernoulli process. Statistics Exam Help
  • 19. Since the deer arrival process is independent of the elephant arrival process, the event that “several deer arrive by the end of 3 hours” is independent of any event defined on the elephant arrival process. Let T1 denote the first arrival time. Then the desired probability can be expressed as Step (3) follows from the memoryless property of the Poisson process. (d) 4 As in part (b), regarding the arrival of elephant as “success”, it forms a Bernoulli process, with parameter p = 1 . Owing to the memoryless property, given that no “success” in the first 53 trials 5 (have seen 53 deer but no elephant), the number of trials, X1, up to and including the first “success” is still a geometrical random variable with parameter p = 1 . Since the number of “failures” (deers) 5 before the first “success” (elephant) is X1 − 1, then (e) 40 Let TD (resp. TE) denote the first arrival time of the deer process (resp. elephant process). The time S until I clicked a picture of both a deer and a elephant is equal to max{TD, TE}. We express S into two parts, Statistics Exam Help
  • 20. where S1 = min{TD, TE} is the first arrival time of an animal and S2 = max{TD, TE} − min{TD, TE} is the additional time until both animals register one arrival. Since the animal arrival process is Poisson with rate 10, Concerning S2, there are two cases. (1) The first arrival is a deer, which happens with probability λD+λE λD . Then we wait for an elephant 1 arrival, which takes λE time on average. (2) (2) The first arrival is an elephant, which happens with probability λD+λE λE . Then we wait for a deer 1 arrival, which takes λD time on average. Statistics Exam Help
  • 21. X denotes the time until the third successful picture of an elephant. Since the deer arrivals are independent of the elephant arrivals and X is only related to elephant arrivals, we can focus on the elephant arrival process. We split the elephant arrival process into two processes, one composed of the successful elephant pictures (We call this process the “successful elephant picture arrival process”) and the other one composed of unsuccessful elephant pictures (We call this process the “unsuccessful elephant picture arrival process”). Since the quality of each picture is independent of everything else, the successful elephant picture arrival process is a Poisson process with parameter λSE = 0.9 λE. Therefore, X is the 3rd · arrival time of the successful elephant picture arrival process. Thus X has the following Erlang distribution (a) The performance of the MIT football team is described by a Markov chain, where the state is taken to be (result of the game before the last game, result of the last game). There are four possible states: {WW, W L, LW, LL}. The corresponding transition probability graph is We use throughout this problem the following labels for the four states Statistics Exam Help
  • 22. (b) 0.6 When the winning streak of the team gets interrupted, the chain is in state 2. The probability of losing the next game is 0.6. An alternative computation-intensive method: Denote the starting time to be n = 1 and let Xn denote the result of the nth game. Since the team starts when they has won their previous two games, we set X0 = X−1 = W. Then, P[ the first future loss followed by another loss | X0 = X−1 = W ] +∞ = P the first future loss occurs at time n and another loss at time n + 1 | X0 = X−1 = W ] n=1 Statistics Exam Help
  • 23. Clearly, X, the number of games played before the first loss, takes value in {0, 1, . . .}. Using the same notations as in (b),the PMF of X can be calculated as follows. Statistics Exam Help
  • 24. The Markov chain consists of a single aperiodic recurrent class, so the steady-state distribution exists, denoted by π = (π1, π2, π3, π4). Solving the equilibrium system πP = π together with 4 i=1 πi = 1, we get the desired result. (e) 2 Recall that Xn denotes the outcome of the nth game. The desired probability is T is the first passage time from state 1 to state 4. Let µ1, µ2, µ3, µ4 be the average first passage time from each state to state 4. We have that E[T] = µ1 Since we are concerned with the first passage time to state 4, the Markov chain’s behavior from state 4 is irrelevant. Therefore, we focus on the modified Markov chain graph where state 4 is converted to an absorbing state. The average first Statistics Exam Help
  • 25. passage time to state 4 in the original Markov chain is equal to the average absorption time to state 4 in the modified Markov chain. The required linear equation system is Statistics Exam Help
  • 26. Three losses in a row can only be reached from state LL. Following this observation, we create a fifth state LLL (labeled 5) and construct the following Markov transition graph. The probability transition matrix of this Markov chain is The number of games played by the MIT football team corresponds to the absorption time from state 1 to state 5. Denote the average absorption time to state 5 as t1, t2, t3, t4, t5. The linear equation system is as follows. Statistics Exam Help