2.
Quantitative Methods Quantifying Uncertainty: Basic Concepts of Probability
3.
Quotes from You and Me <ul><li>Chances of your getting a handsome job should improve if you obtain an MBA. </li></ul><ul><li>Probably, collections will jump this month. </li></ul><ul><li>Most probably, ERP will be on by June. </li></ul><ul><li>Odds are great for my promotion this time. </li></ul><ul><li>Winning cricket match against Australia is not impossible, but is highly improbable Defects from new machines are unlikely. </li></ul>Varsha Varde
4.
Uncertainty <ul><li>Each Statement Involves Uncertainty. </li></ul><ul><li>Chances = Odds = Likelihood = Probability </li></ul><ul><li>Real Life is Usually Full of Uncertainty. </li></ul><ul><li>Almost Nothing is for Sure. </li></ul><ul><li>There are Chances of Something Happening and Chances of Something Else Happening. </li></ul><ul><li>In Such Situations, You can’t ‘Prove’ Anything. </li></ul><ul><li>All You Can Do is to Assign a Probability to Each of the Different Possible Outcomes. </li></ul>Varsha Varde
5.
Quotes from You and Me After This MBA <ul><li>Chances of your getting a handsome job would be 90% if you obtain an MBA. </li></ul><ul><li>I am 75% confident that collections will jump this month. </li></ul><ul><li>Odds are 80:20 for my promotion this time. </li></ul><ul><li>. Winning cricket match against Australia is not impossible but has only 10% chance </li></ul><ul><li>New machines churn out good product 97 out of 100 times. </li></ul>Varsha Varde
6.
Probability Theory <ul><li>How Do You Say 90% Chances, or 80:20 Odds, or 75% Confidence? </li></ul><ul><li>Probability Theory Provides Tools to Decision Makers to Quantify Uncertainties. </li></ul>Varsha Varde
7.
Assigning Probabilities <ul><li>Classical Approach: Assumes equally likely outcomes (card games ,dice games, tossing coins and the like) </li></ul><ul><li>Relative Frequency Approach: Uses relative frequencies of past occurrences as probabilities (Decision problems in area of management. Delay in delivery of product) </li></ul><ul><li>Subjective Approach :Guess based on past experience or intuition.( At higher level of managerial decisions for important ,specific and unique decisions </li></ul>
9.
Use Relative Frequencies <ul><li>Making use of relative frequencies </li></ul><ul><li>of past. </li></ul><ul><li>Suppose an organisation knows from past data that about 25 out of 300 employees entering every year leave due to good opportunities elsewhere </li></ul><ul><li>then the organisation can predict the probability of employee turnover for this reason </li></ul><ul><li>as 25/300=1/12=0.083 </li></ul>
10.
Subjective Probability <ul><li>Based on personal judgements </li></ul><ul><li>Uses individual’s experience and familiarity with facts </li></ul><ul><li>An expert analyst of share prices may give his judgement as follows on price of ACC shares in next two months </li></ul><ul><li>20% probability of increase by Rs500or more </li></ul><ul><li>60% probability of increase by less than Rs500 </li></ul><ul><li>20%probability of remaining unchanged </li></ul>
11.
Experiment <ul><li>Experiment: An experiment is some act, trial or operation that results in a set of possible outcomes. </li></ul><ul><li>-The roll of two dice to note the sum of spots </li></ul><ul><li>-The toss of a coin to see the face that turns up. </li></ul><ul><li>- polling </li></ul><ul><li>- inspecting an assembly line </li></ul><ul><li>- counting arrivals at emergency room </li></ul><ul><li>- following a diet </li></ul>
12.
Event <ul><li>Event: An event means any collection of possible outcomes when an experiment is performed. For example, </li></ul><ul><li>When an unbiased die is rolled we may get either spot 1, spot 2, spot 3, spot 4, spot 5 or spot 6. Appearance of anyone of the spots is an event. </li></ul><ul><li>Appearance of an even spot is also an event. </li></ul>
13.
EVENT/OUTCOME <ul><li>-The roll of two dice ( Appearance of the sum of spots ) </li></ul><ul><li>-The toss of a coin( the face that turns up) </li></ul><ul><li>- polling ( Win or lose ) </li></ul><ul><li>- inspecting an assembly line (Number of defectives) </li></ul><ul><li>- counting arrivals at emergency room( Number of arrivals in one hour ) </li></ul><ul><li>- following a diet ( weight loss or gain ) </li></ul>
14.
Sample space <ul><li>Sample space: the set of all sample points (simple events) for an experiment is called a sample space; or set of all possible outcomes for an experiment </li></ul><ul><li>Venn diagram :It is a pictorial representation of the sample space.It is usually drawn as a rectangular figure representing the sample space and circles representing events in the sample space. </li></ul>
15.
Venn Diagram For Roll of a die A:Odd spots B:Even Spots
16.
Equally Likely Events <ul><li>Equiprobable or Equally Likely Events: Events are said to be equiprobable when one does not occur more often than the others. </li></ul><ul><li>When an unbiased die is thrown any one of the six spots may appear. </li></ul><ul><li>When an unbiased coin is tossed either a head or a tail appears </li></ul>
17.
Exhaustive Events <ul><li>Exhaustive Events: Events are said to be exhaustive when they include all possible cases or outcomes. For example, in tossing of fair coin, the two events “appearance of a head” and “appearance of a tail” are exhaustive events because when a coin is tossed we would get either a head or a tail. </li></ul>
18.
Independent Events <ul><li>Independent Events: Two events A and B are said to be independent if occurrence of A does not affect and is not affected by the occurrence of B. </li></ul><ul><li>When a coin is tossed twice the result of the first toss does not affect and is not affected by the result of the second toss. Thus, the result of the first toss and the result of the second toss are independent events. </li></ul>
19.
Dependent Events <ul><li>Dependent Events: Two events A and B are called dependent if the occurrence of A affects or is affected by the occurrence of B. </li></ul><ul><li>For example, there are four kings in a pack of 52 cards. The event of drawing a king at the first draw and the event of drawing another king at the second draw when the first drawn king is not replaced, are two dependent events. In the first event there are four kings in a pack of 52 cards and in the second event there are only three kings left in the pack of remaining 51 cards </li></ul>
20.
Mutually Exclusive Events <ul><li>Events are termed mutually exclusive if they cannot occur together so that in any one trial of an experiment at most one of the events would occur. </li></ul><ul><li>Mutually Exclusive Events : </li></ul><ul><li>“ throwing even” and “throwing odd” with one die, </li></ul><ul><li>“ drawing the spade,” “drawing a diamond” and “drawing a club” while drawing one card from a deck. </li></ul><ul><li>purchase of a machine out of 3 brands available </li></ul><ul><li>Not mutually exclusive </li></ul><ul><li>“ drawing a spade” and “drawing a queen” </li></ul><ul><li>“ even number” and “at least 3” with one die </li></ul><ul><li>Selection of a candidate with post graduate qualification and over 3 years experience </li></ul><ul><li>A particular easy way to obtain two mutually exclusive events is to consider an event and its negative(Complement). Such as “even” and “not even,” “spade”, “not spade” or in general ‘A’ and ‘not A’. </li></ul>
21.
Notation . <ul><li>Sample space : S </li></ul><ul><li>Sample point: E 1 , E 2 , . . . etc. </li></ul><ul><li>Event: A,B,C,D,E etc. (any capital letter). </li></ul><ul><li>Venn diagram: </li></ul><ul><li>Example. </li></ul><ul><li>S = {E 1 , E 2 , . . ., E 6 } . </li></ul><ul><li>That is S = { 1 , 2 , 3 , 4 , 5 , 6 } . We may think of S as representation of possible outcomes of a throw of a die. </li></ul>Varsha Varde
22.
Venn Diagram A:Candidates over 3 years experience B:Candidates with post graduate qualification S AB
23.
More definitions <ul><li>Union, Intersection and Complementation </li></ul><ul><li>Given A and B two events in a sample space S . </li></ul><ul><li>1. The union of A and B , AUB , is the event containing all sample points in either A or B or both. Sometimes we use A or B for union. </li></ul><ul><li>2. The intersection of A and B , A ∩ B , is the event containing all sample points that are both in A and B . Sometimes we use AB or A and B for intersection. </li></ul><ul><li>3. The complement of A, Ā the event containing all sample points that are not in A . Sometimes we use not A or A c for complement. </li></ul><ul><li>Mutually Exclusive Events (Disjoint Events) </li></ul><ul><li>4 Two events are said to be mutually exclusive (or disjoint) if their intersection is empty. (i.e. A ∩ B = ö ). </li></ul>Varsha Varde
24.
Example <ul><li>Suppose S = {E 1 , E 2 , . . ., E 6 } . Let </li></ul><ul><li>A = {E 1 , E 3 , E 5 } ; </li></ul><ul><li>B = {E 1 , E 2 , E 3 } . Then </li></ul><ul><li>(i) A U B = {E 1 , E 2 , E 3 , E 5 } . </li></ul><ul><li>(ii) A ∩ B = {E 1 , E 3 } . </li></ul><ul><li>(iii) Ā = {E 2 , E 4 , E 6 } ; B c = {E 4 , E 5 , E 6 } ; </li></ul><ul><li>(iv) A and B are not mutually exclusive (why?) </li></ul><ul><li>(v) Give two events in S that are mutually exclusive. </li></ul>Varsha Varde
25.
Probability of an event <ul><li>Relative Frequency Definition If an experiment is repeated a large number, n , of times and the event A is observed n A times, the probability of A is </li></ul><ul><li>P ( A ) = n A / n </li></ul><ul><li>Interpretation </li></ul><ul><li>n = # of trials of an experiment </li></ul><ul><li>n A = frequency of the event A </li></ul><ul><li>n A /n = relative frequency of A </li></ul><ul><li>P ( A ) = n A /n , if n is large enough. </li></ul>Varsha Varde
26.
Basic Formula of Probability <ul><li>Probability of an Event A: </li></ul><ul><li>No. of Outcomes Favourable to Event A </li></ul><ul><li>= ---------------------------------------------------- </li></ul><ul><li>Total Number of All Possible Outcomes </li></ul><ul><li>Probability is a Ratio. (A Distribution Ratio) </li></ul><ul><li>It varies from 0 to 1. </li></ul><ul><li>Often, It is Expressed in Percentage Terms Ranging from 0% to 100%. </li></ul><ul><li>It is denoted as P(A) and termed as marginal or unconditional probability </li></ul>Varsha Varde
27.
Rules of Probability: Multiplication Rule <ul><li>It is for Probability of Simultaneous Occurrence of Two Events </li></ul><ul><li>If A and B are two independent events, P(A & B) = P(A) x P(B) </li></ul><ul><li>Example: Experiment: Toss Two Coins </li></ul><ul><li>A: Getting Head on Coin No. 1 </li></ul><ul><li>B: Getting Head on Coin No. 2 </li></ul><ul><li>P(A)= ½, P(B)= ½, P(A&B)= ¼ =0.25 </li></ul>Varsha Varde
28.
Rules of Probability: General Multiplication Rule <ul><li>If A and B are two dependent events, </li></ul><ul><li>P(A & B) = P(A) x P(B|A) </li></ul><ul><li>P(B|A) The conditional probability of the event B given that event A has occurred </li></ul><ul><li>Example: Draw Two Cards from a Deck </li></ul><ul><li>A: First Card a King </li></ul><ul><li>B: Second Card also a King </li></ul><ul><li>P(A)=4/52=1/13, P(B|A)=3/51 </li></ul><ul><li>P(A & B)=1/13 x 3/51=3/204=0.015=1.5% </li></ul>Varsha Varde
29.
Rules of Probability: Addition Rule <ul><li>It is for Probability of Occurrence of Either of the Two Events </li></ul><ul><li>If A and B are two mutually exclusive events, P(A or B) = P(A) + P(B) </li></ul><ul><li>Example: Experiment: Roll a Die </li></ul><ul><li>A: Getting the No. 5 B: Getting the No. 6 </li></ul><ul><li>P(A)=1/6, P(B)=1/6, P(A or B)=1/3=0.33=33% </li></ul><ul><li>Note: Two Events are Mutually Exclusive if They Cannot Occur Together </li></ul>Varsha Varde
30.
Rules of Probability: General Addition Rule <ul><li>If A and B are any two events , P(A or B) = P(A) + P(B) – P(A & B) </li></ul><ul><li>Example: Toss Two Coins </li></ul><ul><li>A: Getting Head on Coin No. 1 </li></ul><ul><li>B: Getting Head on Coin No. 2 </li></ul><ul><li>P(A)= ½, P(B)= ½, P(A & B)= ¼ </li></ul><ul><li>So, P(A or B)= ½ + ½ - ¼ = ¾ =0.75=75% </li></ul>Varsha Varde
31.
Exercise <ul><li>If 80% Company guests visit the HO, 70% visit the Plant, and 60% visit both, what is the chance that a guest will visit HO or Plant or both? </li></ul><ul><li>What is the probability that he will visit neither the HO nor the Plant, but meet Company Executives at the Taj? </li></ul>Varsha Varde
32.
Solution <ul><li>P(A)=0.8 P(B)=0.7 P(A&B)=0.6 </li></ul><ul><li>Prob that a guest will visit HO or Plant or both = P(A&B)=0.8 + 0.7 – 0.6=0.9 = 90% </li></ul><ul><li>Prob that he will visit neither the HO nor the Plant, but meet Company Executives at the Taj = 1 - Prob that a guest will visit HO or Plant or both = 1 – 0.9 = 0.1 = 10% </li></ul>Varsha Varde
33.
Conceptual Definition of Probability <ul><li>Consider a random experiment whose sample space is S with sample points E 1 , E 2 , . . . , . </li></ul><ul><li>For each event Ei of the sample space S let P ( Ei ) be the probability of Ei </li></ul><ul><li>(i) 0 ≤ P ( Ei ) ≤ 1 for all i </li></ul><ul><li>(ii) P ( S ) = 1 </li></ul><ul><li>(iii) ∑ P ( Ei ) = 1 , where the summation is over all sample points in S . </li></ul>Varsha Varde
34.
Example <ul><li>Definition The probability of any event A is equal to the sum of the probabilities of the sample points in A . </li></ul><ul><li>Example. Let S = {E 1 , . . ., E 10 } . </li></ul><ul><li>Ei E 1 E 2 E 3 E 4 E 5 E 6 E 7 E 8 E 9 E 10 </li></ul><ul><li>P( Ei ) 1 / 20 1 / 20 1 / 20 1 / 20 1 / 20 1 / 20 1 / 5 1 / 5 1 / 5 1 / 10 </li></ul><ul><li>Question: Calculate P ( A ) where A = {Ei, i ≥ 6 } . </li></ul><ul><li>P ( A ) = P ( E 6) + P ( E 7) + P ( E 8) + P ( E 9) + P ( E 10) </li></ul><ul><li>= 1 / 20 + 1 / 5 + 1 / 5 + 1 / 5 + 1 / 10 = 0 . 75 </li></ul>Varsha Varde
35.
Steps in calculating probabilities of events <ul><li>1. Define the experiment </li></ul><ul><li>2. List all simple events </li></ul><ul><li>3. Assign probabilities to simple events </li></ul><ul><li>4. Determine the simple events that constitute the given event </li></ul><ul><li>5. Add up the simple events’ probabilities to obtain the probability of the given event </li></ul><ul><li>Example Calculate the probability of observing one H in a toss of two fair coins. </li></ul><ul><li>Solution. </li></ul><ul><li>S = {HH,HT,TH, TT} </li></ul><ul><li>A = {HT, TH} </li></ul><ul><li>P ( A ) = 0 . 5 </li></ul>Varsha Varde
36.
Example. <ul><li>Example. Toss a fair coin 3 times. </li></ul><ul><li>(i) List all the sample points in the sample space </li></ul><ul><li>Solution: S = {HHH, · · ·TTT} (Complete this) </li></ul><ul><li>(ii) Find the probability of observing exactly two heads and at most one head. </li></ul>Varsha Varde
37.
Probability Laws <ul><li>Complementation law: </li></ul><ul><li>P ( A ) = 1 - P ( Ā ) </li></ul><ul><li>Additive law: </li></ul><ul><li>P ( A U B ) = P ( A ) + P ( B ) - P ( A ∩ B ) </li></ul><ul><li>Moreover, if A and B are mutually exclusive, then P ( A ∩ B ) = 0 and </li></ul><ul><li>P ( A U B ) = P ( A ) + P ( B ) </li></ul><ul><li>Multiplicative law (Product rule) </li></ul><ul><li>P ( A ∩ B ) = P ( A|B ) P ( B ) </li></ul><ul><li>= P ( B|A ) P ( A ) </li></ul><ul><li>Moreover, if A and B are independent </li></ul><ul><li>P ( A ∩ B ) = P ( A ) P ( B ) </li></ul>Varsha Varde
38.
Example <ul><li>Let S = {E 1 , E 2 , . . ., E 6 } ; A = {E 1 , E 3 , E 5 } ; B = {E 1 , E 2 , E 3 } ; C = {E 2 , E 4 , E 6 } ; D = {E 6 } . Suppose that all elementary events are equally likely. </li></ul><ul><li>(i) What does it mean that all elementary events are equally likely? </li></ul><ul><li>(ii) Use the complementation rule to find P ( A c ). </li></ul><ul><li>(iii) Find P ( A|B ) and P ( B|A ) </li></ul><ul><li>(iv) Find P ( D ) and P ( D|C ) </li></ul><ul><li>(v) Are A and B independent? Are C and D independent? </li></ul><ul><li>(vi) Find P ( A ∩ B ) and P ( A U B ). </li></ul>Varsha Varde
39.
Law of total probability <ul><li>Let A, A c be complementary events and let B denote an arbitrary event. Then </li></ul><ul><li>P(B) = P ( B ∩ A ) + P ( B ∩ A c ) , </li></ul><ul><li>or </li></ul><ul><li>P ( B ) = P ( B/A ) P ( A ) + P ( B/A c ) P ( A c ) . </li></ul>Varsha Varde
40.
Bayes’ Law <ul><li>Let A ,A c be complementary events and let B denote an arbitrary event. Then </li></ul><ul><li>P (A |B)= P ( AB )/ P (B ) </li></ul><ul><li>P (B /A ) P (A) </li></ul><ul><li>P (A |B ) =- --------------------------------- </li></ul><ul><li>P (B /A ) P (A) + P (B /A c ) P (A c ) </li></ul><ul><li>Remarks. </li></ul><ul><li>(i) The events of interest here are A , A c , </li></ul><ul><li>(ii) P (A) and P (A c ) are called prior probabilities, </li></ul><ul><li>(iii) P (A |B ) and P (A c |B ) are called posterior (revised) probabilities. </li></ul><ul><li>(iv) Bayes’ Law is important in several fields of applications. </li></ul>Varsha Varde
41.
Bayesian Approach <ul><li>English mathematician Thomas Bayes (1702-61) set out his theory of probability </li></ul><ul><li>It is being revived now 250 years later </li></ul><ul><li>Step ahead from Subjective Prob Method </li></ul><ul><li>A: Digestive disorder, B: Drinking Coke </li></ul><ul><li>Bayes’ Rule: </li></ul><ul><li>P(B|A) P(A) 0.65x0.3 </li></ul><ul><li>P(A|B) = ------------------- = ----------- = 0.53 </li></ul><ul><li>P(B) 0.37 </li></ul>
43.
Example . <ul><li>A laboratory blood test is 95 percent effective in detecting a certain disease when it is, in fact, present. However, the test also yields a “false positive” results for 1 percent of healthy persons tested. (That is, if a healthy person is tested, then, with probability 0.01, the test result will imply he or she has the disease.) If 0.5 percent of the population actually has the disease, what is the probability a person has the disease given that the test result is positive? </li></ul><ul><li>Solution Let D be the event that the tested person has the disease and E the event that the test result is positive. The desired probability P ( D|E ) is obtained by </li></ul><ul><li>P ( D/E ) = P ( D ∩ E )/ P ( E ) </li></ul><ul><li>= P ( E/D ) P ( D )/ P ( E/D ) P ( D ) + P ( E/D c ) P ( D c ) </li></ul><ul><li>=( . 95)( . 005)/( . 95)( . 005) + ( . 01)( . 995) </li></ul><ul><li>=95/294 ≈0 . 323 . </li></ul><ul><li>Thus only 32 percent of those persons whose test results are positive actually have the disease. </li></ul>Varsha Varde
44.
General Bayes’Theorom <ul><li>A1,A2,…..Ak are k mutually exclusive and exhaustive events with known prior probabilities P(A1),P(A2),….P(Ak) </li></ul><ul><li>B is an event that follows or is caused by prior events A1,A2, …Ak with </li></ul><ul><li>Conditional probabilities P(B/A1),P(B/A2),…P(B/Ak) which are known </li></ul><ul><li>Bayes’ formula allows us to calculate posterior (revised) probabilities P(A1/B),P(A2/B),….P(Ak/B) </li></ul><ul><li>P(Ai/B)=P(Ai)P(B/Ai)/{P(A1)P(B/A1)+…+P(Ak)P(B/Ak)} </li></ul>
45.
Counting Sample Points <ul><li>Is it always necessary to list all sample points in S ? </li></ul><ul><li>Coin Tosses </li></ul><ul><li>Coins sample-points Coins sample-points </li></ul><ul><li>1 2 2 4 </li></ul><ul><li>3 8 4 16 </li></ul><ul><li>5 32 6 64 </li></ul><ul><li>10 1024 20 1,048,576 </li></ul><ul><li>30 ≈ 10 9 40 ≈ 10 12 </li></ul><ul><li>50 ≈ 10 15 60 ≈ 10 19 </li></ul><ul><li>Note that 2 30 ≈ 10 9 = one billion, 2 40 ≈ 10 12 = one thousand billion, 2 50 ≈10 15 =one trillion. </li></ul><ul><li>RECALL: P ( A ) = n A /n , so for some applications we need to find n, nA where n and n A are the number of points in S and A respectively. </li></ul>Varsha Varde
46.
Basic principle of counting: mn rule <ul><li>Suppose that two experiments are to be performed. Then if experiment 1 can result in any one of m possible outcomes and if, for each outcome of experiment 1, there are n possible outcomes of experiment 2, then together there are mn possible outcomes of the two experiments. </li></ul>Varsha Varde
47.
Examples. <ul><li>(i) Toss two coins: mn = 2 × 2 = 4 </li></ul><ul><li>(ii) Throw two dice: mn = 6 × 6 = 36 </li></ul><ul><li>(iii) A small community consists of 10 men, each of whom has 3 sons. If one man and one of his sons are to be chosen as father and son of the year, how many different choices are possible? </li></ul><ul><li>Solution : Let the choice of the man as the outcome of the first experiment and the subsequent choice of one of his sons as the outcome of the second experiment, we see,from the basic principle, that there are 10 × 3 = 30 possible choices. </li></ul>Varsha Varde
48.
Generalized basic principle of counting <ul><li>If r experiments that are to be performed are such that the first one may result in any of n 1 possible outcomes, and if for each of these n 1 possible outcomes there are n 2 possible outcomes of the second experiment, and if for each of the possible outcomes of the first two experiments there are n 3 possible outcomes of the third experiment, and so on, . . . , then there are a total of n 1 x n 2 · · xn r possible outcomes of the r experiments. </li></ul>Varsha Varde
49.
Examples <ul><li>(i) There are 5 routes available between A and B ; 4 between B and C ; and 7 between C and D . What is the total number of available routes between A and D ? </li></ul><ul><li>Solution: The total number of available routes is mnt = 5 . 4 . 7 = 140. </li></ul><ul><li>(ii) A college planning committee consists of 3 freshmen, 4 parttimers, 5 juniors and 2 seniors. A subcommittee of 4, consisting of 1 individual from each class, is to be chosen. How many different subcommittees are possible? </li></ul><ul><li>Solution: It follows from the generalized principle of counting that there are 3 · 4 · 5 · 2 = 120 possible subcommittees. </li></ul>Varsha Varde
50.
Examples <ul><li>(iii) How many different 7 - place license plates are possible if the first 3 places are to be occupied by letters and the final 4 by numbers? </li></ul><ul><li>Solution: It follows from the generalized principle of counting that there are 26 · 26 · 26 · 10 · 10 · 10 · 10 = 175 , 760 , 000 possible license plates. </li></ul><ul><li>(iv) In (iii), how many license plates would be possible if repetition among letters or numbers were prohibited? </li></ul><ul><li>Solution: In this case there would be 26 · 25 · 24 · 10 · 9 · 8 · 7 = 78 , 624 , 000 possible license plates. </li></ul>Varsha Varde
51.
Permutations: (Ordered arrangements ) <ul><li>Permutations: (Ordered arrangements) The number of ways of ordering n distinct objects taken r at a time (order is important) is given by </li></ul><ul><li>n ! /( n - r )! = n ( n - 1)( n - 2) · · · ( n - r + 1) </li></ul><ul><li>Examples </li></ul><ul><li>(i) In how many ways can you arrange the letters a , b and c . List all arrangements. </li></ul><ul><li>Answer: There are 3! = 6 arrangements or permutations. </li></ul><ul><li>(ii) A box contains 10 balls. Balls are selected without replacement one at a time. In how many different ways can you select 3 balls? </li></ul><ul><li>Solution: Note that n = 10 , r = 3. Number of different ways is </li></ul><ul><li>= 10! /7! = 10 · 9 · 8= 720, </li></ul>Varsha Varde
52.
Combinations <ul><li>Combinations For r ≤ n , we define </li></ul><ul><li>nCr =n ! / ( n - r )! r ! </li></ul><ul><li>and say that n and r represents the number of possible combinations of n objects taken r at a time (with no regard to order). </li></ul><ul><li>Examples </li></ul><ul><li>(i) A committee of 3 is to be formed from a group of 20 people. How many different committees are possible? </li></ul><ul><li>Solution: There are 20C3 = 20! /3!17! = 20 . 19 . 18/3 . 2 . 1 = 1140 possible committees. </li></ul><ul><li>(ii) From a group of 5 men and 7 women, how many different committees consisting of 2 men and 3 women can be formed? </li></ul><ul><li>Solution: 5C2 x 27C3 = 350 possible committees. </li></ul>Varsha Varde
53.
Random Sampling <ul><li>Definition. A sample of size n is said to be a random sample if the n elements are selected in such a way that every possible combination of n elements has an equal probability of being selected .In this case the sampling process is called simple random sampling . </li></ul><ul><li>Remarks. (i) If n is large, we say the random sample provides an honest representation of the population. </li></ul><ul><li>(ii) For finite populations the number of possible samples of size n is N C n </li></ul><ul><li>For instance the number of possible samples when N = 28 and n = 4 is 28 C 4 =20475 </li></ul><ul><li>Tables of random numbers may be used to select random samples. </li></ul>Varsha Varde
54.
Frequency Distribution: Number of Sales Orders Booked by 50 Sales Execs April 2006 Varsha Varde Number of Orders Number of SEs 00 – 04 14 05 - 09 19 10 – 14 07 15 – 19 04 20 – 24 02 25 – 29 01 30 – 34 02 35 – 39 00 40 – 44 01 TOTAL 50
55.
Probability Distribution Varsha Varde Number of Orders Number of SEs Probability 00 – 04 14 0.28 05 - 09 19 0.38 10 – 14 07 0.14 15 – 19 04 0.08 20 – 24 02 0.04 25 – 29 01 0.02 30 – 34 02 0.04 35 – 39 00 0.00 40 – 44 01 0.02 TOTAL 50 1.00
56.
Standard Discrete Prob Distns <ul><li>Binomial Distribution: When a Situation can have Only Two Possible Outcomes e.g. PASS or FAIL, ACCEPT or REJECT. </li></ul><ul><li>This distribution gives probability of an outcome (say, ACCEPT) occurring exactly m times out of n trials of the situation, i.e. probability of 10 ACCEPTANCES out of 15 items tested. </li></ul>Varsha Varde
57.
Standard Discrete Prob Distns <ul><li>Poisson Distribution: When a Situation can have Only Two Possible Outcomes, & When the Total Number of Observations is Large (>20), Unknown or Innumerable. </li></ul><ul><li>This distribution gives the probability of an outcome (say, ACCEPT) occurring m times, i.e. probability of say 150 ACCEPTANCES. </li></ul>Varsha Varde
58.
Standard Continuous Prob Distn <ul><li>Normal Distribution: Useful & Important </li></ul><ul><li>Several Variables Follow Normal Distn or a Pattern Nearing It. (Weights, Heights) </li></ul><ul><li>Skewed Distns Assume This Shape After Getting Rid of Outliers </li></ul><ul><li>For Large No. of Observations, Discrete Distributions Tend to Follow Normal Distn </li></ul><ul><li>It is Amenable to Mathematical Processes </li></ul>Varsha Varde
59.
Features of Normal Distribution <ul><li>Symmetrical and Bell Shaped </li></ul><ul><li>Mean at the Centre of the Distribution </li></ul><ul><li>Mean = mode = Median </li></ul><ul><li>Probabilities Cluster Around the Middle and Taper Off Gradually on Both Sides </li></ul><ul><li>Very Few Values Beyond Three Times the Standard Deviation from the Mean </li></ul>Varsha Varde
60.
Probabilities in Normal Distn <ul><li>68 % of Values Lie in the Span of Mean Plus / Minus One Standard Deviation. </li></ul><ul><li>95 % of Values Lie in the Span of Mean Plus / Minus Two Standard Deviation. </li></ul><ul><li>99 % of Values Lie in the Span of Mean Plus / Minus Three Standard Deviation. </li></ul><ul><li>Standard Normal Distn Tables Readily Show Prob of Every Value. </li></ul><ul><li>Use Them to Draw Inferences. </li></ul>Varsha Varde
A particular slide catching your eye?
Clipping is a handy way to collect important slides you want to go back to later.
Be the first to comment