Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
PROBABILITY FUNDAMENTALS PROF. NAVEEN BHATIA
PROBABILITY FUNDAMENTALS <ul><li>PROBABILITY IS SCIENTIFIC APPROACH TO UNCERTAINITY. </li></ul><ul><li>STUDY OF MATHEMATIC...
PROBABILITY FUNDAMENTALS <ul><li>QUEUING: BRANCH OF ENGINEERING CONCERNED WITH THE ANALYSIS AND DESIGN OF SYSTEMS INVOLVIN...
PROBABILITY FUNDAMENTALS <ul><li>PROBABILITIES  ARE LONG RUN RELATIVE FREQUENCIES. THIS PERSPECTIVE IN WHICH PROBABILITY I...
PROBABILITY FUNDAMENTALS <ul><li>RANDOM VARIABLE: </li></ul><ul><li>CONSIDER A CASE WHERE THE PAY-OFF IN A GAME OF ROLLING...
PROBABILITY FUNDAMENTALS <ul><li>RANDOM VARIABLE: </li></ul><ul><li>X(1)= -3, X(2)=1; X(3)=1; X(4)=1;X(5)=2 AND X(6)=-3 </...
PROBABILITY FUNDAMENTALS
PROBABILITY FUNDMENTALS <ul><li>TWO PROPERTIES OF PROBABILITY DISTRIBUTION </li></ul><ul><li>1.  0 ≤P(X)≤1 FOR ALL X </li>...
PROBABILITY FUNDMENTALS <ul><li>CHOOSING PAY-OFFS WHEN DISTRIBUTIONS ARE NOT IDENTICAL </li></ul><ul><li>IF EXPECTED PAY-O...
PROBABILITY FUNDMENTALS <ul><li>CLASSICAL PROBABILITY: </li></ul><ul><li>TWO ASSUMPTIONS: </li></ul><ul><li>NUMBER OF POSS...
PROBABILITY FUNDMENTALS <ul><li>CLASSICAL PROBABILITY: </li></ul><ul><li>A RECENT STUDY ON TEENAGE DRUG & ALCOGOL USE FOUN...
PROBABILITY FUNDMENTALS <ul><li>CLASSICAL PROBABILITY: </li></ul><ul><li>THREE EVENTS  </li></ul><ul><li>M: TENNEAGERS TAK...
PROBABILITY FUNDAMENTAL <ul><li>OTHER PROPERTIES  </li></ul><ul><li>P(A C )=1-P(A): THIS FOLLOWS FROM THE FACT THAT P(A)& ...
PROBABILITY FUNDAMENTAL <ul><li>CONDITIONAL PROBABILITY: </li></ul><ul><li>A={(d1,d2): d1+d2=4} </li></ul><ul><li>B={(d1,d...
PROBABILITY FUNDAMENTAL <ul><li>LET S1: STUDENT IS MALE; S2: STUDENT IS FEMALE;C1: THE STUDENT IS FROM SCIENCE; C2: THE ST...
PROBABILITY FUNDAMENTAL <ul><li>SYSTEM RELIABILITY GIVEN SUBSYSTEM RELIABILIY </li></ul>SUB SYSTEM1 SUB SYSTEM2 SYSTEM
PROBABILITY FUNDAMENTALS <ul><li>IN THE PREVIOUS EXAMPLE , THE SYSTEM RELIABILITY IS RS= R1.R2 </li></ul><ul><li>R1=0.9 AN...
PROBABILITY FUNDAMENTALS <ul><li>THREE FIRMS SUPPLY NPN TRANSISTORS TO A MANUFACTURER OF TELEMETRY EQUIPMENT.ALL ARE WITH ...
PROBABILITY FUNDAMENTALS <ul><li>ANOTHER EXAMPLE OF BAYE’S THEOREM: </li></ul><ul><li>URN  RED BALLS  BLACK  TOTAL </li></...
PROBABILITY FUNDAMENTALS <ul><li>ANOTHER EXAMPLE OF BAYE’S THEOREM: </li></ul><ul><li>A FAMILY DO IS MISSING. THREE HYPOTH...
PROBABILITY FUNDAMENTALS <ul><li>ANOTHER EXAMPLE OF BAYE’S THEOREM: </li></ul><ul><li>THEN  P(D’)= P(A).(PD’/A)+P(B).(PD’/...
PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>FUNDAMENTAL RULE:  IF THERE ARE MULTIPLE CHOIC...
PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>FUNDAMENTAL RULE:  IF THERE ARE MULTIPLE CHOIC...
PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>2.SAMPLING WITHOUT REPLACEMENT AND WITH ORDERI...
PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>2.SAMPLING WITH REPLACEMENT AND WITHOUT ORDERI...
PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>IF A DECK OF CARDS IS SHUFFLED THOROUGHLY WHAT...
PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>A PRODUCTION LOT SIZE OF 100 IS KNOWN TO BE 5%...
PROBABILITY FUNDAMENTALS <ul><li>COMBINATIONS </li></ul><ul><li>( n  r )=( n  n-r ) </li></ul><ul><li>( n  r )=( n-1  r-1 ...
PROBABILITY FUNDAMENTALS <ul><li>CONTINUOUS RANDOM VARIABLES: </li></ul><ul><li>P(a ≤x≤b)=  a b  ∫fx(x)dx :  </li></ul><ul...
PROBABILITY FUNDAMENTALS <ul><li>f(t) = λ e - λ t   for t ≥0  and 0 otherwise. </li></ul><ul><li>P(T≥100 HRS)=  100  ∞∫ λ ...
PROBABILITY FUNDAMENTALS 0  1  2 f(x) x
PROBABILITY FUNDAS <ul><li>THE PROBABILITY THAT  -1<X<1/2= </li></ul><ul><li>-1   0 ∫  0dx=0  +  0   1/2 ∫xdx=  [ ( 1/2) 2...
PROBABILITY FUNDAS <ul><li>CONSIDER A COIN TOSSING EXPERIMTENT (THREE TOSSES) AND WHERE X THE RANDOM VARIABLE REPRESENTS T...
PROBABILITY FUNDAS <ul><li>FOR THE COIN TOSSING EXPERIMENT ( TWO TOSSES) </li></ul><ul><li>σ 2  =(0-1) 2  .(1/4)  +(1-1) 2...
PROBABILITY FUNDAS <ul><li>PROPERTIES OF VARIANCE: </li></ul><ul><li>VAR(C)= 0 FOR ALL  CONSTANTS. </li></ul><ul><li>VAR (...
PROBABILITY FUNDAS <ul><li>SKEWNESS: </li></ul><ul><li>ONE POSSIBLEMEASURE IS E[(X-µX) 3 ].IF THIS NUMBER IS >0 THEN THE D...
PROBABILITY FUNDAS <ul><li>E[(x-µx/ σ x) 4 ] </li></ul><ul><li>KURTOSIS IS A MEASURE THAT MEASURES WHETHER A DISTRIBUTION ...
PROBABILITY FUNDAS POSITIVE SKEW NEGATIVE SKEW
PROBABILITY FUNDAS <ul><li>INDEPENDENCE OF RANDOM VARIABLES: </li></ul><ul><li>X1,X2 ARE INDEPEDNENT IF AND ONLY IF  p(x1i...
PROBABILITY FUNDAS <ul><li>INDEPENDENCE OF RANDOM VARIABLES: </li></ul><ul><li>COVARIANCE & CORRELATION: </li></ul><ul><li...
PROBABILITY FUNDAS <ul><li>INDEPENDENCE OF RANDOM VARIABLES: </li></ul><ul><li>COVARIANCE & CORRELATION: </li></ul>
PROBABILITY FUNDAS <ul><li>SOME IMPORTANT DISCRETE DISTRIBUTIONS: </li></ul><ul><li>BERNOULLI TRIALS: A TRIAL WITH TWO POS...
PROBABILITY FUNDAS <ul><li>BERNOULLI TRIALS: </li></ul>Rx FFF 0 FFS FSF SFF FSS SFS SSF SSS 1 2 3 px q.q.q=q 3 3.p.q 2 3.p...
PROBABILITY FUNDAS <ul><li>THE BINOMIAL DISTRIBUTION: </li></ul><ul><li>THE RANDOM VARIABLE x THAT DENOTES THE NUMBER OF S...
PROBABILITY FUNDAS <ul><li>THE BINOMIAL DISTRIBUTION: </li></ul><ul><li>E(x) =np .  0   n-1   Σ  (n-1)!p y .q n-1-y / (y)!...
PROBABILITY FUNDAS <ul><li>THE BINOMIAL DISTRIBUTION : </li></ul><ul><li>THE BINOMIAL DISTRIBUTION IS SYMEETRIC IF p =0.5....
PROBABILITY FUNDAS <ul><li>THE GEOMETRIC DISTRIBUTION : </li></ul><ul><li>THIS IS ALSO RELATED TO A SEQUENCE OF BERNOULLI ...
PROBABILITY FUNDAS <ul><li>THE PASCAL DISTRIBUTION: </li></ul><ul><li>THE BASIS IS BERNOULLI TRIALS. </li></ul><ul><li>LOG...
PROBABILITY FUNDAS <ul><li>THE MULTINOMIAL DISTRIBUTION : </li></ul><ul><li>ASSUME AN EXPERIMNT WHERE THE SAMPLE SPACE IS ...
PROBABILITY FUNDAS <ul><li>THE EXPONENTIAL DISTRIBUTION: </li></ul><ul><li>f(x)=  λ e - λ x   for x≥0  =0 otherwise. </li>...
PROBABILITY FUNDAS <ul><li>RELATIONSHIP BETWEEN EXPONENTIAL AND POISSON DISTRIBUTION: </li></ul><ul><li>IF THE NUMBER OF O...
PROBABILITY FUNDAS f(x) x λ EXPONENTIAL DENSITY FUNCTION
PROPBABILITY FUNDAS 1 F(X) x DISTRIBUTION FUNCTION 1-e - λ x
PROBABILITY FUNDAS <ul><li>NORMAL DISTRIBUTION </li></ul><ul><li>SYMMETRICAL BELL SHAPED DISTRIBUTION </li></ul><ul><li>IT...
PROBABILITY FUNDAS <ul><li>NORMAL DISTRIBUTION </li></ul><ul><li>ITS USED SO EXTENSIVELY THAT THE SHORTHAND NOTATION  X IS...
PROBABILITY FUNDAS <ul><li>NORMAL DISTRIBUTION </li></ul>f(x) x µ
PROBABILITY FUNDAS <ul><li>F(x)= P(X ≤x) = P(Z≤(X-µ)/ σ )=  -∞  (X-µ)/ σ ∫  [1/√2 π ].e -(1/2)[(Z]2. </li></ul><ul><li>Z H...
PROBABILITY FUNDAS φ (x) z 0
PROBABILITY FUNDAS <ul><li>SYMMETERIC INTERVALS : </li></ul><ul><li>P{( µ - 1 σ )  ≤ x  ≤  (µ  +1 σ )} =.6826 </li></ul><u...
PROBABILITY FUNDAS <ul><li>NORMAL DISTRIBUTION: </li></ul><ul><li>THE NORMAL DISTRIBUTION CAN BE INTERPRETED IN DIFFERENT ...
BINOMIAL & B&S CONVERENCE <ul><li>CENTRAL LIMIT THEOREM: </li></ul><ul><li>IF A RANDOM VARIABLE Y IS THE SUM OF n INDEPEND...
BINOMIAL & B&S CONVERENCE <ul><li>CENTRAL LIMIT THEOREM: </li></ul><ul><li>HOW LARGE n MUST BE TO GET REASONABLE RESULTS U...
BINOMIAL & B&S CONVERENCE <ul><li>NORMAL APROXIMATION OF BINOMIAL DISTRIBUTION: </li></ul><ul><li>THE NORMAL APPROXIMATION...
BINOMIAL & B&S CONVERENCE <ul><li>NORMAL APROXIMATION OF BINOMIAL DISTRIBUTION: </li></ul><ul><li>THE FACT THAT THE MEAN A...
BINOMIAL & B&S CONVERENCE <ul><li>NORMAL APROXIMATION OF BINOMIAL DISTRIBUTION: </li></ul>θ IS THE  TRUE PROBABILITY  ASSO...
BINOMIAL & B&S CONVERGENCE <ul><li>IN THE EQUATIONS SHOWN IN THE PREVIOUS SLIDE IT CAN BE SEEN THAT CONVERGENCE TO B&S VAL...
PROBABILITY FUNDAS
PROBABILITY FUNDAS <ul><li>NORMAL DISTRIBUTION: </li></ul><ul><li>THE NORMAL DISTRIBUTION CAN BE INTERPRETED IN DIFFERENT ...
LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>SIMPLEST FORM OF DENSITY FUNCTION OF A VARIABLE W...
LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>A LOGNORMAL DISTRIBUTION IS BOUNDED BY ZERO BELOW...
LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>LN (ST/S0)= r(0,t) REPRESENTS THE CONTINUOUS COMP...
LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>Ln(St/S0)  WHICH REPRESENTS THE CONTINUOUS COMPOU...
LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>. </li></ul>0
LOGNORMAL DISTRIBUTION <ul><li>CONSIDER A STOCK WITH AN INITIAL PRICE OF $40 AN EXPECTED RETURN OF 16% PER ANNUM AND A VOL...
LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>NORMAL DISTRIBUTION HAS ADDITIVE REPRODUCTIVE PRO...
LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>IF X 1, ,X 2 …..X N  ARE INDEPENDENT LOGNORMAL VA...
LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>A LOGNORMAL DISTRIBUTION ALLOWS FOR MORE UPSIDE M...
PROBABILITY FUNDAS <ul><li>SAMPLING & DESCRIPTIVE STATISTICS: </li></ul><ul><li>STATISTICS IS A SCIENCE OF DRAWING CONCLUS...
PROBABILITY FUNDAS <ul><li>STATISTICS & SAMPLING DISTRIBUTIONS: </li></ul><ul><li>IF X1, X2,X3…Xn  IS A RANDOM SAMPLE OF S...
PROBABILITY FUNDAS <ul><li>STATISTICS & SAMPLING DISTRIBUTIONS: </li></ul><ul><li>USING THE REPRODUCTIVE PROEPRTY OF NORMA...
PROBABILITY FUNDAS <ul><li>ESTIMATORS: </li></ul><ul><li>SUPPOSE THAT   X IS A RANDOM VARIABLE WITH MEAN µ  & VARIANCE  σ ...
PROBABILITY FUNDAS <ul><li>E(S 2 )=E [ Σ (Xi-x µ ) 2 /(n-1 ) ] </li></ul><ul><li>[1/n-1]. E  Σ (Xi-x µ ) 2 </li></ul><ul><...
PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>IN MANY SITUATIONS A POINT ESTIMATE DOESN’T P...
PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>THE CONFIDENCE INTERVAL DISCUSSED EARLIER IS ...
PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul>α /2 α /2
PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>LET X BE A RANDOM VARIABLE WITH UNKNOWN MEAN ...
PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>EXAMPLE: A QUALITY INSPECTOR IS INVESTIGATING...
PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>CONFIDENCE INTERVAL ON THE MEAN OF A NORMAL D...
PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>CONFIDENCE INTERVAL ON THE VARIANCE OF A NORM...
PROBABILITY FUNDAS <ul><li>TESTS OF HYPOTHESIS: </li></ul><ul><li>MANY PROBLEMS REQUIRE THAT WE DECIDE WHETHER OR NOT A ST...
PROBABILITY FUNDAS <ul><li>TESTS OF HYPOTHESIS: </li></ul><ul><li>H0 is called the null hypothesis. </li></ul><ul><li>H1: ...
PROBABILITY FUNDAS <ul><li>TYPE 1 AND TYPE II ERRORS: </li></ul><ul><li>H0 IS TRUE  H0 IS FALSE </li></ul><ul><li>ACCEPT H...
PROBABILITY FUNDAS <ul><li>TYPE 1 AND TYPE II ERRORS: </li></ul><ul><li>IN OUR EXAMPLE TYPE 1 ERROR WILL OCCUR IF SAMPLE M...
PROBABILITY FUNDAS <ul><li>NOTE HERE THAT  β (2700)< β (2600).IN OTHER WORDS SMALLER DEVIATIONS ARE HARDER TO DETECT THAN ...
PROBABILITY FUNDAS <ul><li>WHAT ABOUT THE FOLLOWING HYPOTHESIS: </li></ul><ul><li>H0:  μ  ≤ μ 0;  H1:  μ  > μ 0  .  HERE W...
PROBABILITY FUNDAS <ul><li>WHAT ABOUT THE FOLLOWING HYPOTHESIS: </li></ul><ul><li>IN FORMULATING ONE SIDED HYPOTHESIS, WE ...
PROBABILITY FUNDAS <ul><li>CHOICE OF SAMPLE SIZE TO CONTROL TYPE II ERROR </li></ul><ul><li>H0:  μ = μ 0  AND H1:  μ≠μ 0 A...
PROBABILITY FUNDAS <ul><li>TEST OF HYPOTHESIS ON THE VARIANCE OF A NORMAL DISTRIBUTION: </li></ul><ul><li>HERE H0: σ   2 =...
PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>INDEPEPENDENT VARIABLE x and INDEPENDENT VARIABLE y...
PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>THE SECOND DERIVATIVE LEADS TO THE EQUATION </li></...
PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>STANDARD ERROR ESTIMATE: = Σ (ei) 2 /(n-2) </li></u...
PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>HYPOTHESIS TESTING: </li></ul><ul><li>WE CAN ALSO S...
PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>HYPOTHESIS TESTING: </li></ul><ul><li>THE F-TEST TH...
PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>HYPOTHESIS TESTING: </li></ul><ul><li>ONE OF THE TE...
Upcoming SlideShare
Loading in …5
×

Probability Fundas

3,192 views

Published on

Published in: Technology, Education
  • Be the first to comment

Probability Fundas

  1. 1. PROBABILITY FUNDAMENTALS PROF. NAVEEN BHATIA
  2. 2. PROBABILITY FUNDAMENTALS <ul><li>PROBABILITY IS SCIENTIFIC APPROACH TO UNCERTAINITY. </li></ul><ul><li>STUDY OF MATHEMATICAL TECHNIQUES FOR MAKING QUANTATIVE INFERENCES ABOUT UNCERTAINITY. </li></ul><ul><li>THE COMMON VALUE TO WHICH RELATIVE FREQUENCY CONVERGE IS CALLED AS THE PROBABILITY OF DESIRED OUTCOME. FOR EXAMPLE </li></ul><ul><li>THE RELATIVE FREQUENCY OF HEADS APPROACHES A COMMON VALUE OF ½ AS THE NUMBER OF COIN TOSSES INCREASES. </li></ul><ul><li>PROBABIITY CAN BE APPLIED TO ENGINEERING AND SCIENCES AREA SUCH AS RLIABILITY, QUALITY CONTROL AND ANALYSIS OF QUEUES. </li></ul><ul><li>RELIABILITY: BRANCH OF ENGG. MAINTENANCE OF QUALITY IN MANUFACTURING PROCESS.QUALITY CONTROL: THE GOAL HERE IS TO MINIMISE THE NUMBER OF DEFECTS PRODUCED IN MANUFACTURING PROCESS. SINCE ITS NOT POSSIBLETO TEST EVERY ITEM , PROBABILITY CAN BE USED TO ANALYSE UNCERTAINITY. </li></ul>
  3. 3. PROBABILITY FUNDAMENTALS <ul><li>QUEUING: BRANCH OF ENGINEERING CONCERNED WITH THE ANALYSIS AND DESIGN OF SYSTEMS INVOLVING MULTIPLE SERVERS AND MULTIPLE CLIENTS IN WHICH CLIENTS MAY BE REQUIRED TO WAIT FOR SERVICE. </li></ul><ul><li>SIMILARILY WE CAN APPLY THE THEORY OF PROBABILITY IN FINANCIAL ENGINEERING. THE FUNDAMENTAL PRINCIPLE UNDERLYING THE PRINCIPLE OF FINANCIAL ENGINEERING IS PRINCIPLE OF NO ARBITRAGE. </li></ul><ul><li>THE PRINCIPLE ASSERTS THAT TWO SECURITIES THAT PROVIDETHE SAME FUTURE CASH FLOW AND HAVE THE SAME LEVEL OF RISK MUST SELL FOR THE SAME PRICE. </li></ul><ul><li>NO ARBITRAGE IMPLIES THAT THERE IS NO FREE LUNCH. </li></ul><ul><li>SIMILARILY PORTFOLIO OPTIMISATION ( HIHGEST RISK ADJUSTED RETURN). FINANCIAL ENGINEERS USE PROB AND STSATS TO MEASURE EXPECTED RETURN AND RISK FOR INDIVIDUAL SECURITY AND FOR PORTFOLIO. </li></ul><ul><li>. </li></ul>
  4. 4. PROBABILITY FUNDAMENTALS <ul><li>PROBABILITIES ARE LONG RUN RELATIVE FREQUENCIES. THIS PERSPECTIVE IN WHICH PROBABILITY IS CONSIDERED TO BE A CONSTANT LONG RUN RELATIVE FRQUENCY IS KNOWN AS OBJECTIVIST INTERPRETATION OF PROBABILITY. </li></ul><ul><li>THE OTHER OBJECTIVE IS KNOWN AS BAYESIAN OR SUBJECTIVIST INTERPRETATION IN WHICH PROBABILITIES ARE CONSIDERED TOBE MEASURE OF PERSONAL BELIEF. </li></ul>
  5. 5. PROBABILITY FUNDAMENTALS <ul><li>RANDOM VARIABLE: </li></ul><ul><li>CONSIDER A CASE WHERE THE PAY-OFF IN A GAME OF ROLLING THE DIE. THE PAYOFF IS AN EXAMPLE OF A RANDOM VARIABLE BECAUSE ITS VALUE VARIES IN A WELL DEFINED WAY ACCORDING TO OUTCOME OF ROLLING THE DIE. </li></ul><ul><li>ITS RANDOM BECAUSE THE UNDERLYING PROCESS ON WHICH IT DEPENDS ITSELF IS RANDOM. </li></ul><ul><li>FOR INSTANCE THE GAME IS ROLLING A DIE. THE PAYOFF IS$1 IF THE RESULT IS 2/3/4 ; PAYOFF IS$2 IF ITS 5 ;IF ITS 1 OR 6 WE LOOSE $3. </li></ul><ul><li>THE PAY-OFF X IS CONSIDERED AS A DISCRETE RANDOM VARIABLE BECAUSE ITS POSSIBLE VALUES BELONG TO DISCRETE SET {-3,1,2} </li></ul><ul><li>GENERALLY SPEAKING A RANDOM VARIABLE IS ANY QUANTITY WITH REAL VALUES THAT DEPENDS IN A WELL DEFINED WAYON SOME PROCESS WHOSE OUTCOMES ARE UNCERTAIN. </li></ul><ul><li>ALL THE POSSIBLE OUTCOMES OF THE EXPERIMENT IS KNOWN AS “SAMPLE SPACE” { 1,2,3,4,5,6} AND THE PAY-OFF FUNCTION </li></ul>
  6. 6. PROBABILITY FUNDAMENTALS <ul><li>RANDOM VARIABLE: </li></ul><ul><li>X(1)= -3, X(2)=1; X(3)=1; X(4)=1;X(5)=2 AND X(6)=-3 </li></ul><ul><li>NOTE THAT FOR DIFFERENT TYPES OF BETS THERE COULD BE MANY DIFFERENT REAL VALUED FUNCTIONS THAT CAN BE DEFINED ON A GIVEN SAMPLE SPACE. </li></ul><ul><li>DISTRIBUTION OF PROBABILITY: </li></ul><ul><li>IF WE ASSUME THAT DIE IS A BALANCED DIE THEN IT LEADS TO THE CONCLUSION THAT EACH OUTCOME OF THE EPERIMENT HAS PROBABILITY =1/6. THIS 1/6 IS THE DISTRIBUTION OF PROBABILITY OVER THE SAMPLE SPACE. </li></ul><ul><li>WHAT WE NEED TO KNOW IS THE DISTRIBUTION OF PROBABILITY FOR THE PAY-OFF. </li></ul><ul><li>P(X=-3)= 1/3 OR P(X=1)= ½ AND P(X=2)= 1/6. THIS PROBABILITY FUNCTION SUPRESSES INFORMATION ON THE UNDERLYING EXPERIMENT, BUT WE ARE ACTUALLY WORRIED ABOUT IS PAY-OFF. </li></ul>
  7. 7. PROBABILITY FUNDAMENTALS
  8. 8. PROBABILITY FUNDMENTALS <ul><li>TWO PROPERTIES OF PROBABILITY DISTRIBUTION </li></ul><ul><li>1. 0 ≤P(X)≤1 FOR ALL X </li></ul><ul><li>∑ P(X)=1 FOR ALL X </li></ul><ul><li>THE FIRST PROPERTY INDICATES THAT EVERY FUNCTION VALUE OF PX IS A RELATIVE FREQUENCY. </li></ul><ul><li>THE SECOND PROPERTY INDICATES THAT ONE OF THE POSIBLE PAYOFFS WILL OCCUR. </li></ul><ul><li>EXPECTED PAYOFF: </li></ul><ul><li>∑ X.PX FOR ALL X. IN OUR EXAMPLE IT WIL BE=1/3*-3 +1/2*1 +2*1/6=-1/6 </li></ul><ul><li>THE EXPECTED PAYOFF HERE REPRESENTS THE AVERAGE PAYOFF PER GAME IF WE PLAY A LARGE NUMBER OF GAMES. </li></ul><ul><li>E[X]=∑ x.p(x) FOR ALL X </li></ul><ul><li>E[X] IS THE ARITHMETIC AVERAGE OF THE VALUES OF X WEIGHTED BY THE PROBABILITY MASS FUNCTION px. </li></ul><ul><li>TWO RANDOM VARIABLES Y1&Y2 ARE IDENTUCALLY DISTRIBUTED </li></ul><ul><li>PY1(Y)=PY2(Y) FOR ALLY </li></ul>
  9. 9. PROBABILITY FUNDMENTALS <ul><li>CHOOSING PAY-OFFS WHEN DISTRIBUTIONS ARE NOT IDENTICAL </li></ul><ul><li>IF EXPECTED PAY-OFF IS SAME, ONE HAS TO LOOK AT RANGE. RISK AVERSE INVESTORS WILL SELECT ONE WITH LESS RANGE OF PAY-OFFS WHEN EXPECTED PAY-OFFS ARE SAME. </li></ul>
  10. 10. PROBABILITY FUNDMENTALS <ul><li>CLASSICAL PROBABILITY: </li></ul><ul><li>TWO ASSUMPTIONS: </li></ul><ul><li>NUMBER OF POSSIBLE OUTCOMES OF RANDOM EXPERIMENT IS FINITE. </li></ul><ul><li>ALL OUTCOMES ARE EQUALLY LIKELY I.E EACH OUTCOME HAS THE SAME PROBABILITY. </li></ul><ul><li>P(E)=[E]/[S] HERE E IS THE NUMBER OF ELEMENTS IN THE EVENT AND S IS THE NUMBER OF ELEMENTS IN THE SAMPLE SPACE. </li></ul><ul><li>SUPPOSE A DIE IS ROLLED , WHATS THE PROBABILITY THAT THE OUTCOME IS A MULTIPLE OF 3. THE EVENT IS {3,6} ; THEREFORE THE PROB. IS 2/6=1/3 </li></ul><ul><li>ANOTHER EXAMPLE IF TWO COINS ARE TOSSED WHATS THE PROBABILITY THAT BOTH THE COINS SHOW UP THE SAME FACE. HERE E=2 AND S=4 AND HENCE P=1/2. </li></ul><ul><li>ONE MUST CHOOSE THE RANDOM SPACE IN SUCH A WAY THAT ALL THE INFORMATION OF SAMPLE SPACE IS CAPTURED. </li></ul>
  11. 11. PROBABILITY FUNDMENTALS <ul><li>CLASSICAL PROBABILITY: </li></ul><ul><li>A RECENT STUDY ON TEENAGE DRUG & ALCOGOL USE FOUND THAT ONE IN 10 TEENAGERS USES MARIJUANA ATLEAST ONCEA MONTH AND FIVE ADMOTTED TO DRINKING ALCOHOL ONCE A WEEKAND ONE IN FOUR ADMITTED TO SMOKING CIGARETTES ON A DAILY BASIS. </li></ul><ul><li>THE RESEARCH ALSO FOUND THAT 10% OF THE RESPONDENTS ADMITTED TO SMOKING CIGARETTES DAILY AND TAKING ALCOHOL ONCE A WEEK. </li></ul><ul><li>5% CIGARETTES AND MARIJUANA </li></ul><ul><li>3% DRINKING ALCOHOL AND USING MARIJUANA </li></ul><ul><li>1% ALL THE THREE. </li></ul><ul><li>IN THIS EXAMPLE ONE DOESN’T KNOW THE SIZE OF SAMPLE BUT ONE CAN STILL DETERMINE THE PROBABILITIES </li></ul><ul><li>0≤P(E)≤ 1 FOR ALL EVENTS; P(S)=1 </li></ul><ul><li>WHEN THE NUMBER OF EVENTS IS SMALL ONE CAN USE VENN DIAGRAMS </li></ul>
  12. 12. PROBABILITY FUNDMENTALS <ul><li>CLASSICAL PROBABILITY: </li></ul><ul><li>THREE EVENTS </li></ul><ul><li>M: TENNEAGERS TAKING MARIJUANA; A : TEENAGERSCONSUMING ALCOHOL ONCE A WEEK AND C: TEENAGERS ADMITS TO SMOKING CIGARETTES ON A DAILY BASIS. </li></ul><ul><li>P(M)=10%; P(A)= 20% ; P(C)= 25% ; P(C∩A)=10%; P(C∩M)=5% P(A∩M)=3%AND P(M∩A∩C)=1% </li></ul>
  13. 13. PROBABILITY FUNDAMENTAL <ul><li>OTHER PROPERTIES </li></ul><ul><li>P(A C )=1-P(A): THIS FOLLOWS FROM THE FACT THAT P(A)& P(A C ) ARE MUTUALLY EXCLUSIVE AND SUM OF P(A)& P(A C )=1 </li></ul><ul><li>P(A U B)= P(A)+P(B)- P(A ∩B) </li></ul><ul><li>P(AUBUC)= P(A)+P(B)+P(C)- P(A∩B)-P(B∩C)- P(A∩C)+ P(A∩B∩C) </li></ul><ul><li>USING THIS LOGIC IF WE NEE TO FIND OUT THE PROBABILITY OF A TEENAAGER INDULGING ATLEAST ONE OF THE THREE ACTIVITIES </li></ul><ul><li>THIS WILL BE EQUAL TO =.10+.20+.25-.(10+.05+.03) +.01=0.38. </li></ul><ul><li>1-0.38= 0.62 DON’T ENGAGE IN ANY OF THE THREE ACTIVITIES. </li></ul><ul><li>CONDITIONAL PROBABILITY: </li></ul><ul><li>IF WE ROLL A DICE , WHATS THE PROBABILITY THAT 2 WILL SHOW UP GIVEN THAT ITS AN EVEN NUMBER </li></ul><ul><li>P(A/B)= P(A∩B)/P(B)=[1/6]/[1/2]=1/3 </li></ul><ul><li>CONSIDER TWO DICES ARE TOSSED. THERE ARE 36 POSSIBLE OUTCOMES.. IF d1 IS THE VALUE OF UPFACE OF DICE 1 AND d2 IS THE VALUE OF UPFACE OF DICE 2.WE DEFINE TWO EVENTS </li></ul>
  14. 14. PROBABILITY FUNDAMENTAL <ul><li>CONDITIONAL PROBABILITY: </li></ul><ul><li>A={(d1,d2): d1+d2=4} </li></ul><ul><li>B={(d1,d2):d2≥d1} </li></ul><ul><li>P(A)= 3/36{(1,3),(3,1)&(2,2)} </li></ul><ul><li>P(B)=21/36 </li></ul><ul><li>P(B/A)=2/3 </li></ul><ul><li>P(A/B)=2/21 </li></ul><ul><li>P(A∩B)=2/36=P(B)*(PA/B)=[21/36]*[2/21] =2/36 </li></ul><ul><li>LETS LOOK AT ANOTHER EXAMPLE: </li></ul><ul><li>A COED COLLEGE HAS THREE COURSES, SCIENCE, MANAGEMENT AND ENGINEERING. BY SEX THE ENROLLMENT IS AS FOLLOWS </li></ul><ul><li>SCIENCE MANAGEMENT ENGG TOTAL </li></ul><ul><li>MALE 250 350 200 800 </li></ul><ul><li>FEMALE 100 50 50 200 </li></ul><ul><li>TOTAL 350 400 350 1000 </li></ul>
  15. 15. PROBABILITY FUNDAMENTAL <ul><li>LET S1: STUDENT IS MALE; S2: STUDENT IS FEMALE;C1: THE STUDENT IS FROM SCIENCE; C2: THE STUDENT IS FROM MANAGEMENT; AND C3 THE STUDENT IS FROM ENGG. </li></ul><ul><li>C1 C2 C3 TOTAL </li></ul><ul><li>S1 .25 .35 .20 .80 </li></ul><ul><li>S2 .10 .05 .05 .20 </li></ul><ul><li>. 35 .40 .25 1.00 </li></ul><ul><li>P(S1)=0.80 AND P(S2)=0.20 ARE ALSO KNOWN AS MARGINAL PROBABILITIES </li></ul><ul><li>P(C3/S2)= .05/.20=1/4 </li></ul><ul><li>P(C1/S2)=.10/.20=1/2 </li></ul><ul><li>P(C2/S2)=.05/.20=1/4 </li></ul><ul><li>P(C1UC2UC3/S2)=P(C1/S2)+P(C2/S2)+P(C3/S2) </li></ul><ul><li>IF TWO EVENTS ARE INDEPENENT THEN P(A∩B)=P(A).P(B) AND THEREFORE P(A/B)=P(A) AND P(B/A)=P(B). (FOR EXAMPLE TOSSING A COIN TWICE) </li></ul><ul><li>LETS LOOK AT AN EXAMPLE ON THE NEXT SLIDE. </li></ul>
  16. 16. PROBABILITY FUNDAMENTAL <ul><li>SYSTEM RELIABILITY GIVEN SUBSYSTEM RELIABILIY </li></ul>SUB SYSTEM1 SUB SYSTEM2 SYSTEM
  17. 17. PROBABILITY FUNDAMENTALS <ul><li>IN THE PREVIOUS EXAMPLE , THE SYSTEM RELIABILITY IS RS= R1.R2 </li></ul><ul><li>R1=0.9 AND R2=0.8 THEN RS= 0.9*0.8=0.72 </li></ul><ul><li>WE CAN EXTEND THE SAME LOGIC TO N SUBSYSTEMS </li></ul><ul><li>RS= R1.R2.R3.R4.R5.R6—RN SUBSYTEMS ARE MUTUALLY INDEPENDENT. </li></ul><ul><li>BAYES THEOREM: </li></ul><ul><li>IF B1,B2…BK REPRESENT A PARTITION OF S & A IS AN ARBITRATY EVENT OF S THEN THE TOTAL PROBABILITY OF A IS GIVEN BY </li></ul><ul><li>P(A)= P(B1).P(A/B1)+ P(B2).(PA/B2)…… </li></ul><ul><li>WE WILL SEE A NUMERICAL EXAMPLE OF THIS VERY SHORTLY. </li></ul><ul><li>P(BK/A)= P(BK ∩A)/P(A)= P(BK).P( A/BK)/P(A)=P(BK).P(A/BK)/ P(B1).P(A/B1)+ P(B2).(PA/B2)…… </li></ul>
  18. 18. PROBABILITY FUNDAMENTALS <ul><li>THREE FIRMS SUPPLY NPN TRANSISTORS TO A MANUFACTURER OF TELEMETRY EQUIPMENT.ALL ARE WITH SAME SPECS. </li></ul><ul><li>FIRM FRACTION DEFECTIVE FRACTION SUPPLIED BY </li></ul><ul><li>1 .02 .15 </li></ul><ul><li>2 .01 .80 </li></ul><ul><li>3 .03 .05 </li></ul><ul><li>IF A DEFINES THE EVENT THAT ITEM IS DEFECTIVE. AND B1,B2,B3 DEFINE WHETHER THE ITEM CAME FROM FIRM 1 2 OR 3. </li></ul><ul><li>THEN P(B3/A)= PB3*P(A/B3)/ [(PB1.(PA/B1)) +(P(B2)*(PA/B2))+(P(B3)*(P(A/B3)] </li></ul><ul><li>= .05*.03/[.15*.02+ .80*.01 +.05*.03]= 3/25 </li></ul>
  19. 19. PROBABILITY FUNDAMENTALS <ul><li>ANOTHER EXAMPLE OF BAYE’S THEOREM: </li></ul><ul><li>URN RED BALLS BLACK TOTAL </li></ul><ul><li>1 3 2 5 </li></ul><ul><li>2 2 3 5 </li></ul><ul><li>WE TOSS AN UNBIASE COIN TO DRAW A BALL FROM ONE OF THE URNS BUT WE DON’T KNOW “WHICH IS WHICH URN”.SUPPOSE THE FIRST BALL DRAWN IS BLACK AND IS PUT BACK.WHAT IS THE PROBABILITY THAT THE SECOND BALL DRAWN FROM THE SAME URN IS ALSO BLACK. </li></ul><ul><li>P(U1)= ½ ( SELECTING URN 1 BY TOSS) % P(U2)=1/2 (SELECTING URN 2) </li></ul><ul><li>THE EVENT B1 DENOTES THE FIRST BALL DRAWN IS BLACK &B2 THE SECOND BALL DRAWN IS ALSO BLACK. </li></ul><ul><li>P(U1/B1)=P(U1 ∩B1)/P(B1)=P(B/U1).P(U1)/[ PU1.P(B1/U1)+PU2.P(B1/U2)]=(1/2.2/5)/((1/2.2/5)+(1/2.3/5)=2/5 </li></ul><ul><li>P(U2/B1)=3/5. </li></ul><ul><li>WHEN THE FIRST BALL IS PUT BACK THE PROBABILITY THAT THE SECOND BALL IS BLACK AND CAME FROM U1= 2/5 AND FROM U2=3/5 </li></ul>
  20. 20. PROBABILITY FUNDAMENTALS <ul><li>ANOTHER EXAMPLE OF BAYE’S THEOREM: </li></ul><ul><li>A FAMILY DO IS MISSING. THREE HYPOTHESIS ARE SUGGESTED. </li></ul><ul><li>A.. IT HAS GONE HOME. B. ITS STILL WORRYING THAT BIG BONE IN THE PICNIC AREA. 3. IT HAS WANDERED INTO WOODS. </li></ul><ul><li>APRIORI PROBABILITIES AHICH HAVE BEEN OBSERVED FROM THE PAST HABITS OF DOG SUGGEST THAT P(A)=1/4 P(B)=1/2 AND P(C)= ¼. ‘ </li></ul><ul><li>A CHILD IS SENT TO SEARCH FOR DOG. IF ITS IN PICNIC AREA THEN THE PROBABILITY OF FINDING DOG IS 90% AND IF ITS IN THE WOODS THE PROBABILITY IS 50%. WHATS THE PROBABILITY THAT DOG WILL BE FOUND IN THE PARK. </li></ul><ul><li>LET “D” DENOTE THE EVENT THAT DOG WILL BE FOUND IN THE PARK. </li></ul><ul><li>P(D/A)=0 P(D/B)=90% P(D/C)= 50% </li></ul><ul><li>P(D)= PA. (PD/A) + P(B).P(D/B) + P(C).P(D/C)=(1/4).0 +(1/2).0.9 +(1/4).5=115/200 </li></ul><ul><li>WHATS THE PROBABILITY THAT DOG WILL BE FOUND AT HOME. LET D’ DENOTE THIS EVENT. </li></ul>
  21. 21. PROBABILITY FUNDAMENTALS <ul><li>ANOTHER EXAMPLE OF BAYE’S THEOREM: </li></ul><ul><li>THEN P(D’)= P(A).(PD’/A)+P(B).(PD’/B)+P©.P(D’/C)=(1/4).(1)+1/2.0 +1/4.0=1/4 </li></ul><ul><li>WHATS THE PROBABILITY THAT THE DOG IS LOST. 1-P(D)- P(D’). </li></ul><ul><li>A BALL IS DRAWN FROM THE URN AND DISCARDED. W/O KNOWING ITS COLOUR, WHATS THE PROBABILITY THAT THE SECOND BALL DRAWN IS ALSO BLACK? </li></ul><ul><li>P(B1)= b/(b+r) & P(BI C )= r/( b+r) </li></ul><ul><li>P(B2/B1)=(b-1)/(b+r-1) </li></ul><ul><li>P(B2) (I.E W/O KNOWING THE COLOR OF THE FIRST BALL)= </li></ul><ul><li>P(B1).P(B2/B1)+ P(B1 C ).(PB2/B1 C )= </li></ul><ul><li>[b/{b+r}].[{b-1}/{b+r-1} +[r/{b+r}].{b/(b+r-1}= b.(b+r-1)/(b+r).(b+r-1)=b/(b+r) </li></ul>
  22. 22. PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>FUNDAMENTAL RULE: IF THERE ARE MULTIPLE CHOICES TO BE MADE I.E THERE ARE m1 POSSIBILITIES FOR THE FIRST CHOICE &m2 FOR THE SECOND CHOICE AND SO ON & IF THESE ARE ALLOWED TO BE COMBINED FREELY, THEN THE TOTAL NUMBER OF POSSIBLE CHOICES= m1 *m2* m3-----mn. </li></ul><ul><li>HOW MANY WAYS 6 DICES CAN APPEAR. THE ANSWER IS 6 6 =46656. </li></ul><ul><li>KOW MANY WAYS IF ALL HAVE TO SHOW UP DIFFERENT. THE ANSWERIS 6 </li></ul>
  23. 23. PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>FUNDAMENTAL RULE: IF THERE ARE MULTIPLE CHOICES TO BE MADE I.E THERE ARE m1 POSSIBILITIES FOR THE FIRST CHOICE &m2 FnOR THE SECOND CHOICE AND SO ON & IF THESE ARE ALLOWED TO BE COMBINED FREELY, THEN THE TOTAL NUMBER OF POSSIBLE CHOICES= m1 *m2* m3-----mn. </li></ul><ul><li>HOW MANY WAYS 6 DICES CAN APPEAR. THE ANSWER IS 6 6 =46656. </li></ul><ul><li>KOW MANY WAYS IF ALL HAVE TO SHOW UP DIFFERENT. THE ANSWERIS 6!.=6.5.4.3.2.1=720. AFTER THE FIRST DICE SHOWS UP THE SECOND MUST SHOW UP A IFFERENT FACE & SO ON. </li></ul><ul><li>AN URN CONTAINS m DISTINGUISHABLE BALLS MARKED 1 TO m FROM WHICH n BALLS WILL BE DRAWN UNDER VARIOUS SPECIFIED CONDITIONS. </li></ul><ul><li>1. SAMPLING WITH REPLACEMENT AND WITH ORDERING. </li></ul><ul><li>WE DRAW n BALLS SEQUENTIALLY, EACH BALL DRAWN BEING PUT BACK BEFORE THE NEXT DRAWN IS MADE.WHAT WE RECORD IS THE NUMBER ON THE BALL TOGETHER WITH THEIR ORDER. THE ANSWER WILL BE WE ARE LOOKIG AT SEQUENCE OF n TUPLES( a1,a2,a3..an) WHERE EACH aj COULD BE 1 TO m. THEREFORE THE ANSWER IS m n . </li></ul>
  24. 24. PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>2.SAMPLING WITHOUT REPLACEMENT AND WITH ORDERING: </li></ul><ul><li>THE ANSWER IS m.(m-1).(m-2)….(m-n+1)=(m) n </li></ul><ul><li>2.a): PERMUTATION OF m DISTINGUISHABLE BALLS.THE ANSWER IS m!. </li></ul><ul><li>3. SAMPLING WITHOUT REPLACEMENT AND WITHOUT ORDERING: </li></ul><ul><li>HERE THE ORDER OF SEQUENCE IS NOT RECORDED (I.E123 IS SAME AS 321) I.E IF EDRAW N BALLS IN ONE GRAB: FOR EXAMPLE IF m=5 AND n=3 THEN {3,5,2} CAN BE DRAWN IN 6 WAYS (3!) THEREFORE THE ANSWER IS (m) n /n! </li></ul><ul><li>THE EXPRESSION ABOVE IS = m!/(n!.(m-n)!) </li></ul><ul><li>THE EXPRESSION IS ALSO KNOWN AS BINOMIAL COEFFICIENT. </li></ul><ul><li>PERMUTATION OF m BALLS THAT ARE DISTINGUISHABLE BY GROUPS </li></ul><ul><li>SUPPOSE THERE m1 BALLS OF 1 COLOUR, m2 BALLS OF SECOND COLOUR ANDSO ON.OFCOURSE m1+m2+m3+….mr=m. HOW MANY DISTINGUISHABLE ARRANGEMENTS ARE THERE?. FOR INSTANCE IF m1=m2=2 and m=4 AND THE TWO COLOURS ARE BLACK AND WHITE THEN THE TOTAL NUMBER OF PERMULTATION=m!/[m1!.m2!..mr!). IN OUR EXAMPLE THE ANSWER IS 6. </li></ul>
  25. 25. PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>2.SAMPLING WITH REPLACEMENT AND WITHOUT ORDERING: </li></ul><ul><li>SUPPOSE WE TOSS TWO COINS m=2 and n=2 (HT=TH) THEN THERE ARE THREE POSSIBILITIES = TT/HH/TH(HT). = (m+n-1 m-1 )=3!/2!.1!=3. </li></ul><ul><li>IF 6 DICES ARE ROLLED AND THE DICE ARE NOT DISTINGUISHABLE THEN THE NUMBER OF DISTINGUISHABLE PATTERNS IS =11!/6!.5!=462 </li></ul><ul><li>SOME EXAMPLES: </li></ul><ul><li>EXAMPLE 1: 6 MOUNTAINEERS DECIDE TO DIVIDE INTO THREE GROUPS FOR THE FINAL ASSAULT ON THE PEAK.THE GROUPS WILL BE OF SIZE 1,2& 3 RESPECTIVELY. HOW MANY WAYS ITS POSSIBLE? </li></ul><ul><li>=6!/(3!.2!.1!)=60 </li></ul><ul><li>HAVING FORMED THESE GROUPS WHICH GROUP LEADS/MIDDLE AND SO ON=3!. THE FINAL ANSWER IS =60*3=180. </li></ul>
  26. 26. PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>IF A DECK OF CARDS IS SHUFFLED THOROUGHLY WHATS THE PROBABILITY THAT FOUR ACES APPEAR IN A SEQUENCE? </li></ul><ul><li>TOTAL NUMBER OF PERMUTATIONS CAN BE 52!. </li></ul><ul><li>THE FOUR ACES CAN APPEAR ANYWHERE IN 49 PLACES . FURTHER FOUR ACES WITHIN THESE 49 PLACES CAN BE COMBINED IN 4! WAYS . THE OTHER 48 CARDS CAN AGAIN APPEAR IN 48! WAYS. </li></ul><ul><li>THEREFORE THE FINAL ANSWER IS 49*4!*48!/52!=24/52.51.50=0.018%. </li></ul><ul><li>ANOTHER EXAMPLE: </li></ul><ul><li>FIFTEEN NEW STUDENTS ARE TO BE DISTRIBUTED EVENLY AMONG THREE CLASSES. THREE ARE WHIZ KIDS. WHATS THE PROBABILITY THAT EACH GETS ONE. </li></ul><ul><li>TOTAL NUMBER OF WAYS = 15!/(5!.5!.5!) </li></ul><ul><li>THERE ARE 6 WAYS IN WHICH THREE WHIZ KIDS CAN BE DISTRIBUTED AMONG THREE CLASSES. THE BALANCE 12 CAN BE DISTRIBUTED IN 12!/4!.4!.4!. THEREFORE THE FINAL ANSWER IS[ 6.12!/(4!.4!.4!)]/[15!/(5!.5!.5!) </li></ul><ul><li>=207900/756756=27.47% </li></ul>
  27. 27. PROBABILITY FUNDAMENTALS <ul><li>PERMUTATION&COMBINATION: </li></ul><ul><li>A PRODUCTION LOT SIZE OF 100 IS KNOWN TO BE 5% DEFECTIVE. A RANDOM SAMPLE OF 10 ITEMS IS SELECTED W/O REPLACEMENT. WHATS THE PROBABILITY THAT THERE IS NO DEFECTIVE IN THE SAMPLE? </li></ul><ul><li>THE NUMBER OF WAYS IN WHICH 10 ITEMS CAM BE DRAWN OUT OF 100( HERE THE SEQUENCE IS NOT IMPORTANT)= 100!/(90!.10!)= 1.73*10 13 . THE NUMBER OF WAYS THAT SAMPLE CONTAINS NO DEFECTIVE=[5!/(0!.5!)] .[95!/(10!.85!)= 1.01*10 13 </li></ul><ul><li>THEREFORE THE PROBABILITY THAT THE SAMPLE CONTAINS NO DEFECTIVE= .58375% </li></ul><ul><li>TO GENERALISE THIS EXAMPLE , WE CAN SAY THAT ANY POPULATION OF N WHERE D BELONG TO A PARTICULAR CLASS. A RANDOM SAMPLE OF n IS SELECTED W/O REPLACEMENT. IF A DENOTES THE EVENT OF OBTAINING EXACTLY r ITEMS FROM THE CLASS OF INTEREST, THEN </li></ul><ul><li>P(A )= ( D r ).( N-D n-r )/( N n ) </li></ul>
  28. 28. PROBABILITY FUNDAMENTALS <ul><li>COMBINATIONS </li></ul><ul><li>( n r )=( n n-r ) </li></ul><ul><li>( n r )=( n-1 r-1 ) + ( n-1 r ) </li></ul>
  29. 29. PROBABILITY FUNDAMENTALS <ul><li>CONTINUOUS RANDOM VARIABLES: </li></ul><ul><li>P(a ≤x≤b)= a b ∫fx(x)dx : </li></ul><ul><li>f(x) IS THE PROBABILITY DENSITY FUNCTION SATISFIES THE FOLLOWING CONDITIONS </li></ul><ul><li>fx(x)≥0 FOR ALL x OVER THE RANGE R & </li></ul><ul><li>R ∫fx(x)dx=1 </li></ul><ul><li>WHEN WE HAVE A DENSITY FUNCTION OR PROBABILITY AT APOINT SAY XO THEN THE PROBABILITY DENSITY AT A POINT BECAUSE OF INTEGERATION FORMULA IS ZERO. </li></ul><ul><li>THE TIME TO FAILURE OF A CATHODE TUBE IS DESCRIBED BY THE FOLLOWING FUNCTION: f(t)= λ e - λ t fot t≥ 0 and 0 OTHERWISE. WHERE λ >0 IS KNOWN AS CONSTANT FAILURE RATE. WHATS THE PROBABILITY THAT P(T≥100 HRS) </li></ul>
  30. 30. PROBABILITY FUNDAMENTALS <ul><li>f(t) = λ e - λ t for t ≥0 and 0 otherwise. </li></ul><ul><li>P(T≥100 HRS)= 100 ∞∫ λ e- λ t =e -100 λ </li></ul><ul><li>P(T≥100 HRS/T>99)= [100 ∞∫ λ e- λ t] / [99 ∞∫ λ e- λ t </li></ul><ul><li>= e -100 λ /e -99 λ =e - λ . </li></ul><ul><li>ANOTHER EXAMPLE: </li></ul><ul><li>f(x)= x for 0≤x<1 and 2-x for 1≤x<2 and=0 otherwise. </li></ul><ul><li>LETS LOOK AT THE GRAPHICAL REPRESENTATION </li></ul>
  31. 31. PROBABILITY FUNDAMENTALS 0 1 2 f(x) x
  32. 32. PROBABILITY FUNDAS <ul><li>THE PROBABILITY THAT -1<X<1/2= </li></ul><ul><li>-1 0 ∫ 0dx=0 + 0 1/2 ∫xdx= [ ( 1/2) 2 – (0) 2 ]/2=1/8 </li></ul><ul><li>SIMILARIIY WE CAN FIND THAT P(X ≤3/2)= 7/8. </li></ul><ul><li>P(X≤3)=1 &P(X≥2.5)=0 </li></ul><ul><li>IN DESCRIBING PROBABILITY FUNCTIONS A MATHEMATICAL MODEL IS USUALLY EMPLOYED. THE AREA UNDER THE DENSITY FUNCTIONCORRESPONDS TO PROBABILITY AND THE TOTAL AREA IS 1. </li></ul><ul><li>MEAN OF THE RANDOM VARIABLE: </li></ul><ul><li>µ= Σ xip(xi) FOR DISCRETE X </li></ul><ul><li>µ= -∞ ∞ ∫xf(x)dx FOR CONTINUOUS X </li></ul>
  33. 33. PROBABILITY FUNDAS <ul><li>CONSIDER A COIN TOSSING EXPERIMTENT (THREE TOSSES) AND WHERE X THE RANDOM VARIABLE REPRESENTS THE NUMBER OF HEADS. </li></ul><ul><li>P(O HEADS)= 1/8 P(1H)=3/8 (3/8.1) ; P(H=2) =3/8( 3.8.2) ;P(H=3) =1/8(3.1/8) </li></ul><ul><li>µ=12/8= 3/2 </li></ul><ul><li>SIMILARILY FOR THE DENSITY FUNCTION THE </li></ul><ul><li>µ = 0 1 ∫ x.xdx + 1 2 ∫ x.(2-x)dx=1/3 +({4-1} –{8/3-1/3}) </li></ul><ul><li>=1/3 +({3-7/3})=1/3 +({2/3})=1. </li></ul><ul><li>VARIANCE: SPREAD OR DISPERSION AROUND THE MEAN </li></ul><ul><li>σ 2 = Σ (xi -µ) 2 . p(xi) FOR DISCRETE X. </li></ul><ul><li>σ 2 = -∞ +∞ ∫ (xi -µ) 2 . f(x)dx FOR CONTINUOUS X. </li></ul>
  34. 34. PROBABILITY FUNDAS <ul><li>FOR THE COIN TOSSING EXPERIMENT ( TWO TOSSES) </li></ul><ul><li>σ 2 =(0-1) 2 .(1/4) +(1-1) 2 .(1/2) +(2-1) 2 .(1/4)=1/2 </li></ul><ul><li>THE UNITS OF THE RANDOM VARIABLE AND THE MEAN ARE THE SAME. WHERE AS THE UNITS OF THE VARIANCE ARE SQUARED. </li></ul><ul><li>ANOTHER MEASURE OF DISPERSION IS CALLED STANDARD DEVIATION IS SQUARE ROOT OF VARIANCE = σ . </li></ul><ul><li>ANOTHER FORMULA FOR VARIANCE </li></ul><ul><li>σ 2 = Σ Xi 2 .p(xi)-µ 2 = σ 2 =E(xi) 2 -[E(x)] 2 </li></ul>
  35. 35. PROBABILITY FUNDAS <ul><li>PROPERTIES OF VARIANCE: </li></ul><ul><li>VAR(C)= 0 FOR ALL CONSTANTS. </li></ul><ul><li>VAR (mX +b)= m 2 .VAR(x) </li></ul><ul><li>VAR ( x+y)= Var(x) +Var(y); if x&y are indepednent </li></ul><ul><li>VAR(X+Y)= Var(x)+ 2COV(x,y)+Var(y) </li></ul>
  36. 36. PROBABILITY FUNDAS <ul><li>SKEWNESS: </li></ul><ul><li>ONE POSSIBLEMEASURE IS E[(X-µX) 3 ].IF THIS NUMBER IS >0 THEN THE DISTRIBUTION IS SKEWED TO THE RIGHT AND IF ITS < O THEN DISTRIBUTION IS SKEWED TO THE LEFT. </li></ul><ul><li>TO DESCRIBE THE DEGREE OF THE SKEWNESS THE STATSTIC THAT’S OFTEN USED IS </li></ul><ul><li>γx = E[(x-µx/ σ x) 3 ] </li></ul><ul><li>IN A POSITIVELY SKEWED DISTRIBUTION , THE MODE IS AT THE HIGHEST POINT OF DISTRIBUTION, THE MEDIAN IS TO THE RIGHT OF THAT AND THE MEAN IS TO THE RIGHT OF BOTH THE MEAN AND THE MEDIANJ. </li></ul><ul><li>KURTOSIS </li></ul><ul><li>THE DIFFERENCE IN PEAKEDNESS IN TWO DISTRIBUTIONS: </li></ul><ul><li>= E[(x-µx/ σ x) 4 ] </li></ul>
  37. 37. PROBABILITY FUNDAS <ul><li>E[(x-µx/ σ x) 4 ] </li></ul><ul><li>KURTOSIS IS A MEASURE THAT MEASURES WHETHER A DISTRIBUTION IS MORE OR LESS PEAKED THAN A NORMAL DISTRIBUTION. </li></ul><ul><li>LEPTO KURTIC: MORE PEAKED THAN A NORMAL </li></ul><ul><li>PLATYKUTIC: LESS PEAKED THAN A NORMAL. </li></ul><ul><li>KURTOSIS ALSO MEANS THAT A DISTRIBUTION IS MORE PEAKED AND HAVING FATTER TAILS. THIS IMPLIES MORE RETURNS CLUSTERED AROUND THE MEAN AND MORE RETURNS WITH LARGE DEVIATIONS FORM THE MEAN. </li></ul><ul><li>FOR ALL NORMAL DISTRIBUTIONS KURTOSIS IS EQUAL TO 3 . EXCESS KURTOSIS IS MEASURED KURTOSIS LESS THREE. A LEPTOKURTIC DISTRIBUTION HAS EXCESS KURTOSIS GREATER THAN ZERO AND A PLATYKURTIC DISTRIBUTION HAS EXCESS KURTOSIS LES THAN ZERO. </li></ul><ul><li>MOST EQUITY RETURN SERIES HAVE BEEN FOUND TO BE LEPTOKURTIC </li></ul><ul><li>THIS IMPLIES THAT IF WE USE STATASTICAL MODEL THAT DON’T ALOW FOR FATTER TAILS, WE WILL UNDERESTIMATE THE LIKELIHOOD OF VERY BAD OR VERY GOOD OUTCOMES. </li></ul><ul><li>FOR EXAMP0LE THE RETURN OF S&P 500 FOR 19 TH OCTOBER 1997 WAS 20 STANDARD DEVIATIONS AWAY FROM THE MEAN DAILY RETURN . </li></ul><ul><li>IF DAILY RETURNS ARE NORMALLY DISTRIBUTED THAN RETURNS FOUR STANDARD DEVIATIONS SHOULD OCCUR ONCE EVERY 50 YEARS. </li></ul><ul><li>THE MONTHLY RETURN SERIES OF S&P 500 RETURNS HAS VERY LARGE KURTOSIS (9.5) AND BY CONTRAST THE ANNUAL S&P RETURN SERIES HAS VERY SMALL NEGATIVE KURTOSIS (-0.2) </li></ul>
  38. 38. PROBABILITY FUNDAS POSITIVE SKEW NEGATIVE SKEW
  39. 39. PROBABILITY FUNDAS <ul><li>INDEPENDENCE OF RANDOM VARIABLES: </li></ul><ul><li>X1,X2 ARE INDEPEDNENT IF AND ONLY IF p(x1i,x2j)=p(x1 i ).p(x2 j ) FOR ALL i&j. </li></ul><ul><li>IF THE VARIABLES ARE CONTINUOUS WE SAY THAT THEY ARE INDEPENDENTIF AND ONLY IF f(x1,x2)= f1(x1).f2(x2) </li></ul><ul><li>IF X1 AND X2 ARE INDEPENDENT, THEN WE CAN SAY THAT </li></ul><ul><li>COVARIANCE & CORRELATION: </li></ul><ul><li>COVARIANCE= E[(X-µ X ).(Y-µ Y )] </li></ul><ul><li>IT CAN BE SHOWN THAT COV(X,Y)= E(XY)-E(X).E(Y) </li></ul><ul><li>THEREFORE IF X& Y ARE INDEPENDENT THE COV(X,Y)=0. </li></ul><ul><li>NOTICE THAT FOR ANY GIVEN PAIR OF VALUES OF X&Y THE COVARIANCE WILL BE POSITIVE IF BOTH X& Y ARE IN THE SAME DIRCTION AROUND THEIR MEAN..WHERE AS IT WILL BE NEGATIVE IF ONE IS ABOVE THE AVERAGE AND OTHER IS BELOW THE AVERAGE. </li></ul>
  40. 40. PROBABILITY FUNDAS <ul><li>INDEPENDENCE OF RANDOM VARIABLES: </li></ul><ul><li>COVARIANCE & CORRELATION: </li></ul><ul><li>CORRELATION IS A STATASTIC THAT IS A MEASURE OF BOTH THE SIGN AND DEGREE OF ASOCIATION BETWEEN TWO RANDOM VARIABLES. </li></ul><ul><li>ρxy =Cov(x,y)/[ σ x. σ y] </li></ul><ul><li>-1≤ ρxy ≤1 </li></ul><ul><li>THIS PROPERTY FOLLOWS FROM CAUCHY-SCHWARZ INEQUALITY </li></ul><ul><li>E(XY) 2 ≤E(X 2 ].E[Y 2 ] </li></ul><ul><li>THEREFORE WE CAN SAY THAT E[(X-µX)(Y-µY)] 2 ≤E(X-µX) 2 .E(Y-µY) 2. </li></ul><ul><li>THEREFORE IT FOLLOWS THAT ρ 2≤ 1 & -1≤ ρxy ≤1. </li></ul><ul><li>ANOTHER IMPORTANT EQUALITY IS </li></ul><ul><li>Var(ax+by)= a 2 Var(x)+ 2abCov(xy) +b 2 Var(y) </li></ul>
  41. 41. PROBABILITY FUNDAS <ul><li>INDEPENDENCE OF RANDOM VARIABLES: </li></ul><ul><li>COVARIANCE & CORRELATION: </li></ul>
  42. 42. PROBABILITY FUNDAS <ul><li>SOME IMPORTANT DISCRETE DISTRIBUTIONS: </li></ul><ul><li>BERNOULLI TRIALS: A TRIAL WITH TWO POSSIBILITIES S/F IS KNOWN AS BERNOULLI TRIALS. </li></ul><ul><li>SUPPOSE WE CONDUCT N BERNOULLI TRIALS ITS CALLED A BERNOULLI PROCESS IF THE TRIALS ARE INDEPENDENT AND EACH TRIAL HAS TWO POSSIBLE OUTCOMES {S} OR{F} & THE PROBABILITY OF SUCCESS REMAINS CONSTANT FROM TRIAL TO TRIAL. </li></ul><ul><li>FURTHER FOR EACH SUCCESS YOU GET 1 AND FAILURE YOU GET 0 </li></ul><ul><li>LET p BE THE PROBABILITY OF SUCCESS AND 1-p BE THE PROBABILITY OF FAILURE ORLETS CALL IT q ; q BEING EQUAL TO 1-p </li></ul><ul><li>E(Xi)= 0.q +1.p= p . V(Xi)= {[0 2 .q]+1 2 .p –p 2 = p.(1-p) </li></ul><ul><li>SUPPOSE AN EXPERIMENT CONSISTS OF THREE BERNOU;;I TRIALS AND THE PROBABILITY OF SUCCESS IS p ON EACH TRIAL . THE RANDOM VARIABLE x DEFINES THE NUMBER OF SUCCESSES. IT CAN TAKE ANY VALUE FROM 0 TO 3, AS SHOWN IN THE NEXT SLIDE. </li></ul>
  43. 43. PROBABILITY FUNDAS <ul><li>BERNOULLI TRIALS: </li></ul>Rx FFF 0 FFS FSF SFF FSS SFS SSF SSS 1 2 3 px q.q.q=q 3 3.p.q 2 3.p 2 q p 3
  44. 44. PROBABILITY FUNDAS <ul><li>THE BINOMIAL DISTRIBUTION: </li></ul><ul><li>THE RANDOM VARIABLE x THAT DENOTES THE NUMBER OF SUCCESSES IN n BERNOULLI TRIALS HAS A BINOMIAL DISTRIBUTION GIVEN BY p(x) = ( n x )p x .(1-p) n-x FOR x=0,1,2,3…n . </li></ul><ul><li>THE MEAN CAN BE DETERMINED BY </li></ul><ul><li>E(X)= 0 n Σ x. n!p x .q n-x / x!.(n-x)! </li></ul><ul><li>E(x) =np . 1 n Σ (n-1)!p x-1 .q n-x / (x-1)!.(n-x)! </li></ul><ul><li>LETTING y= x-1 </li></ul>
  45. 45. PROBABILITY FUNDAS <ul><li>THE BINOMIAL DISTRIBUTION: </li></ul><ul><li>E(x) =np . 0 n-1 Σ (n-1)!p y .q n-1-y / (y)!.(n-1-y)! </li></ul><ul><li>THE SUMMATION OF TERMS = 1 AND THEREFORE E(X)=np. </li></ul><ul><li>SIMILARILY IT CAN BE SHOWN THAT VARIANCE = npq. </li></ul><ul><li>A MUCH EASIER APPROACH WOULD HAVE BEEN TO CONSIDER X AS A SUM OF n INDEPENDENT RANDOM VARIABLES, EACH WITH MEAN p AND VARIANCE=pq, SO THAT X= X1+X2+X3..Xn then </li></ul><ul><li>E(X)= p +p+p.. n times = np & SIMILARILY VARIANCE = npq. </li></ul><ul><li>THE CUMULATIVE BINOMIAL DISTRIBUTION FUNCTION F HAS BEEN EXTENSIVELY TABLED. </li></ul>
  46. 46. PROBABILITY FUNDAS <ul><li>THE BINOMIAL DISTRIBUTION : </li></ul><ul><li>THE BINOMIAL DISTRIBUTION IS SYMEETRIC IF p =0.5. </li></ul><ul><li>THIS FOLLOWS FROM THE FACT THE NUMBER OF WAYS IN WHICH 3 SUCCESSES CAN BE ACHIEVED OUT OF 5 IS SAME AS 2 . SIMILARILY 1 OUT OF 5 IS SAME AS 4 OUT OF 5. </li></ul>
  47. 47. PROBABILITY FUNDAS <ul><li>THE GEOMETRIC DISTRIBUTION : </li></ul><ul><li>THIS IS ALSO RELATED TO A SEQUENCE OF BERNOULLI TRIALS. </li></ul><ul><li>THE RANDOM VARIABLE HERE IS DEFINED AS NUMBER OF TRIALS REQUIRED TO ACHIEVE THE FIRST SUCCESS. </li></ul><ul><li>P(x)= q x-1 .p x=1,2.. </li></ul><ul><li>IT CAN BE SEEN EASILY THAT IT’S A PROBABILITYDISTRIBUTION SINCE Σ p.q x-1 (where x varies from 1 to∞) </li></ul><ul><li>=p Σ q k k varies from 0 to∞ . THIS IS EQUAL TO p.[1/(1-q)] = 1. </li></ul><ul><li>µ= E(x)= 1 ∞ Σ x.p.q x-1 = p.d/dq[ Σ q x ]=p.d/dq[q/(1-q)] </li></ul><ul><li>E(x)= p.1/[1-q] 2 = 1/p </li></ul><ul><li>SIMIARILY VARIANCE = q/p 2 </li></ul>
  48. 48. PROBABILITY FUNDAS <ul><li>THE PASCAL DISTRIBUTION: </li></ul><ul><li>THE BASIS IS BERNOULLI TRIALS. </li></ul><ul><li>LOGICAL EXTENSION OF GEOMETRIC DISTRIBUTION. </li></ul><ul><li>IN THIS CASE THE RANDOM VARIABLE x DENOTES THE TRIAL ON WHICH THE rth SUCCESS OCCURS WHERE r IS AN INTEGER. </li></ul><ul><li>p(x)= = ( x-1 r-1 )p r .(1-p) x-r FOR x= r,r+1.r+2 </li></ul><ul><li>HERE THERE MUST BE r-1 SUCCESSES IN X-1 TRIALS & THE FRIST TERM IN THE ABOVE EXPRESSION JUST REFLECTS THAT. </li></ul><ul><li>HERE µ = r/p & σ 2 = rq/p 2. </li></ul>
  49. 49. PROBABILITY FUNDAS <ul><li>THE MULTINOMIAL DISTRIBUTION : </li></ul><ul><li>ASSUME AN EXPERIMNT WHERE THE SAMPLE SPACE IS PARTITIONED INTO k MUTUALLY EXCLUSIVE EVENTS SAY B1;B2..BK. </li></ul><ul><li>WE CONSDIER n INDEPENDENT REPETITIONS OF THE EXPERIMENT &LET pi =P(Bi) BE CONSTANT FROM TRIAL TO TRIAL. </li></ul><ul><li>THE RANDOM VECTOR { X1,X2,X3..XK} HAS THE FOLLOWING DISTRIBUTION WHERE Xi IS THE NUMBER OF TIMES Bi OCCURS IN THE n REPETITIONS. </li></ul><ul><li>P( x1, x2,x3.. Xk)= [n!/[ x1!.x2!.x3!..xk!]p1 x1 .p2 x2 …pk xk </li></ul><ul><li>HERE Σ Xi=n FOR ANY n REPETITIONS. </li></ul><ul><li>E(Xi)= npi V(Xi)= npi(1-pi) </li></ul>
  50. 50. PROBABILITY FUNDAS <ul><li>THE EXPONENTIAL DISTRIBUTION: </li></ul><ul><li>f(x)= λ e - λ x for x≥0 =0 otherwise. </li></ul><ul><li>Λ IS A POSITIVE REAL CONSTANT. </li></ul><ul><li>POISSON DISTRIBUTION : TALKS ABOUT NUMBER OF OCCURENCES IN TIME t & </li></ul><ul><li>P(x)= e - λ t .( λ t) x /x! </li></ul><ul><li>WHATS THE PROBABILITY THAT 0 OCCURRENCE OCCUR IN TIME ti.e p(0)= e - λ t . ANOTHER WAY TO LOOK AT THIS IS PROBABILITY THAT FIRST OCCURRENCE OCCUR AT TIME T>t I.e </li></ul><ul><li>P(0)=p(T>t)= e - λ t . IF T IS CONSIDERED AS RANDOM VARIABLE WHICH DENOTES THE TIME TO OCCURRENCE, then </li></ul><ul><li>F(t){ CUMULATIVE PROB FN.) P(T≤t) = 1- e - λ t . </li></ul><ul><li>f(t)= WHICH IS DERIVATIVE OF F(t) IS THE PROB. DENSITY FN.= λ e - λ t </li></ul>
  51. 51. PROBABILITY FUNDAS <ul><li>RELATIONSHIP BETWEEN EXPONENTIAL AND POISSON DISTRIBUTION: </li></ul><ul><li>IF THE NUMBER OF OCCURENCES HAS A POISSON DISTRIBUTION, THEN THE TIME BETWEEN OCCURENCES HAS AN EXPONENTIAL DISTRIBUTION. FOR EXAMPLE IF THE NUMBER OF ORDERS RECEIVED PER WEEK HAS A POISSON DISTRIBUTION THEN THE TIME BETWEEN ORDERS WOULD HAVE AN EXPONENTIAL DISTRIBUTION. </li></ul><ul><li>0 ∞ ∫ λ e - λ x dx=-e - λ x 0 ∞ =1. </li></ul><ul><li>E(X)= 0 ∞ ∫x. λ e - λ x dx=-xe - λ x 0 ∞ +∫e - λ x dx=1/ λ </li></ul><ul><li>V(x)=1/ λ 2 </li></ul>
  52. 52. PROBABILITY FUNDAS f(x) x λ EXPONENTIAL DENSITY FUNCTION
  53. 53. PROPBABILITY FUNDAS 1 F(X) x DISTRIBUTION FUNCTION 1-e - λ x
  54. 54. PROBABILITY FUNDAS <ul><li>NORMAL DISTRIBUTION </li></ul><ul><li>SYMMETRICAL BELL SHAPED DISTRIBUTION </li></ul><ul><li>IT WAS FIRST PRESENTED IN MATHEMATICAL FORM IN1733 BE DEMOIVERE, WHO DERIVED IT AS LIMITING FORM OF THE BINOMIAL DISTRIBUTION. </li></ul><ul><li>ITS ALSO KNOWN AS GUASSIAN DISTRIBUTION ( THROUGH HISTORICAL ERROR) </li></ul><ul><li>A RANDOM VARIABLE X IS SAID TO HAVE A NORMAL DISTRIBUTION WITH MEAN µ (-∞<µ<∞)& VARIANCE σ 2 . IT HAS A DENSITY FUNCTION f(x)= [1/ σ .√2 π ].e -(1/2)[(x-µ)/ σ ]2 </li></ul>
  55. 55. PROBABILITY FUNDAS <ul><li>NORMAL DISTRIBUTION </li></ul><ul><li>ITS USED SO EXTENSIVELY THAT THE SHORTHAND NOTATION X IS NORMALLY DISTRIBUTED WITH MEAN µ &VARIANCE σ 2. </li></ul><ul><li>X IS REPRESENRTED BY N(µ, σ 2 ) </li></ul><ul><li>IT HAS THE FOLLOWING IMPORTANT PROPERTIES </li></ul><ul><li>-∞ ∞ ∫f(x)dx=1 </li></ul><ul><li>f(x)≥0 for all x </li></ul><ul><li>f(x)=0 as x approaches ∞ or -∞ </li></ul><ul><li>f(x+µ)=f(-(x-µ)] ;THE DENSITY IS SYMETTRIC ABOUT µ </li></ul><ul><li>The maximum value of f occurs at x=µ </li></ul><ul><li>The points of inflection of f are at x=µ +- σ </li></ul>
  56. 56. PROBABILITY FUNDAS <ul><li>NORMAL DISTRIBUTION </li></ul>f(x) x µ
  57. 57. PROBABILITY FUNDAS <ul><li>F(x)= P(X ≤x) = P(Z≤(X-µ)/ σ )= -∞ (X-µ)/ σ ∫ [1/√2 π ].e -(1/2)[(Z]2. </li></ul><ul><li>Z HERE IS DEFINED AS STANDARD NORMAL DISTRIBUTED </li></ul><ul><li>VARIABLE. THIS HAS MEAN 0 AND VARIANCE OF 1. </li></ul><ul><li>φ (Z)= 1/√2 π .[e -z^2/2 ] - ∞<z<∞ . DENSITY FUNCTION </li></ul><ul><li>Φ (Z)= -∞ Z ∫[1/√2. π ]. .[e -z^2/2 ]dz. CUMULATIVE FUNCTION </li></ul><ul><li>THE STANDARD NORMAL DISTRIBUTION IS WELL TABULATED. </li></ul><ul><li>EXAMPLE: SUPPOSE X HAS A NORMAL DISTRIBUTION N(100,4) , WE WISH TO EVALUATE F(104) I.E (P(X≤104)= Φ ((104-100)/2)= Φ (2)=0.9772. </li></ul><ul><li>Φ MEASURES THE DEPARTURE OF X FROM THE MEAN IN STD. DEVIATION UNITS.IN OUR EXAMPLE 104 IS TWO STANDARD DEVIATIONS AWAY FROM 100. </li></ul>
  58. 58. PROBABILITY FUNDAS φ (x) z 0
  59. 59. PROBABILITY FUNDAS <ul><li>SYMMETERIC INTERVALS : </li></ul><ul><li>P{( µ - 1 σ ) ≤ x ≤ (µ +1 σ )} =.6826 </li></ul><ul><li>P{( µ - 1.645 σ ) ≤ x ≤ (µ +1.645 σ )} =.90 </li></ul><ul><li>P{( µ - 1.96 σ ) ≤ x ≤ (µ +1.96 σ )} =.95 </li></ul><ul><li>P{( µ - 2.57 σ ) ≤ x ≤ (µ +2.57 σ )} =.99 </li></ul><ul><li>P{( µ - 3 σ ) ≤ x ≤ (µ +3 σ )} =.9978 </li></ul><ul><li>REPRODUCTIVE PROPERTY OF NORMAL DISTRIBUTION: </li></ul><ul><li>SUPPOSE WE HAVE n INDEPENDENT NORMAL VARIABLES X1,X2,X3..Xn; WHERE Xi- N(µi, σ i 2 ),& Y=X1+X2+X3…XN </li></ul><ul><li>E(Y)= 1 n Σ µI & V(Y)= 1 n Σσ i 2 </li></ul>
  60. 60. PROBABILITY FUNDAS <ul><li>NORMAL DISTRIBUTION: </li></ul><ul><li>THE NORMAL DISTRIBUTION CAN BE INTERPRETED IN DIFFERENT WAYS. </li></ul><ul><li>ONE OF THE INTERPRETATION IS IT’S THE CONTINUOUS ANALOG OF BINOMIAL DISTRIBUTION WITH p=1/2. </li></ul>
  61. 61. BINOMIAL & B&S CONVERENCE <ul><li>CENTRAL LIMIT THEOREM: </li></ul><ul><li>IF A RANDOM VARIABLE Y IS THE SUM OF n INDEPENDENT RANDOM VARIABLES WHICH SATISFY CERTAIN GENERAL CONDITIONS, THEN FOR SUFFICIENTLY LARGE n, Y IS APPROXIMATELY NORMALLY DISTRIBUTED. </li></ul><ul><li>X1,X2,X3..Xn IS A SEQUENCE OF n INDEPENDENT VARIABLES WITH E(Xi)= μ i & V(Xi)= σ 2 i AND Y= X1+X2+X3+…Xn, THEN UNDER SOME GENERAL CONDITIONS </li></ul><ul><li>Zn= [Y- ∑ μ i]/ ∑σ 2 i HAS APP. N(0,1) DISTRIBUTION AS n APPROACHES INFINITY </li></ul><ul><li>IF ALL μ ’s ARE SAME AND ALL σ ’s ARE SAME THEN </li></ul><ul><li>Zn= (Y- n μ )/ σ√ n HAS APPROXIMATELY N(0,1 ) </li></ul>
  62. 62. BINOMIAL & B&S CONVERENCE <ul><li>CENTRAL LIMIT THEOREM: </li></ul><ul><li>HOW LARGE n MUST BE TO GET REASONABLE RESULTS USING THE NORMAL DISTRIBUTION TO APPROXIMATE Y. THE ANSWER DEPENDS UPON THE DISTRIBUTION OF Xi’s. </li></ul><ul><li>THUMB RULES: IF THE DISTRIBUTION OF Xi’s DOESN’T RADICALLY DEPART FROMNORMAL DISTRIBUTION THEN n>=4. </li></ul><ul><li>IF Xi’s ARE UNIFORM DENSITY THEN n>=12 </li></ul><ul><li>ILL BEHAVED, THE ISTRIBUTION HAS MEASURE IN TAILS THEN n>=100. </li></ul>
  63. 63. BINOMIAL & B&S CONVERENCE <ul><li>NORMAL APROXIMATION OF BINOMIAL DISTRIBUTION: </li></ul><ul><li>THE NORMAL APPROXIMATION TOBINOMIAL DISTRIBUTION: </li></ul><ul><li>p(x)= n!/[x!]*[n-x]!p x q n-x </li></ul><ul><li>[X-np]/[√npq] HAS N(0,1) DISTRIBUTION IF p is CLOSE TO ½ AND n>10. </li></ul><ul><li>HOWEVER FOR OTHER VALUES OF p , n MUST BE FAIRLY LARGE SO THAT np>5 IF p<=1/2 or nq>5 WHEN p>1/2 </li></ul><ul><li>NOW LETS LOOK AT A BINOMIAL TREE IN THE NEXT SLIDE WHICH CLEARLY SHOWS THAT VARIANVE AND MEAN INCREASE PROPORTIONATELY WITH TIME. </li></ul>
  64. 64. BINOMIAL & B&S CONVERENCE <ul><li>NORMAL APROXIMATION OF BINOMIAL DISTRIBUTION: </li></ul><ul><li>THE FACT THAT THE MEAN AND VARIANCE ARE PROPORTIONAL TO TIME AND THE STD. DEVIATIONIS PROPORTIONAL TO SQUARE ROOT OF TIME IS NOT UNIQUE TO BINOMIAL OUTCOMES . </li></ul><ul><li>IN FACT THIS PROPERTY WILL HOLD GOOD FOR ANY PROBABILITY DISTRIBUTION OFRETURNS PROVIDED: </li></ul><ul><li>THE RETURNS FROM ONE PERIOD TO THE NEXT ARE INDEPEDNENT. </li></ul><ul><li>SECOND THE MEAN AND VARIANCE OF ONE PERIOD RETURNS MUST BE STATIONARY. </li></ul><ul><li>AS SUCH RETURNS MUST BE DRAWN FROM THE SAME PROBABILITY DISTRIBUTION IN EACH PERIOD. </li></ul>
  65. 65. BINOMIAL & B&S CONVERENCE <ul><li>NORMAL APROXIMATION OF BINOMIAL DISTRIBUTION: </li></ul>θ IS THE TRUE PROBABILITY ASSOCIATED WITH INCREASE INS STOCK PRICE d= EXP(µτ/n -σ√τ/√n√(1-θ)/θ U= EXP(µτ/n +σ√τ/√n√(1-θ)/θ IF WE USE
  66. 66. BINOMIAL & B&S CONVERGENCE <ul><li>IN THE EQUATIONS SHOWN IN THE PREVIOUS SLIDE IT CAN BE SEEN THAT CONVERGENCE TO B&S VALUE IS FASTEST WHEN </li></ul><ul><li>TRUE PROBABILITY= 0.5 AND μ a ( CONT COMP RETURN)=ra -0.5 σ 2 </li></ul><ul><li>USING THIS VALUE OF μ a CAUSES THE RISK NEUTRAL PROBABILITY TO CONVERGE TO 0.5 AT A FASTER RATE. </li></ul><ul><li>IN PRACICE IT IS COMMON TO SET μ a TO 0 AND Θ =0.5 AND UNDER THESE CONDITIONS THE VALUE OF u& d ARE </li></ul><ul><li>u= e σ√ T √ n & d= e - σ√ T/ √ n </li></ul><ul><li>IF u < e rT/n t, THEN IT IS ADVISABLE TO USE μ a ( CONT COMP RETURN)=ra -0.5 σ 2 WHILE CALCULATING THE VALUES OF u&d. </li></ul>
  67. 67. PROBABILITY FUNDAS
  68. 68. PROBABILITY FUNDAS <ul><li>NORMAL DISTRIBUTION: </li></ul><ul><li>THE NORMAL DISTRIBUTION CAN BE INTERPRETED IN DIFFERENT WAYS. </li></ul><ul><li>ONE OF THE INTERPRETATION IS IT’S THE CONTINUOUS ANALOG OF BINOMIAL DISTRIBUTION WITH p=1/2. </li></ul>
  69. 69. LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>SIMPLEST FORM OF DENSITY FUNCTION OF A VARIABLE WHOSE LOGARITHM FOLLOWS A NORMAL PROBABILITY DISTRIBUTION </li></ul><ul><li>RANDOM VARIABLE X WITH RANGE SPACE Rx: [x:0<x<∞] </li></ul><ul><li>Y=Log e x=ln x IS NORMALLY DISTRIBUTED WITH MEAN μ y AND VARIANCE σ 2 Y. </li></ul><ul><li>E(X)= μ x=e μ y+0.5( σ )2y </li></ul><ul><li>σ 2 x=e 2 μ y + σ 2y (e σ 2y -1) </li></ul>
  70. 70. LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>A LOGNORMAL DISTRIBUTION IS BOUNDED BY ZERO BELOW AND IS SKEWED TO THE RIGHT. </li></ul><ul><li>A LOGNORMAL DISTRIBUTION IS USEFUL FOR DESCRIBING THE PRICES FOR MANY FINANCIAL ASSETS AND A NORMAL DISTRIBUTION IS OFTEN A GOOD APPROXIMATION FOR RETURNS. </li></ul><ul><li>A LOGNORMAL DISTRIBUTION IS DEFINED BY MEAN AND VARIANCE WHICH IN TURN ARE DERIVED FROM MEAN AND VARIANCE OF ITS ASSOCIATED NORMAL DISTRIBUTION. </li></ul><ul><li>WHEN σ INCREASES THE MEAN OF LOG NORMAL INCREASES, IT CAN SPREAD OUTWARDS BUT IT CANT SPREAD BEYOND ZERO, THEREFORE IT MEANS INCREASES. </li></ul><ul><li>A NORMAL DISTRIBUTION IS A CLOSER FIT FOR QUARTERLY AND TEARLY HOLDING RETURNS THAN IT IS FOR DAILY OR WEKLY RETURNS. </li></ul><ul><li>A NORMAL DISTRIBUTION IS LESS SUITABLE FOR ASSET PRICES SINCE THEY CANT FALL BELOW ZERO. </li></ul>
  71. 71. LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>LN (ST/S0)= r(0,t) REPRESENTS THE CONTINUOUS COMPOUNDED RETURN. </li></ul><ul><li>ST/S0= (ST/ST-1)*(ST-1)/(ST-2)…. (S1/S0). TAKING LOG ON BOTH SIDES </li></ul><ul><li>R(0,T)= r(T-1,T)+r(t-2,t-1)…+r(0,1) </li></ul><ul><li>USING CONTINUOS COMPUNDED RETURNS . HOLDING PERIOD RETURNS INVOLVE ADDITION OF SMALLER PERIODS CONT. COMP RETURNS. </li></ul><ul><li>ASSUMING THAT ONE PERIODS RETURNS ARE INDEPENDEDNT AND IDENTICAL(IID), THEN IT CAN BE SHOWN THAT RETURNS GET ADED UP AND VARIANCE GETS ADDED UP. </li></ul>
  72. 72. LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>Ln(St/S0) WHICH REPRESENTS THE CONTINUOUS COMPOUNDED RETURNS IS NORMALLY DISTRIBUTED WITH MEAN =( μ -1/2 σ 2 )T AND STD DEVIATION= σ√ T </li></ul><ul><li>A VARIABLE LIKE STOCK PRICE WHICH CAN TAKE VALUE BETWEEN 0 AND INFINITY IS LOG NORMALLY DISTRIBUTED. </li></ul><ul><li>(1+ STOCK RETURN) CAN TAKE VALUE BETWEEN 0&INFINITY IS LOGNORMALLY DISTRIBUTED. THE LN OF(1+STOCK RETURN)= LN(St+1/St) WHICH REPRESENTS CONTINUOUS COMPOUNDED RETURNS IS NORMALLY DISTRIBUTED. </li></ul>
  73. 73. LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>. </li></ul>0
  74. 74. LOGNORMAL DISTRIBUTION <ul><li>CONSIDER A STOCK WITH AN INITIAL PRICE OF $40 AN EXPECTED RETURN OF 16% PER ANNUM AND A VOL 4.035 OF 20% PER ANNUM. </li></ul><ul><li>LN(St)= Φ [ln40+[(.16-(0.2)^/2)]*0.5 ; .20*√0.5 REPRESENTS THE DISTRIBUTION OF STOCK PRICE 6 MONTHS FROM NOW </li></ul><ul><li>Ln(St)= Φ (3.759,.041) </li></ul><ul><li>THERE IS A 95% PROBABILITY THAT A NORMALLY DISTRIBUTED VARIABLE HAS A VALUE BETWEEN 1.96 STD DEVIATION OF MEAN. THIS MEANS THAT </li></ul><ul><li>3.759-1.96*.141 < LN(ST)<3.759+1.96*.141 </li></ul><ul><li>OR e 3.482 <St <e 4.035 or 32.55<ST<56.56 </li></ul>
  75. 75. LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>NORMAL DISTRIBUTION HAS ADDITIVE REPRODUCTIVE PROPERTIES </li></ul><ul><li>LOGNORMAL DISTRIBUTION HAS MULTIPLICATIVE REPRODUCTIVE PROPERTIES. </li></ul><ul><li>IF X1 &X2 ARE INDEPENDENT LOGNORMAL VARIABLES WITH PARAMETERS ( μ Y1 , σ 2 Y1 ) &( μ Y2, σ 2 Y2 ), THEN W=X1.X2 HAS A LOGNORMAL DISTRIBUTION WITH MEAN ( μ Y1+ μ Y2 ) & VARIANCE=( σ 2 Y1 + σ 2 Y2 ) </li></ul>
  76. 76. LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>IF X 1, ,X 2 …..X N ARE INDEPENDENT LOGNORMAL VARIATES AND EACH ONE HAS THE SAME PARAMETERS ( μ Y , σ 2 Y), THEN GEOMETRIC MEAN [X1.X1.X3….XN ][ 1/N] HAS A LOGNORMAL DISTRIBUTION WITH MEAN μ Y AND VARIACE= σ 2 Y/n </li></ul><ul><li>LETS LOOK AT AN EXAMPLE: </li></ul><ul><li>Y=Ln(x) HAS A N(10,4) DISTRIBUTION THEN X HAS A LOG NORMAL DISTRIBUTION WITH MEAN=e (10+2) =162754.79and VARIANCE=e (24) * (e 4 -1)=54.598e 24. </li></ul><ul><li>P(X<1000)=P(lnx <ln1000)=P(Y<=LN 1000)OR P(Z<=(LN1000-10)/2=P(Z<-1.55)=.0606 </li></ul>
  77. 77. LOGNORMAL DISTRIBUTION <ul><li>LOGNORMAL DISTRIBUTION: </li></ul><ul><li>A LOGNORMAL DISTRIBUTION ALLOWS FOR MORE UPSIDE MOVEMENT. I.E IF RS.100 CONTINUOUS COMPOUNDED AT 12% P.A WOULD LEAD TO 112.75( +12.75) AND 12% DOWNMOVEMENT WOULD LEAD TO 88.69( -11.31) </li></ul><ul><li>INOTHER WORDS FOR THE SAME UP AND DOWN LEVELS (SAY +-10%) WE REQUIRE MORE MOVEMENTS PROPORTIONAL OR CC) ON THE DOWN SIDE THAN ON THE UPSIDE. </li></ul><ul><li>A 10% OTM CALL WILL BE MORE EXPENSIVE THAN A 10% OTM PUT.LESSER MOVEMENTS ARE REQUIRED TO CROSS THE STRIKE PRICE FOR A 10% OTM CALL THAN A 10% OTM PUT. </li></ul>
  78. 78. PROBABILITY FUNDAS <ul><li>SAMPLING & DESCRIPTIVE STATISTICS: </li></ul><ul><li>STATISTICS IS A SCIENCE OF DRAWING CONCLUSION ABOUT A POPULATION BASED ON ANALYSIS OF DATA FROM THAT POPULATION. </li></ul><ul><li>MEASURES OF CENTRAL TENDENCY: </li></ul><ul><li>MEAN :[ Σ Xi]/n: MEAN IS AVERAGE OF ALL AND THERFORE IS IMPACTED BY EXTREME VALUES. </li></ul><ul><li>MEDIAN: MEDIAN IS THE VALUE OF n+1/2 IF n IS ODD. IF n=101 MEDIAN IS 51 ST OBSERVATION AND IF n=100 MEDIAN IS 50.5 OBSERVATION. THE median IS NOT INFLUENCED BY EXTREME OBSERVATIONS. </li></ul><ul><li>MODE: MOST FREQUENT OBSERVATION. </li></ul><ul><li>IF THE DATA IS SYMMETRIC THEN BOTH THE MEAN & MEDIAN COINCIDE. </li></ul><ul><li>IN ADDITION IF THE DATA HAS ONLY ONE MODE , THEN THE MEAN ,MEDIAN AND MODE ALL COINCIDE. </li></ul>
  79. 79. PROBABILITY FUNDAS <ul><li>STATISTICS & SAMPLING DISTRIBUTIONS: </li></ul><ul><li>IF X1, X2,X3…Xn IS A RANDOM SAMPLE OF SIZE n , THEN THE SAMP[LE MEAN X µ , THE SAMPLE VARIANCE S 2 ARE ITS STATISTICS. </li></ul><ul><li>SINCE THE STATISTIC IS A FUNCTION OF THE DATA FROM A RANDOM SAMPLE IT’S A RANDOM VARIABLE. </li></ul><ul><li>IN OTHER WORDS IF WE TAKE TWO DIFFERENT SAMPLES AND COMPUTE THEIR MEANS, THE MEAN WOULD BE DIFFERENT. </li></ul><ul><li>THE PROCESS OF DRAWING CONCLUSIONS ABOUT POPULATIONS BASED ON SAMPLE DATA MAKE CONSIDERABLE USE OF STATISTICS. </li></ul><ul><li>IN GENERAL WE CALL THE PROBABILITY DISTRIBUTION OF A STATISTIC A SAMPLING DISTRIBUTION. </li></ul><ul><li>LETS LOOK AT THE SAMPLING DISTRIBUTION OF MEAN X µ . </li></ul><ul><li>HERE THE POPULATION MEAN IS µ & σ 2. </li></ul><ul><li>NOTE THAT EACH OBSERVATION IN THE RANDOM SAMPLE IS NORMALLY DISTRIBUTED WITH MEAN µ AND VARIANCE σ 2. . </li></ul>
  80. 80. PROBABILITY FUNDAS <ul><li>STATISTICS & SAMPLING DISTRIBUTIONS: </li></ul><ul><li>USING THE REPRODUCTIVE PROEPRTY OF NORMAL DISTRIBUTION, THE MEAN OF SAMPLE WILL BE 1/n[ µ+µ+µ……µ{ntimes}= µ </li></ul><ul><li>THE VARIANCE OF THE SAMPLE MEAN THAT IS VARIANCE OF X µ WILL BE </li></ul><ul><li>VAR (X µ ) = VAR[(1/n){x1+x2+x3…..xn}]= 1/n 2 .[ σ 2 + σ 2 + σ 2……. σ 2 (ntimes)] </li></ul><ul><li>= 1/n 2 .[n σ 2 ] =[ σ 2 /n] </li></ul><ul><li>THEREFORE THE DISTRIBUTIONOF X µ IS NORMAL WITH MEAN µ AND VARIANCE σ 2 /n .. </li></ul><ul><li>THE TERM √[ σ 2 /n ] IS ALSO REFERRED TO AS THE STANDARD ERROR </li></ul>
  81. 81. PROBABILITY FUNDAS <ul><li>ESTIMATORS: </li></ul><ul><li>SUPPOSE THAT X IS A RANDOM VARIABLE WITH MEAN µ & VARIANCE σ 2. </li></ul><ul><li>WE CAN SHOW THAT THE SAMPLE MEAN X µ AND SAMPLE VARIANCE S 2 ARE UNBIASED ESTIMATORS OF µ & VARIANCE σ 2. </li></ul><ul><li>E(X µ )= E[ Σ Xi/n ] ( i varies from 1 to n). </li></ul><ul><li>=1/n . E[ Σ Xi ] ( i varies from 1 to n). </li></ul><ul><li>=1/n . Σ E(Xi) I varies from 1 to n </li></ul><ul><li>=1/n Σ µ i varies from 1 to n = µ </li></ul><ul><li>SIMILARLY IT CAN BE SHOWN THAT E(S 2 )= σ 2. </li></ul>
  82. 82. PROBABILITY FUNDAS <ul><li>E(S 2 )=E [ Σ (Xi-x µ ) 2 /(n-1 ) ] </li></ul><ul><li>[1/n-1]. E Σ (Xi-x µ ) 2 </li></ul><ul><li>[1/n-1]. E Σ ( Xi 2 +x µ 2 +2x µ xi) </li></ul><ul><li>[1/n-1]. E Σ ( Xi 2 -n x µ 2 ) </li></ul><ul><li>=[1/n-1]. Σ ( E( Xi 2 ) –n.E (x µ 2 )) </li></ul><ul><li>=E(Xi 2 )=µ 2 + σ 2 E(X µ 2)= µ 2 + σ 2 /n. </li></ul><ul><li>=[1/n-1]. Σ µ 2 + σ 2 –n.( µ 2 + σ 2 /n )= [ 1/n-1].[nµ 2 +n σ 2 -nµ 2 - σ 2 ] </li></ul><ul><li>= σ 2. </li></ul><ul><li>THEREFORE THE SAMPLE VARIANCE IS AN UNBIASED ESTIMATE OF POP </li></ul>
  83. 83. PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>IN MANY SITUATIONS A POINT ESTIMATE DOESN’T PROVIDE ENOUGH INFORMATION ABOUT THE PARAMETER OF INTEREST. </li></ul><ul><li>AN INTERVAL ESTIMATE OF THE FORM L ≤µ≤U . THE END POINTS OF THE INTERVAL WILL BE RANDOM VARIABLE. </li></ul><ul><li>P(L≤ θ≤ U)=1- α . THIS INTERVAL IS CALLED 100(1- α ) PERCENT CONFIDENCE INTERVAL FOR THE UNKNOWN PARAMETER θ . </li></ul><ul><li>L& U ARE CALLED LOWER AND UPPER CONFIDNCE LIMITS. </li></ul><ul><li>(1- α ) IS CALLED THE CONFIDENCE COEFFICIENT. </li></ul><ul><li>WE SAY THAT θ LIES IN THE OBSERVED INTERVAL WITH CONFIDENCE 100*(1- α ). </li></ul><ul><li>THE ABOVE STATEMENT HAS A FREQUENCY INTERPRETATION, I.E WE DON’T KNOW IF THE STATEMENT IS TRUE FOR THIS SPECIFIC SAMPLE , BUT THE METHOD USED TO OBTAIN THE INTERVAL [L,U] YIELDS CORRECT STATEMENTS 100.(1- α ) PERCENT OF THE TIME. </li></ul>
  84. 84. PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>THE CONFIDENCE INTERVAL DISCUSSED EARLIER IS CALLED TWO-SIDED CONFIDENCE INTERVAL AS IT SPECIFIES BOTH A LOWER AND ANUPPER LIMIT ON θ . </li></ul><ul><li>OCCASIONALLY , A ONE SIDED CONFIDENCE INTERVAL MIGHT BE MORE APPROPRIATE. A ONE SIDED 100.(1- α ) PERCENT LOWER CONFIDENCE INTERVAL ON θ IS GIVEN BY THE INTERVAL L ≤ θ , WHERE THE LOWER CONFIDENCE LIMIT L IS CHOSEN SO THAT </li></ul><ul><li>P(L ≤θ )= 1- α , SIMILARLY ON THE UPPER SIDE WE CAN HAVE P( θ≤ U)=1- α . </li></ul><ul><li>THE LENGTH OF THE CONFIDENCE INTERVAL IS AN IMPORTANT MEASURE OF THE QUALITY OF THE INFORMATION OBTAINED FROM THE SAMPLE. </li></ul><ul><li>THE LONGER THE INTERVAL , MORE CONFIDENT WE ARE THAT THE INTERVAL ACTUALLY CONTAINS THE TRUE VALUE OF θ . </li></ul><ul><li>ON THE OTHER HAND THE LONGER THE INTERVAL, THE LESS INFORMATION WE HAVE ABOUT THE TRUE VALUE OF θ . </li></ul><ul><li>IN AN IDEAL SITUATION , WE OBTAIN A RELATIVELY SHORT INTERVAL WITH HIGH CONFIDENCE. </li></ul>
  85. 85. PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul>α /2 α /2
  86. 86. PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>LET X BE A RANDOM VARIABLE WITH UNKNOWN MEAN µ & VARIANCE σ 2. </li></ul><ul><li>SUPPOSE THAT A RANDOM SAMPLE OF SIZE n x1,x2,x3…xn IS TAKEN. </li></ul><ul><li>100. (1- α ) PERCENT CONFIDENCE INTERVALON µ CAN BE OBTAINED BY CONSIDERING THE SAMPLING DISTRIBUTION OF SAMPLE MEAN X µ . </li></ul><ul><li>THE MEAN OF X µ IS µ & THE VARIANCE IS σ 2 /n. THEREFORE, THE DISTRIBUTION OF THE STATISTIC </li></ul><ul><li>Z=[X µ -µ]/[ σ /√n] IS TAKEN TO BE STANDARD NORMAL DISTRIBUTION. </li></ul><ul><li>P(-Z α /2 ≤ Z ≤Z α /2 )=1- α </li></ul><ul><li>P(-Z α /2 ≤ [X µ -µ]/[ σ /√n] ≤Z α /2 )=1- α </li></ul><ul><li>OR P[( X µ -( Z α /2 ). [ σ /√n] ≤ µ ≤ ( X µ +( Z α /2 ). [ σ /√n] ) ] =1- α </li></ul>
  87. 87. PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>EXAMPLE: A QUALITY INSPECTOR IS INVESTIGATING THE INTERNAL PRESSURE STRENGTH OF A 1 LITRE , GLASS SOFT DRINK BOTTLE. </li></ul><ul><li>PRESSURE STRENGTH IS NORMALLYDISTRIBUTED WITH STD. DEVIATION OF 30 PSI. A RANDOM SAMPLE OF 25 BOTTLES HAD A MEAN PRESSURE OF 278 PSI . A 95% CONFIDENCE INTERVAL FOR µ IS </li></ul><ul><li>[278-1.96.30/5]≤µ≤[278+1.96.30/5]=[266.24≤µ≤289.76]. </li></ul><ul><li>WE CAN SAY WITH 95% CONFIDENCE THAT POPULATION MEAN WILL BE WITHIN THE GIVEN RANGE. </li></ul><ul><li>IN SITUATIONS WHERE THE SAMPLE SIZE CAN BE CONTROLLED WE CAN CHOOSE n TO BE 100.(1- α ) PERCENT SUCH THAT ERROR IN ESTIMATING µ IS LESS THAN A SPECIFIED ERROR E.. THE APPROPRIATE SAMPLE SIZE IS </li></ul><ul><li>N= [ z α /2 . σ /E] 2 </li></ul>
  88. 88. PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>CONFIDENCE INTERVAL ON THE MEAN OF A NORMAL DISTRIBUTION VARIANCE UNKNOWN </li></ul><ul><li>t= =[X µ -µ]/[S/√n] IS A t DISTRIBUTION WITH n-1 degrees OF FREEDOM. </li></ul><ul><li>P(-t α /2,n-1 ≤ [X µ -µ]/[s/√n] ≤t α /2,n-1 )=1- α </li></ul>
  89. 89. PROBABILITY FUNDAS <ul><li>CONFIDENCE INTERVAL ESTIMATION: </li></ul><ul><li>CONFIDENCE INTERVAL ON THE VARIANCE OF A NORMAL DISTRIBUTION </li></ul><ul><li>X IS ANDOM VARIABLE WITH UNKNOWN MEAN & VARIANCE </li></ul><ul><li>Χ 2 =[(n-1).S 2 ]/ σ 2 IS Χ 2 DISTRIBUTION WITH n-1 DEGREES OF FREEDOM. </li></ul><ul><li>P( Χ 2 1- α /2, n-1 ≤[(n-1)S 2 ]/ σ 2 ≤ Χ 2 α /2, n-1 )=1- α </li></ul><ul><li>THIS CAN BE REARRANGED AS </li></ul><ul><li>P({n-1}s 2 / Χ 2 α /2, n-1 ≤ σ 2 ≤ ({n-1}s 2 / Χ 2 1- α /2, n-1 )=1- α </li></ul>α /2 α /2 Χ 2 n-1 DISTRIBUTION 0 Χ 2 1- α /2,n-1 Χ 2 α /2,n-1
  90. 90. PROBABILITY FUNDAS <ul><li>TESTS OF HYPOTHESIS: </li></ul><ul><li>MANY PROBLEMS REQUIRE THAT WE DECIDE WHETHER OR NOT A STATEMENT ABOUT SOME PARAMETER IS TRUE OR FALSE. </li></ul><ul><li>THE STATEMENT IS USUALLY CALLED A HYPOTHESIS AND THE DECISION MAKING PROCEDURE ABOUT THE TRUTH OR FALSITY OF THE HYPOTHESIS ISCALLED HYPOTHESIS TESTING. </li></ul><ul><li>A STATISTICAL HYPOTHESIS IS A STATEMENT ABOUT THE PROBABILITY DISTRIBUTION OF A RANDOM VARIABLE. </li></ul><ul><li>SUPPOSE WE ARE INTERESTED IN THE MEAN COMPREHENSIVE STRENGTH OF A PARTICULAR TYPE OF CONCRETE. </li></ul><ul><li>WE ARE INTERESTED IN SAYING WHETHER THE MEAN IS 2500 psi. WE MAY EXPRESS THIS FORMALLY AS </li></ul><ul><li>H0: µ= 2500 psi H1: µ≠ 2500 psi </li></ul>
  91. 91. PROBABILITY FUNDAS <ul><li>TESTS OF HYPOTHESIS: </li></ul><ul><li>H0 is called the null hypothesis. </li></ul><ul><li>H1: IS CALLED THE ALTERNATIVE HYPOTHESIS. </li></ul><ul><li>IN SOME SITUATIONS THE ALTERNATIVE HYPOTHESIS MAY BE ONE SIDED. I.E H1: µ > 2500 </li></ul><ul><li>ITS IMPORTANT TO NOTE THAT HYPOTHESIS ARE ALWAYS STATEMENTS ABOUT POPULATIONS AND NOT ABOUT SAMPLES. </li></ul><ul><li>SUPPOSE WE TEST THE SAMPLE AND WE SAY THAT WE WILL REJECT H0 IF X µ > 2550 OR X µ < 2450. THE SET OF ALL VALUES > 2550 OR LESS THAN 2450 IS CALLED THE CRITICAL REGION OR THE REJECTION REGION </li></ul><ul><li>THE INTERVAL 2450-2550 IS CALLED THE ACCEPTANCE REGION. </li></ul>
  92. 92. PROBABILITY FUNDAS <ul><li>TYPE 1 AND TYPE II ERRORS: </li></ul><ul><li>H0 IS TRUE H0 IS FALSE </li></ul><ul><li>ACCEPT H0 NO ERROR TYPE II ERROR </li></ul><ul><li>REJECT H0 TYPE I ERROR NO ERROR </li></ul><ul><li>α = P( TYPE I ERROR)= P( REJECT H0/ H0 IS TRUE) </li></ul><ul><li>Β = P( TYPE II ERROR)= P(ACCEPT H0/ H0 IS FALSE) </li></ul><ul><li>SOMETIMES ITS MORE USEFUL TO WORK WITH POWER OF THE TEST . POWER IS DEFINED AS (1- β ) =P(REJECT H0/H0 IS FALSE) </li></ul><ul><li>POWER OF THE TEST IS THE PROBABILITY THAT A FALSE NULL HYPOTHESIS IS REJECTED CORRECTLY. </li></ul><ul><li>THE PROBABILITY OF TYPE 1 ERROR IS CALLED THE SIGNIFICANCE LEVEL OR SIZE OF THE TEST. </li></ul>
  93. 93. PROBABILITY FUNDAS <ul><li>TYPE 1 AND TYPE II ERRORS: </li></ul><ul><li>IN OUR EXAMPLE TYPE 1 ERROR WILL OCCUR IF SAMPLE MEAN IS > 2550 OR < 2450, WHEN IN FACT THE TRUE MEAN IS 2500. </li></ul><ul><li>TYPE 1 ERROR PROBABILITY IS CONTROLLED BY THE LOCATION OF THE CRITICAL REGION. </li></ul><ul><li>ITS EASY FOR AN ANALYST TO SET THE DESIRED VALUE OF TYPE I ERROR. </li></ul><ul><li>WHAT IF THE ACTUAL MEAN IS DIFFERENT FROM 2500 I.E THE NULL HYPOTHESIS IS FALSE.? THE PROBABILITY OF TYPE II ERROR IS NOT A CONSTANT BUT DEPENDS UPON THE TRUE MEAN COMPREHENSIVE MEAN STRENGTH OF THE CONCRETEµ. </li></ul><ul><li>LET β (µ ) DENOTE THE TYPE II ERROR PROBABILITY CORRESPONDING TO µ </li></ul>
  94. 94. PROBABILITY FUNDAS <ul><li>NOTE HERE THAT β (2700)< β (2600).IN OTHER WORDS SMALLER DEVIATIONS ARE HARDER TO DETECT THAN LARGER ONES. </li></ul><ul><li>IN ADDITION , THE PROBABILITY OF TYPE II ERROR IS ALSO A FUNCTION OF THE SAMPLE SIZE. IT DCREASES AS THE SAMPLE SIZE INCREASES. </li></ul><ul><li>FINALLY THE TYPE II ERROR ALSO DEPENDS UPON THE TYPE I ERROR α . </li></ul><ul><li>DECREASING α CAUSES TYPE II ERROR TO INCREASE. </li></ul><ul><li>BECAUSE TYPE II ERROR PROBABILITY IS A FUNCTION OF SAMPLE SIZE, TYPE I ERROR AND TRUE MEAN, WE GENERAALLY SAY , “ WE FAIL TO REJECT H0 “ RATHER THAN SAYING ACCEPTING H0. </li></ul><ul><li>MANY HYOTHESIS TESTING PROBLEMS REQUIRE A ONE SIDED TEST SUCH AS </li></ul><ul><li>H0: μ = μ 0; H1: μ > μ 0: THIS WOUL MEAN THAT CRITICAAL REGION IS IN THE UPPER TAIL.THAT IS WE WOULD REJECT H0: IF SAMPLE MEAN IS TOO LARGE. </li></ul><ul><li>HERE NOTE THAT IF TRUE MEAN IS > μ 0 THEN ONE SIDED TEST IS BETTER. IF TRUE MEAN= μ 0 THEN BOTH ONE AND TWO SIED TESTS ARE EQUIVALENT. HOWEVER IF TRUE MEAN IS< μ 0, THEN TWO SIDED TEST IS BETTER THAN ONE SIDED TEST. </li></ul>
  95. 95. PROBABILITY FUNDAS <ul><li>WHAT ABOUT THE FOLLOWING HYPOTHESIS: </li></ul><ul><li>H0: μ ≤ μ 0; H1: μ > μ 0 . HERE WE ARE ASSUMING THAT μ CAN BE < μ 0. BUT IT CANT BE GREATER. </li></ul><ul><li>IN SITUATIONS WHERE ONE SIDED HYPOTHESIS ARE APPROPRIATE, WE WILL USUALLY WRITE THE NULL HYPOTHESIS WITH AN EQUALITY SIGN WHICH MEANS IT INCLUDES THE CASES WHERE μ < μ 0 </li></ul><ul><li>SUPPOSE A SOFT DRINK BOTTLER PURCHASES BOTTLES. THE BOTTLES WILL BE ACCEPTED ONLY IF PRESSURE STRENGTH IT CAN STAND IS 200PSI. </li></ul><ul><li>NOW THE HYPOTHESIS CAN BE FORMULATED IN TWO WAYS. </li></ul><ul><li>1: H0: μ≤ 200 H1: μ >200 2. H0: μ ≥ 200 H1: μ <200 </li></ul><ul><li>IN 1 THERE IS A PROBABILITY THAT H0 WILL BE ACCEPTED THAT BOTTLES ARE NOT SATISFACTORY , EVEN THOUGH THE TRUE MEAN IS GREATER THAN 200. THIS IMPLIES THAT BOTTLER WANTS THE BOTTLES TO MEET ORE XCEED THE EXPECTATIONS. IN 2 CASE THERE IS PROBABILITY THAT H0 WILL BE ACCEPTED EVEN THOUGH THE TRUE MEAN IS SLIGHTLY LESS THAN 200. </li></ul>
  96. 96. PROBABILITY FUNDAS <ul><li>WHAT ABOUT THE FOLLOWING HYPOTHESIS: </li></ul><ul><li>IN FORMULATING ONE SIDED HYPOTHESIS, WE SHOULD ALWAYS REMEMBER THAT REJECTING H0 IS ALWAYS A STRONG CONCLUSION AND CONSEQUENTLY WE SHOULD PUT THE STATEMENT ABOUT WHICH IS IMPORTANT TO MAKE A STRONG CONCLUSION IN THE ALTERNATE HYPOTHESIS. </li></ul><ul><li>TEST OF HYPOTHESIS ON MEAN ,VARIANCE KNOWN: </li></ul><ul><li>H0: μ = μ 0; H1 μ ≠ μ 0. A SAMPLE IS TAKEN AND Z0=X μ - μ 0/ σ / √ n. </li></ul><ul><li>HERE WE WOULD REJECT H0 IF Z0>Z α /2 AND IF Z0 <Z - α /2 ( REJECTION REGION) AND WE WOULD FAIL TO REJECT H0 IF –Z α /2 ≤Z0 ≤ Z α /2 </li></ul><ul><li>FOR EXAMPLE: H0: μ =40 AND H1: μ≠ 40, n=25 α =.05 and σ =2 </li></ul><ul><li>Z .025 =1.96 AND – Z .025= 1.96 Z0= 41.25-40/2/√25=3.125 </li></ul><ul><li>SINCE Z0 FALLS IN CRITICAL REGION , H0 IS REJECTED. </li></ul>
  97. 97. PROBABILITY FUNDAS <ul><li>CHOICE OF SAMPLE SIZE TO CONTROL TYPE II ERROR </li></ul><ul><li>H0: μ = μ 0 AND H1: μ≠μ 0 AND LETS ASSUME THAT TRUE MEAN IS μ 0+ δ . Z0– N( δ .√n/ σ ,1) . NOW THE TYPE II ERROR 7-ILL BE MADE, GIVE H1 IS TRUE ONLY IF Z0 LIES BETWEEN (-Z α /2 AND Z α /2 ) </li></ul><ul><li>IN OTHER WORDS β = Φ (Z α /2 - δ .√n/ σ /)- Φ ((-Z α /2 - δ .√n/ σ /). </li></ul><ul><li>TEST OF A HYPOTHESIS MEAN VARIANCE UNKNOWN: </li></ul><ul><li>HERE THE VARIANCE IS NOT KNOWN </li></ul><ul><li>TO TEST A HYPOTHESIS H0: μ = μ 0 AND H1: μ≠μ 0 </li></ul><ul><li>IN SUCH CASES WE USE TH t –statistic </li></ul><ul><li>t0 =[X μ - μ 0]/[S/√n] -t DISTRIBUTION WITH n-1 degrees OF FREEDOM </li></ul><ul><li>THE OTHER PRINCIPLES USED ARE THE SAME I.E FOR THE CRITICAL REGION ETC.IN OTHER WORDS ,INSTEAD OF USING Z DISTRIBUTION WE USE t WITH n-1 DEGREES OF FREEDOM </li></ul>
  98. 98. PROBABILITY FUNDAS <ul><li>TEST OF HYPOTHESIS ON THE VARIANCE OF A NORMAL DISTRIBUTION: </li></ul><ul><li>HERE H0: σ 2 = σ 0 2 H1: σ 2 ≠ σ 0 2 </li></ul><ul><li>HERE WE USE THE (CHI SQUARE TEST STATSTIC WITH n-1 degrees of freedom) ) Χ 0 2 = (n-1).S 2 / σ 0 2. </li></ul><ul><li>THEREFOR H0” σ 2 = σ 0 2 WILL BE REJECTED IF Χ 0 2 > Χ 2 α /2,N-1 OR IF </li></ul><ul><li>Χ 0 2 < Χ 2 1- α /2,N-1 </li></ul><ul><li>LETS CONSIDER AN EXAMPLE H0: σ 2 =.02 AND H1: σ 2 >.02 . A RANDOM SAMPLE OF 20 CANS YIELD A SAMPLE VARIANCE OF .0225. THUS THE TEST STATISTIC IS Χ 0 2 =(19)*.0225/.02=21.38. IF WE CHOOSE α = .05, WE FIND THAT Χ 0 2 .05,19 =30.14. WE WILL CONCLUDE THAT THERE IS NO STRONG EVIDENCE THAT THE VARIANCE EXCEEDS .02. </li></ul>
  99. 99. PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>INDEPEPENDENT VARIABLE x and INDEPENDENT VARIABLE y. </li></ul><ul><li>y= β 0 + β 1.x + ε . HERE ε IS A RANDOM ERROR WITH MEAN 0 AND VARIANCE σ 2. FURTHER THE ERROR TERMS ARE ASSUMED TO BE UNCORRELATED. </li></ul><ul><li>LEAST SQUARE MODEL: SUM OF SQUARE OF DEVIATIONS BETWEEN THE ACTUAL OBSERVATIONS AND THE ONES ESTIMATED BY THE REGRESSION LINE IS MINIMISED. </li></ul><ul><li>I.E L=∑ ε i 2 =∑(yi - β 0 + β 1.x ) 2 . </li></ul><ul><li>IF WE REWRITE y= β 0’ + β 1.(x-x μ )+ ε . HERE β 0’ = β 0 + β 1.x μ ) </li></ul><ul><li>WHEN WE DIFFERENTIATE THE EQUATION WRT β 0’ AND β 1 </li></ul><ul><li>WE GET ∂L/∂ β ’0=-2∑[yi- β ’0 – β 1(x-x μ )]=0 and </li></ul><ul><li>∂ L/∂ β 1= =-2∑[yi- β ’0 – β 1(x-x μ ). (x-x μ ).]=0 </li></ul><ul><li>SIMPLIFYING FIRST EQUATION WE GET n β ’0= ∑yi or β ’0=y μ (mean of y) or finally we are saying that β 0= y μ –x μ . β 1. </li></ul>
  100. 100. PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>THE SECOND DERIVATIVE LEADS TO THE EQUATION </li></ul><ul><li>β 1.∑(xi-x μ ) 2 =∑yi.(xi-x μ ) or β 1= =∑yi.(xi-x μ ) / .∑(xi-x μ ) 2 </li></ul><ul><li>ASSUMPTIONS </li></ul><ul><li>THE RELATIONSHIP BETWEEN DEPENDENT VARIABLE AND INDEPENDENT VARIABLE IS LINEAR. </li></ul><ul><li>THE EXPECTED VALUE OF RANDOM TERM IS ZERO </li></ul><ul><li>THE VARIANCE FOR ERROR TERM IS SAME FOR ALL OBSERVATIONS E(ei^2)= σ e 2 </li></ul><ul><li>THE ERROR TERM IS UNCORRELATED ACROSS OBSERVATIONS E( eiej)=0 </li></ul><ul><li>THE ERROR TERM IS NORMALLY DISTIBUTED. </li></ul>
  101. 101. PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>STANDARD ERROR ESTIMATE: = Σ (ei) 2 /(n-2) </li></ul><ul><li>TWO PARAMETERS HAVE BEEN ESTIMATED (ALPHA/BETA) </li></ul><ul><li>N-2 REPRESENTS THE DIFFERENCE BETWEEN THE NUMBER OF OBSERVATIONS AND NUMBER OF PARAMETES ESTIMATED FROM THESE OBSERVATIONS. </li></ul><ul><li>COEFFICIENT OF DETERMINATION (R 2) TELLS WHAT PERCENTAGE OF VARIATION IN THE DEPENDENT VARIABLE IS EXPLAINED BY THE INDEPENDENT VARIABLE </li></ul><ul><li>IT IS ALSO EQUAL TO =[TOTAL VARIATION –UNEXPLAINED VARIATION]/TOTAL VARIATION. </li></ul><ul><li>HYPOTHESIS TESTING: </li></ul><ul><li>T STATSTIC= [ACTUAL BETA – HYPOTHESISED VALUE OF BETA]/ sbeta </li></ul><ul><li>FOR EXAMPLE IF ACTUAL BETA= 1.5 AND HYPOTHESISED BETA =1 AND sbi=0.20 </li></ul><ul><li>THEN T STATSTIC= 0.5/0.2= 2.5. IF n=62 and n-2=60 THEN tc AT 95% CONFIDENCE INTERVAL (5% SIGNIFICANCE LEVEL) IS 2.00. </li></ul><ul><li>SINCE THE OBSERVED STASTIC IS 2.5( OUTSIDE THE INTERVAL) , WE REJECT THE NULL HYPOTHESIS THAT BETA= 1. THE CONFIDENCE INTERVAL= 1.1TO1.90 </li></ul>
  102. 102. PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>HYPOTHESIS TESTING: </li></ul><ul><li>WE CAN ALSO SAY THAT WE ARE MORE THAN 95% CONFIDENT THAT STOCK BETA IS DIFFERENT FROM 1. </li></ul><ul><li>IF WE USE A SIGNIFICANCE LEVEL OF 1% THEN T STATSTIC IS 2.66%, THEN WE WILL NOT REJECT THE NULL HYPOTHESIS.. AT HIGHER LEVEL OF CONFIDENCE ( LOWER LEVEL OF SIGNIFICANCE), THE CONFIDENCE INTERVAL INCREASES AND NOW WE WILL NOT REJECT THE NULL HYPOTHESIS. IT REDUCES THE TYPE I ERROR ( REEJCTING NULL WHEN ITS TRUE) , BUT IT INCREASES TYPE II ERROR , THAT IS FAILING TO REJECT NULL WHEN ITS FALSE. </li></ul><ul><li>OFTEN THE FINANCIAL ANALYSTS INDIACE THE p value. THE p vale IS THE SMALLEST VALUE OF SIGNIFICANCE AT WHICH THE NULL HYPOTHESIS CAN BE REJECTED. </li></ul><ul><li>IN FACT IN MOST OF THE SOFTWARE THE p value CORRESPONDS TO A TEST OF NULL HYPOTHESIS THAT TRUE VALUE IS EQUAL TO 0 . FOR EXAMPLE IF THE p value IS .005, WE CAN REJECT THE HYPOTHESIS THAT THE TRUE PARAMETER IS EQUAL TO 0 AT THE 0.5% SIGNIFICANCE LEVEL (99.5% CONFIDENCE INTERVAL) </li></ul><ul><li>STRONGER REGRESSION RESULTS LEAD TO SMALLER STANDARD ERRORS OF AN ESTIMATED PARAMETER AND RESULT IN TIGHTER CONFIDENCE INTERVALS AND WE WILL REJECT THE NULL HYPOTHESIS EVEN AT HIHGER CONFIDENCE ( LOWER LEVEL OF SIGNIFICANCE) </li></ul><ul><li>THE Standard error of beta = SQRT ( SSE/n-2)/(SSX) or </li></ul><ul><li>THE T STSSTIC= ρ *sqrt(n-2)/ sqrt(1- ρ 2 ).. FOR THE STANDARD ERROR OF BETA IS AVERAGE STD ERROR PER DEGREE OF FREEDOM (WHICH IS Σe and DENOMINATOR IS SUM OF SQUARES OF INDEPENDENT VARIABLE) </li></ul>
  103. 103. PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>HYPOTHESIS TESTING: </li></ul><ul><li>THE F-TEST THAT THE SLOPE COEFFICIENT EQUALS ZERO IS BASED ON F-STASTIC. </li></ul><ul><li>THE F-STATSTIC IS RATIO OF RSS/[SSE/n-2). IT’S THE RATIO OF TOTAL VARIATION OF Y THAT IS EXPLAINED BY REGRESSION AND VARIANCE OF ERROR. THE F STASTIC HAS F1,n-2 I( ONE INTERCEPT) AND N-2 DEGREES OF FREEDOM. </li></ul><ul><li>IF THE REGRESSION MODEL DOES A GOOD JOB THEN THE RATIO SHOULD BE HIGH. </li></ul><ul><li>THE EXPLAINED REGRESSION SUM OF SQUARES PER ESTIMATED PARAMETER WILL BE RELATIVELY HIGH TO THE UNEXPLAINED VARIATION FOR EACH DEGREE OF FREEDOM. </li></ul><ul><li>WHEN THERE IS ONE INDEPENDENT VARIABLE , THE F STASTIC IS SQUARE OF TSTASTIC FOR THE SLOPE COEFFICIENT. </li></ul>
  104. 104. PROBABILITY FUNDAS <ul><li>SIMPLE LINEAR REGRESSION: </li></ul><ul><li>HYPOTHESIS TESTING: </li></ul><ul><li>ONE OF THE TEST THAT IS CONDUCTED TO EVALUATE THE PERFORMANCE OF MUTUAL FUND MANAGER IS EXCES RETURNS WHICH IS ALPHA. </li></ul><ul><li>HERE THE NULL HYPOTHESIS IS THAT ALPHA=0 AND THE ALTERNATE HYPOTHESIS IS ALPHA IS NOT EQUAL TO ZERO. </li></ul><ul><li>LETS SEE THE RESULTS OF A REGRESSION </li></ul><ul><li>MULTIPLE R= 0.9280 RSQUARE= 0.8611; STANDARD ERROR OF ESTIMATE=.0174, OBSERVATIONS=60. </li></ul><ul><li>ANOVA DEGREES OF FREEDOM SUM OF SQUARES MEAN SS F </li></ul><ul><li>REGRESSION 1 .1093 .1093 359.64 </li></ul><ul><li>RESIDUAL 58 .0176 .0003 </li></ul><ul><li>TOTAL 59 .1269 </li></ul><ul><li>COEFFICIENT STD ERROR T-STATSTIC </li></ul><ul><li>ALPHA 0.0009 0.0023 0.4036 </li></ul><ul><li>BETA 0.7902 0.0417 18.9655 </li></ul><ul><li>THE VALUE OF ALPHA COEFFICIENT IS ONLY 1/3 RD OF STANDARD ERROR FOR THAT COEFICIENTAND THE T-STASTIC IS .4036. THEREFORE WE CANT REJECT THE NULL HYPOTHESIS </li></ul><ul><li>THE p VALUE FOR THE t-stastic is 0.0001. THEREFORE THE PROBABILITY THAT THE TRUE VALUE OF THIS COEFFICIENT IS ACTUALLY ZERO IS MICROSCOPIC.SIMILARILY THE p VALUE FOR THE F STASTIC IS LESS THAN .0001 </li></ul><ul><li> </li></ul>

×