2:P(A Aj)
i
'<j
'i<j<..1:
L P{A.AjAk)
+ ...
It is often easier to cal eulate P(intersections) than !.l'(unions) l1.1atching Pro blem: You ha-ve II: letters and n envelopes, randomly stuff the letters into the envelopes. \Vhat is the probability tbttt at Jenstone letter will match its intended envelope? JP(AI U ... UA,,),Ai = {ith position will match} JP(Ao) =.!.. = (",-. in! n n (permute everyone else if just Ai i.sin the right. place.) JPeAiA,) = (n:i~'lr(A.i and Aj are in the right place)
,I
!P{AnA'2 ...Aik) =
(,,;t')r
;p(,;
; . , ) =~ nx
_
n
3'
...
+ (_
.),,+1
(n)
n
(n .1 n.
n)!
general term:
SUM =
+ (_1)"+1 .'
n!
Recall: Taylor series lor e:r.= 1 + z + ~ + -!r + ... for X= ~1,e-1 = 1- 1 + + ... therd'ore, SUM = 1- limit of Taslor series as n -+ 00 Wben n is large, the ptobabmty converges to 1 - e-l = 0.63
4 - it
§2.1 - Conditional Probability Given tht B I.bappenedt" ,vhat is the probability tha.t A also happened? Th.e sample space is narrowed down to the space where B has occurred:
A
Definitio·n;
B
that event B happened.
The sample size now only Includes the determination Conditional probability of Event A given Event B:
r.l>(A.
Visually, conditional
It is sometimes easier to calcula.te intenlection given conditional probability; JP(AB) = P(AIB)i?(B) Example:
B = {T is odd}, A = {T < 8}
< 8).
=
P(AIB)
.
!P(AB) !P(B)
!P(B)'·
18
62
1:.
2
= ·2, '5"
=
.Ex.ample: RoD 2 dice until sum of7 or 8 resuJts (T .1P(A. = (T = 7}), B = {T = 7 or 8} This is the same case as if you roll once.
70r 8)
JP(AI!B) = ~;~:,) =
Exo..mple:
~~~1 = fr = (t;.%~j:fii
Treatments
Example, Example,
considering considering
Placebo:
B = Placebo,
treatment B: P(A:IB) =
13':25
= 0.34
~(A lA2" ..A,,) = P(Al) x ~(A2:liAd x P(A.!!llA21!Ad x ... x P(A"Ii:A,,-1 .•.A2i:Al) 1'IlI(A)
I'X
=i;F
P(AzA 1)
O(AI)
.
IPC. A SA2AI) ..
iP(A2AI)
x...
iP(A".A .. _l··.At)
Per, b, b, r)
== --
1'+b
,'-1+b
&-1
r+b-2
X --:---::-
r-1
"'+b-3
Example, Casino game ~ Craps, Wno..t's the probability of actually ,... inning?? On first roU: 7, 11 ~ winj2, 3, 12~ Jose; any other number (Xl), you continue playing. If you f,l''i'entuaHy roll 7 ~ lose; Xl, you win!
+ IP(Xl=
4)P(get
P{AIB) '"'" iI".(A. jl. .. iJ>(B·' Definition - A and B are independent if F(AIB) = lP(A) peAIB) = P~~~) = WeA) "-+ peAB) = P(A)P{B)
§2.2 Independence
of events.
Experiments can be physically independent (roll 1 die, then roll another die), or seem physically related and still be independent. Example: A = {odd}, B = {I, 2,3, 4}. Related events, but independent. W{A) = ~.JlP(B) = ~.AB = {I, 3} P{AB) = .~ x P(AB) therefore independent.
t=
=t,
A Disjoint If A. Bare independent, find p(ABe) P(AB) = P(A)W(B) ABc = A \AB, as shown:
f:. Independent.
A
.50,.
P(ABC} = JF(A) - P(AB) = peA) - P(A)IP(B) = P(A)(I)PCB)) = P(A)W'(BC) therefore, A and Be are independent as well similarly, A,C and Bt; are independent .. See Pset 3 for proof. Independence allows you to find lP'(intersection) through simple multiplication.
Example: Toss all unfair coin twice, these are independent events, P(H) = p, 0 :5 p :5 1, find IF'( TH") == tails first, beads second ( .P('!TH") == P(T)IF'(H) == (1 - p)p Since this is an unfair coin, the probability is not just
If fai , .r rarr
TH SH+lfT+TH+TT
= 4'
If you have several events: A 1, A 2, '" An that you need to prove Independent: It is necessary to show that any subset is independent, Total subsets: An,A,:2., .. "Aik,2:5 k:5 n Prove: P(AilAi2'. ,A,k) = P{Ail )JP'(Ai2)" ,P(Aik) You could prove that any 2 events are independent, which is called '(pairwise" independence, but this is not sufficient to prove that all events are independent, Example of pairwise independence: Consider a tetrahedral die, equally weighted. Three of the faces are each colored red, blue, and green, but the last face is multicolored, eontauung red, blue and green, P(red) = 2/4 = 1/2 = IF'(blue) = P(gTeen) P(red and. blue) = 1/4 = 1/2.x 1/2 = JP'(red)JP'{blue) Therefore, the pair {red, blue] is independent, The same can be proven for {red, green} and {blue, green}. but, what about all three together? P(red, blue, and green) = 1/4 rf P{red)P(blue}lF'(green} = 1/8, not fully independent .. Example: fil'(H) = p,fil'(T) = 1 ~ P for unfair coin Toss the coin 5 times ~ JP(!'HTHTT") = JP( )JP(T) P{H)P(T)P{T) H "" p(l- p)p(l- p)(l - p) = p2(1 _ p)3 Example: Find lP'(get 2H and 3T, in any order) = sum of probabilities for ordertng = P(HHITT) +JP(HTHTT) = ... =p2(1_p)3 +p2(1-p)3 +." = (~)p2(1 _ p)3 General Example: Throw a coin n times, P(k heads out of n throws)
= (:)pk{l_
p)",-k
Example: Toss a coin until the result is "heads;" there are n tosses before H results, .P(number of tosses = n)=1 needs to result as «TIT ....TH," number ofT's = (n - 1) fil'(tosses = n) = JF'(TT... H) = (1- p)n-lp Example, In a criminal case, witnesses give a specific description of the couple seen fleeing the scene. P{randoDl couple meets description) = 8.3 x 10-8 == P We know at the beg:ilming that 1 couple exists. Perhaps 8. better question to be asked is: Gfven a couple exists, what is the probability that another couple fits the same description? 1"(2 couples exists) A = JPC least 1 couple), B = P( a t least 2 couples), find P( B IA) at P{BIA) ::;;iJ>(BA) = !P(B)
iI'(A} . "'(A}
Out of 11couples, peA) = peat least 1couple) "" 1- Wenocouples) "" 1- rr=l (1- p) :tEach* couple doesn't satisfy the description, if no couples exist. Use independence property, and multiply. P(.A) "" 1 - (1 - p)'" PCB) = )Peat least two) = 1- .1"(0couples) - P(exactly 1 couple) = 1- (1- p)'" - n x p(l- p)",-l, keep in mind that IfD{exactly1) falls into Jl"(kout of n) P(BIA) _ 1- (1- PY" - np(l1 - (I -
p)n
p)",-l
If n = 8 million people, IP(BIA) = 0.2966, which is within reasonable doubt! 1"(2 couples) < PCl couple), but given that 1 couple exists, the probability that 2 exist is not insignificant
A
In the large sample space, the probability that B occurs when we know that A oecured is significant! §2.3 Bayes's Theorem It is sometimes useful to separate a sample space S into a set of disjoint partitions:
s, n 0, for i =F j,8 U~=l B; (disjoint) Total probability: peA) =L~=llF(AB;} = I::=t IPCAIB;)P(Bi} (all AB; are disjoint, AB; = A) -
n, =
U:'""t
Bayes'
Formula.
U7"",
Partition
rj
~ total probability.
Example: In box I,. there are 60 short bolts and 40 long bolts. In box 2, there are 10. short bolts and 20 long bolts. Take [I. box ntrandom, and pick: a. bolt. \Vhat is the probability tha.t you chose asbort bolt? B, = choose Box 1. B2 = choose Box 2. P(short)= m>(shqrt,IB1)P(B,) + IP(shortIB2)?(B~)= -f£(~) + ~(~) Example: Partitions: BI, B2, ..•B,. and you know the distribution. s..·elIts: A, A, ... ) A and you know the P'(A) for each B, If you know that. A happened, what is the probabilJtythat
B,?
•• 1'
nCB 'IA)
!:
..j'
.=
PCB/A) peA)
eormmc
Example: Medical detecti.on test, 90.% accurate .. Partition - you have the disease (Bd, you don't have the disease (B2) The accuracy meADS, in terms of probability: IP(positive~IBl) = D..9,IP(posith·eIB2) = 0..1 In the general public, thechance of getting the disease is 1.in 10.,0.0.0.. In terms of probability: .J.l'(Bl) = D.Oo.o.l,IP(B2)= 0..9999 If the result comes up positive, what is the probability that you a.ctuaD.y have the disease? D( B llpositive)?
. The probability
(0.9)(0..00.0.1)
(0.9)(0.0.001)
+ (0..1)(0.99.99)
.= 0.0.0.0.9
Example.: Identify the source of a defective item. There are 3 machines; /I,h,.. 2,. Mg. I?{ defective): 0.01, 0.02, 0.03, respectively. M The percent of items made that come from. each machine is: 20%, 30%, and 50%,respecth'ely. Probability th.",t the item comes from a machine: P{M 1) = 0.2, P(.M2) = 0.3, P(Mg) = 0.5 Probability tb",t a machine's item isdefective; P(D,I:M1) = om, P(D,IM2.) = 0.02, P(D: Ifg) = 0.03 Probability th.",t Jt came from Machine 1:
(0.01)(0.2)
+ (O.(2)(0
(0.01)(0.2). ..3)
+ (0.03)(0.5)
.=.0.081
Example: A gene has 2 alleles; A, u, The gene exhibits itse1£ through a trait with two versions, The possible phenotypes are "dominant," with genotypes AA or Aa., and "recessive," with genotype Aneles I;nwel independently, derived from a parent's genotype. In a population, the probability of having a partrcular allele: peA) = 0 .. , P(a) = 0.5 5 Therefore, the probabilities of the ge.notypes are; P(AA) = 0.25,. P(Aa) = 0.5,. tP(aa) = 0.25 Partitions: genotypes of parents: (A.A, AA), (A.A, An), (AA, an), (Ae, Aa.), (Aa, an), (Oi.'\., aa). Assume pairs ma.tch regardless of genot)pe. Parent genotypes Probabilities 2 x (4)(*) = 2 x (~)(.!;-) = (*)(~) =i 2 x (~)(~J =
Ga.
* "*
i
phenotype
(113., aa)
o
the genotypes of the pare.ll.ts:
u you
P«AA AA)I'A) ~
.' .
You can do the sameoomputation Ba:),'es!g formula gi\'es Il. prediction
. ~ 1~(1)+ i(l)+
1 xiI
is
1~(0) = 12
see.
to find the probabilities of each type of couple. inside the parents th",t you aren't able to directly
Example.: You have 1machine .. In good condnioardefective items only produced 1% of the time. P(in good condition) In broken condition: defective items produced 40% of the time. P(broken) = 10% Sample a items, and find that 2 are defective, Is the machine broken? This is '\'eI}' similar to the medical example worked earlier in lecture: P{goodj2 out. of a are defective) =
90%
(;)(0.01)2(0.99)4(0.9)
+ (~)(0.4)2(O.6)4.(O.1)
= 0.. 4 0