Anda di halaman 1dari 5

CS 112 Homework # 6, Due Tuesday 11/12/2002, 12:00pm

(You must show your work to receive credit) Problems [1] Consider an input/output database buffer that has room for N records. In any one unit of time, a new record may be inserted into the buffer, provided that the latter is not full, with probability p. In addition, in any unit interval, the buffer may be emptied completely, with probability r. If both an insertion and a clearing operation occur in the same unit interval, the former is carried out before the latter. a) Set up the problem as a discrete time Markov chain mentioning any assumptions you made. b) Draw the state-transition probability diagram and find the one-step transition probability matrix of the chain. Solution a) Let Xn be the number of records in the buffer at time n. We assume that the occurrences of insertions and clearings are independent of each other and their past histories. Then, {Xn, n = 0,1,} is a Markov chain with a state space {0, 1, 2, N}. b) State-transition probability diagram:
1 p(1 r) (1 p)(1 r) (1 p)(1 r) p(1 r) p(1 r) p(1 r) (1 p)(1 r) p(1 r) p(1 r)
N-1

1r

1
r

N
r

One-step transition probability matrix: 0 0 1 2 N 1 N 1 2 N 1 0 0 0 (1 p )(1 r ) 0 N 0 0 0 p (1 r ) 1 r

P=

1 p(1 r ) p(1 r ) 0 r (1 p)(1 r ) p (1 r ) r 0 (1 p)(1 r ) r r 0 0 0 0

Note we have N + 1 states, so the matrix is (N + 1) (N +1).

[2] In a message sent in code in the English alphabet, the probability that a vowel will be followed by another vowel is found to be .15 and the probability that a consonant will be followed by another consonant is .40. a) Find the state-transition probability matrix for the states vowel and consonant. b) Estimate the percentage of vowels in the messages in this code. Solution We consider two states: V for vowel and C for consonant. V C a) P = V 0.15 0.85 C 0.60 0.40 b) By computing the equilibrium distribution of vowels and consonants, that is solving = P and i = 1 , we get that the required percentage of vowels is 0.413.
i

[3] The president of a chain of fast-food restaurants classifies the profitability of each restaurant as excellent (E), good (G), fair (F), or poor (P). The presidents estimates of the probabilities that a restaurant will change classification from one month to the next are given bellow. If in a particular month, 10% of the restaurants in the chain are classified as excellent, 30% as good, 40% as fair, and 20% as poor, what is the estimate for the next month? Present Month Excellent Good .80 .20 .10 .70 .10 .10 0 0

Next Month

Excellent Good Fair Poor

Fair 0 .10 .80 .10

Poor 0 0 .10 .90

Solution Notice the above table is the transpose of the state-transition probability matrix: E G F P E G P= F P .8 .1 .1 .2 .7 .1 0 .1 .8 0 0 .1 0 0 .1 .9
(0)

and we are given the initial state probabilities for a particular month: So in order to estimate the next month classification: (1) = ( 0) P = [.14 .26 .38 .22]

= [.1 .3 .4 .2]

[4] Consider a Markov chain with state space {0,1} and state-transmission probability matrix 1 0 P= 1 1
2 2

Classify states 0 and 1 as recurrent or transient ones. Solution


f0 = f1 =
n =1 n =1

f 0( n ) = 1 + 0 + 0 + ... = 1 that is, state 0 is recurrent. f 1( n ) = 1 1 + 0 + 0 + ... = < 1 that is, state 1 is transient. 2 2

[5] Draw the state-transmission probability diagrams and classify the states of the Markov chains with the following transition probability matrices:
.3 .4 0 1 b) P = 0 0 0 0 0 0 0 0 .3 0 0 0 0 .6 .4 0 0 1 1 0 0

0 .5 .5

a) P = .5 0 .5 .5 .5 0
Solution a)
0 .5 .5 .5 2 .5

.5 .5

The Markov chain is irreducible and aperiodic. E.g. we can get back to state 0 in two steps: 0 1 0. But also we can get back in 3 steps: 0 1 2 0. Hence 0 is aperiodic. Since the MC is irreducible with finite state space, all states are recurrent non-null. b)
.3 .4 0 .3 .4 4 1 1 .6 3 2 1 1

The Markov chain is not irreducible, since states 0 and 4 do not communicate, and state 1 is absorbing. The set {2,3,4} of states is a communicating class. [6] Suppose that coin 1 has probability 0.7 of coming up heads, and coin 2 has probability 0.6 of coming up heads. If the coin flipped today comes up heads, then we select coin 1 to flip tomorrow, and if it comes up tails, then we select coin 2 to flip tomorrow. If the coin initially flipped is equally likely to be coin 1 or coin 2, then what is the probability that the coin flipped on the third day after the initial flip is coin 1? Solution Let the state be the number of coin flipped in that day: 1 for coin 1, 2 for coin 2. Then the transition probability matrix is 1 2 P = 1 .7 .3 2 .6 .4 and so, P2 = and .667 .333 .666 .334 Hence, the probability that a coin 1 is flipped on third day is 1 3 3 P11 + P21 = 0.6665 2 P3 = .67 .33 .66 .34

[7] Three out of every four trucks on the road are followed by a car, while only one out of every five cars is followed by a truck. What fraction of vehicles on the road are trucks? Solution Let the state be C if the vehicle is a car and T if a truck. We obtain two-state Markov chain with the following state-transition probability matrix: C T P = C .8 .2 T .75 .25 The long run proportion of cars and trucks are obtained by solving for . Solving the equations 4 i = 1 , we get = [15 19 ] or 4 out of 19 vehicles are trucks. = P and 19
i

[8] A certain town never has two sunny days in a row. Each day is classified as being either sunny, cloudy (but dry), or rainy. If it is sunny one day, then it is equally likely to be either cloudy or rainy the next day. If it is rainy or cloudy one day, then there is one chance in two that it will be the same the next day, and if it changes then it is equally likely to be either of the other two possibilities. In the long run, what proportion of days are sunny? What proportion are cloudy?

Solution Let the state on day n be S if sunny, C if cloudy, and R if rainy. This gives a three-state Markov chain with the following transition probability matrix: S C R P=
S 0 C 1 4
1 2 1 2 1 4 1 2 1 4 1 2

R 1 4 Solving the equations for the long run proportions: = P S + C + R = 1 we get S = 1/5, C = 2/5, R = 2/5.

Anda mungkin juga menyukai