Reply
Sat 20 Nov, 2004 05:53 pm
I'm having a little difficulty with the following problem:
Suppose that a bowl initially contains two red marbles and two green marbles. One marble is removed at random. Instead of replacing that marble, a marble of the other color is put into the bowl. The number of red marbles in the bowl forms a Markov chain. Determine the matrix of transition probabilities.
I came upon a solution but it doesn't make sense. I know that the states of this Markov chain are 0, 1, 2, 3, and 4. Please advise.
Are those states equally likely ? I would have thought not ( 3->4 is less likely than 3->2) and does such a "transition matrix" reflect this ?