4

Three white and three black balls are distributed in two urns in such a way that each contains three balls. We say that the system is in state i, i = 0, 1, 2, 3, if the first urn contains i white balls. At each step, we draw one ball from each urn and place the ball drawn from the first urn into the second, and conversely with the ball from the second urn.

Let $X_n$ denote the state of the system after nth step. Now how to prove that $(X_n=0,1,2,...)$is a markov chain and how to calculate its transition probability matrix.

Solution:If at the initial stage both the urns have three balls each and we draw one ball from each urn and place them into urn different from the urn from which it is drawn. So after nth step state of the system will be 3 and it will remain it forever. So this is not a markov chain. I also want to understand the meaning of bold line.

If I am wrong, explain me why and how I am wrong and what is the transition matrix of this markov chain. Would any one answer this question?

Win_odd Dhamnekar
  • 1,075
  • 2
  • 11
  • 25
  • "conversely with the ball from the second urn." Read: "place the ball from the second urn into the first urn." – David Feb 13 '17 at 14:29
  • Also, if a ball is randomly drawn from each urn, why do you think that the system will have an absorbing state at 3? – David Feb 13 '17 at 14:30
  • 1
    If system is in state 3, in the next step it will, with probabiilty 1, transition to state 2. – Zoran Loncarevic Feb 13 '17 at 14:33
  • @ZoranLoncarevic,Your answer is matching with the answer provided to me which Is $P_{32}=1$. But what about $P_{10},P_{11},P_{12},P_{21},P_{22},P_{23},P_{01}$ – Win_odd Dhamnekar Feb 13 '17 at 14:40
  • if we r in state 0 then also in the second urn we had three white balls and surely after the exchange business we will have 1 white ball in urn 1 so $P_{01}$=1 – Horan Feb 13 '17 at 14:58

2 Answers2

8

Yes, this is a Markov chain

The state of the system is defined by the number of white balls in the first box.

There are four states:

enter image description here

The figure above depicts both boxes the left one being the one in which we count the white balls.

Based on the description of the experiment we can declare that this is a discrete time Markov chain. Obviously, if we are in a state it does not matter how we got there; the probability of the next state depends on the actual state.

Now, here are the state transition probabilities:

$$ p_{ij}: \begin{bmatrix} &\mathbf j&\mathbf0&\mathbf1&\mathbf2&\mathbf3\\ \mathbf i&\\ \mathbf0&&0&1&0&0\\ \mathbf1&&\frac19&\frac49&\frac49&0\\ \mathbf2&&0&\frac49&\frac49&\frac19\\ \mathbf3&&0&0&1&0 \end{bmatrix}.$$

$\mathbf i$ stands for the state the system actually is at, and $\mathbf j$ stands for the state the system is to jump to.

For example

$$p_{22}=\frac49$$

because we need to randomly select either of the white balls ($\frac23$) in the left box and the white ball in the right box ($\frac13$) or either of the black balls in the right box ($\frac23$) and the black ball in the left box ($\frac13$); the events are independent. The following figure shows the four equally likely pairs of choices resulting in $2\to 2$.

enter image description here

Note that the system does not remain in state $\mathbf3$ rather it jumps to state $\mathbf 2$ with probability one.


Let $[P_0 P_1 P_2 P_3]^T$ denote the stationary probabilities. These probabilities are the solutions of the following system of linear equations:

$$[P_0 \ P_1 \ P_2\ \ P_3] \begin{bmatrix} 0&1&0&0\\ \frac19&\frac49&\frac49&0\\ 0&\frac49&\frac49&\frac19\\ 0&0&1&0 \end{bmatrix}=[P_0\ P_1 \ P_2\ P_3]. $$

It is easy to check that

$$[P_0 \ P_1 \ P_2\ P_3]^T=\left[\frac1{20} \ \frac{9}{20}\ \frac{9}{20}\ \frac1{20}\right]^T.$$

zoli
  • 20,817
3

Define $F_n$ to be the indicator random variable which has value 1 if at $n^{th}$ step white ball Chosen from the first ball and else 0.

Similarly Define Indicator random variable $S_n$ for second urn

Now,To check Markov property we need to check

P($X_n$= j|($X_{n-1}$,$X_{n-2}$,..$X_0$)=($i_{n-1}$,$i_{n-2}$,...$i_0$))=P($X_n$= j|$X_{n-1}$=$i_{n-1}$)

First Observe the conditional Range of $X_n$ given $X_{n-1}$,$X_{n-2}$...,$X_0$ is

$\space$ {$X_{n-1}-1$, $X_{n-1}$, $X_{n-1}+1$}

Hence enough to check for these cases below.

if j = $i_{n-1}+1$

Then rewrite this prob as P($F_n$=0,$S_n$=1|($X_{n-1}$,$X_{n-2}$,..$X_0$)=($i_{n-1}$,$i_{n-2}$,...$i_0$))

if j= $i_{n-1} -1$

Then rewrite this prob as P($F_n$=1,$S_n$=0|($X_{n-1}$,$X_{n-2}$,..$X_0$)=($i_{n-1}$,$i_{n-2}$,...$i_0$))

if j = $i_{n-1}$

Then rewrite this prob as

P($F_n$=1,$S_n$=1|($X_{n-1}$,$X_{n-2}$,..$X_0$)=($i_{n-1}$,$i_{n-2}$,...$i_0$))+ P($F_n$=0,$S_n$=0|($X_{n-1}$,$X_{n-2}$,..$X_0$)=($i_{n-1}$,$i_{n-2}$,...$i_0$))

Now,Observe that $F_n$ and $S_n$ only depends on $X_{n-1}$ and not on $X_{n-2}$,$X_{n-3}$,...$X_0$

So,from here you can conclude

P($X_n$= j|($X_{n-1}$,$X_{n-2}$,..$X_0$)=($i_{n-1}$,$i_{n-2}$,...$i_0$))= P($X_n$= j|$X_{n-1}$=$i_{n-1}$)

$\textbf{NOTE:}$ This is a long hand mathematical approach to argue markov property of $X_n$.If one wants to keep life simple one can argue in one line that the $n^th$ draw depends only on the $X_{n-1}$ by giving appropriate arguments.

Horan
  • 361
  • I understood the mathematical approach. Thank you very much! It was very smart to define theses two random variables, $F_n$ and $S_n$. So, which would be the appropriate arguments to say that $X_n$ is a markov chain in one line? – Pedro Salgado Nov 07 '21 at 21:55