1

In order to characterize equivalent experiments in the Blackwell order, I would be interested in what we can say about the problem in the question.

Similarly, this was already asked: When are the inverses of stochastic matrices also stochastic matrices?

I did not find any reference, so I proved the following, hopefully correctly (1,2 do not hinge on $C,G$ being stochastic)

Lemma Fix $n,m\in\mathbb{N}$. Let $C\in\mathbb{R}^{n\times m},G\in\mathbb{R}^{m\times n}$ be stochastic matrices (i.e. coloumns summing to 1, positive entries) such that $CG=Id_n$. Then:

  1. $n\leq m$
  2. $C$ and $G$ have full rank equal to $n$
  3. $C$ has an entry equal to $1$ on each row
  4. In particular, if $m=n$ $C,G$ are $n\times n$ permutation matrices

Proof Observe that, from linear algebra we know $$n=\mathsf{rank}(Id_n)=\mathsf{rank}(CG)\leq\min\{\mathsf{rank(C)},\mathsf{rank}(G)\}\leq min\{m,n\}$$ where the first inequality is a fact about rank of products and the second inequality we exploit the fact that the rank of a matrix is bounded by the minimum of its dimensions.\ If it was $m<n$, the inequality could not attain, and we can conclude $n\leq m$. In turn we can write: $$n=\mathsf{rank}(CG)\leq\min\{\mathsf{rank(C)},\mathsf{rank}(G)\}\leq\min\{m,n\}=n$$ and we can conclude that $\mathsf{rank(C)}=n=\mathsf{rank}(G)$. Then observe that, for any $j\in\{1,\dots,n\}$ $$1=(CG)_{jj}=\sum_{l=1}^m c_{jl}g_{lj}\leq\max_{l=1,\dots,m}\{c_{j,l}\}\sum_{l=1}^mg_{lj}=\max_{l=1,\dots,m}\{c_{j,l}\}\leq 1$$ where in the last equality we use the fact $G$ is stochastic, so its $j$-th coloumn sums to $1$, and in the last inequality that $C$ is stochastic, so that all its entries are in $[0,1]$. It follows that for each row $j$ of $C$ there is at least one entry which is $1$.
For the second fact, let $m=n$ and notice that by stochasticity if $c_{ij}=1$, then all other entries in the $j$-th coloumn of $C$ are $0$. It follows that the $n$ coloumns of $C$ are elements of the canonical basis of $\mathbb{R}^n$. Since $C$ has rank $n$, we can conclude that they are linear independent. In turn $G$ is the inverse of a permutation matrix hence a permutation matrix.

I would like to know if the proof is correct and I would be interested in if something more can be said on the structure of $C,G$ in the non square case (point 3).

Jean Marie
  • 88,997
  • 2
    You talk about columns summing to $1$ while your link talks about rows; this does not really matter, though it affects which is applied first. If you regard $C$ and $G$ as transition matrices, then $CG=Id_n$ essentially says that after both steps you end up where you started. If they are not square, the first transition must go to at least as many states as it started from; the first transition can split possibilities if the second reunites them, while the second can have have arbitrary transitions from states which cannot be reached with the first. – Henry Feb 22 '22 at 11:18

1 Answers1

1

The proof looks correct. It seems possible to derive some further interesting properties about $C$ and $G$ in the non $n\times n$-case:

From $$ 1=(CG)_{jj}=\sum_{l=1}^mc_{jl}\,g_{lj}\quad\quad\text{ and }\quad \sum_{l=1}^m\,g_{lj}=1 $$ we can conclude:

  • If $c_{jl}<1$ for some $l$ then $g_{lj}=0\,.$

Otherwise we would get $$ 1=\sum_{l=1}^mc_{jl}\,g_{lj}<\sum_{l=1}^mg_{lj}=1\,. $$ Further:

  • If $c_{jl}=1$ then $\sum_{i=1}^nc_{il}=1$ implies that for all rows $i\not=j$ the element $c_{il}$ must be zero. Therefore, $$ g_{li}=0\quad\quad\forall i\not=j\,. $$

Since we know that for each $j$ there must be at least one $l$ such that $c_{jl}=1$ we conclude:

  • For every $j$ there must be at least one $l$ such that $g_{li}=0$ for all $i\not=j\,.$

Let's sort the columns of $C$ and the rows of $G$ such that the element $c_{ii}=1$ for all $i=1,...,n\,.$ We know that for each $i$ at least one such element must exists. Because every column of $C$ is stochastic the sorted matrices must be of the form $$ C=\left(\begin{matrix}I&C'\end{matrix}\right) $$ where $I$ is the $n\times n$-identity matrix and $C'$ an $n\times(m-n)$-matrix. The (row sorted) matrix $G$ must be of the form $$\tag{1} G=\left(\begin{matrix}D\\G'\end{matrix}\right) $$ where $D={\rm diag}(g_{11},...,g_{nn})$ and $G'$ is an $(m-n)\times n$-matrix.

Now:

  • If $g_{ii}<1$ for some $i=1,...n$ then there must exist at least one $g_{li}>0$ (because $\sum_{l=1}^ng_{li}=1$). Because of (1) this $l$ must be greater than $n$. Therefore, $$ c_{li}=1\,, $$ and this is an element of $C'\,.$

So:

  • If $D={\rm diag}(g_{11},...,g_{nn})$ is not the identity matrix then we can sort $C'$ as we sorted $C$ and find a (possibly smaller) identity matrix at the top left corner of $C'$ and a (possibly smaller) diagonal matrix at the top left corner of $G'$.

  • Clearly, $C'G'$ must be a diagonal matrix that sums to the identity matrix with $D$: $$ I=CG=D+C'G'\,. $$

A typical example would be $$ C=\left(\begin{matrix}1&0&1&\frac{1}{2}\\0&1&0&\frac{1}{2}\end{matrix}\right)\,,\quad\quad G=\left(\begin{matrix}\frac{1}{2}&0\\0&1\\\frac{1}{2}&0\\0&0\end{matrix}\right) $$ where $$ C'=\left(\begin{matrix}1&\frac{1}{2}\\0&\frac{1}{2}\end{matrix}\right)\,,\quad D=\left(\begin{matrix}\frac{1}{2}&0\\0&1\end{matrix}\right)\,,\quad G'=\left(\begin{matrix}\frac{1}{2}&0\\0&0\end{matrix}\right)\,,\quad C'G'=\left(\begin{matrix}\frac{1}{2}&0\\0&0\end{matrix}\right)\,. $$ The question is if there are examples where $G'$ is not a diagonal matrix. The examples $$ C=\left(\begin{matrix}1&0&0&1\\0&1&1&0\end{matrix}\right)\,,\quad\quad G=\left(\begin{matrix}\frac{1}{2}&0\\0&1\\0&0\\\frac{1}{2}&0\end{matrix}\right) $$ or $$ C=\left(\begin{matrix}1&0&0&1\\0&1&1&0\end{matrix}\right)\,,\quad\quad G=\left(\begin{matrix}\frac{1}{2}&0\\0&\frac{1}{2}\\0&\frac{1}{2}\\\frac{1}{2}&0\end{matrix}\right) $$ are too trivial because they can be sorted so that $C'=I$ and $G'$ is diagonal.

Kurt G.
  • 17,136