In order to characterize equivalent experiments in the Blackwell order, I would be interested in what we can say about the problem in the question.
Similarly, this was already asked: When are the inverses of stochastic matrices also stochastic matrices?
I did not find any reference, so I proved the following, hopefully correctly (1,2 do not hinge on $C,G$ being stochastic)
Lemma Fix $n,m\in\mathbb{N}$. Let $C\in\mathbb{R}^{n\times m},G\in\mathbb{R}^{m\times n}$ be stochastic matrices (i.e. coloumns summing to 1, positive entries) such that $CG=Id_n$. Then:
- $n\leq m$
- $C$ and $G$ have full rank equal to $n$
- $C$ has an entry equal to $1$ on each row
- In particular, if $m=n$ $C,G$ are $n\times n$ permutation matrices
Proof
Observe that, from linear algebra we know $$n=\mathsf{rank}(Id_n)=\mathsf{rank}(CG)\leq\min\{\mathsf{rank(C)},\mathsf{rank}(G)\}\leq min\{m,n\}$$
where the first inequality is a fact about rank of products and the second inequality we exploit the fact that the rank of a matrix is bounded by the minimum of its dimensions.\
If it was $m<n$, the inequality could not attain, and we can conclude $n\leq m$. In turn we can write:
$$n=\mathsf{rank}(CG)\leq\min\{\mathsf{rank(C)},\mathsf{rank}(G)\}\leq\min\{m,n\}=n$$
and we can conclude that $\mathsf{rank(C)}=n=\mathsf{rank}(G)$. Then observe that, for any $j\in\{1,\dots,n\}$
$$1=(CG)_{jj}=\sum_{l=1}^m c_{jl}g_{lj}\leq\max_{l=1,\dots,m}\{c_{j,l}\}\sum_{l=1}^mg_{lj}=\max_{l=1,\dots,m}\{c_{j,l}\}\leq 1$$
where in the last equality we use the fact $G$ is stochastic, so its $j$-th coloumn sums to $1$, and in the last inequality that $C$ is stochastic, so that all its entries are in $[0,1]$. It follows that for each row $j$ of $C$ there is at least one entry which is $1$.
For the second fact, let $m=n$ and notice that by stochasticity if $c_{ij}=1$, then all other entries in the $j$-th coloumn of $C$ are $0$. It follows that the $n$ coloumns of $C$ are elements of the canonical basis of $\mathbb{R}^n$. Since $C$ has rank $n$, we can conclude that they are linear independent. In turn $G$ is the inverse of a permutation matrix hence a permutation matrix.
I would like to know if the proof is correct and I would be interested in if something more can be said on the structure of $C,G$ in the non square case (point 3).