5

I have a question about a joint distribution calculated in a paper I am reading.

There are three random variable a, b and c such that $$ a,b,c \in \{+1,-1\} $$ and then the joint distribution is given by: $$ p(a,b,c) = \frac{1}{8}(1 + aE_{A} + bE_{B} + cE_{C} + abE_{AB} + acE_{AC} + bcE_{BC} + abcE_{ABC})$$ where $E_{A}$, $E_{B}$ and $E_{C}$ are the single-party marginals, $E_{AB}$, $E_{BC}$ and $E_{AC}$ the two-party marginals, and $E_{ABC}$ is the three-body correlator.

I feel like this should be something I should have seen in an introductory course on probability but I can't seem to prove it. Also if I convince myself that it's just adding up all the possible cases, I think 1 should be part of the other expected values (i.e. it is already considered in them) so I don't see the point in adding it separately. Also would we have the same expression if the possible values were $\{+1,0\}$ (or any other set of size 2) ?

I am used to the notation $E$ as the expected value, but this is totally unrelated to that and it is about marginals. Am I correct?

I would be pleased to see a complete proof or a link to study this fact.

It also adds:

Note that the positivity of $p(a,b,c)$ implies constraints on marginals, in particular $p(+ + +) + p(−−−) ≥ 0$ implies $$ E_{AB} + E_{AC} + E_{BC} ≥ −1.$$

which I don't understand.

Pegi
  • 540
  • Can you add a link to the paper? – Leander Tilsted Kristensen Nov 19 '21 at 12:07
  • I think I heard the term 'Rademacher chaos' but not sure how deeply it is related to your question. Anyway, the expansion is more or less the result of the fact that monomials $\prod_{x\in A}x$ for $A\subseteq {a,b,c}$ spans the space of functions of the form ${-1,1}^3\to\mathbb{R}$. – Sangchul Lee Nov 19 '21 at 12:10
  • https://arxiv.org/abs/1906.06495 – Pegi Nov 19 '21 at 12:37
  • It is on the second page. – Pegi Nov 19 '21 at 12:38
  • The equations $(3)$-$(10)$ in the paper indicate that $E_A,E_B,E_C,E_{AB},E_{AC},E_{BC}$ and $E_{ABC}$ does indeed represent the expected values $E[A],E[B],E[C],E[AB],E[AC],E[BC]$ and $E[ABC]$. – Leander Tilsted Kristensen Nov 19 '21 at 14:27
  • But those are the constraints specified after this expression i,e. they are not considered when deriving it. – Pegi Nov 19 '21 at 20:05
  • @SangchulLee so what about the 1? I mean if we consider that expansion we can't still justify 1. – Pegi Nov 20 '21 at 12:02
  • @Pegi, I am not sure about your question about 'justification', but note that any joint distribution $p$ must satisfy the constraint $$\sum_{a,b,c\in{\pm1}}p(a,b,c)=1.$$ Then, under this sum, all the "non-constant" terms vanish. So that is why we need the term $1$ there. – Sangchul Lee Nov 20 '21 at 12:11
  • @SangchulLee Thank you, is this only because the variables could be +1 or -1? Would this still hold if they were +1 or 0? – Pegi Nov 20 '21 at 12:34
  • Since the space of all $\mathbb{R}$-valued functions on ${0,1}^3$ is $8$-dimensional and the functions $$1, \quad a, \quad b, \quad c, \quad ab, \quad bc, \quad ca, \quad abc$$ form a basis of this vector space, we can still uniquely write any joint distribution $p$ on ${0,1}^3$ in the form $$ p(a,b,c)= f_1 + f_a a + f_b b + f_c c + f_{ab}ab + f_{bc}bc + f_{ca}ca + f_{abc}abc$$ for some constants $f_1, f_a, \ldots, f_{abc}$. However, now the probability-theoretic meaning of these coefficients is less clear. – Sangchul Lee Nov 20 '21 at 12:44

0 Answers0