I recently asked a question about pairwise versus mutual independence (also related to this and this q).
However,
(1) I inadvertently used incorrect terminology:
three events, A, B, C are mutually independent when:
P[A,B]=P[A]P[B], P[B,C]=P[B]P[C], P[A,C]=P[A]P[C], P[A,B,C]=P[A]P[B]P[C]
Did and others pointed out that
"Mutual independence means the four identities you copied, pairwise independence means the first three of these identities." -- Did
Note that the term mutual has varying definitions across math. For example, mutual information is a pairwise relation.
(2) Going back to probability, GC Rota said the theory can be approached two ways: focusing on random variables (event algebra) or focusing on distributions. Here I am interested in distributions, where independence can be interpreted as factorization of the probability distribution function. The conditions are the same as above, where P is interpreted as the PDF function.
The following graphic based on a standard example from Counterexamples in Probability and Statistics of a 3-dimensional binomial PDF that factorizes pairwise (ie, each of the 3 pairs of random variables are independent and the 2-dim joint distributions can all be written as the product of the respective marginals) but not 3-way independent (the joint distribution cannot be written as the product of the individual marginal distributions)

My question is whether the opposite can happen, ie if the 3-dim (or perhaps higher) joint distribution factorizes into the 1-dim marginals, does that imply the pairwise factorization of all 2-dim joint distributions into the marginals?