Assume we have $N$ random variables $x_1, x_2, \dots, x_N$. Is there a way to express the joint entropy $H(x_1, x_2, \dots, x_N)$ in terms of single-variable or pairwise measures such as the pairwise mutual information $I(x_i; x_j)$ or the entropy $H(x_i)$ of each variable? In other words, are there functions $g_{ij}(\cdot,\cdot)$ such that
$$ H(x_1, x_2, \dots, x_N) = \sum_{i\in \{1, \dots, N\}}\sum_{j\in \{1, \dots, N\}} g_{ij}(x_i, x_j). $$