1

Let's start with an

Example: Consider the PDFs of a multivariate normal distribution and a multivariate Student distribution. How to prove that they are linearly independent, as functions from $\mathbb R^n$ to $\mathbb R$?

More generally, I'm interested in general proof strategies (or references!) on the following:

Question: How to prove that a given set of PDFs is linearly independent? For example, under additional assumptions, e.g., unimodality and distinct modes.

To provide some motivation, I'm trying to understand better strict linear independence of probability measures: probability measures $P_1, \dotsc, P_K$ defined on a space $\mathcal X$ are strictly linearly independent if for every vector $\lambda\neq 0$ there exists a measurable set $A$ such that $$\lambda_1 P_1(A) + \cdots + \lambda_K P_K(A) \neq 0,$$ or, equivalently, $|\lambda_1 P_1 + \cdots + \lambda P_K|(\mathcal X) > 0$.

In case of $\mathcal X = \mathbb R^n$ and continuous PDFs $p_1, \dotsc, p_K$ this condition is equivalent to the linear independence of their PDFs.

The proof for exponential distributions with distinct rates is here and there exist proof strategies based on asymptotics for normal distributions (e.g., this one or that one). However, I think more general results should follow, perhaps based on the notion of strict linear independence of measures. Intuitively, if I have several different unimodal distributions and a given $\lambda \neq 0$, then trying as the candidate set $A$ small balls centered around different modes, may work. However, I have not been able to formalise this argument.

FD_bfa
  • 4,757
Paweł Czyż
  • 3,449
  • Asymptotics seems more powerful when you have densities available. You can use asymptotics as $x \to a$ for any $a$, and you can use Asymptotics on the characteristic functions and moment generating functions. – Mason May 14 '24 at 15:15
  • This is a good point! I'm however not sure how to pass from linear independence of PDFs to easily-verifiable conditions on the characteristic functions or moment generating functions. Could you perhaps recommend a reference on related topics? – Paweł Czyż May 15 '24 at 12:46
  • It's just a calculus problem. Linear dependence means some linear combination of the functions is 0. By taking limits and doing other operations on that linear combination, you can show each coefficient is 0. – Mason May 15 '24 at 15:24
  • Thank you for a quick response! However, I'm still a bit lost on passing from a linear combination of PDFs to characteristic functions or moment generating functions.

    For example, let $f(x) = \lambda_1 N(x\mid 0, 1) + \lambda_2 N(x\mid 0, 2)$ be a linear combination of normal distribution PDFs. This $f$ does not need to correspond to a random variable (e.g., there may be regions with $f(x) < 0$) and I don't know how to define a characteristic function in this case. Could you please recommend a reference or show an example in an answer, so I can understand it better?

    – Paweł Czyż May 16 '24 at 13:53
  • Here is a simple example: Suppose $f(x) = ae^x + be^{2x} = 0$ for all $x \in \mathbb{R}$, where $a, b \in \mathbb{R}$. Then $f(x) \sim be^{2x}$, where $h_1(x) \sim h_2(x)$ means $\frac{h_1(x)}{h_2(x)} \to 1$ as $x \to \infty$. Thus $\frac{f(x)}{e^{2x}} = b + o(1)$. Thus $b = 0$. Similarly, $a = 0$. – Mason May 16 '24 at 18:58
  • Now I see, thank you both! I've accepted the answer :) – Paweł Czyż May 16 '24 at 20:28

1 Answers1

2

Here is an example where MGFs make the proof simple.

Suppose $f(x) = a N(\mu_1, \sigma_1)(x) + bN(\mu_2, \sigma_2)(x) = 0$ for all $x \in \mathbb{R}$, where $a, b \in \mathbb{R}$. Taking MGFs of both sides, that is, integrating both sides by $\exp(tx)\,dx$, yields:

$$\phi(t) = a\exp(\mu_1 t + \sigma_1^2 t^2 / 2) + b\exp(\mu_2 t + \sigma_2^2 t^2 / 2) = 0 \space \space \text{ for all } \space t \in \mathbb{R}$$

Now assuming $\sigma_2^2 > \sigma_1^2$, $\frac{\phi(t)}{\exp(\mu_2 t + \sigma_2^2 t^2 / 2)} \to b$ as $t \to \infty$, so $b = 0$.

Then $a = 0$ too.

FD_bfa
  • 4,757
Mason
  • 12,787