14

Let $x_1,\ldots ,x_n$ be $n$ distinct points in $\mathbb{R}^3$. Consider the $n\times n$ real symmetric matrix $A$ defined by $A_{ij}:=|x_i-x_j|$. I would like to show that

$$Ker\,A\;\cap\,\{v\in\mathbb{R}^n\,:\, v_1+v_2 +\ldots +v_n=0\}=\{0\}$$

Thank you for any suggestions.

Capublanca
  • 1,282
  • 8
  • 16

2 Answers2

8

$\def\RR{\mathbb{R}}$Numerical experiments suggest that this matrix is negative semidefinite on the plane $\sum v_i=0$. Specifically, I generated 20 sets of 10 points each, drawn uniformly from the unit cube, and this was true every time. I repeated the experiment with points in 2, 4 and 5 dimensions, and the same held.

I am reminded of this answer by Noam Elkies but can't make a precise connection.


Switching this answer to CW to write up darij's proof from the comments. We will show that:

  • If $x_i$ are any $n$ points in $\RR^d$ and $v_i$ are any scalars with $\sum v_i=0$, then $\sum v_i v_j |x_i-x_j| \leq 0$ and

  • If the $x_i$ are distinct and the $v_i$ are not all $0$, we have strict inequality.

The latter shows that the matrix $\left[ |x_i-x_j| \right]$ times the vector $\left[ v_i \right]$ is nonzero. We start, however, by proving the former. We turn to the issue of when zero occurs after the horizontal line.

We start with the averaging trick: By rotational and scaling invariance, we can see that there is some positive $c$ such that $$\int_{|w|=1} \left| \langle w \cdot x \rangle \right| = c |x|.$$ So $$\sum v_i v_j |x_i-x_j| = c^{-1} \int_{|w|=1} \sum v_i v_j \left| \langle w \cdot (x_i-x_j) \rangle \right|$$ and thus it is sufficient to show $\sum v_i v_j \left| \langle w \cdot (x_i-x_j) \rangle \right|\leq 0$ for a particular vector $w$. Now, $w \cdot (x_i-x_j)$ only depends on the orthogonal projections of $x_i$ and $x_j$ onto the line $\RR w$, so we may (and do) assume all the $x_i$ lie on a line. Our goal is now to show, for any $n$ values $x_i \in \RR$, that $\sum v_i v_j |x_i-x_j| \leq 0$.

We have $|z| = \max(z,0) + \max(-z,0)$, so $\sum v_i v_j |x_i-x_j|=2 \sum v_i v_j \max(x_i-x_j,0)$. We use the notation $\left[ \mbox{statement} \right]$ to mean $1$ if the statement is true and $0$ if it is false. So $$\max(x_i-x_j,0) = \int_{t \in \RR} \left[x_j < t < x_i \right] dt$$ and $$\sum_{i,j} v_i v_j \max(x_i-x_j,0) = \int_{t \in \RR} \sum_{i,j} v_i v_j \left[x_j < t < x_i \right] dt.$$ So it is enough to show that, for any $t$, we have $$\sum_{x_i < t < x_j} v_i v_j \leq 0 . $$ Let $I = \{ i : x_i < t \}$ and $J = \{ i : x_j > t \}$. (For almost all $t$, none of the $x_i$ equal $t$, so we can disregard the boundary case.) Then $$\sum_{x_i < t < x_j} v_i v_j = \sum_{i \in I,\ j \in J} v_i v_j = \left( \sum_{i \in I} v_i \right) \left( \sum_{j \in J} v_j \right) = - \left(\sum_{i \in I} v_i \right)^2 \leq 0 .$$ In the final equality, we have finally used the hypothesis $\sum v_k=0$.


Now, let's consider what happens for distinct $x_i$. As long as the $x_i$ as distinct, for almost as $w$, the orthogonal projections of the $x_i$ onto $\RR w$ will stay distinct. In order to get $0$, we must get $0$ from all these choices of $w$. Let's say the orthogonal projections are ordered as $x_1 < x_2 < \cdots < x_n$. Once again, we must get $0$ from every choice of $t$. So we must have $\left( \sum_{i=1}^k v_i \right)^2 =0$ for all $k$, and this means that all $v_i$ are zero.

  • Yes, it is negative semidefinite on the plane given by $\sum_i v_i = 0$. Here is a sketch of a proof (feel free to write it up in detail): ... – darij grinberg Nov 23 '18 at 05:13
  • 2
    ... Step 1: Fix $n$ reals $v_1, v_2, \ldots, v_n$ satisfying $\sum_i v_i = 0$. Our goal is to prove that $\sum_{i, j} \left|x_i - x_j\right| v_i v_j \leq 0$ for any $n$ points $x_1, x_2, \ldots, x_n \in \mathbb{R}^m$ for any $m \geq 0$. (There is no reason to restrict oneself to $m = 3$.) By the classical "averaging over the sphere" trick, it suffices to prove this inequality only for $x_1, x_2, \ldots, x_n \in \mathbb{R}^1$, because each $\left|x_i - x_j\right|$ is just a scalar factor times ... – darij grinberg Nov 23 '18 at 05:15
  • ... the average of $w^\perp x_i - w^\perp x_j$ where $w$ ranges over the unit sphere in $\mathbb{R}^m$ (uniformly in Haar measure). (I hope I'm getting this trick correct; if not, replace the unit sphere by $\operatorname{SO}\left(m, \mathbb{R}\right)$ or something similar. I don't really know what I'm talking about when I'm talking integration.) ... – darij grinberg Nov 23 '18 at 05:16
  • 3
    ... Step 2: So let's prove this inequality for $m = 1$. I think this was an olympiad problem not too long ago (USAMO?), but I can't find it, so here's the idea of the solution: Let $I$ be a finite interval containing all points $x_1, x_2, \ldots, x_n \in \mathbb{R}$, and let us use the Iverson bracket notation. Then, $\left|x_i-x_j\right| = \int_I \left[t \leq x_i\right] \left[t > x_j\right] dt$ for all $i$ and $j$. Thus, ... – darij grinberg Nov 23 '18 at 05:18
  • 3
    ... we have $\sum_{i,j} \left|x_i-x_j\right| v_i v_j = \sum_{i,j} \int_I \left[t \leq x_i\right] \left[t > x_j\right] dt v_i v_j = \int_I \left(\sum_i \left[t \leq x_i\right] v_i\right) \left(\sum_j \left[t > x_j\right] v_j\right) dt \leq 0$, because each $t \in I$ satisfies $\sum_j \left[t > x_j\right] v_j = -\sum_i \left[t \leq x_i\right] v_i$ and thus $\left(\sum_i \left[t \leq x_i\right] v_i\right) \left(\sum_j \left[t > x_j\right] v_j\right) = - \left(\sum_i \left[t \leq x_i\right] v_i\right)^2 \leq 0$. – darij grinberg Nov 23 '18 at 05:18
  • @darijgrinberg Very cool! I actually came back today to write up a different proof: This is the $r \to 0^{+}$ limit of the fact that the matrix $\exp(-r d(x_i,x_j))$ is positive definite. The latter fact can be found in http://www.ams.org/journals/tran/1938-044-03/S0002-9947-1938-1501980-0/S0002-9947-1938-1501980-0.pdf . But your proof is better, not only because it is more elementary, but because it will be easier to determine when definiteness occurs. – David E Speyer Nov 23 '18 at 12:00
-1

So, if I understand it correctly, you want to proof that $Ax=0 \implies x=0$, correct?

So, we simply need to show that $det(A) \neq 0$.

Let $A= \begin{pmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{pmatrix}$ where $a_{ij} := d(i,j) := |x_i-x_j|$

\begin{align} \det A &= \det \begin{pmatrix} a_{11} & a_{12} & a_{13} \ a_{21} & a_{22} & a_{23} \ a_{31} & a_{32} & a_{33} \end{pmatrix} \ &= a_{11} a_{22} a_{33} +a_{12} a_{23} a_{31} + a_{13} a_{21} a_{32} - a_{13} a_{22} a_{31} - a_{12} a_{21} a_{33} - a_{11} a_{23} a_{32} \ &= 0+a_{12} a_{23} a_{31} + a_{13} a_{21} a_{32} -0-0-0 \ &= 2a_{12}a_{13} a_{23} > 0 \end{align}

As $a_{ii}=d(i,i) = 0$ and $d(i,j)>0$ for $i\neq j$

Sudix
  • 3,829