3

How can I get lower/upper bounds on the largest eigenvalue of the following sum of diagonal and rank-1 matrices for vector $h$ with $h_i>0\ \forall i$:

$$A=2\text{diag}(h)+h \cdot 1^T$$ For instance, for $d=3$ it would be matrix below

$$2 \left( \begin{array}{ccc} h_1 & 0 & 0 \\ 0 & h_2 & 0 \\ 0 & 0 & h_3 \\ \end{array} \right)+\left( \begin{array}{ccc} h_1 & h_1 & h_1 \\ h_2 & h_2 & h_2 \\ h_3 & h_3 & h_3 \\ \end{array} \right) $$

The following has been observed to be an upper bound empirically $$2\max_i h_i+\sum_i h_i\ge\lambda_\text{max}(A)$$

If we let $h=1,\frac{1}{2},\frac{1}{3},\ldots,\frac{1}{d}$, then for $d=4000$, the answer is $\approx 9.29455$, proposed upper bound is 10.8714. Furthermore, relative difference between bound and true value seems bounded as we vary $h$

Motivation: $\alpha<\lambda_1(A)$ is necessary and sufficient for the iteration $w=w-\alpha \langle w, x\rangle x$ to converge when $x$ is sampled from centered Normal with diagonal covariance and $h_i$ on the diagonal (derivation)

2 Answers2

2

We show the bound holds, and show lower bound of $\sum h_j$.

For simplicity, I assume all $h_i$ are real and $h_i>0$ (and $h_1\geq h_2...\geq h_d$). Let's write out the eigenvalue-eigenvector equations, for an eigenvector $(v_1, ... ,v_d)$ with $\sum v_j=1$ (this exists for $\lambda$ of maximal modulus by Perron-Frobenius).

$$2h_iv_i+h_i \sum v_j = \lambda v_i$$ $$2h_iv_i+h_i = \lambda v_i$$ $$ h_i=(\lambda-2h_i)v_i$$

If $\lambda-2h_i=0$ for some $i$ then $h_i=0$ for that $i$, but we agreed this does not happen. Hence $v_i=\frac{h_i}{\lambda-2h_i}$ and

$$\sum \frac{h_i}{\lambda-2h_i}=1$$

If $|\lambda| > 2h_1+\sum h_j$ (let's denote $2h_1+\sum h_i$ by $l$) then

$$|\lambda -2h_i|> |\lambda|-2h_i>\sum h_j$$

$$|\sum \frac{h_i}{\lambda-2h_i}|\leq \sum | \frac{h_i}{\lambda-2h_i} | < |\sum \frac{h_i}{\sum h_j}|=1.$$

Contradiction, so, in fact, $|\lambda| \leq 2h_1+\sum h_i$.

Now, conversely, any root of the equation

$$\sum \frac{h_i}{\lambda-2h_i}=1$$

is an eigenvalue -- just take corresponding $v_i=\frac{h_i}{\lambda-2h_i}$ to get an eigenvector.

So we just need to show that for some $\epsilon>0$ we have for the function $f(\epsilon)= \sum \frac{h_i}{l-\epsilon -2h_i} \geq 1$. Since at $\epsilon =0$ it is $\leq 1$, there will be some value between $l-\epsilon$ and $l$ (inclusive) where it is $1$. If we take $\epsilon =2 h_1$ we have

$$f(\epsilon)=\sum \frac{h_i}{\sum h_j - 2h_i}>\sum \frac{h_i}{\sum h_j}=1$$

So we get a lower bound which is $2 h_1$ away from the upper.

(Note that the bound you wanted originally, i.e. $\lambda_{max}> H_d+2-C d^{-0.5}$, then boils down to proving $\sum_1^d \frac{\frac{1}{j}}{H_d+2-\frac{2}{j} - C d^{-0.5}}>1$ for some $C$ (independent of $d$). Here $H_d=\sum_1^d \frac{1}{j}$.)

Update: I am a bit suspicious. Some computations seem to suggest the asymptotic value in the $h_j=1/j$ case is actually $H_d$, i.e. the absolute gap (upper bound - spectral radius) limits to $2$ and relative one decays as $2/\ln d$.

Max
  • 14,503
  • (Already +1 some hours ago) A question: Let $n = 3$ and $h_1 = h_2 = 2, h_3 = 1$. Then $4$ is an eigenvalue of $A$, from $Av = 4v$ where $v = [v_1, v_2, v_3]^T$, we get $v_1 + v_2 + v_3 = 0$. But you let $\sum v_j=1$, so you don't consider the eigenvalue $4$ in my example? – River Li Sep 02 '22 at 10:47
  • You are right. One can either 1) restrict to the case of distinct $h_i$ as you do in your answer or 2) restrict to the case of the largest eigenvalue, where Perron-Frobenius guarantees an eigenvector with positive entries. I've added this to my answer. Thanks! – Max Sep 02 '22 at 16:39
2

Some thoughts:

We deal with the case $h_1 > h_2 > \cdots > h_n$.

Consider the equation $Ax = \lambda x$ ($x\ne 0$) which is written as $$(\lambda - 2h_k) x_k = h_k\sum_{i=1}^n x_i, \quad k=1, 2, \cdots, n. \tag{1}$$

We claim that $\lambda \ne 2h_j, \forall j$. Indeed, if $\lambda = 2h_j$ for some $j$, then $\sum_{i=1}^n x_i = 0$ and $(2h_j - 2h_k)x_k = 0, \forall k\ne j$ which results in $x_k = 0, \forall k \ne j$. Then we get $x = 0$. Contradiction.

From $\lambda \ne 2h_j, \forall j$, we have $\sum_{i=1}^n x_i \ne 0$. From (1), we have $$x_k = \frac{h_k}{\lambda - 2h_k}\sum_{i=1}^n x_i, \quad k=1, 2, \cdots, n. $$ Thus, we have $$\sum_{k=1}^n \frac{h_k}{\lambda - 2h_k} = 1. \tag{2}$$

Fact 1: The equation (2) has exactly $n$ distinct real solutions $\lambda_1 > \lambda_2 > \cdots > \lambda_n$ with $\lambda_1 > 2h_1$ and $\lambda_k \in (2h_k, 2h_{k-1}), k=2, 3, \cdots, n$.
(The proof is easy and thus omitted here.)

Let us give a lower bound of $\lambda_1$.

We have $$\frac{h_k}{\lambda_1 - 2h_k} = \frac{h_k}{\lambda_1}\cdot \frac{1}{1 - 2h_k/\lambda_1} > \frac{h_k}{\lambda_1} \cdot \left(1 + \frac{2h_k}{\lambda_1}\right), \quad \forall k.$$ Thus, we have $$\frac{\sum_{i=1}^n h_i}{\lambda_1} + \frac{2\sum_{i=1}^n h_i^2}{\lambda_1^2} < 1$$ which results in $$\lambda_1 > \frac12 \sum_{i=1}^n h_i + \frac12\sqrt{\left(\sum_{i=1}^n h_i\right)^2 + 8 \sum_{i=1}^n h_i^2}. \tag{3}$$

When $h = 1, \frac12, \frac13, \cdots, \frac1d$ and $d = 4000$, (3) gives $\lambda_1 > 9.227851206$. Using Maple, from (2), we get $\lambda_1 \approx 9.294554415$.

A better lower bound:

We have $$\frac{h_1}{\lambda_1 - 2h_1} + \frac{\sum_{i=2}^n h_i}{\lambda_1} + \frac{2\sum_{i=2}^n h_i^2}{\lambda_1^2} < 1. \tag{4}$$

When $h = 1, \frac12, \frac13, \cdots, \frac1d$ and $d = 4000$, (4) gives $\lambda_1 > 9.284803103$.

River Li
  • 49,125
  • I'm wondering if better bounds exist if we know that $R=(\sum_i h_i)^2/(\sum_i h_i^2)$ is large. For general $h$, the bounds above can be off by factor of 80 in 200 dimensions. But I'm finding empirically for $R>100$, the two extreme distributions you provided in https://math.stackexchange.com/questions/4633085/bounds-on-max-i-p-i-in-terms-of-sum-i-p-i2 serve to bound the ratio $\lambda_1/(|h|1+2|h|\infty)$ – Yaroslav Bulatov Feb 26 '23 at 23:38
  • Plotting the ratio for a variety of spectra against $R$ https://i.sstatic.net/pF0sa.png – Yaroslav Bulatov Feb 26 '23 at 23:40
  • @YaroslavBulatov The task is to give a good bound of $\lambda_1$ based on (2). We can do better such as (4) which does not admit closed form bounds. – River Li Feb 27 '23 at 00:20
  • OK, actually, plotting true value divided by your lower bound from (4) seems to converge to 1 as $R$ increases for all the families I tried -- https://i.sstatic.net/ADGYP.png ... is it obvious why this should happen? – Yaroslav Bulatov Feb 27 '23 at 00:47
  • Forked into separate question https://math.stackexchange.com/questions/4647281/largest-value-of-lambda-1-in-frach-1-lambda-1-2h-1-frac-sum-i-2 – Yaroslav Bulatov Feb 27 '23 at 01:12
  • @YaroslavBulatov We need to compare the bound (4) with $|h|1 + 2 |h|\infty$? – River Li Feb 27 '23 at 01:22
  • 1
    yes. Empirically, value obtained by optimizing Bound (4) seems to converge to $|h|1+2|h|\infty$ as $R(h)=\frac{\sum_i (h_i)^2}{\sum_i h_i^2}$ goes to infinity, so it would be useful to know if its true – Yaroslav Bulatov Feb 27 '23 at 01:25