0

Let $A_1,\dots,A_m$ be a set of $m<n$ real positive definite matrices of size $n\times n$. Now, let $H(\alpha) = \sum_{i=1}^m\alpha_iA_i$. Then, how can I find non trivial (different from all 0) values for $\alpha_1,\dots,\alpha_m\in\mathbb{R}$ (if they exist) such that $\text{det}H(\alpha)=0$ and $\sum_{i=1}^m \alpha_i = 1$?

This question is related to this other one: Conditions for this vectors to be linearly dependent, but I decided to make a different question, since my main concerns of the original were already addressed.

My attempt: I posed this as a nonlinear program: $$ \min_{\alpha_1,\dots,\alpha_m} \left(\text{det}H(\alpha)\right)^2 $$ $$ \text{subject to }\sum_{i=1}^m\alpha_i=1 $$ I expected to see if I obtained a solution to this problem, maybe I could compare to see if the minimum was 0. Then, I use the lagrange multiplier theorem to find the local minima as solutions to $\nabla L(\alpha,\lambda) = 0$ with $L = \left(\text{det}H(\alpha)\right)^2 + \lambda\left(1- \sum_{i=1}^m\alpha_i\right)$ leading to: $$ 2\text{det}(H(\alpha))\text{trace}(H(\alpha)^*A_i) = \frac{1}{\lambda}, i=1,\dots,m $$ were $H^*$ is the adjugate matrix and I used the determinant differentiation formula here. Now, since $\lambda$ is just some value that appears in all equations, I can pair equations as: $$ 2\text{det}(H(\alpha))\text{trace}(H(\alpha)^*A_i) = 2\text{det}(H(\alpha))\text{trace}(H(\alpha)^*A_j) $$ However, these are satisfied if either $\text{det}(H(\alpha))=0$ or if $\text{trace}(H(\alpha)^*(A_i-A_j))=0$. Hence, this doesn't give any new information since I was trying to obtain when is it that $\text{det}(H(\alpha))=0$, and the answer I obtained from here is something like "well... whenever you have $\text{det}(H(\alpha))=0$", which is true... but does't lead me to a method to obtain $\alpha$ which is what I wan't.

Do you think of any way to modify my attempt, and obtain a method to compute $\alpha$ (if they exist)? Do you suggest another approach?

Even ignoring the restriction $\sum_{i=1}^m \alpha_i=1$, do you suggest anything?

At this point any suggestion, comment or anything is useful for me.

EDIT: I know that if I consider $\alpha_i\geq 0$ or consider $H(\alpha)$ semi-positive definite I could modify the program above, or use semi-definite programing to obtain "some" solutions. However I really don't wan't to make that restrictions, since I'm interested in all cases it could happen that $H(\alpha)$ is singular. Allowing $H(\alpha)$ to be not semi-positive definite is important for me.

1 Answers1

1

I assume that these real positive definite matrices are in fact symmetric. I also assume that you've deleted any duplicate matrices from this set (i.e. they are all distinct).

Since all matrices are distinct, $B := A_1 - A_2$ which is non-zero symmetric. This means it has at some non-zero eigenvalue, $\lambda \lt 0$ WLOG (if not $B := A_2- A_1 $) and associated eigenvector $\mathbf v$.

For $\gamma \in \mathbb R$
$C_{\gamma} := \gamma \cdot B + A_m$
note that $C_{\gamma}$ implicitly has $\alpha_1 = \gamma$, $\alpha_2 = -\gamma$ and $\alpha_m = 1$ with all other set $\alpha_j=0$, thus $\sum_{k=1}^m \alpha_k = 1$ as desired.

$\mathbf v^T C_{\gamma}\mathbf v \lt 0$ for positive $\gamma $ large enough (call this $\Gamma$) and
$\mathbf v^T C_{\gamma}\mathbf v \gt 0$ for $\gamma = 0$.

In former case the signature is $\big(*,\geq 1\big)$ and in the latter case the signature is $\big(n,0\big)$. The eigenvalues of $C_{\gamma}$ vary continuously with $\gamma$ hence there exists some $\gamma^* \in \big(0,\Gamma\big)$ such that $\det\big(C_{\gamma^*}\big)=0$.

If you compute the determinant of $C_{\gamma}$ (using e.g. row reduction), you find its a single variable polynomial in $\gamma$, so you can solve for a satisfying root using your favorite root solver.

comment
The original post says

EDIT: I know that if I consider $\alpha_i \geq 0$ or consider () semi-positive definite I could modify the program above, or use semi-definite programing to obtain "some" solutions. However I really don't wan't to make that restrictions, since I'm interested in all cases it could happen that () is singular. Allowing () to be not semi-positive definite is important for me.

This misses an important point. If we restrict each $\alpha_i \geq 0$ then any non-trivial linear combination amounts to the addition of positive definite matrices -- such a sum is always positive definite and hence $\det\big(H(\alpha)\big) \gt 0$ in such cases.

user8675309
  • 12,193
  • Thank you for this example! Just for curiosity, do you think there is an infinite amount of combinations like these? I mean, in this example you used just three of the matrices, and then, only one parameter $\lambda$. Then you obtained that you can always find such $\lambda$. But do you think the same is true for two parameters and 4 matrices? My point, is if there is a whole manifold (of dimension greater than 1) in the $\alpha$'s that lead to $H(\alpha)$ being singular. – FeedbackLooper Oct 02 '20 at 09:37
  • There probably is some nice structure with the zero sets when using two parameters. – user8675309 Oct 02 '20 at 16:13
  • Thanks! You have helped me a lot! – FeedbackLooper Oct 02 '20 at 16:16