2

According to the answer from @Cloudscape

  • The first step of finding the convex hull of a given set would be to visualize the convex hull and guess it.
  • The second step would be to prove your guess contains the set of which you wanted to find the convex hull.
  • Next, prove that your guess is convex.
  • Finally, prove that any convex set containing the set will include your guess.

But I am struggling with the first step, I don't know how to visualize the given set.

Additionally, I also wonder the specific meaning of "compact form".

Raiden
  • 23
  • Is $uu^T$ the outer product? – Anon Mar 30 '21 at 06:20
  • @Kaind exactly! – Raiden Mar 30 '21 at 06:34
  • Is the first sentence you talking to us, or you reproducing the text of some other source talking to you? If the second, you should blockquote it, and also cite your source. The recommendations in the linked answer might not be helpful in $n^2$ dimensions. (Or are you restricting your vectors specifically to $\Bbb R^2$ or $\Bbb R^3$? If so, you should specify.) – anon Mar 30 '21 at 06:49
  • I don't understand the English of your last sentence. Are you saying you're talking about $\Bbb R^2$, or are you saying you're not talking about $\Bbb R^2$? – anon Mar 30 '21 at 09:07
  • 2
  • 1
    Pablo Parrilo, The convex algebraic geometry of rank minimization, International Symposium on Mathematical Programming, August 2009, Chicago. – Rodrigo de Azevedo Mar 30 '21 at 09:33
  • 1
    Nowhere in your original question did you say you were only talking about two dimensions! That's important information, OP, you shouldn't hide it from us!! – anon Mar 30 '21 at 09:34
  • @runway44 Really sorry about that~ But my teacher does not mention it in problem sets @_@ – Raiden Mar 30 '21 at 10:08
  • Well then could it be the homework is not restricting to 2D? What level of class / homework is this at? Context and background are important. The 2D version and the $n$-dimensional are very different difficulty levels. – anon Mar 30 '21 at 10:10
  • @ runway44 Sorry again I might have misled everyone. Maybe here is not restricted to 2D. It might be in $n^2$ dimension. It's from a graduate level optimization course. – Raiden Mar 30 '21 at 10:17

2 Answers2

0

Visualization of set: A matrix $A$ belongs to that set iff

  1. A is symmetric.
  2. $a_{ii} > 0 $ & trace(A) = $1$
  3. $a_{ij}^2 = a_{ii} a_{jj}$

Can you carry on from here?

Anon
  • 2,727
  • The answer linked by Rodrigo above seems more complex than this hint seems to suggest, or am I missing something? – anon Mar 30 '21 at 09:22
  • @Kaind But how to deal with the sign of $uu^T$ since $trace(−A)=-1$ – Raiden Mar 30 '21 at 10:34
  • @Kaind I also don't know how to use the third condition you mentioned to construct convex hull in matrix form – Raiden Mar 30 '21 at 12:32
  • @Raiden Yeah because that question is $uv^T$ and this question is $uu^T$ i.e. this question is a subset of that question. – Anon Mar 30 '21 at 14:06
  • @Kaind yeah $u u^{T}$ is the subset of $u v^{T}$, but here is $\pm u u^{T}$, it's different! In the $u v^{T}$ case we can subsitute v with -v and get $-u v^{T}$, so $\left{u v^{T}: u \in \mathbb{R}^{m}, v \in \mathbb{R}^{n},|u|{2}=|v|{2}=1\right} = \left{-u v^{T}: u \in \mathbb{R}^{m}, v \in \mathbb{R}^{n},|u|{2}=|v|{2}=1\right}$
    But here we cannot get $-uu^T$ by the same way~
    – Raiden Mar 31 '21 at 01:55
  • 1
    You can still use their solution to solve your problem. – Anon Mar 31 '21 at 02:02
  • I mean $\left{u v^{T}: u \in \mathbb{R}^{m}, v \in \mathbb{R}^{n},|u|{2}=|v|{2}=1\right}=\left{\pm u v^{T}: u \in \mathbb{R}^{m}, v \in \mathbb{R}^{n},|u|{2}=|v|{2}=1\right}$ But $\left{\mathbf{u u}^{T} \mid|\mathbf{u}|=1\right}\neq \left{\pm \mathbf{u u}^{T} \mid|\mathbf{u}|=1\right}$ – Raiden Mar 31 '21 at 02:04
0

Interestingly, I'm facing the same question on my optimization course, and maybe the same class as the proposer.

Let $A = \left\{\pm \mathbf{u} \mathbf{u}^{T} \mid\|\mathbf{u}\|=1\right\}$. First we may guess that $\text{conv}A$ is symmetric with some eigenvalue properties. The rank of $\mathbf{u} \mathbf{u}^{T}$ is $1$, so$$\lambda_1 =\text{tr}(\mathbf{u} \mathbf{u}^{T}) = \text{tr}{(\mathbf{u}^T \mathbf{u}) = \|u\|} = 1, \\\lambda_i = 0, 2\le i \le n.$$ Its nuclear norm $||\mathbf{u} \mathbf{u}^{T}||_* = \Sigma|\lambda_i| = \lambda_1 = 1 $(for symmetric matrices nuclear norm equals to the sum of the absolute eigenvalues).

We can also find that $\mathbf{0} = \mathbf{u} \mathbf{u}^{T} + (-\mathbf{u} \mathbf{u}^{T})\in \text{conv}A$ with nuclear norm 0. Therefore we guess that $$\text{conv}A = \left\{M\in \mathbb{R}^{n\times n}\mid\| M^T = M , \|M\|_* \le 1\right\}. $$

We prove it by two steps.

Step 1: $B\subseteq\text{conv}A$

$\forall M \in B,M=M^T$, so we can decompose $$ M = Q\Lambda Q^T, QQ^T=I, \Lambda = \text{diag}{\{\lambda_1,\lambda_2,\dots,\lambda_n\}}.$$ Let $$P = \{\mathbf{p}_1^T,\mathbf{p}_2^T,\dots, \mathbf{p}_n^T\}^T, \mathbf{p}_i \in \mathbb{R}^n, \|\mathbf{p}_i\|_2 = 1,$$ we get \begin{align}M =& \Sigma_{i=1}^n\lambda_i \mathbf{p}_i \mathbf{p}_i^T \\ =& \Sigma_{i=1}^n|\lambda_i| \text{sgn}(\lambda_i)\mathbf{p}_i \mathbf{p}_i^T\\ =& \Sigma_{i=1}^n|\lambda_i| \text{sgn}(\lambda_i)\mathbf{p}_i \mathbf{p}_i^T + \frac{1-\Sigma_{i=1}^n|\lambda_i|}{2}(\mathbf{u}\mathbf{u}^{T} + (-\mathbf{u} \mathbf{u}^{T})), \end{align} which means $M$ is a convex combination of elements in $A$, so $B\subseteq\text{conv}A$

Step 2: $B$ is convex

The nuclear norm $\|\cdot \|_*$ is a convex function, so $$\forall M_1,M_2 \in B, \forall \theta \in [0,1], M_0 = \theta M_1 + (1-\theta)M_2,$$ we have $$M_0^T = \theta M_1^T + (1-\theta)M_2^T = \theta M_1 + (1-\theta)M_2 = M_0 \\ \|M_0\|_* = \|\theta M_1 + (1-\theta)M_2\|_* \le \theta \|M_1\|_* + (1-\theta)\|M_2\|_* \le \theta + (1-\theta) = 1,$$ we got $M_0\in B$, so it is convex.

Now we know that $B$ is convex and is a subset of $\text{conv}A$, of course it is exact $\text{conv}A$.