0

Let $J = Q^{-1} A Q$ be the Jordan form of $A$, where $Q$ is a similarity transform matrix. Then, how to minimize the condition number $\kappa(Q) = \|Q^{-1}\| \|Q\|$? Is it a convex optimization problem?

At first, let's fix $A$ and $J$. Then, the general version can be solved by searching over all possible permutations of the Jordan blocks, even though it is less efficient.

Ryan
  • 725
  • 1
    What would be the matrix variables in such a convex program? Would $\bf A$ and $\bf J$ be given and $\bf Q$ be the only unknown? Or would the permutation of the Jordan blocks be considered? – Rodrigo de Azevedo Feb 08 '25 at 09:20
  • @RodrigodeAzevedo Thanks very much for your comments! Sorry for my delayed reply. I think we can fix $A$ and $J$ first. Then, the general version can be solved by searching all possible permutations (even though it is less efficient). – Ryan Feb 14 '25 at 12:03
  • 1
    If $\bf A$ and $\bf J$ are given, one has control over the columns of $\bf Q$, right? If so, have you tried making the columns of $\bf Q$ as orthonormal as possible? – Rodrigo de Azevedo Feb 14 '25 at 13:25
  • @RodrigodeAzevedo Inspired by your comment, I think we can parameterize $Q$ by scaling its columns, i.e., $Q(\alpha_1, \ldots, \alpha_n) = [\alpha_1 q_1~\ldots~\alpha_n q_n]$ ($\alpha_i \in \mathbb{C} \setminus {0}$), since the "directions" of the columns are fixed (eigenvectors or generalized eigenvectors). Then, $Q^{-1} =: P = [\beta_1 p_1^T~\ldots~\beta_1 p_n^T]^T$ with $\beta_i = 1/\alpha_i$, and this problem can be solved by searching $\alpha_1, \ldots, \alpha_n$. – Ryan Feb 14 '25 at 14:23
  • 1
    Are the directions truly fixed? What if there are, say, $2$-dimensional "generalized eigenspaces" corresponding to $2 \times 2$ Jordan blocks? In that case, there should be infinitely many orthonormal basis for such "generalized eigenspaces", right? – Rodrigo de Azevedo Feb 14 '25 at 17:13
  • @RodrigodeAzevedo Thanks for your comments. Yes, for the 2-dimensional case, the columns are derived by solving $A q_1 = \lambda q_1$ and $A q_2 = q_1 + \lambda q_2$, respectively. The direction of $q_1$ is fixed, but $q_2$ depends on $q_1$. Thus, the first column of $Q$ has one parameter, and the second has two parameters (one is dependent on the first column). More specifically, $Q = [\alpha_1 q_1~\alpha_1 q_1 + \alpha_2 q_2]$. – Ryan Feb 15 '25 at 02:01
  • 1
    This might give you some intuition. – Rodrigo de Azevedo Feb 15 '25 at 08:22
  • @RodrigodeAzevedo Thanks so much! If $J$ is diagonal, $Q^{-1} =: P = [\beta_1 p_1^T~\ldots~\beta_n p_n^T]$, and $\beta_i = 1/\alpha_i$ is derived by solving $n$ equations $\alpha_i \beta_i p_i q_i \stackrel{(a)}{=} \alpha_i \beta_i = 1$, where $(a)$ follows from $|p_i| = |q_i| = 1$ (we can construct $Q$ and $P$ with this condition). If $J$ contains Jordan blocks, $p_1,\ldots,p_T$ are left eigenvectors/generalized eigenvectors, and $\beta_i$ can also be derived by solving $n$ equations in a similar way. – Ryan Feb 15 '25 at 16:09
  • 1
    So, problem solved? – Rodrigo de Azevedo Feb 15 '25 at 19:01
  • 1
    @RodrigodeAzevedo I think so (from my perspective, the problem has been successfully parameterized)! Thanks very much for your help! – Ryan Feb 19 '25 at 07:26

0 Answers0