I have an operator $T$ on space of symmetric matrices $n \times n$ such that $$ T(S) = L^{t}SL $$ where $L$ is a $n \times n$ matrix and $L^t$ means $L$ transposed. I want to find determinant of matrix of T (expressed through coefficients of characteristic polynomial of $L$). For that purpose, I want to find the matrix itself first. My approach was to calculate images of all basis vectors of the space of symmetric matrices, then “convert” them into vectors. Then columns of matrix of T will just be those vectors. The problem is, there are $\frac{n(n+1)}{2}$ basis vectors in my vector space, but after “conversion” they all will be of length $n^2$, which means that matrix of T won’t even be square. I’ve been stuck on this problem for days now. Any help will be appreciated, thanks!
-
If I were you, I would simply use the relation $\mathrm{det}(T) = \mathrm{det}(L^T)\mathrm{det}(S)\mathrm{det}(L) = \mathrm{det}(S)\mathrm{det}(L)^2$. – Abezhiko Mar 06 '23 at 17:19
-
2@Abezhiko: That's $\det(T(S))$, not $\det(T)$ (by which the question means the determinant of the operator $T$ on the space of symmetric matrices, not the determinant of a symmetric matrix). – joriki Mar 06 '23 at 17:23
-
1I assume the field is $\mathbb R$. I suggest you first consider real symmetric PSD $L$ and get the determinant via rank one $S$'s that are eigenvectors. Using Polar Form you only then need to get the determinant of an orthogonal matrix; you can mimic the approach I took here: https://math.stackexchange.com/questions/4046572/how-do-i-find-operatornamedet-t-q/4046831#4046831 or better: use the fact that $V=M_n(\mathbb R)$ is a direct sum of symmetric and skew symmetric matrices, and each sub-space is $T$-invariant and get result by calculating $\det T$ on $V$ and dividing by result from link. – user8675309 Mar 06 '23 at 17:49
-
@Jayden If you would like to use the suggested approach of dividing out by the determinant of the same function over the skew-symmetric matrix subspace, then it is useful to note that the map $T$ over all $n \times n$ matrices has determinant $\det(L)^{2n}$. – Ben Grossmann Mar 06 '23 at 18:28
-
1@user8675309 With the singular value decomposition, you could even reduce the problem to the case of diagonal real symmetric PSD $L$ – Ben Grossmann Mar 06 '23 at 18:28
-
@BenGrossmann it's a good point about SVD. I suppose I typically reach for Polar Decomposition since it is unique for $GL_n(\mathbb R)$ unlike SVD.... but SVD might be conceptually simpler here. – user8675309 Mar 06 '23 at 18:35
-
@user8675309 thank you for all the advice! Unfortunately, the field is just F, so I can't really use SVD nor polar form – Jayden Mar 06 '23 at 18:41
-
@Jayden Notably, this map is (up to a change of basis) the "symmetric tensor power" $L^{\vee 2}$ of $L$ – Ben Grossmann Mar 06 '23 at 18:42
-
Re "after 'conversion' they all will be of length $n^2$": remember to express those matrices in terms of your basis, which will get you back to length $\frac{n(n+1)}{2}$ – math54321 Mar 06 '23 at 18:46
-
@Jayden If you pick an ordering of the pairs $i\leq j$ and use the basis elements $e_i e_j^T + e_je_i^T$ (where $e_1,\dots,e_n$ is the standard basis), then then matrix of $T(S)$ relative to this basis will be the matrix whose entries are the permanents of a corresponding $2 \times 2$ submatrix of $L$. – Ben Grossmann Mar 06 '23 at 18:48
-
@BenGrossmann could you please explain what "symmetric" tensor power means? I guess just tensor power is L \otimes L? Actually I just googled it and it said second symmetric power of V (where V is a vector space) is a subspace of symmetric algebra over V generated by products of 2 elements from V. So how do we define such powers over tensors exactly? – Jayden Mar 06 '23 at 18:52
-
@Jayden People are giving you a lot of terms which you may not be familiar with, but I don't think you need to worry about them right now (unless you want to of course). Your approach is fine, you just forgot the last step of finding coefficients with respect to your chosen basis – math54321 Mar 06 '23 at 18:56
-
@Jayden They're are a few equivalent definitions, but one is the restriction of $L \otimes L$ to the "symmetric" subspace of $V \otimes V$, namely the subspace generated by tensors of the form $x \otimes x$ for $x \in V$. – Ben Grossmann Mar 06 '23 at 18:57
-
@Jayden To clarify my earlier comment about permanents: if $f:{1,\dots,\frac{n(n+1)}{2}} \to {1,\dots,n}^2$ is your ordering of the pairs $1 \leq i \leq j \leq n$, then the $p,q$ entry of your matrix of $T$ should be as follows. Let $a,b,c,d$ be such that $(a,b) = f(p)$ and $(c,d) = f(q)$. Then the $p,q$ entry of the matrix is $$ \text{perm} \pmatrix{L_{ac} & L_{ad}\L_{bc} & L_{bd}} = L_{ac}L_{bd} + L_{ad}L_{bc}. $$ – Ben Grossmann Mar 06 '23 at 19:02
-
@math54321 thanks! the thing is, even when I get a $\frac{n(n+1)}{2} \times \frac{n(n+1)}{2}$ matrix (I did this btw, thank you), it's determinant is hard for me to compute, I just don't know how to do it – Jayden Mar 06 '23 at 19:08
-
@BenGrossmann thank you for all the clarifications ! – Jayden Mar 06 '23 at 19:08
-
1My proof over $\mathbb R$, which implies the result over arbitrary fields is completed. My standard suggestion for those unfamiliar with the principle of permanence of identities is to look at the first half of the Modules chapter in Artin's Algebra. – user8675309 Mar 06 '23 at 22:57
1 Answers
I am going to reformulate the problem somewhat. It suffices to only consider this problem when working over $\mathbb R$. We can easily abstract to arbitrary fields (see end of post).
Let $V:=M_n(\mathbb R)$ and $T:V\longrightarrow V$ given by $T(X)=L^T XL$. Let $W$ be the subspace of skew-symmetric matrices and $W^\perp$ be the subspace of symmetric matrices. $V=W\oplus W^\perp$ and both $TW\subseteq W$ and $TW^\perp\subseteq W^\perp$ apply.
A standard result on this site, e.g. using Kronecker Products formulation is $\det\big(T\big)= \det\big(L\big)^{2n}$. So if we are able to compute $\det\big(T_W\big)$ then we'll get an answer OP is looking for given by $\det\big(T_{W^\perp}\big)=\frac{\det(T)}{\det(T_w)}$
special case 1: $L$ is symmetric PSD.
Then $L$ has $n$ orthonormal eigenvectors $\big\{\mathbf v_1,..,\mathbf v_n\big\}$ and there are $\binom{n}{2}$ skew symmetric basis vectors for $W$ given by $\mathbf v_k\mathbf v_j^T-\big(\mathbf v_k\mathbf v_j^T\big)^T=\mathbf v_k\mathbf v_j^T-\mathbf v_j\mathbf v_k^T$ for $j\neq k$ and $T\big(\mathbf v_k\mathbf v_j^T-\mathbf v_j\mathbf v_k^T\big) = L\mathbf v_k\mathbf v_j^TL-L\mathbf v_j\mathbf v_k^TL = \lambda_j\cdot \lambda_k\big(\mathbf v_k\mathbf v_j^T-\mathbf v_j\mathbf v_k^T\big)$ -- i.e. these are $\binom{n}{2}=\dim W$ eigenvectors for $T$ that live in $W$ and they are linearly independent; you can check this by writing $\big(\mathbf v_k\mathbf v_j^T-\mathbf v_j\mathbf v_k^T\big)$ as a linear combination of other eigenvectors for $T_W$ and first left multiply by $\mathbf v_k^T$ to see all contain $\mathbf v_j$ as one of the two vectors and then repeat, but this time left multiply by $\mathbf v_j^T$. This means $\det\big(T_W\big)$ is the product of $\binom{n}{2}$ eigenvalues of $L$, so e.g. $\lambda_1$ shows up $n-1$ times in said product, which is a symmetric function thus
$\implies\det\big(T_W\big)=\prod_{i=1}^n \lambda_i^{n-1} =\det\big(L\big)^{n-1}$
special case 2: $L$ is orthogonal.
Then $\det\big(T_W\big) = \det\big(L\big)^{n-1}$. See link.
How do I find $\operatorname{det} T_Q$?
general case:
Using Polar Decomposition we see for general $L=UP$ that $\det\big(T_W\big)$ is given by the product of the results from the two above special cases.
$\det\big(T_W\big)=\det\big(U\big)^{n-1}\det\big(P\big)^{n-1}=\det\big(UP\big)^{n-1}=\det\big(L\big)^{n-1}$
Finally as OP desired, we have $\det\big(T_{W^\perp}\big)=\frac{\det(T)}{\det(T_w)}= \det\big(L\big)^{2n}\cdot\det\big(L\big)^{-(n-1)}=\det\big(L\big)^{n+1}$
remark: for avoidance of doubt, this holds even when $\det\big(L\big)=0$ because in such a case there is non-zero $\mathbf y$ such that $\mathbf y^TL=\mathbf 0^T$ and so for any $\mathbf z$ we have $T_{W^\perp}\big(\mathbf y\mathbf z^T +\mathbf z\mathbf y^T\big)=\big(L^T\mathbf y\big)\mathbf z^TL + L^T\mathbf z\big(\mathbf y^TL\big)= \mathbf 0 +\mathbf 0\implies\det\big(T_{W^\perp}\big) = 0=\det\big(L\big)^{n+1}$
arbitrary fields
Abstracting out: consider $V'$ be the symmetric matrix subspace of $M_n\big(\mathbb Z[\mathbf x]\big)$ and $T':V'\longrightarrow V'$ given by $T'(X')=L^TX'L$. Then $\det\big(T'\big)-\det\big(L\big)^{n+1}$ is a polynomial that is zero for any substitution homomorphism $\phi:\mathbb Z[\mathbf x]\longrightarrow \mathbb R$, so we conclude by Principle of Permanence of Identities that $\det\big(T'\big)=\det\big(L\big)^{n+1}$ and this survives for any substitution $\Phi:\mathbb Z[\mathbf x]\longrightarrow \mathbb K$ for arbitrary field $\mathbb K$.
- 12,193