10

Let $V$ be an $n$ dimensional vector space over a field $F$ and $T$ be a linear operator over $V$. Assume that the characteristic of $F$ is not $2$.

Definition. Consider the map $f_1:V^n\to \Lambda^n V$ as $$f(v_1, \ldots, v_n)= \sum_{i=1}^n v_1\wedge \cdots \wedge v_{i-1}\wedge Tv_i\wedge v_{i+1} \wedge \cdots \wedge v_n$$This is an alternating multilinear map and thus it induces a unique linear map $\Lambda^n V\to \Lambda^n V$. Since $\dim(\Lambda^n V)=1$, this linear map is multiplication by a constant which we call the trace of $T$.

The above is standard and it naturally calls for the following generalization before which we discuss a notation.

Given an $n$ tuple $(v_1, \ldots, v_n)$ of vectors in $V$ and an increasing $k$-tuple $I=(i_1, \ldots , i_k)$ of integers between $1$ and $n$, write $v_{I, j}$ to denote $Tv_j$ if $j$ appears in $I$ and simply $v_j$ if $j$ does not appear in $I$. Further write $v_I$ to denote $v_{I, 1}\wedge \cdots \wedge v_{I, n}$.

Definition. Let $f_k:V^n\to \Lambda^n V$ be defined as $$f_k(v_1, \ldots, v_n)= \sum_{I \text{ an increasing }k\text{-tuple}}v_I$$ Then $f_k$ is an alternating multilinear map and this induces a unique linear map $\Lambda^n V\to \Lambda^n V$. Again, this linear map is multiplication by a constant which we call the $k$-th trace of $T$ and denote it as $\text{trace}_k(T)$.

From this post I have am convinced that the following is true

Statement. $\text{trace}_k(T)= \text{trace}(\Lambda^k T)$.

I am unable to prove this.

4 Answers4

7

It is convenient to use the Hodge star to simply the calculations. Choose a non-degenerate symmetric bilinear form $\left< \cdot, \cdot \right>$ on $V$ that has an orthonormal basis (for example, the one corresponding to the identity matrix) and let $(e_1, \ldots, e_n)$ be an orthonormal basis with respect to the chosen bilinear form. We will use $\sum_{I}$ to denote summation over increasing multi-indices $I$ of size $k$. Thus,

$$ \mathrm{trace}(\Lambda^k T)(e_1 \wedge \cdots \wedge e_n) = \sum_{I} \left< (\Lambda^kT)(e_I), e_I \right> \left( e_1 \wedge \cdots \wedge e_n \right) = \sum_{I} \Lambda^k T(e_I) \wedge (*e_I) = \sum_{I \coprod J = [n]} \pm \left( \Lambda^k T(e_I) \wedge e_J \right) = \sum_{I \coprod J = [n]} \pm \left(Te_{i_1} \wedge \cdots \wedge Te_{i_k} \wedge e_{j_1} \wedge \cdots \wedge e_{j_{n-k}} \right) $$

where $J$ is an increasing multi-index such that $I \coprod J = [n]$ and we used the fact that $*e_I = \pm e_J$. A sign calculation that uses the definition of the Hodge star shows that in fact the sign is plus which shows that

$$ \mathrm{trace}(\Lambda^k T)(e_1 \wedge \cdots \wedge e_n) = f_k(e_1, \cdots, e_n) $$

and thus $\mathrm{trace}(\Lambda^k T) = \mathrm{trace}_k(T)$.


One can also show this without using the Hodge star. Choose some basis $(e_1,\dots,e_n)$ for $V$. The expression for $f_k(e_1,\dots,e_n)$ is the sum of $n \choose k$ terms where each term is obtained from $e_1 \wedge \dots \wedge e_n$ by choosing an increasing tuple $I = (i_1, \dots, i_k)$ and applying $T$ to each $e_{i_j}$ while leaving the rest of the vectors intact and in the same order. Let $J$ be the unique increasing tuple $J$ such that $I \coprod J = [n]$ and then by reordering the vectors in the wedge product, we can write each term as

$$ (-1)^{\sigma(I)} Te_{i_1} \wedge \dots \wedge Te_{i_k} \wedge e_{j_1} \wedge \dots \wedge e_{j_{n-k}} = (-1)^{\sigma(I)} \Lambda^k(T)(e_I) \wedge e_J $$

where $(-1)^{\sigma(I)}$ is the sign that comes from the reordering. Now,

$$ \operatorname{trace}(f_k) = (e^1 \wedge \dots \wedge e^n)(f_k(e_1, \dots, e_n)) = (e^1 \wedge \dots \wedge e^n) \sum_{I} (-1)^{\sigma(I)} \Lambda^k(T)(e_I) \wedge e_J = \sum_{I} (-1)^{\sigma(I)} (e^I \wedge e^J)((-1)^{\sigma(I)} \Lambda^k(e_I) \wedge e_J = \sum_{I} (e^I \wedge e^J)(\Lambda^k(e_I) \wedge e_J). $$

Each $(e^I \wedge e^J)(\Lambda^k(e_I) \wedge e_J)$ is the determinant of an upper triangular block matrix whose lower $(n-k) \times (n-k)$ block is $I$. The vanishing of the rightmost $k \times (n-k)$ block comes from "$e^I(e_J)$" while the fact that the lower $(n -k) \times (n-k)$ block is $I$ comes from "$e^J(e_J)$". Hence,

$$ \operatorname{trace}(f_k) = \sum_{I} e^I(\Lambda^k(e_I)) = \operatorname{trace}(\Lambda^k(T)). $$

levap
  • 67,610
  • So we are using the fact that if $(e_1, \ldots, e_n)$ is an orhonormal basis of $V$ under the chosen bilinear form, then $(e_I: I\text{ an increasing } k \text{ tuple})$ is an orthonormam basis for the induced bilinear map on $\Lambda^k V$? This is why we can write $\text{trace}(\Lambda^k T)= \sum_I\langle(\Lambda^k T)e_I, e_I\rangle$. Am I right? – caffeinemachine Feb 03 '16 at 10:14
  • @caffeinemachine Yep. – levap Feb 03 '16 at 12:09
  • Is there a proof without using the Hodge star? Employing an arbitrarily-constructed scalar product seems too ad-hoc here. (Still, this is a great answer already, thank you!) – lisyarus Mar 02 '18 at 09:53
  • @lisyarus: Yeah, sure. I've added a proof along those lines. – levap Mar 05 '18 at 11:08
4

Here's another proof which uses mixed exterior algebra. The advantage is that it doesn't involve messy computations with bases, sign factors, determinants, etc. but the disadvantage is that it uses a bunch of other machinery. I summarize facts that are needed below, but for details see Chapter 6 of [1].

Background

As above let $V$ be $n$-dimensional, let $V^*$ be dual to $V$, and define $$\textstyle\bigwedge(V^*,V)=\bigwedge V^*\otimes\bigwedge V$$ to be the mixed exterior algebra over $V^*,V$, which is the canonical tensor product of the exterior algebras $\bigwedge V^*$ and $\bigwedge V$. The mixed exterior product (or "dot" product) satisfies $$(u^*\otimes u)\cdot(v^*\otimes v)=(u^*\wedge v^*)\otimes(u\wedge v)$$ for all $u^*,v^*\in\bigwedge V^*$ and $u,v\in\bigwedge V$. Importantly, this product is commutative in the diagonal subalgebra $$\Delta(V^*,V)=\bigoplus_{p=0}^n\Delta_p(V^*,V)\qquad\text{where}\qquad\Delta_p(V^*,V)=\textstyle\bigwedge^p V^*\otimes\bigwedge^p V$$ to which we restrict attention. For $z\in\Delta(V^*,V)$, we write $$z^k=\frac{1}{k!}\underbrace{z\cdots z}_{k\text{ factors}}$$

There is an inner product in $\Delta(V^*,V)$ induced by the scalar product between $V^*$ and $V$ which satisfies $$\langle u^*\otimes u,v^*\otimes v\rangle=\langle u^*,v\rangle\langle v^*,u\rangle$$

There is a canonical linear isomorphism $T:\Delta(V^*,V)\to H(\bigwedge V;\bigwedge V)$ to the space of homogeneous linear transformations of $\bigwedge V$ which satisfies $$T(u^*\otimes u)(v)=\langle u^*,v\rangle u$$ Making appropriate identifications under $T$, we write $$\textstyle H(\bigwedge V;\bigwedge V)=\displaystyle\bigoplus_{p=0}^n L(\textstyle\bigwedge^p V;\bigwedge^p V)$$ so in particular all linear transformations of $V$ and their exterior powers are in there. We make $T$ an algebra isomorphism by defining the (generalized) box product $$\alpha\mathbin{\square}\beta=T(T^{-1}\alpha\cdot T^{-1}\beta)$$ for homogeneous $\alpha,\beta:\bigwedge V\to\bigwedge V$. For $z\in V^*\otimes V$ and $\varphi=T(z):V\to V$, it can be shown that $$T(z^k)=\frac{1}{k!}T(\underbrace{z\cdots z}_{k\text{ factors}})=\frac{1}{k!}\underbrace{\varphi\mathbin{\square}\cdots\mathbin{\square}\varphi}_{k\text{ factors}}=\textstyle\bigwedge^k\varphi$$

$T$ is also an isometry from the inner product above to the trace form $\langle\alpha,\beta\rangle=\mathop{\mathrm{tr}}(\alpha\circ\beta)$. If we define the unit tensor $t=T^{-1}(\iota)$ where $\iota:V\to V$ is the identity map, then for $z\in\Delta_k(V^*,V)$ it follows that $$\langle t^k,z\rangle=\mathop{\mathrm{tr}}(T(z))$$

Finally, it can be shown that $$i(t^k)t^n=t^{n-k}$$ where for $z\in\Delta(V^*,V)$, $i(z)$ is the insertion of $z$, dual to exterior multiplication by $z$: $$\langle i(z)x,y\rangle=\langle x,z\cdot y\rangle=\langle x,y\cdot z\rangle$$

Solution

For $\varphi:V\to V$ with $z=T^{-1}(\varphi)$, we have $$\begin{align*} \textstyle\mathop{\mathrm{tr}}(\bigwedge^k\varphi)&=\langle t^k,z^k\rangle\\ &=\langle i(t^{n-k})t^n,z^k\rangle\\ &=\langle t^n,z^k\cdot t^{n-k}\rangle\\ &=\textstyle\mathop{\mathrm{tr}}(\bigwedge^k\varphi\mathbin{\square}\bigwedge^{n-k}\iota) \end{align*}$$

Let $\Phi=\bigwedge^k\varphi\mathbin{\square}\bigwedge^{n-k}\iota$. Then $\Phi:\bigwedge^n V\to\bigwedge^n V$, and analysis of the box product shows that $$\Phi(v_1\wedge\cdots\wedge v_n)=\frac{1}{k!\,(n-k)!}\sum_{\sigma\in S_n}\varphi_{\sigma(1)}v_1\wedge\cdots\wedge\varphi_{\sigma(n)}v_n\tag{1}$$ where $$\varphi_i=\begin{cases} \varphi&\text{if }1\le i\le k\\ \iota&\text{if }k<i\le n \end{cases}$$

A moment's thought shows that (1) is your desired sum, since for every $1\le i_1<\cdots<i_k\le n$, there will be precisely $k!\,(n-k)!$ terms of form $$v_1\wedge\cdots\wedge\varphi v_{i_1}\wedge\cdots\wedge\varphi v_{i_k}\wedge\cdots\wedge v_n$$ and every term is of one of these forms.

References

  1. Greub, W. Multilinear Algebra, 2nd ed. Springer, 1978.
blargoner
  • 3,501
3

Late to the party, not sure if this reply adds much. If anything, perhaps the assurance that the first exterior trace is just like the regular trace and - actually - that the $k$-th exterior trace is also like a regular trace for $k>1$.

Suppose $\text{span}\{a_1,\dots,a_n\}=V$. Let $\omega\in\Lambda^n(V^*)$, then $$ x=\sum_{i=1}^n \frac{\omega(a_1\wedge\dots\wedge a_{i-1} \wedge x\wedge a_{i+1}\dots\wedge a_n)}{\omega(a_1\wedge\dots\wedge a_n)}a_i, $$ which essentially is an application of Cramer's rule. Defining
$$ e^i(x)=\frac{\omega(a_1\wedge\dots\wedge a_{i-1} \wedge x\wedge a_{i+1}\dots\wedge a_n)}{\omega(a_1\wedge\dots\wedge a_n)} $$ and $e_i=a_i$, we have $\{e^1,\dots, e^n\}$ as a basis for $V^*$, dual to $\{e_1,\dots e_n\}$, i.e. $\langle e_i,e^j\rangle=\delta^j_i$. (Note: $\langle .,.\rangle:V\times V^*\to \mathbb{R}$ (or whatever), is defined as $(x,f)\mapsto\langle x,f \rangle=f(x)$). We can also write $$ e^i(x)=\frac{\det(a,i\to x)}{\det(a)}, $$ the quotient of the determinant of the matrix $a$ whose columns are formed by $(a_i)_i$, but with column $i$ replaced with $x$ and the determinant of $a$.

Given a basis and dual basis like this, the trace of an operator $A:V\to V$ is computed as $$ \text{tr}(A)=\sum_{1\leq i\leq n}\langle Ae_i,e^i\rangle $$ (Of course, $\text{tr}(A)$ famously does not depend on the chosen basis.)

If we have $A:V\to V$, then $A$ also acts in a linear fashion on higher exterior powers like this $$ A(u_1\wedge \dots \wedge u_k)=A u_1\wedge \dots \wedge A u_k. $$ And those induced linear operators have traces too. To compute their traces we need a dual basis. This is reasonably straightforward. For example, we get a basis for $\Lambda^2(V^*)$, dual to $(e_i\wedge e_j)_{1\leq i<j\leq n}$, in this way: $$ (e^i\wedge e^j)(x,y)=\frac{\det(a,i\to x,j\to y)}{\det(a)}. $$ and $$ \text{tr}_2(A)=\sum_{1\leq i<j\leq n}\langle A(e_i\wedge e_j),e^i\wedge e^j\rangle. $$ Again, we sum the contributions of all basis vectors, this time with a double index.

Next consider $f_k(v_1, \dots\,v_k)$ acted upon by $n$-covector $\omega$. We can express this as $$ \langle f_k(v_1,\dots,v_n),\omega\rangle=\langle v_1\wedge \dots\wedge v_n,\omega\rangle\sum_{1\leq i_1<\dots<i_k\leq n}\frac {\det(v,i_1\to Av_{i_1},\dots,i_k\to Av_{i_k})}{\det(v)}. $$ With $$ \text{tr}_k(A)=\sum_{1\leq i_1<\dots<i_k\leq n}\langle A(e_{i_1}\dots\wedge e_{i_k}),e^{i_1}\dots\wedge e^{i_k}\rangle $$ and using that $\Lambda^n(V)$ is one dimensional, we can conclude that $$ f_k(v_1,\dots,v_n)=\text{tr}_k(A)v_1\wedge \dots \wedge v_n $$ Hence $\text{trace}_k(A)=\text{tr}_k(A)$.

(Sorry about using different symbols like $A$ instead of $T$, $\text{tr}_k(A)$ instead of $\text{trace}(\Lambda^k A)$. I will try to be good next time.)

awsnap
  • 108
  • 5
    @amWhy, why should that be done?! I've answered tons of questions that had answers — even accepted answers (even answers of mine that had been accepted!) — and I have never ever been asked to do that, let alone done that. – Mariano Suárez-Álvarez Dec 16 '22 at 00:34
  • @amWhy, Yes good point. Not quite sure how things work around here. Right at the top of my reply I explained a potential improvement, – awsnap Dec 16 '22 at 00:40
  • 1
    @awsnap Thank you for joining the party (better late than never!). Even adding the same answer written differently (in my opinion) adds to the discussion, so there is no need to explain how this is an improvement over the previous answer. – caffeinemachine Dec 16 '22 at 07:19
1

Here's another way to see this:

Let $e_1,\ldots,e_n$ be a basis of $V$ and $e^{*1},\ldots,e^{*n}$ a dual basis of the dual space $V^*$. Then $\langle e^{*j},e_i\rangle=\delta^j_i$, so

$$\begin{align*} \mathrm{trace}_k(T)&=\mathrm{trace}_k(T)\det\langle e^{*j},e_i\rangle\\ &=\mathrm{trace}_k(T)\langle e^{*1}\wedge\cdots\wedge e^{*n},e_1\wedge\cdots\wedge e_n\rangle\\ &=\langle e^{*1}\wedge\cdots\wedge e^{*n},f_k(e_1,\ldots,e_n)\rangle\\ &=\sum_I\langle e^{*1}\wedge\cdots\wedge e^{*n},e_1\wedge\cdots\wedge Te_{I_1}\wedge\cdots\wedge Te_{I_k}\wedge\cdots\wedge e_n\rangle\tag{1} \end{align*}$$

Now for a given ordered $k$-tuple $I$, the summand in (1) is the determinant $$\begin{vmatrix} 1&&\\ \vdots&\ddots&\\ \langle e^{*1},Te_{I_1}\rangle&\cdots&\langle e^{*n},Te_{I_1}\rangle\\ \vdots&\ddots&\vdots\\ \langle e^{*1},Te_{I_k}\rangle&\cdots&\langle e^{*n},Te_{I_k}\rangle\\ &\ddots&\vdots\\ &&1 \end{vmatrix}\tag{2}$$ By the generalized Laplace expansion along the rows indexed by $I$, (2) expands to the single principal $k\times k$ minor $\det\langle e^{*I_j},Te_{I_k}\rangle$. But that's clearly just the $I$-th principal $k\times k$ minor of the matrix of $T$ with respect to the basis $e_1,\ldots,e_n$, which is just the $I$-th diagonal entry of the matrix of $\bigwedge^k T$ with respect to the induced basis of $\bigwedge^k V$. Substituting into (1) we have $$\mathrm{trace}_k(T)=\sum_I[\textstyle\bigwedge^k T]_{II}=\mathrm{tr}(\bigwedge^k T)$$

blargoner
  • 3,501