2

Suppose we have an $\mathbb{R}$-vector space $E$, $\text{dim}(E)=n$, and two bases $\alpha:=\{v_i\}$ and $\beta:=\{w_i\}$ of it.

We can consider the maps to $\mathbb{R}^n$ given by the coordinates $v\mapsto [v]_\alpha$ and $v\mapsto[v]_\beta$. Let $I:E\rightarrow E$ be the identity function and let $[I]_{\alpha}^{\beta}$ be the matrix of $I$ in the bases $\alpha$ and $\beta$. We have then, from the construction of matrix of a linear transformation, that $$[v]_{\beta}=[I]_{\alpha}^{\beta}\cdot[v]_{\alpha}.$$

This matrix $[I]_{\alpha}^{\beta}$ is often called the matrix of change of coordinates (from $\alpha$ to $\beta$), and sometimes the matrix of change of basis (from $\alpha$ to $\beta$). But let us call it here only the matrix of change of coordinates (from $\alpha$ to $\beta$) since that is what is doing.

Consider now $T:E\rightarrow E$ the linear transformation defined by $T(v_i):=w_i$, for all $i$. This linear transformation is sending the vectors of the basis $\alpha$ to the vectors of the basis $\beta$. Let us call it the transformation of change of basis (because that is what is doing).

Assume from this point on that $E=\mathbb{R}^n$. Then we have the standard basis $e:=\{e_i\}$, which has the property that coordinates in that basis look like the same vectors, i.e. $[v]_e=v$, for every $v\in E=\mathbb{R}^n$. Therefore the matrix $[T]_{e}^{e}$ of $T$ in the basis $e$ satisfies

$$w_i=[w_i]_e=[T(v_i)]_e=[T]_{e}^{e}\cdot[v_i]_e=[T]_{e}^{e}v_i$$

Therefore, this matrix $[T]_{e}^{e}$ is sending the vectors of the basis $\alpha$ to the vectors of the basis $\beta$. We could call it matrix of change of basis since that is what is doing.

Questions:

  1. What is the relationship between $[T]_{e}^{e}$ and $[I]_{\alpha}^{\beta}$?
  2. What is $[T]_{e}^{e}$ usually called?

Example:

Assume that $E=\mathbb{R}^2$, $\alpha=\{v_1=\begin{bmatrix}1\\1\end{bmatrix}, v_2=\begin{bmatrix}2\\1\end{bmatrix}\}$, and $\beta=\{w_1=\begin{bmatrix}2\\2\end{bmatrix}, w_2=\begin{bmatrix}3\\2\end{bmatrix}\}$.

Then we have

$$\begin{align}v_1&=\phantom{-}\frac{1}{2}w_1+0w_2\\v_2&=-\frac{1}{2}w_1+1w_2\end{align}$$

Therefore

$$[I]_{\alpha}^{\beta}=\begin{bmatrix}\frac{1}{2}&-\frac{1}{2}\\0&\phantom{-}1\end{bmatrix}$$

To compute $[T]_{e}^{e}$ we can evaluate $T(e_1)$ and $T(e_2)$, expand them in the basis $e$ and put the coefficients as columns in a matrix. Since $$\begin{align}e_1&=-1v_1+1v_2\\e_2&=2v_1-v_2\end{align}$$

we get that $$\begin{align}T(e_1)&=T(-v_1+v_2)&=-w_1+w_2\\T(e_2)&=T(2v_1-v_2)&=2w_1-w_2\end{align}$$

This tells us that

$$[T]_{e}^{e}=\begin{bmatrix}1&1\\0&2\end{bmatrix}.$$

Notice that when we compute the inverse of $[I]_{\alpha}^{\beta}$ in this case, we don't get $[T]_{e}^{e}$ as this answer seems to be implying. We get

$$([I]_{\alpha}^{\beta})^{-1}=\begin{bmatrix}2&1\\0&1\end{bmatrix}$$

1 Answers1

0

(This is a much-revised answer).

Doing my best to follow your notation, I'm going to say that $$ [s]_B $$ is the vector of coefficients $(c_1, \ldots, c_n)^t$ with the property that $c_1 b_1 + \ldots + c_n b_n = s$, where $B$ is the basis whose vectors are $b_1, b_2, \ldots$.

I'm going to write $e_i$ for the $i$th standard basis vector in $R^n$ -- the one with all entries 0 except the $i$th, which is 1. This unfortunately means that $$ [e_i]_E = e_i $$ where $E = e_1, e_2, \ldots$ is the standard basis.

Let $V$ be a matrix such that $$ V[e_i]_E = [v_i]_E $$

i.e., the $i$th column of $V$ contains the standard coordinates of $v_i$. In your example, $$ V = \begin{bmatrix} 1 & 2 \\ 1 & 1\end{bmatrix}. $$

Define $W$ analogously.

Then we have \begin{align} [e_i]_E &= V^{-1} [v_i]_E \text{ (1) }\\ [e_i]_E &= W^{-1} [w_i]_E \text{ (2) }\\ V[e_i]_E &= [v_i]_E \text{ (3) }\\ W[e_i]_E &= [w_i]_E \text{ (4) }\\ \end{align}

Your definition of $T_e^e$, is, in this notation, that it's the matrix such that \begin{align} [w_i]_E &= T_e^e [v_i]_E \\ W[e_i]_E &= T_e^e [v_i]_E \text{, by eq 4}\\ W[e_i]_E &= T_e^e V[e_1]_E \text{, by eq 3} \end{align} Since this holds for each $i$, we can laminate together all the $[e_i]_E$ into the identity matrix to get \begin{align} W I &= T_e^e VI \text{, so that}\\ W V^{-1} &= T_e&e. \end{align}

$$\newcommand{\Iab}{I_\alpha^\beta}$$ Now let's look at $\Iab$.

First, look at $v = v_i$, the $i$th entry of the first basis. In the first basis, its coordinates are just $(1, 0, 0, \ldots)^t$, i.e., $$ [v_i]_\alpha = [e_i]_E $$ That's a statement about equality of coordinate representations.

Second, according to equation (1) above, we have \begin{align} [e_i]_E &= V^{-1} [v_i]_E \end{align}

Combining, we get \begin{align} [v_i]_\alpha &= V^{-1} [v_i]_E \end{align} We can apply this to a linear combination of the $v_i$ to get \begin{align} [v]_\alpha &= V^{-1} [v]_E \text{ (5) } \end{align} and correspondingly, \begin{align} [v]_\beta &= W^{-1} [v]_E \text{ (6) } \end{align}

Now it all falls into place. with substitution, we get

\begin{align} [v]_\beta &= \Iab [v]_\alpha \text{, from the definition of $\Iab$}\\ W^{-1} [v]_E &= \Iab [v]_\alpha \text{, applying Eq. (6)}\\ W^{-1} [v]_E &= \Iab V^{-1} [v]_E \text{, applying Eq. (5)}\\ W^{-1} &= \Iab V^{-1} \text{, because prev. eqn is true for all $v$}\\ W^{-1}V &= \Iab \end{align}

Here's a matlab script verifying that these computations lead to the result you got when you worked it out by hand:

v1 = [1;1]
v2 = [2;1]
w1 = [2;2]
w2 = [3;2]
V = [v1, v2]
W = [w1, w2]
Tee = W * V^(-1)
Iab = (W^(-1)) * V

Summary: $\Iab = W^{-1} V$; $T_e^e = W V^{-1}$.

And just to make the comment-stream below continue to make sense, I'll suggest that putting the $\alpha$ up and the $\beta$ down leads to a nice situation in which "up" and "down" indices appear to cancel.

For your second question ("what is $T_e^e$ usually called?") I'd say "The matrix of the transformation taking the $v$s to the $w$s," at least in the case where the $v$s and $w$s are in $R^n$. That's the term I've heard used most often, anyhow. It may be that a poll of mathematicians would reveal something different.

One last thought: when one of the two bases actually is the standard basis, then either $W$ or $V$ is the identity, and the two matrices $\Iab$ and $T_e^e$ end up being inverses.

John Hughes
  • 100,827
  • 4
  • 86
  • 159
  • I don't like Einstein's notation. –  Feb 09 '14 at 04:22
  • OK. I don't always, either. Does your notation attach any semantic meaning to "up" and "down", or did you just put one up and one down to avoid having both in the same place? (which seems perfectly reasonable to me) – John Hughes Feb 09 '14 at 04:28
  • If putting them in one position has a meaning, putting them in the opposite position, necessarily, has a meaning too. –  Feb 09 '14 at 04:36
  • Good. That's what I got. –  Feb 09 '14 at 15:35