1

Rotation matrices in $R^3$ are given by $$S = \begin{pmatrix} \hat e'_1.\hat e_1 & \hat e'_1.\hat e_2 & \hat e'_1.\hat e_3 \\ \hat e'_2.\hat e_1 & \hat e'_2.\hat e_2 & \hat e'_2.\hat e_3 \\ \hat e'_3.\hat e_1 & \hat e'_3.\hat e_2 & \hat e'_3.\hat e_3 \end{pmatrix} $$ where {$\hat e_1, \hat e_2, \hat e_3$} are orthogonal unit vectors in original space and {$\hat e'_1, \hat e'_2 ,\hat e'_3$} are orthogonal unit vectors in rotated space.

I quote the following reasoning from a text book that couldn't understand.

It is useful to make one observation about the elements of $S$, namely $S_{\mu\nu} = \hat e'_\mu.\hat e_\nu$. This dot product is the projection of $\hat e'_\mu$ onto the $\hat e_\nu$ direction, and is therefore the change in $x_\nu$ that is produced by a unit change in $x'_\mu$.Since the relation between the coordinates is linear, we can identify $\hat e'_\mu.\hat e_\nu$ as $\frac {\partial x_\nu}{\partial x'_\mu}$, so our transformation matrix S can be written in the alternate form

$$S = \begin{pmatrix} \frac {\partial x_1}{\partial x'_1} & \frac {\partial x_2}{\partial x'_1} & \frac {\partial x_3}{\partial x'_1} \\ \frac {\partial x_1}{\partial x'_2} & \frac {\partial x_1}{\partial x'_2} & \frac {\partial x_1}{\partial x'_2} \\ \frac {\partial x_1}{\partial x'_3} & \frac {\partial x_1}{\partial x'_3} & \frac {\partial x_1}{\partial x'_3} \end{pmatrix} $$

I cannot absorb that a dot product is the same as a partial derivative. Any help will be appreciated.

levitt
  • 359
  • Note that as defined your change of basis matrix may be orthogonal but not proper orthogonal, i.e. it is not a rotation matrix, but a reflection instead (determinant = -1). – user_of_math Sep 20 '14 at 08:13
  • What does x n the question stand for? – Apoorv Mishra Jun 30 '23 at 18:13
  • @ApoorvMishra $x\mapsto x'$ is an injective and differentiable map from $\mathbb R^3$ to (as subset of) $\mathbb R^3.$ We call that a coordinate transformation. If you write $x'=T(x)$ and use Travis Willses's answer it should all become clear. – Kurt G. Jul 01 '23 at 08:47
  • The notations used by Willse are a little Complex for me to understand. Also the explanation given by them is mathematically correct but it doesn't give an intuition of why it is true. And in the question itself what does xv and x'u represent? – Apoorv Mishra Jul 01 '23 at 15:22

2 Answers2

2

For $R^2$ it is easy to visualize. You can extend it to $R^3$. The thing to note is that the projection of one unit vector on another unit vector is the cosine of the angle between them.

$e^{'}_v.{e_{\mu}} = |e^{'}_v|.|e_{\mu}|\cos{\theta} = \cos{\theta}$ , where $\theta$ is the angle between unit vectors $e^{'}_v$ and $e_{\mu}$.

Now, refer to the image attached.enter image description here

The partial derivatives come into play when we do something similar in $R^3$.

Prakhar
  • 137
  • Your answer does explain a simplified case but it would be helpful if you can suggest some reading on this topic that describes it briefly. – Apoorv Mishra Jul 03 '23 at 04:10
  • @ApoorvMishra You seem quite hard to please. This user has made a big effort to show what is going on. There is not much you have to read on this topic. You should extend (as suggested) to $\mathbb R^3$ to learn those things actively. – Kurt G. Jul 03 '23 at 06:55
  • I get it. It's just that I never came across such a form of rotation matrix anywhere before and in the book that has been quoted in the question, the idea is further extended to explain covariant and contravariant tensors using the same form of transformation matrix. This complicates the text even further and so I was asking for a reading on the topic. – Apoorv Mishra Jul 03 '23 at 08:11
  • 1
    @ApoorvMishra In this answer I derive from scratch (without looking at any literature) the covariant transformation rule for basis vector fields. Please study that. Hopefully you find it simple some day. The contravariant rules for the vector components are a consequence of the covariant rules by assuming that the vector is the same no matter in which coordinate system we express it. – Kurt G. Jul 03 '23 at 13:07
1

For any $m \times n$ matrix $A$, the coordinate representation of the linear transformation $$T_A:x \mapsto A x$$ is $$T(x)_i = (Ax)_i = \sum_{k = 1}^n A_{ik} x_k, \qquad 1 \leq i \leq m.$$ In other words, the $i$th entry of the transformed vector $Ax$ is just the usual dot product of the $i$th row of $A$ (regarded as a vector) with $x$.

On the other hand, differentiating the above coordinate formula gives that the partial derivative of $T_i$ is $$\frac{\partial T(x)_i}{\partial x_j} = \frac{\partial}{\partial x_j}\left(\sum_{k = 1}^n A_{ik} x_k\right) = A_{ij}.$$

Travis Willse
  • 108,056