2

I would like to rotate a multidimensional vector $v \in R^d$ at a random direction. Now I have two problems here.

  1. How to define direction in a multi dimensional space?
  2. How to rotate the vector in the given direction.

To make it a bit mor cleare I ilusetrate the logic for d=2 and d=3 cases

enter image description here

Now in the d=3 case the blue circle indicates all possible vector which is $\alpha$ away from vector v.

Assume that I am writing a program and I want to call a function such as v = rotateIt(v,direction, alpha). I would expect the function to return the rotated vector. I am having dificulity how to define the direction for any dimension d and also how to rotate the vector towards the given direction with angle alpha. Can anybody give me a solution or direct me to some literature?

Thanks for the help.

3 Answers3

1

General rotations (linear transformations preserving length and orientation) in higher dimensions get tricky:

  • In 2 dimensions, a rotation is completely determined by an angle.
  • In 3 dimensions, a rotation always has an axis (a line that is unmoved by the rotation), so is determined by a rotation in the plane orthogonal to the axis.
  • In higher dimensions things get more complicated, and there is no well-defined angle of rotation. (For example, a general rotation in 4 dimensions has 2 orthogonal invariant planes, each with its own angle of rotation.)

Luckily, you are not asking for a rotation of the entire space, but of a single vector $\mathbf{v}$. One way to think about it is to pick a random plane containing $\mathbf{v}$, and apply a rotation in that plane. Here is a way to do that:

  1. Choose a random vector $\mathbf{w}$ orthogonal to $\mathbf{v}$, with the same length. The plane of rotation will be the one containing $\mathbf{v}$ and $\mathbf{w}$. For example, one may start with a random vector $\mathbf{u}$ that is not parallel to $\mathbf{v}$, project it onto the hyperplane orthogonal to $\mathbf{v}$, and rescale it to have the same length; explicitly: $$ \mathbf{w}' = \mathbf{u} - \frac{\mathbf{v} \cdot \mathbf{u}}{|\mathbf{v}|^2} \mathbf{v}; \qquad \mathbf{w} = \frac{|\mathbf{v}|}{|\mathbf{w}'|} \mathbf{w}'. $$ (This is essentially Gram-Schmidt applied to $\mathbf{v}$ and $\mathbf{u}$, except with scaling.)
  2. Given a rotation angle $\alpha$, the result should be $$ \mathbf{v}' = \cos(\alpha) \mathbf{v} + \sin(\alpha) \mathbf{w}.$$ This will have the same length as $\mathbf{v}$, and the angle between $\mathbf{v}$ and $\mathbf{v}'$ will be $\alpha$.
arkeet
  • 7,970
0
  1. A unit vector can be used to define directions.
  2. Matrix multiplication is a way to rotate a vector.

Two useful links:

Rotation matrix - Wiki

Finding the rotation matrix in n-dimensions

Andy
  • 1
0

A rotation in (2n+1)-dimensions has a real eigenvalue $\pm 1$, its eigenvector spans an invariant 1-d axis subspace. Except for dimension 3 with its single parameter of an angle determining the rotation matrix in the 2d-plane orthogonal to the axis, in higher dimensions the 2n-dimensional subspace orthogonal to the invariant 1-d subspace is carrying the full representation.

So let 2n be the even dimensional case. There exists the set of antisymmetric unit matrices $$(L_{(i,k)})_{m,n} = \delta_{m,i}\delta_{n,k}-\delta_{n,i}\delta_{m,k}$$

e.g. $$L_{1,2}\ = \ \left( \begin{array}{cccc} 0 & -1 & 0 & 0 \\ 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ \end{array} \right) , \quad L_{1,2}^2 \ = \ \ \left( \begin{array}{cccc} -1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ \end{array} \right) $$

Any such matrix generates a rotation matrix $O_{ik}(\alpha_{ik})$, leaving all direction inavariant except the 2d-plane $(x_i,x_k)$ by $$(e^{\alpha_{ik} L_{(i,k)}})_{mn}= 1 + (\cos (\alpha_{ik}) -1) \ \delta_{im}\ \delta_{kn} + (\alpha_{ik} L_{(i,k)})_{mn} \sin (\alpha_{ik}) $$

because the even powers generate an alternating unit diagonal matrix with two entries only

$$ \left(\left(L_{(i,k)}\right)^2\right)_{m n} \ = \ -\delta_{im}\ \delta_{kn} $$

It follows, that the 2n-dimensional orthogonal matrix is generated by all products of all 2-d rotation matrices in any order. In the vincinity of the unit matrix the linear approximation is given by

$$O(\mathbf \alpha) = 1 + \sum_{1<i<k<2n}\alpha_{ik} L_{ik} + \mathrm o\left( \mathbf \alpha^{\otimes 2}\right)$$

that extends to the exponential by the group identity

$$\lim_{s\to \infty} \ \left(1 + \frac{1}{s}\ \sum_{1<i<k<2n}\alpha_{ik} L_{ik} \right)^s \ = \ e^{\sum_{ik} \ \alpha_{ik} \ L_{ik}}$$

Such general rotations are entangled in such a way, that finally only the formula for the determination of the rotation angle in any 2-plane by taking the trace with the generator for that plane yields the well known $\cos$-formula in dimension 2

$$2 \cos(\alpha) \ = \ Tr\left(\begin{array}{cc} \cos \alpha & -\sin \alpha \\ \sin \alpha & \cos \alpha \end{array} \right)$$

Roland F
  • 5,122