5

I'm trying to understand the curvature tensor $R^{\rho}_{\sigma\mu\nu}$ by playing with it in certain contexts. I already have an understanding, to some degree, of how it measures the change in a vector as it moves along a path in curved space, and how it's related to parallel transport. The issue is that there are just too many components for me to understand succinctly which parts correspond to what and why, in detail.

I tried constructing a differential equation with the intention of having the curvature of a manifold embedded into Euclidean space decrease as a function of some Euclidean distance from a point source. I began by writing

$$u^{\alpha}\partial_{\alpha}R^{\rho}_{\sigma\mu\nu}=$$

where $u$ is a vector pointing towards the source, and am at a loss for what the right hand side should be. Fundamentally, why does this curvature tensor need as many components as is does to describe something which is a relatively simple concept? Also any help with the differential equation above would be nice. Again, I'm trying to build a solid concept of this thing, so what is the best way to understand it?

  • 4
    "Why does this curvature tensor need as many components as is does to describe something which is a relatively simple concept?" I would say it's only "simple" (or simplest) in the case of a surface, which is really the only case I can visualize all at once. Indeed, in this case, the Riemann tensor is essentially given entirely by one scalar function (the Gauss curvature of the surface), which speaks to the fact that in 2D, it is indeed a relatively simpler concept. In higher dimensions, you have more components, owing to having many different directions one can measure curvature in. – Chris Jun 29 '21 at 17:09

2 Answers2

8

Note that I only recently started carefully studying Riemannian geometry, so I'm not aware, off the top of my head, of all the conventions regarding the signs of the curvature, placement of indices etc, so take those issues with a grain of salt.


For the components $R^{\rho}_{\,\sigma\mu\nu}$, even though all the indices $\rho,\sigma,\mu,\nu$ lie in the same range $\{1,\dots, \dim M\}$, it is useful to use a different notation. There are really two pairs of indices here, so let me write this as $R^{\alpha}_{\,ij\beta}$. In this way, the coordinate expression in terms of the $\Gamma$'s is \begin{align} R^{\alpha}_{\,\,ij\beta}&=\Gamma^{\alpha}_{i\mu}\Gamma^{\mu}_{j\beta}-\Gamma^{\alpha}_{j\mu}\Gamma^{\mu}_{i\beta} +\frac{\partial \Gamma^{\alpha}_{j\beta}}{\partial x^i} - \frac{\partial \Gamma^{\alpha}_{i\beta}}{\partial x^j}. \end{align} Here, we can think of each $\Gamma_i$ as a matrix $(\Gamma^{\mu}_{i\nu})_{\mu,\nu}$, in which case if we define $R_{ij}$ to be the endomorphism whose matrix entries (relative to the usual coordinate induced basis) are given as $(R_{\,ij\beta}^{\alpha})_{\alpha,\beta}$, then the above expression can be written more succinctly as

\begin{align} R_{ij}&=\Gamma_i\cdot\Gamma_j-\Gamma_j\cdot \Gamma_i +\frac{\partial \Gamma_j}{\partial x^i}-\frac{\partial \Gamma_i}{\partial x^j}, \end{align} where the $\cdot$ refers to matrix multiplication. So, the curvature $R$ is an object which in coordinates yields for each $i,j$, a certain matrix $R_{ij}=(R_{\,ij\beta}^{\alpha})_{\alpha,\beta}$, which happens to be skew-symmetric in $i,j$. Hopefully this clarifies the role of the different indices.


I think that by abstracting one level up can clarify the situation significantly. Let us consider a vector bundle $(E,\pi,M)$, and say $\dim M=n$ and $\text{rank}(E)=k$. Suppose we have some connection/covariant derivative operator $\nabla$. Using this we can define the curvature $R$ analogously to in the case where $E=TM$ is the tangent bundle. In this case $R$ is an $\text{End}(E)$-valued $2$-form on $M$. Being a $2$-form is reflected in the fact that in the coordinate expression, $R_{ij}=-R_{ji}$. Being endomorphism-valued is why we have two other indices $\alpha,\beta$.

You said you're aware that curvature is related to the change of a vector by parallel transport. Let us make this more precise:

Let $(E,\pi,M)$ be a vector bundle where $\dim M=n$ and $\text{rank}(E)=k$, and say we have a connection $\nabla$. Fix a point $p\in M$ and let $(U,\phi=(x^1,\dots, x^n))$ be a local coordinate chart about $p$. For any distinct $i,j\in\{1,\dots, n\}$ and any sufficiently small $\epsilon,\delta>0$, let $\gamma_{ij,\epsilon,\delta}$ be that curve in the manifold $M$ such that $\phi\circ \gamma_{ij,\epsilon,\delta}$ is the rectangular loop in the $x^i$-$x^j$ plane, with bottom left vertex at point $\phi(p)$, and which has side-lengths $\epsilon,\delta$ and which is traversed counter clockwise (see picture below). Let $P_{\gamma_{ij\epsilon,\delta}}$ denote the parallel-transport along this loop; this is an isomorphism of the fiber $E_p\to E_p$. Then, \begin{align} P_{\gamma_{ij,\epsilon,\delta}}&=\text{id}_{E_p}-\epsilon\delta R_p\left(\frac{\partial}{\partial x^i}(p),\frac{\partial}{\partial x^j}(p)\right) + \mathcal{O}((\epsilon^2+\delta^2)^{3/2})\\ &:=\text{id}_{E_p}-\epsilon\delta R_{ij}(p)+\mathcal{O}(\|(\epsilon,\delta)\|^3) \end{align}

loop

In fact, some books even use this equation as the definition of the curvature $R$. In words, if you parallel transport along a rectangular loop in the $x^i$-$x^j$ plane based at the point, then the deviation from the identity is equal to the curvature component $-R_{ij}(p)$, up to third order corrections.

I hope this makes it clear why so many indices are required. We need to describe parallel-transport along loops which lie in every possible plane (hence the $ij$ indices); then since parallel transport is an endomorphism for each fiber, if you want to describe the change in components of the vectors in that fiber, you need to introduce two more indices $\alpha,\beta\in\{1,\dots, \text{rank}(E)\}$ to describe this. So, in the expression $R^{\alpha}_{\,ij\beta}$, I think of $i,j$ as "planar indices" and $\alpha,\beta$ as "endomorphism indices" (though I like to suppress the $\alpha,\beta$ indices whenever possible because for me anyway it makes more sense conceptually to enlarge the target space to the space of endomorphisms and only keep track of the planar indices).

peek-a-boo
  • 65,833
  • 1
    I really should write $R^{\alpha}_{,\beta ij}$ to conform with the more standard notation, but I don't want to bump the post, so I'll just leave it here as a comment. – peek-a-boo Jul 27 '22 at 14:37
5

First, the last two indices are skew-symmetric and therefore the curvature tensor can be viewed as a $2$-form that depends on two indices, $$ \Omega^i{}_j = R^i{}_{jkl}\,dx^k\wedge dx^l. $$ Therefore, for given $i$ and $j$, $\Omega^i{}_j$ is something that can be integrated along a surface. Second, if written with respect to an orthonormal frame $(e_1, \dots, e_n)$, the tensor is also skew-symmetric with respect to the indices $i$ and $j$. Here, that should be interpreted as say that, for given $k$ and $l$, $$R^i{}_{jkl}\,e_i\otimes \omega^j$$ (where $(\omega^1, \dots, \omega^n)$ is the dual basis to $(e_1, \dots, e_n)$) is $o(n)$-valued, where $o(n)$ is the Lie algebra of the orthogonal group $O(n)$. In other words, for given $k$ and $l$, the curvature tensor is an infinitesimal rotation.

In particular, it measures the infinitesimal rotation of the given orthonormal frame relative to parallel transation. And, if you integrate the curvature $2$-form along a surface with boundary in $M$, you get the actual difference between the orthonormal frame at a point on the boundary and its parallel translate around the boundary. This is known as the Ambrose-Singer theorem.

A rather terse explanation for this (for the more general case of a connection on a vector bundle) can be found here: https://deaneyang.github.io/blog/blog/math/differential-geometry/riemannian-geometry/2020/08/07/Holonomy.html

Deane
  • 10,298