0

In my notes, there is a proof of the fact that if $c_1,c_2:I \to M$ are geodesics from an interval $I \subset \mathbb{R}$ to a smooth manifold $M$, presumably semi-riemannian, where $c_1(a) = c_2(a)$ and $c_1'(a) = c_2'(a)$, for some $a \in \mathbb{R}$, then $c_1 = c_2$.

Now, I read somewhere else (Christian Bär:s notes) that this "easily" follows from the fact that the geodesic equation is a second-order non-linear ordinary differential equations, hence we can use Picard-Lindelöf, so that by uniqueness of solution to the IVP, $c_1 = c_2$.

Here is the proof in my notes:

Proof: Let $$J := \{t \in I: c_1(t) = c_2(t), \quad c_1'(t) = c_2'(t)\}$$.

  1. $J \neq \emptyset$, since $a \in J$.
  2. J is closed since it is defined by a continuous equation.
  3. J is open; For each $b \in J$, lemma 5.2.2 gives us that there exists an $\epsilon > 0$ such that $c_1(t) = c_2(t)$ for $t \in (b-\epsilon,b+\epsilon)$. This means that $c_1'(t) = c_2'(t)$ so that $(b-\epsilon,b+\epsilon) \subset J \subset I$ hence $J$ is open.

Now, I sort of get the general idea. We want to show that $J$ is non-empty, and both open and closed in $I$, since $I$ is an interval in $\mathbb{R}$, it follows that $I$ is connected, hence the only sets that are both open and closed are $\emptyset,I$. By implication, since $J$ is not empty, we must have $I = J$.

My question relates to $2)$. What does he mean by "J is closed since it defined by a continuous equation"? I know that a continuous function $f:X \to Y$ between topological spaces $X,Y$ is such that if $A \subset Y$ is closed, then $f^{-1}(A)$ is closed. Is that what he is referring to?

Also, I am not entirely sure how to think about the geodesic equation as a system of second-order non-linear ordinary differential equations. How shall I think of $$(c^k)''+(c^i)'(c^j)'\Gamma^k_{ij} = 0 \quad (1 \leq k \leq n)?$$

Generally, a second-order differential equation is on the form $f(t,x,x',x'')$, but here we have $(c^k)'',(c^i)',(c^j)'$ and also a variable coefficient $\Gamma^k_{ij}:U \to \mathbb{R}$ that is locally defined.

Note that $c^j = (x^1 \circ c)$ for a chart $(U, \varphi = (x^1,\ldots,x^n))$.

Ben123
  • 1,847
  • 1
    regarding your just updated ODE question, you need a sufficiently general version of the existence and uniquenss theorems, particularly those that deal directly with the vector-valued case (or what amounts to the same thing (in finite dimensions), treat it as a system of ODEs). In other words, it is perfectly fine if you have $x,x’,x’’$ taking values in $\Bbb{R}^n$ for $f(t,x,x’,x’’)$. – peek-a-boo Aug 11 '23 at 03:12
  • Hm, yes, I just meant that here we actually have different functions $(c^i),(c^j)$ and $(c^k)$, as compared to $x,x',x''$. Could you maybe rewrite it on the form $f(t,c(t),c'(t),c''(t))$ since we here define $$c(t) = \varphi \circ c = (x^1 \circ c,\ldots,x^n \circ c) = (c^1(t),\ldots,c^n(t))?$$ – Ben123 Aug 11 '23 at 03:16

1 Answers1

0

If $f,g:X\to Y$ are continuous maps between topological spaces, with $Y$ being Hausdorff, then the set of points of equality $E_{f,g}:=\{x\in X\,:f(x)=g(x)\}$ is closed.

To prove this, consider the product map $f\times g:X\to Y\times Y$, $x\mapsto (f(x),g(x))$. Also, let $\Delta_Y=\{(y,y)\,:y\in Y\}$ be the diagonal in $Y\times Y$. Then, we have $E_{f,g}=(f\times g)^{-1}(\Delta_Y)$. Now, if we equip $Y\times Y$ with the product topology, then we have that $f\times g$ is continuous, and also $\Delta_Y$ is closed if and only if $Y$ is Hausforff (a standard fact). Since we are indeed assuming $Y$ is Hausdorff, we see that $E_{f,g}$ is the preimage of a closed set under a continuous map, hence closed.

In the case where $Y$ is a normed vector space (like $\Bbb{R}$) you can offer a less abstract proof by noting that $E_{f,g}=\{x\in X\,:\, f(x)-g(x)=0\}$, i.e it is the zero-level set of the difference $f-g$ (which is continuous). Note that $\{0\}$, a singleton, is indeed a closed set. So, $E_{f,g}$ is the preimage of a closed set under a continuous map, hence continuous.

In your case, you have two equality conditions, so you can think of it as the intersection of two such sets (defined by $c_1,c_2$ and $c_1’,c_2’$). Or you can think of it as the equality set of the maps $C_1(t)=(c_1(t),c_1’(t))$ and $C_2(t)=(c_2(t),c_2’(t))$.


For the ODEs, you just need a sufficiently general version of the existence and uniqueness theorem. To write it in a slightly recognizable format, let me abuse notation slightly and think of ‘already applying the chart’, meaning I’ll consider the Christoffel symbols $\Gamma^i_{jk}\circ\phi^{-1}$ as a mapping $U’:=\phi[U]\subset\Bbb{R}^n\to\Bbb{R}$. But, for ease of typing, I’ll simply write $\Gamma^i_{jk}$ instead.

Consider the function $f:U’\times\Bbb{R}^n\to\Bbb{R}^n$ defined as \begin{align} f(x,v):=\left(-\Gamma^1_{jk}(x)v^jv^k, \dots, -\Gamma^n_{jk}(x)v^jv^k \right). \end{align} This is a smooth function, so the existence and uniqueness theorems (we’re dealing with the autonomous case here) tell us that for any $x_0\in U’$ and $v_0\in\Bbb{R}^n$, there is a unique smooth curve $\alpha:I\to U’$ (where $I$ is some open interval around the origin in $\Bbb{R}$) such that for all $t\in I$, we have $\alpha’’(t)=f(\alpha(t),\alpha’(t)) $ and $\alpha(0)=x_0$ and $\alpha’(0)=v_0$. This is the desired (chart-representation of) geodesic.

At this point you may object that I’m dealing with second order ODEs, while the usual existence and uniqueness theorems are phrased for first order ODEs (eg on Wikipedia). However, the second order case follows trivially (since the theorem applies generally for vector-valued functions and curves). The trick for converting a second order system into a first order one is explained in any good ODE book.

peek-a-boo
  • 65,833
  • Thank you, that makes sense. I remember my friend telling me the same fact about diagonals in Hausdorff-spaces a while back. :) – Ben123 Aug 11 '23 at 03:18
  • @Ben123 see my edit – peek-a-boo Aug 11 '23 at 03:29
  • Thank you. It is not immediately obvious why you are looking at, well, maybe you are looking the christoffel-symbol composed with the pullback of inverse of the chart $\phi$ because you want a real-valued function. Hm, oh, I see, sort of, well, I sort of see it. It does not light my fire immediately in how it all comes together, but I sort of see what you are doing. Thanks a lot for taking the time :) – Ben123 Aug 11 '23 at 03:39
  • I’m trying to blindly apply some version of the (second order version of) existence/uniqueness theorem. These theorems are formulated in $\Bbb{R}^n$, so if I want to apply them blindly, I had better situate myself there. Hence, I pushed forward with the chart so we can delete ‘manifold’ from our vocabulary temporarily and apply the theorem. btw, minor correction, it’s not that I want a real-valued function (because $\Gamma$’s are already real-valued). I want them to have domain a part of $\Bbb{R}^n$ too. – peek-a-boo Aug 11 '23 at 03:43
  • About the real-valued function: Yes, you are correct, as you say, $\Gamma^k_{ij}$ already is a real-valued function.

    What do you mean by push-forward by the chart? Is not $(\phi^{-1})^*(\Gamma^k_{ij}) = \Gamma^k_{ij} \circ \phi^{-1}$ a pull-back?

    Is not the push-forward the differential/tangent-map? In this setting?

    – Ben123 Aug 11 '23 at 03:47
  • 1
    Pullback by the inverse of $\phi$ is also called the pushforward by $\phi$ … same thing. You just have to get used to the vocabulary. – peek-a-boo Aug 11 '23 at 16:35
  • Another question, what causes the equation to be called non-linear? Naive question, perhaps. Is it the $v^j \cdot v^k$ factors? – Ben123 Aug 17 '23 at 19:19
  • @Ben123 that, and the $\Gamma$’s are not necessarily linear functions of $x$. Of course in the trivial case where the $\Gamma$’s all vanish, this is a linear ODE (and the solution is that $\alpha(t)=x_0+v_0t$, which parametrizes a line). – peek-a-boo Aug 17 '23 at 19:41
  • But for an ODE $$a_0(x)y+a_1(x)y'+\ldots+a_n(x)y^{(n)} = b(x)$$ to be called linear, $a_i(x)$ for $0 \leq i \leq n$ is a polynomial which does not have to be a linear (first-order) polynomial. Now, I don´t know if we can treat factors $\Gamma^k_{ij}$:s as I treat $a_i(x)$ above, but if we can, then I don´t see the relevance of whether $\Gamma$:s are linear functions of $x$ or not? – Ben123 Aug 17 '23 at 19:47
  • You’re mixing up the $t,x,v$ in our problem with the $x$ and $y$ there. They’re different things. – peek-a-boo Aug 17 '23 at 19:52
  • Hm, that is possible, I think the chapter on geodesics is still the one that has not clicked yet. I have taken a rigorous course on ODE:s quite recently (although my brain usually starts forgetting as soon as the exam is over), so I am aware of relevant notations/theorems, but I can´t really connect it directly to the notation here. Thanks for your input, I need to tinker and think some more, I guess :)

    But, yes, that is correct, both $\Gamma$:s and $(c^k)$ and it´s derivatives, for arbitrary $k$, takes arguments from $t$ (originally), so I see your point.

    – Ben123 Aug 17 '23 at 19:54
  • Atleast if we think of the original equation as $$\Big(c^k)''+(c^i)'(c^j)'\Gamma^k_{ij} \circ c\Big) \ \partial_k \circ c = 0 \quad (1 \leq k \leq n).$$ – Ben123 Aug 17 '23 at 19:59
  • for completeness, I’ll add the definition. Let $E$ be a Banach space. A linear (though more properly it should be called affine) ODE is one that can be written as $\frac{d\xi}{dt}=A(t)[\xi(t)] +B(t)$, where $A:\Bbb{R}\to\text{Hom}(E)$ and $B:\Bbb{R}\to E$ are maps. So, the linearity is encoded in the fact that $A(t)$ belongs to $\text{Hom}(E)$. In our situation, we should take $E=\Bbb{R}^{n+n}$ and $\xi=(x_1,\dots, x_n,v_1,\dots, v_n)$. You’ll see that if you write out the equation satisfied by our $\xi$, it does not satisfy this format (barring some trivial cases). – peek-a-boo Aug 17 '23 at 20:03
  • The key point is that $x(t)$ and $v(t)$ are both things we need to solve for (since of course $v(t)=x’(t)$). That’s why the $\Gamma(x)$ behavior on $x$ is important. Though, to really challenge your brain, there’s another equation, namely that for parallel transport of $u(t)$ along a given base curve $x(t)$ (everything written in local coordinates). That looks like $\frac{du^{\alpha}}{dt}=-\Gamma^{\alpha}_{i\beta}(x(t))\dot{x}^i(t)u^{\beta}(t)$. This is a linear ODE for $u(t)$. Here, we don’t care that the dependence on $x$ is crazy since we’re not trying to solve for it; we are given it. – peek-a-boo Aug 17 '23 at 20:08
  • Yes, I am aware of the equation for parallel vector fields. That this equation is linear is the reason that parallell-transport is a linear isometry, apart from the fact that $$\frac{\nabla}{dt}X = 0$$ for parallel vector fields so that $$\frac{d}{dt}\langle X,Y\rangle = \langle \frac{\nabla}{dt}X,Y \rangle+\langle X, \frac{\nabla}{dt}Y \rangle = 0$$ for a connection $$\frac{\nabla}{dt}$$ along a smooth curve. – Ben123 Aug 17 '23 at 20:11
  • I see, it seems like your $v^i$:s are my $c^i$:s. I was somewhat confused by your notation. It will be easier for me to understand, I believe, if we agree on notation.

    I see it as $$(X^k)'+(X^i)(c^j)'\Gamma^k_{ij} = 0 \quad (1 \leq k \leq n).$$ By ODE-theory we get a unique solution $${X^i \in C^{\infty}(I,\mathbb{R}}_{1 \leq i \leq n}$$ such that $X^(a) = v^i$ given initial condition $X(a) = v.$

    – Ben123 Aug 17 '23 at 20:19
  • NO. Your $c^i(t)$ is my $x^i(t)$. Then, my $v^i(t)$ is $(x^i)’(t)$, or what you would write as $(c^i)’(t)$. Your really should review everything I’ve said and process. – peek-a-boo Aug 17 '23 at 20:22
  • Oh, yes, I see. Thanks for clarifying, makes sense – Ben123 Aug 17 '23 at 20:25
  • Big caps comes off at somewhat aggressive. No need for aggression. Also, a comment on your answer regarding my original formulation. As you pointed out, you need another function $$f' \times g':X \to Y' \times Y' \subset TM \times TM$$ defined by $$a \in X \mapsto (f(a),g(a)) = (c'_1(a),c_2'(a))$$ that is continuous, I believe. But anyhow, I can´t deal with people who are aggressive, I am just trying to learn and discuss math, so if you are aggressive, I will shut down this discussion. No offence. – Ben123 Aug 17 '23 at 20:45
  • @Ben123 lol the caps is a mechanical issue with the shift key on my keyboard. I hold it down to capitalize the first letter, but there’s a lag (as it gets ‘stuck’), so it stays on and usually capitalizes the next one or two letters. (In an answer, I can go back and edit, but on a comment, I can’t, unless I spot it in time). – peek-a-boo Aug 17 '23 at 20:47
  • Oh, then I apologize. – Ben123 Aug 17 '23 at 20:52
  • I think I understand this better now, your comment was very helpful, especially $$\alpha(t)'' = f(\alpha(t),\alpha'(t))$$ for $$f(x,v) = \ldots$$ etc.

    W/ respect to the differential equation for parallel vector-fields, would you reformulate it in any way as you did w/ regards to the geodesic equation?

    – Ben123 Aug 18 '23 at 08:21
  • I suppose you could think that you set $f(X^k) = -X^i(c^j)'\Gamma^k_{ij}$ for $1 \leq k \leq n$. Then you get a system of $n$ equations such that $(X^k)' = f(X^k)$ with $n$ boundary conditions $X^k(a) = v^k$ for $1 \leq k \leq n$, where the $v^k$:s are the coefficient of a vector $v \in T_{c(a)}M$ written in basis (with ESC) as $$v = v^i \partial_i|_{c(a)}.$$ – Ben123 Aug 18 '23 at 09:49
  • I should probably write it as $$F_k(X^1,\ldots,X^n) = -X^i(c^j)’\Gamma^k_{ij}.$$ Then we get $n$ such equations such that $$(X^k)’ = F_k(X^1,\ldots,X^n).$$ hm – Ben123 Aug 18 '23 at 11:07
  • Or probably as $$F_k(X^1,\ldots,X^n)(t) = -X^i(c^j)’\Gamma^k_{ij}$$ – Ben123 Aug 18 '23 at 11:14
  • 1
    Yes thats the idea. – peek-a-boo Aug 18 '23 at 16:26