1

From Serge Lang's Linear Algebra:

Let $V$ be a finite dimensional space over $\mathbb{R}$ of infinitely differentiable functions (0) vanishing outside some interval. Let the scalar product be defined as usual by:

$$\langle f, g \rangle = \int_0^1 {f(t)g(t)} \, dt$$

Let $D$ be the derivative. (1) Show that one can define $D^T$ as before, and $D^T=-D$.

Definitions (and assumptions):

(0): The bounds of interval on which functions vanish are not specified in the book. If we consider the boundaries for the integral scalar product, I assume it is possible, that for $f, g \in V$, $\exists t \in [0, 1], (f(t)=0 \lor g(t)=0) \implies (\langle f, g \rangle = 0)$.

(1): $D^T$ is the adjoint of linear operator $D$ with respect to the scalar product defined above (such that $\langle Df, g \rangle = \langle f, D^{T}g \rangle$), which was proven to exist in my textbook using the isomorphic relation between inner product space and its dual space (Riesz representation theorem), but in this case scalar product is not proven to be non-degenerate, so this is not certain.

Furthermore, it seems to me that $D^T$ in this specific case implies transpose, therefore $D^T=-D$ equality simply implies that $D$ is skew-symmetric.

Formulated problem:

The relation between $D$ and $D^T$ can be represented by the equality from (1) (if we know that scalar product above is non-degenerate):

$$\int_0^1 {D(f(t))g(t)} \, dt = \int_0^1 {f(t)D^T(g(t))} \, dt$$

In order to show skew symmetry, following equality must hold:

$$\int_0^1 {D(f(t))g(t)} \, dt = -\int_0^1 {f(t)D(g(t))} \, dt$$

or in concordance with initial equation: $$\int_0^1 {f(t)D^T(g(t))} \, dt = -\int_0^1 {f(t)D(g(t))} \, dt$$

and thus, due to linearity, we must show that:

$$\int_0^1 f(t)(D^T(g(t)) + D(g(t)) \, dt = 0 $$

Question:

$V$ is a vector space of finitely dimensional smooth non-analytic functions. Can this fact be utilized to show non-degeneracy of integral scalar product, thus proving my initial equation (without using numerical approximation methods)? Finally, does the fact that these functions vanish have anything to do with the last equation that I derived in formulated problem?

In general, how can the existence of adjoint operator be proven by given information and is skew-symmetric property associated with this?

Thank you!

ShellRox
  • 977

1 Answers1

0

Suppose that $V$ is a vector space of all real-valued infinitely differentiable (smooth) functions with a compact support inside $[a, b]$ (i.e such that $f(x) = 0, \forall x \notin [a, b]$).

If $f \in V$, then it is non-analytic by the identity theorem of real analytic functions and is commonly referred to as a bump function.

Existence of Adjoint

Assume that $D$ is a linear (first order derivative) operator, then:

$$\exists D^*, \langle Df, g \rangle = \langle f, D^*g \rangle, \forall f, g \in V$$

where $\langle f, g \rangle: C[0, 1]^2 \rightarrow \mathbb{R}$ defined by $(f, g) ↦ \int_0^1 {f(t)g(t)} \, dt$.

Proof:

We start by showing non-degenerate property of the inner product defined above. Take $f=g$ and $\langle f, f \rangle =0$, assume for contradiction that $\exists c \in [0, 1], |f^2(c)| > 0$. Since $f$ is continuous and smooth, by epsilon-delta definition it is seen that:

$$\forall \epsilon = \frac{f^2(c)}{2}, \exists \delta>0: |x - c| < \delta \implies |f^2(x) - f^2(c)| < \frac{f^2(c)}{2}$$

for all $x \in (x - \delta, x + \delta)$. Then it can be algebraically verified that $f^2(x) > \frac{f^2(c)}{2}, \forall x \in (x - \delta, x + \delta)$ and therefore:

$$\int_0^1 {f^2(t)} \, dt \geq \int_{c - \delta}^{c + \delta} {f^2(t)} \, dt \geq \int_{c - \delta}^{c + \delta} \frac{f^2(c)}{2} \geq \delta f^2(x) > 0$$

Which is a contradiction of our assumption, therefore the scalar product $\langle f, g \rangle$ must be non-degenerate.

Now, let $L: V \rightarrow \mathbb{K}$ be a linear functional defined by $L(f)=\langle f, g \rangle, g \in V$. Due the non-degeneracy of the $L$, we can use Riesz-Representation theorem to show that:

$$\exists g' \in V: L(f) = \langle f, g \rangle, \forall f \in V$$

where the element $g'$ is unique and depends on the choice of $g$ (and $D$, the linear operator). Such element $g'$ can be denoted as $D^*g$ and holds an isomorphic relation with $g$: $g \mapsto D^*g$.

Finally, it can be easily verified by integral properties that $D^{*}$ is linear - I might expand on this if necessary.

Q.E.D

$D^*$ in this case will be denoted as $D^T$.

Skew-Symmetry

By definition, a linear operator $A$ is said to be skew-symmetric if $\langle Ax, y \rangle = -\langle x, Ay \rangle \forall x, y \in \mathbb{R}^n$. It can be trivially seen that equivalent definition of skew-symmetry is:

$$\langle Ax, x \rangle, \forall x \in \mathbb{R}^n$$

In our case, it must hold that for the first order derivative $D$:

$$\int_0^1 f(t)Df(t) = 0$$

We know that $V$ is a vector space of bump functions with compact support. If $K$ defines such compact support, it must be true that $K \subset [0, 1]$, for otherwise $0 \notin V$ which violates definition of our scalar product (that the vector space must be its domain). It then: $(\{0, 1\} \subset \mathbb{R} / K) \implies f(0)=f(1)=0$

Finally, we can use the identity $fDf = \frac{1}{2}Df^2$ and the fundamental theorem of calculus to see that:

$$\int_0^1 f(t)Df(t) \, dt = \int_0^1 \frac{1}{2} Df^2(t) \, dt = \frac{1}{2} \int_0^1 Df^2(t) \, dt = \frac{1}{2}(f^2(1) - f^2(0))=0$$

Thus $D$ is skew-symmetric with respect to the scalar product defined and $-D=D^T$.

ShellRox
  • 977