10

I now asked this at MO.

Let $f:\mathbb{R}^d \to \mathbb{R}$ be smooth. The mixed derivatives commute: $f_{xy}=f_{yx}$. This identity is "universal" in the sense that it holds for any smooth map.

Question:

Are there any universal identities which are not consequences of the commutation of the mixed derivatives?

More explicitly, let $D_i$ be the differential operator which takes the partial derivative with respect to $x_i$. The symmetry can be written as an algebraic statement

$$ D_i \circ D_j = D_j \circ D_i \tag{1}.$$

(When we choose the domain for these operators to be the space of smooth maps $\mathbb{R}^d \to \mathbb{R}^d$, so we can compose operators).

Consider the subset $A$ of differential operators (which map $C^{\infty}(\mathbb{R}^d) \to C^{\infty}(\mathbb{R}^d)$) that is "generated" by the $D_i$ via addition, composition and multiplication*.

Note that this algebraic structure $A$ has $3$ binary operations.

(I am not sure if there is a term for such an "algebraic creature", $A$ is a ring w.r.t both operations $(+,\cdot)$ and $(+,\circ)$, but this two operations have relations, namely $$(f \cdot g) \circ h = (f \circ h) \cdot (g \circ h),$$

they "commute". Does such a structure have a name?

"Concrete Question:":

Are there relations in $A$ which are not consequences of the fundamental relation $(1)$?.


*By multiplication (as opposed to composition) of operators I mean the following:

$$D_x \times D_y(f)=f_x \cdot f_y \, , \, D_x∘D_y(f)=f_{xy} \, , \, (D_x \circ D_x) \times D_y(f)=f_{xx}f_y$$ etc. (Here $f$ is a scalar function, to extend the operations to $\mathbb{R}^d$-valued maps, jut act on each component separately. I am also allowing for the $i$-th component of output to depend on partial derivatives of all components of $f:\mathbb{R}^d \to \mathbb{R}^d$).

The multiplication is needed in order to talk about relations of the form of $f_x f_y=f_yf_x$ (trivially true) or $f_{xx}=f_yf_{xy}$ (clearly not universal). (Without multiplication, there are no additional relations as observed in this answer).


In particular, I am interested to know whether the "Cofactor Lemma" (divergence-free rows, see below) is a "consequence" of the commutation of mixed derivatives (for dimension $d>2$ this involves multiplication, as well as addition and composition).


The Cofactor Lemma:

Let $f:\mathbb{R}^d \to \mathbb{R}^d$ be smooth. Then the Cofactor of $df$ has divergence-free rows:

$$\sum_{j=1}^n \frac{\partial(Cof(Du))_{kj}}{\partial x_j} = 0, k=1,...,d.$$

In dimension $d=2$, it reduces to relation $(1)$:

Given $A= \begin{pmatrix} a & b \\\ c & d \end{pmatrix}$, $\operatorname{Cof}A= \begin{pmatrix} d & -c \\\ -b & a \end{pmatrix}$, so

$$ df= \begin{pmatrix} (f_1)_x & (f_1)_y \\\ (f_2)_x & (f_2)_y \end{pmatrix}, \operatorname{Cof}df= \begin{pmatrix} (f_2)_y & -(f_2)_x \\\ -(f_1)_y & (f_1)_x \end{pmatrix} .$$

We see that $\operatorname{div} (\operatorname{Cof}df)=0$ is equivalent to $(f_1)_{xy}=(f_1)_{yx},(f_2)_{xy}=(f_2)_{yx}$.

As stated above, for dimension $d>2$ we need multiplication to even phrase the question properly.

Asaf Shachar
  • 25,967

2 Answers2

0

Here is how the cofactor lemma follows from your identity. I'm not giving all details as it would be a bit messy, but I hope you can reconstruct a more pedantic proof from this sketch if you want one. Let me know if you want more details or an example ($3\times3$?) to show it in action.

Let's look at the element $cof(Du)_{ij}$ of the cofactor matrix. It is the determinant of a submatrix, in this case a sum of products of $\partial_au_b$, where $a\neq i$ and $b\neq j$. When you differentiate such a product w.r.t. $x_j$, you can use the product rule. All the terms look similar: There is one $\partial_j\partial_au_b$ and the rest of the terms in the product are unchanged.

In $\sum_j\partial_jcof(Du)_{ij}$ you get each term twice; the term containing $\partial_j\partial_au_b$ comes from both $\partial_jcof(Du)_{ij}$ and $\partial_acof(Du)_{aj}$. These two contributions come with opposite signs. The sum $\sum_j\partial_jcof(Du)_{ij}$ contains no derivatives like $\partial_j^2$, so you can organize them sum in these pairs. Each pair has to vanish since the second order partials of $u_b$ commute for any $b\neq j$. Therefore the entire sum has to vanish.

  • Thanks. I am not sure I see why the signs are opposite... Is there a chance you can elaborate on this? – Asaf Shachar May 20 '17 at 11:53
  • @AsafShachar I don't have the time for more elaboration on this now, but the idea is that each term in the sum is the product of $n$ (dimension) terms. The two similar terms differ only by a permutation of the relevant indices ($j,a$), whence the sign difference, given the determinantal structure. I can write more later if you want. – Joonas Ilmavirta May 21 '17 at 07:55
0

Terry Tao has answered the question on MO. There are no non-trivial universal identities.

Asaf Shachar
  • 25,967