0

I see via wiki that: $$\begin{aligned} \nabla \cdot(\mathbf{A}+\mathbf{B}) &=\nabla \cdot \mathbf{A}+\nabla \cdot \mathbf{B} \\ \nabla \times(\mathbf{A}+\mathbf{B}) &=\nabla \times \mathbf{A}+\nabla \times \mathbf{B} \end{aligned}$$

But... Does curl distribute like this?

$$\nabla \times(\mathbf{A} \cdot \mathbf{B}) =\nabla \times \mathbf{A} \cdot \nabla \times \mathbf{B}$$

Edit: Was told the above doesn't make sense. Here's another try? $$\nabla \times (\vec{A} \cdot \nabla \vec{A}) \stackrel{?}{=} (\nabla\times\vec{A}) \cdot (\nabla \times \nabla\vec{A}) $$

Calvin Khor
  • 36,192
  • 6
  • 47
  • 102
  • 4
    The dot product produces a scalar, so the left-hand side of the “identity” doesn’t make sense. – Clayton Jan 11 '22 at 02:20
  • @Clayton I added the example that I'm actually trying to expand. Does this work? I think no as well, because it still results in a scalar right? – user267298 Jan 11 '22 at 02:25
  • Did you mean this by chance? https://proofwiki.org/wiki/Divergence_of_Vector_Cross_Product – Golden_Ratio Jan 11 '22 at 05:16

1 Answers1

2

You should properly bracket $A\cdot\nabla A\let\del\partial$ as $(A\cdot\nabla)A$, at least until you know what you're doing. the operator $A\cdot \nabla$ is a "scalar operator" i.e. it acts in the same way on each of the coordinates of the vector field $A$ it is differentiating. In components, we have $$ \big((A\cdot\nabla)A\big)_i = (A\cdot\nabla) A_i= \sum_jA_j \del_jA_i.$$One skips writing $\sum_j$ for repeated indices; this is called Einstein summation.

Also, it is not clear what $\nabla \times \nabla A$ is in your guess; note that $\nabla A$ is a matrix. Regardless, your guess cannot be right because (either $\nabla\times\nabla = 0$ or) it doesnt have the right number of derivatives, which can be seen by a scaling argument: Suppose it was true for all vector fields $A(x)$. Then it must also be true for $A_\lambda(x) := A(\lambda x)$, for all $\lambda>0$. But chain rule in your guess gives $$RHS =\lambda^3 (\nabla \times A \cdot \nabla \times \nabla A)(\lambda x)$$ and $$ LHS = \lambda \nabla \times (((A \cdot\nabla) A)(\lambda x)) = \lambda^2 \nabla \times ((A \cdot\nabla) A)(\lambda x) $$ which is a contradiction for $\lambda\neq 1$.

The correct answer afaik is not nicely expressed in vector notation. Here it is with Einstein summation, writing the curl with the Levi-Civita symbol:

$$ \nabla\times ((A\cdot \nabla) B)_i=\epsilon_{ijk}\del_j (A_l\del_lB_k)=\epsilon_{ijk}(\del_jA_l)(\del_l B_k) + \epsilon_{ijk}A_l\del_j\del_lB_k $$ The second term is $(A\cdot\nabla) \nabla \times B$ but the first term isn't so nice. One can write it as

$$\epsilon_{ijk}(\del_jA_l)(\del_l B_k) = \bigg(\sum_{l=1}^3 \nabla A_l \times \del_l B\bigg)_i$$ (I write the sum for emphasis; note that $\nabla A_l$ and $\del_l B$ are vector fields, so the expression makes sense) or $$\epsilon_{ijk}(\del_jA_l)(\del_l B_k) = \sum_{j=1}^3 \sum_{k=1}^3 \epsilon_{ijk} (\del_j A\cdot\nabla) B_k$$

but I don't think these are better than just writing out the indices.

Calvin Khor
  • 36,192
  • 6
  • 47
  • 102