1

In some book about continuum mechanics I read that from principle of virtual work follows balance of rotational momentum when $\delta \boldsymbol{r} = \boldsymbol{\delta \varphi} \times \boldsymbol{r}, \; \boldsymbol{\delta \varphi} = \boldsymbol{\mathsf{const}}$ ($\boldsymbol{r}$ is location vector, $\delta \boldsymbol{r}$ is its variation, $\boldsymbol{\delta \varphi}$ is not variation, just denoted as it for some reason like being small enough for infinitesimal $\delta \boldsymbol{r}$). Then there is written without any explaination $\boldsymbol{\nabla} \delta \boldsymbol{r} = - \boldsymbol{E} \times \boldsymbol{\delta \varphi}$. I know that $\boldsymbol{E}$ is bivalent “metric unit identity” tensor (the one which is neutral to dot product operation), and that $\boldsymbol{\nabla} \boldsymbol{r} = \boldsymbol{E}$. And that $\boldsymbol{a} \times \boldsymbol{E} = \boldsymbol{E} \times \boldsymbol{a} \:\: \forall\boldsymbol{a}$, no minus here. To get minus, transposing is needed: $\left( \boldsymbol{E} \times \boldsymbol{\delta \varphi} \right)^{\mathsf{T}} \! = - \boldsymbol{E} \times \boldsymbol{\delta \varphi}$. Thus I can’t get why $\boldsymbol{\nabla} \delta \boldsymbol{r} = - \boldsymbol{E} \times \boldsymbol{\delta \varphi}$ has minus sign.

For constant $\boldsymbol{\delta \varphi}$, $\boldsymbol{\nabla} \boldsymbol{\delta \varphi} = {^2\boldsymbol{0}}$ (bivalent zero tensor). Isn’t it true that $\boldsymbol{\nabla} \! \left( \boldsymbol{\delta \varphi} \times \boldsymbol{r} \right) = \boldsymbol{\delta \varphi} \times \boldsymbol{\nabla} \boldsymbol{r} = \boldsymbol{\delta \varphi} \times \boldsymbol{E} = \boldsymbol{E} \times \boldsymbol{\delta \varphi}$? Searching for how to get gradient of cross product of two vectors gives gradient of dot product, divergence ($\boldsymbol{\nabla} \cdot$) of cross product, and many other relations. But no gradient of cross product $\boldsymbol{\nabla} \! \left( \boldsymbol{a} \times \boldsymbol{b} \right) = \ldots$ Is it impossible or unknown how to find it? At least for the case when first vector is constant.

update

As “gradient” I mean tensor product with “nabla” $\boldsymbol{\nabla}$: $\operatorname{^{+1}grad} \boldsymbol{A} \equiv \boldsymbol{\nabla} \! \boldsymbol{A}$, here $\boldsymbol{A}$ may be tensor of any valence (and I don’t use “$\otimes$” or any other symbol for tensor product). Nabla (differential Hamilton’s operator) is $\boldsymbol{\nabla} \equiv (\sum_i)\, \boldsymbol{r}^i \partial_i$, $\:(\sum_i)\, \boldsymbol{r}^i \boldsymbol{r}_i = \boldsymbol{E} \,\Leftrightarrow\, \boldsymbol{r}^i \cdot \boldsymbol{r}_j = \delta^{i}_{j}$ (Kronecker’s delta), $\,\boldsymbol{r}_i \equiv \partial_i \boldsymbol{r}$ (basis vectors), $\,\partial_i \equiv \frac{\partial}{\partial q^i}$, $\:\boldsymbol{r}(q^i)$ is location vector, and $q^i$ $(i = 1, 2, 3)$ are coordinates.

  • The gradient of the cross product is not defined (usually). The gradient is only defined for scalar-valued functions – rubikscube09 May 05 '19 at 13:47
  • @rubikscube09 As “gradient” I mean tensor product with nabla $\boldsymbol{\nabla}$: $\operatorname{grad} \boldsymbol{A} \equiv \boldsymbol{\nabla} ! \boldsymbol{A}$, here $\boldsymbol{A}$ may be tensor of any valency. Nabla (differential Hamilton’s operator) is $\boldsymbol{\nabla} \equiv \boldsymbol{r}^i \partial_i$ – Douglas Mencken May 05 '19 at 13:58
  • I see. Perhaps you are referring to this : https://en.wikipedia.org/wiki/Tensor_derivative_(continuum_mechanics) ? – rubikscube09 May 05 '19 at 14:02
  • @rubikscube09 Pretty like (I use $\boldsymbol{r}^i$ while there’s $\boldsymbol{g}^i$). But again, I see nothing about symbolic calculation of $\boldsymbol{\nabla} ! \left( \boldsymbol{a} \times \boldsymbol{b} \right)$ in that article. – Douglas Mencken May 05 '19 at 14:16

1 Answers1

2

Well, it’s easy to find such a gradient. You mentioned almost everything you need for that but the following

  • the anticommutativity ${\; \boldsymbol{p} \times \boldsymbol{q} = - \, \boldsymbol{q} \times \boldsymbol{p} \;}$ for any two vectors ${\; \boldsymbol{p} \;}$ and ${\; \boldsymbol{q}}$

  • a partial derivative of any vector with respect to scalar like a coordinate isn’t some more complex tensor, it is a vector too

  • for differentiation of a “$\circ$”-product of two multipliers, the famous “product rule” https://en.wikipedia.org/wiki/Product_rule applies: \begin{equation*} \displaystyle \frac{\partial}{\partial q^i} \bigl( u \circ v \bigr) = \biggl( \frac{\partial}{\partial q^i} \, u \biggr) \! \circ v \, + \, u \circ \! \biggl( \frac{\partial}{\partial q^i} \, v \biggr) \end{equation*}

So we have

\begin{gather*} {\boldsymbol{\nabla} \bigl( \boldsymbol{a} \times \boldsymbol{b} \bigr)} = {\boldsymbol{r}^i \partial_i \!\, \bigl( \boldsymbol{a} \times \boldsymbol{b} \bigr)} = {\boldsymbol{r}^i \! \left( \partial_i \boldsymbol{a} \times \boldsymbol{b} + \boldsymbol{a} \times \partial_i \boldsymbol{b} \right)} = {\boldsymbol{r}^i \! \left( \partial_i \boldsymbol{a} \times \boldsymbol{b} - \partial_i \boldsymbol{b} \times \boldsymbol{a} \right)} = {\boldsymbol{r}^i \partial_i \boldsymbol{a} \times \boldsymbol{b} - \boldsymbol{r}^i \partial_i \boldsymbol{b} \times \boldsymbol{a}} = {\boldsymbol{\nabla} \boldsymbol{a} \times \boldsymbol{b} - \! \boldsymbol{\nabla} \boldsymbol{b} \times \boldsymbol{a}} \end{gather*}

A minus sign before a second addend appears, because we need to swap multipliers to get the differentiation of the first multiplier to get the full $\boldsymbol{\nabla}$

When a first multiplier, say $\boldsymbol{\phi}$, is constant in space, that is it remains the same although coordinates change, then it becomes

$$ {\boldsymbol{\nabla} \bigl( \boldsymbol{\phi} \times \boldsymbol{b} \bigr)} = {\boldsymbol{\nabla} \!\, \boldsymbol{\phi} \times \boldsymbol{b} - \! \boldsymbol{\nabla} \boldsymbol{b} \times \boldsymbol{\phi}} = {{^2{\boldsymbol{0}}} \times \boldsymbol{b} - \! \boldsymbol{\nabla} \boldsymbol{b} \times \boldsymbol{\phi}} = {{^2{\boldsymbol{0}}} - \! \boldsymbol{\nabla} \boldsymbol{b} \times \boldsymbol{\phi}} = {- \boldsymbol{\nabla} \boldsymbol{b} \times \boldsymbol{\phi}} $$

That’s why there’s a minus sign.

  • How do you define $\boldsymbol{\nabla} \boldsymbol{a} \times \boldsymbol{b}$? The first term is a rank 2 tensor (or matrix) and the last one is a vector, isn't it? –  May 22 '20 at 19:10
  • @macnguyen Thru the completely asymmetric isotropic Levi-Civita trivalent (pseudo)tensor ${^3!\boldsymbol{\epsilon}}$: ${\boldsymbol{a}\boldsymbol{b} \times \boldsymbol{c} \equiv - \boldsymbol{a}\boldsymbol{b} \cdot ^3!\boldsymbol{\epsilon} \cdot \boldsymbol{c}}$, more info here https://math.stackexchange.com/questions/496060/gradient-of-a-dot-product/3322522#3322522

    post scriptum~ a bivalent tensor is not a matrix

    – Vadique Myself May 23 '20 at 23:52