In my current research, a curious problem arose. I will elaborate.
I have a Volume function, which is given by: $$ V(u,v,w) = \sum_{i=1}^n N_i(u)M_i(v)L_i(w)\omega _iPi \text{,} $$
where $N_i(u)$, $M_i(v)$, $L_i(w)$ are B-Spline basis functions (polynomials, really), $\omega _i$ are constant weights and $P_i$ are control points, which are also constant, in $x,y,z$ coordinates. This means that $V(u,v,w)$ is a mapping from parameter space $(u,v,w)$ to physical space $(x,y,z)$.
I'm working with differential equations that use this mapping as it's domain. More specifically, I'm working with elastostatics, so I need information on all second derivatives of the displacement: $\frac{\partial ^2 V}{\partial x^2}$, $\frac{\partial ^2 V}{\partial x \partial y}$, etc. EDIT: Please consider this as an $f(V(u,v,w)) = (f_x, f_y,f_z)$.
This is where things get complicated, so let's get to it. Differentiating $V(u,v,w)$ in regards to $x$. To avoid excessive chain rules, we will call the tensor-product $N_i(u)M_i(v)L_i(w)$ = $R_i(u,v,w)$: $$\frac{\partial V}{\partial x} = \frac{\partial}{\partial x} \sum_{i=1}^n R_i(u,v,w)\omega_iP_i ,$$ Chain rule, rearranging constants and summation: $$\frac{\partial V}{\partial x} = \sum_{i=1}^n \omega_i P_i \left[\frac{\partial R}{\partial u}\frac{\partial u}{\partial x} + \frac{\partial R}{\partial v}\frac{\partial v}{\partial x} + \frac{\partial R}{\partial w}\frac{\partial w}{\partial x} \right] .$$
To compute this, we need the partials over $u,v,w$, which we have, as the tensor-product splines have well defined derivatives. We also need the differentials $\frac{\partial u}{\partial x}$, $\frac{\partial v}{\partial x}$ and $\frac{\partial w}{\partial x}$, which are obtained via inversion of the Jacobian matrix: $$J(V) = \begin{bmatrix} \frac{\partial V_x}{\partial u} && \frac{\partial V_x}{\partial v} && \frac{\partial V_x}{\partial w} \\ \frac{\partial V_y}{\partial u} && \frac{\partial V_y}{\partial v} && \frac{\partial V_y}{\partial w} \\ \frac{\partial V_z}{\partial u} && \frac{\partial V_z}{\partial v} && \frac{\partial V_z}{\partial w} \end{bmatrix} $$
and $J^{-1}(V) = (J(V))^{-1}$ via multivariate inverse function theorem, $$J^{-1}(V) = \begin{bmatrix}\frac{\partial u}{\partial x} && \frac{\partial v}{\partial x} && \frac{\partial w}{\partial x} \\ \frac{\partial u}{\partial y} && \frac{\partial v}{\partial y} && \frac{\partial w}{\partial y} \\ \frac{\partial u}{\partial z} && \frac{\partial v}{\partial z} && \frac{\partial w}{\partial z} \end{bmatrix} $$.
So far, so good. Everything is obtainable. The problem starts when we go to second order derivatives: $$\frac{\partial ^2 V}{\partial x^2} = \sum_{i=1}^n \omega_i P_i \frac{\partial}{\partial x} \left[\frac{\partial R}{\partial u}\frac{\partial u}{\partial x} + \frac{\partial R}{\partial v}\frac{\partial v}{\partial x} + \frac{\partial R}{\partial w}\frac{\partial w}{\partial x} \right] ,$$ which, saving any chain rules mistakes, becomes: $$ \begin{split} \frac{\partial ^2 V}{\partial x^2} & = \sum_{i=1}^n \omega_i P_i \left\{ \left[ \left( \frac{\partial ^2 R}{\partial u^2}\frac{\partial u}{\partial x} +\frac{\partial ^2 R}{\partial u \partial v} \frac{\partial v}{\partial x} + \frac{\partial ^2 R}{\partial u \partial w}\frac{\partial w}{\partial x}\right) \frac{\partial u}{\partial x} + \frac{\partial R}{\partial u}\frac{\partial ^2 u}{\partial x^2} \right] + \ldots \\ \ldots + \left[ \left( \frac{\partial ^2 R}{\partial u \partial v}\frac{\partial u}{\partial x} +\frac{\partial ^2 R}{\partial v^2} \frac{\partial v}{\partial x} + \frac{\partial ^2 R}{\partial v \partial w}\frac{\partial w}{\partial x}\right)\frac{\partial v}{\partial x} + \frac{\partial R}{\partial v}\frac{\partial ^2 v}{\partial x^2} \right] +\dots \\ \dots + \left[ \left( \frac{\partial ^2 R}{\partial w \partial u}\frac{\partial u}{\partial x} +\frac{\partial ^2 R}{\partial w \partial v} \frac{\partial v}{\partial x} + \frac{\partial ^2 R}{\partial w^2}\frac{\partial w}{\partial x} \right)\frac{\partial w}{\partial x} + \frac{\partial R}{\partial w}\frac{\partial ^2 w}{\partial x^2} \right] \right\}. \end{split} $$
And there's where the problem lies: how can I obtain the $\frac{\partial ^2 u}{\partial x^2}$? When we evaluate $\frac{\partial ^2 V}{\partial x \partial y}$ , differentials of the kind $\frac{\partial ^2 u}{\partial x \partial y}$ also appear.
Is inverting the Hessian matrix the way to approach this? It feels wrong to me, as I can't fathom how the cross derivatives would behave in this inversion. I'm currently considering two approaches:
First one is going a "scalar" route, by going coordinate by coordinate with this procedure: Denoting $P_i = (P_{ix}, P_{iy}, P_{iz})$ and $V(u,v,w) = (x, y, z)$, we start by deriving the first component by u: $$\frac{\partial V}{\partial u} = \sum_{i=1}^n P_x \frac{\partial R}{\partial u} = \frac{\partial x}{\partial u} .$$ By the inverse function theorem, we can write: $$\frac{\partial u}{\partial x} = \left[\sum_{i=1}^n P_x \frac{\partial R}{\partial u}\right]^{-1}$$. Then we derive this expression in regards to $x$ to obtain an expression for $\frac{\partial ^2 u}{\partial x^2}$. The same procedure goes for the cross-derivatives.
This is the procedure that I'm working with right now, but if this works, why should I bother inverting the Jacobian in the first place? Is this right? Are there any other way to rigorously achieve these derivatives?
The second procedure I'm looking into is by getting an analytical expression for the inverse Jacobian, and then deriving it in regards to x to get a "Inverse Hessian" styled matrix. The problem I'm facing with this route is: my adjoint matrix is null, which is something that is bothering me (either because I'm right and then I can't invert the Jacobian or because I'm wrong and can't do simple algebra).
Dear Mathexchange: a penny for your thoughts?
The Jacobian, as mentioned, is done exactly as its Wikipedia definition: $$ \boldsymbol{J} = \left[ \frac{\partial \boldsymbol{f} }{\partial x_1} \dots \frac{\partial \boldsymbol{f} }{\partial x_n} \right] $$.
– Guilherme Henrique da Silva Nov 19 '19 at 16:42