2

Let's say we have a second-order central difference approximation for the first derivative:

$$\frac{\partial f(x_j)}{\partial x} = \frac{f(x_j + h)-f(x_j - h)}{2h} + O(h^2)$$

Is it true that:

$$\left(\frac{\partial f(x_j)}{\partial x}\right)^2 = \left(\frac{f(x_j + h)-f(x_j - h)}{2h}\right)^2+O(h^2)$$

For what functions $g$ is it true that:

$$g\left(\frac{\partial f(x_j)}{\partial x}\right) = g\left(\frac{f(x_j + h)-f(x_j - h)}{2h}\right)+O(h^2)$$

My initial thought is that applying any nonlinear function to the finite difference will not preserve the order of accuracy, but I don't know how to prove that.

S. Green
  • 91
  • 4

1 Answers1

0

Here is my attempt to show that it doesn't depend on $g$:

In my opinion the solution lays in the origin of the FDM schemes. The central difference scheme is the result of the subtraction of the Taylor series: $$f(x_{j+1}) = f(x_j)+h\cdot f'(x_j)+h^2\cdot \frac{f''(x_j)}{2}+h^3 \cdot \frac{|f'''(\xi_1)|}{6} , \xi_1 \in[x_i,x_{i+1}]$$

$$f(x_{j-1}) = f(x_j)-h\cdot f'(x_j)+h^2\cdot \frac{f''(x_j)}{2}-h^3 \cdot \frac{|f'''(\xi_2)|}{6} , \xi_2 \in[x_{i-1},x_{i}]$$

$$f(x_{j+1}) - f(x_{j-1}) = 2h \cdot f'(x_j) + \frac {h^3}{6} \cdot (|f'''(\xi_1)| + |f'''(\xi_2)|)$$

If we maximize the error terms we can substitute them with:

$$\frac{|f'''(\xi_1)| + |f'''(\xi_2)|}{6} \leq \frac {max(f'''(\xi))}{3} = C, \xi \in [x_{j-1},x_{j+1}]$$

Which leads us to the final definition of the central difference scheme:

$$f'(x_j)=\frac {f(x_{j+1}) - f(x_{j-1})}{2h}+\tilde{C}h^2$$

Let us take a general case for a differential equation:

$$g(u'(x))=f(x)$$

On our finite domain $\Omega$ with our differential matrix $A$ and right side $F_i$, as well as our local error $R_i$, which is linearly dependent on $h^p$ ($p =$ order of convergence, $h=$ stepsize):

$$g(A \space u_i + R_i) = F_i$$ Supposing $g(x)$ has an inverse and our matrix $A$ is well defined (problem is well posed...): $$A \space u_i = g^{-1}(F_i) - R_i$$ $$u_i = A^{-1}(g^{-1}(F_i) - R_i)$$

Here you can observe, that the error remains the same as for the case of having $g(x) = ax+b$ (linear). I am not a mathematician so don't quote me here, but another hint to my opinion might give you the relation (Burgers equation):

$$uu_x = \frac{1}{2}u_x^2$$

What do you think?

Pablo Jeken Rico
  • 253
  • 3
  • 13