8

In single-variable calculus, the second-derivative test states that if $x$ is a real number such that $f'(x)=0$, then:

  1. If $f''(x)>0$, then $f$ has a local minimum at $x$.
  2. If $f''(x)<0$, then $f$ has a local maximum at $x$.
  3. If $f''(x)=0$, then the text is inconclusive.

But there's no need to despair if the second-derivative test is inconclusive, because there is the higher-order derivative test. It states that if $x$ is a real number such that $f'(x)=0$, and $n$ is the smallest natural number such that $f^{(n)}(x)\neq 0$, then:

  1. If $n$ is even and $f^{(n)}>0$, then $f$ has a local minimum at $x$.
  2. If $n$ is even and $f^{(n)}<0$, then $f$ has a local manimum at $x$.
  3. If $n$ is odd, then $f$ has an inflection point at $x$.

Similarly, in multivariable calculus the second-derivative test states that if $(x,y)$ is an ordered pair such that $\nabla f(x,y) = 0$, then:

  1. If $D(x,y)>0$ and $f_{xx}(x,y)>0$, then $f$ has a local minimum at $(x,y)$.
  2. If $D(x,y)>0$ and $f_{xx}(x,y)<0$, then $f$ has a local maximum at $(x,y)$.
  3. If $D(x,y)<0$, then $f$ has a saddle point at $(x,y)$.
  4. If $D(x,y)=0$, then the test is inconclusive.

where $D(x,y)=f_{xx}(x,y)f_{yy}(x,y)-(f_{xy}(x,y))^2$ is the determinant of the Hessian matrix of $f$ evaluated at $(x,y)$.

My question is, what do you do if this test is inconclusive? What is the analogue of the higher-order derivative test in multivariable calculus?

3 Answers3

6

This webpage states and proves a version of the higher-order derivative test that applies not only to functions defined on $\mathbb{R}^2$ or $\mathbb{R}^N$, but functions defined on arbitrary Banach spaces. First there is this theorem:

Theorem 38 (Higher derivative test). Let $A\subseteq E$ be an open set and let f$:A\to\mathbb{R}$. Assume that $f$ is $(p-1)$ times continuously differentiable and that $D^p f(x)$ exists for some $p\ge 2$ and $x\in A$. Also assume that $f'(x),\dots,f^{(p-1)}(x)=0$ and $f^{(p)}(x)\ne 0$. Write $h^{(p)}$ for the $p$-tuple $(h,\dots,h)$.

  1. If $f$ has an extreme value at $x$, then $p$ is even and the form $f^{(p)}(x)h^{(p)}$ is semidefinite.
  2. If there is a constant $c$ such that $f^{(p)}(x)h^{(p)}\ge c > 0$ for all $|h|=1$, then $f$ has a strict local minimum at $x$ and (1) applies.
  3. If there is a constant $c$ such that $f^{(p)}(x)h^{(p)}\le c < 0$ for all $|h|=1$, then $f$ has a strict local maximum at $x$ and (1) applies.

Then there is this corollary for the finite dimensional case, which is what we’re interested in:

Corollary 39 (Higher derivative test, finite-dimensional case). In Theorem 38, further assume that $E$ is finite-dimensional. Then $h\mapsto f^{(p)}(x)h^{(p)}$ has both a minimum and maximum value on the set $\{h\in E:|h|=1\}$, and:

  1. If the form $f^{(p)}(x)h^{(p)}$ is indefinite, then $f$ does not have an extreme value at $x$.
  2. If the form $f^{(p)}(x)h^{(p)}$ is positive definite, then $f$ has a strict local minimum at $x$.
  3. If the form $f^{(p)}(x)h^{(p)}$ is negative definite, then $f$ has a strict local maximum at $x$.

Here $f^{(p)}(x)$ denotes a tensor containing all the pure and mixed partial derivatives of $f$ of order $p$, evaluated at $x$.

  • It seems a general theorem not useful as a general test when hessian test fails. – user Aug 02 '18 at 16:45
  • @gimusi Why isn’t it useful? It gives a precise procedure you can feed into a computer to determine whether a point is a local maximum or minimum. – Keshav Srinivasan Aug 02 '18 at 18:53
  • Sorry I thought you were looking for a test similar to Hessian test to check for max, min or saddle also by hand calculation.The possibility for numerical algorithms was already noticed by Robert Israel. Then if you are looking for that your problem is now solved, Bye – user Aug 02 '18 at 19:49
  • 1
    @gimusi I'm not interested in numerical algorithms at all . I was looking for something that works when the Hessian test fails, just like you can use the higher-order derivative test if the second derivative test fails in single variable calculus. In what sense do you think corollary 39 is not useful when the Hessian test fails? – Keshav Srinivasan Aug 02 '18 at 20:10
  • How is this not similar to the Hessian test? – Keshav Srinivasan Aug 03 '18 at 01:25
  • @KeshavSrinivasan I know this thread is old but did you ever try this out to see if it worked? – Caleb Williams Oct 20 '20 at 15:21
  • The link in your solution appears to be broken. In addition, an I correct in understanding that there is a type in your solution? Namely, is the $p$ tuple $h$ defined properly? – Michael Levy Nov 15 '22 at 00:53
  • @Keshav Srinivasan, Hi. You indicate to "Write $h(p)$ for the $p$-tuple $(h,…,h)$;" however, this is not defined anywhere. Meanwhile the link is broken. Your solution would be made better if you amend it to define $h(p)$ – Michael Levy Nov 24 '22 at 15:58
  • @MichaelLevy Here’s an archived version of the link: http://web.archive.org/web/20130326111714/https://wj32.org/wp/2013/02/25/differentiation-done-correctly-5-maxima-and-minima/ In any case h is an arbitrary real number. And we’re concerned with what is and is not true for all values of h, or all values of h with absolute value less than or equal to 1. – Keshav Srinivasan Nov 25 '22 at 22:02
2

Consider a homogeneous polynomial $f(x_1, \ldots, x_n)$ of total degree $d > 0$ in $n$ variables. In order to tell that $(0,\ldots,0)$ is a local minimum, we would need to know that $f(x_1,\ldots,x_n) \ge 0$ for all $x_1,\ldots,x_n$. Unfortunately this is a difficult problem in general, and I'm pretty sure there are no very simple tests, although there are algorithms related to Hilbert's 17th problem.

Robert Israel
  • 470,583
1

In that case if the test is inconclusive there is not a general rule or test that always work. We need case be case to show what kind of critical point we have using some manipulation and inequalities.

As a simple example for

$$f(x,y)=x^4-2x^2y^2+y^4$$

at $(x,y)=(0,0)$ the test is inconclusive but

$$f(x,y)=x^4-2x^2y^2+y^4=(x^2-y^2)^2\ge 0$$

user
  • 162,563
  • Are you saying that the higher-order derivative test has no analogue in multivariable calculus? – Keshav Srinivasan Aug 02 '18 at 06:16
  • @KeshavSrinivasan Yes exactly, there is not an analogue test in multivariable calculus. – user Aug 02 '18 at 06:20
  • 1
    How do you know there isn't an analogue? The higher-order derivative test springs from Taylor series, and Taylor series exist for multivariable functions, so why wouldn't you be able to derive an analogue? – Keshav Srinivasan Aug 02 '18 at 06:24
  • The higher-order derivative test does not always work either. Then it does not mean that there does not exist a multivariable version of that test. – nicomezi Aug 02 '18 at 06:26
  • @KeshavSrinivasan Other examples https://math.stackexchange.com/questions/716100/what-to-do-when-the-multivariable-second-derivative-test-is-inconclusive?rq=1 https://math.stackexchange.com/questions/2715753/how-to-deal-with-inconclusive-hessian-test-for-maximization-of-xy2-x2y2?rq=1 https://math.stackexchange.com/questions/2018823/inconclusive-second-derivative-test-rigorous-proof – user Aug 02 '18 at 06:30
  • @nicomezi Whatkind of test are you referring to? – user Aug 02 '18 at 06:30
  • 1
    @gimusi None of that answers why a higher-order derivative test would not exist for multivariable functions. – Keshav Srinivasan Aug 02 '18 at 06:34
  • @gimusi I am just saying that there may be a test that will work in some cases (like the higher order derivative test for one variable calculus). You are just providing examples where the second derivative test is inconclusive actually. – nicomezi Aug 02 '18 at 06:34
  • @KeshavSrinivasan The examples I given are aimed to show the techniques we use when hessian test fails. – user Aug 02 '18 at 06:36
  • @nicomezi See my previous comment here above. – user Aug 02 '18 at 06:37
  • Just because Taylor series exist in several variables does not mean everything you do with single-variable Taylor series must generalize. Specifically, in one variable a nonzero Taylor series has a unique monomial of lowest degree (a unique leading term) but this is false for Taylor series in more than one variable. (Analogue: all ideals in $\mathbf R[x]$ are principal but that is false for polynomials in more than one variable.) Local behavior of a function in more than one variable is genuinely more complicated than in one variable. – KCd Aug 02 '18 at 06:39
  • @nicomezi I'm claiming that there is not a general rule or test when hessian test is not conclusive. You are claiming that "may be a test that will work in some cases". The two things do not seem in contrast to me. – user Aug 02 '18 at 06:39
  • 2
    "The examples I given are aimed to show the techniques we use when hessian test fails." OK, but I'm not interested in that. I'm specifically interested in the higher-order derivative test. – Keshav Srinivasan Aug 02 '18 at 06:44
  • @KeshavSrinivasan There is not a general higher-order derivative test for functions of two or more variables. – user Aug 02 '18 at 06:47
  • @gimusi Again, how do you know that? Why can't you derive such a test from the Taylor series for multivariable functions? – Keshav Srinivasan Aug 02 '18 at 06:48
  • @KeshavSrinivasan The reason is that Hessian test is based on the theory for the quadratic forms. As noticed here above by KCd, for higher order Taylor's polynomials becomes (in general) to much complicated to be studied with similar methods.Of course we can use Taylor's expansion around the point of interest but there is not a test which always works. We need to consider the problem case by case. – user Aug 02 '18 at 06:56
  • To add to gimusi's answer about the special feature of the degree-two terms (a quadratic form) compared to higher degree, the degree-$n$ terms in a Taylor expansion are a homogeneous polynomial ("form") of degree $n$, and many properties of these when $n=2$ do not generalize to $n>2$ when there is more than one variable. For example, all quadratic forms over the real numbers in any (finite) number of variables can be diagonalized, but this is generally false for forms over $\mathbf R$ in more than one variable of degree greater than $2$. – KCd Aug 02 '18 at 07:28
  • For a concrete example of a cubic form that can't be diagonalized, see https://math.stackexchange.com/questions/940432/diagonalizing-xyz. – KCd Aug 02 '18 at 07:37
  • @KCd Thanks for your contribution and for the examples given, very helpful. – user Aug 02 '18 at 07:52
  • At a more advanced level one could consider results like the Morse lemma, which show the degree-2 part of local approximations really is special compared to higher degrees. – KCd Aug 02 '18 at 10:15
  • gimusi, I just found a version of the higher-order derivative test for multivariable functions. See my answer. @KCd – Keshav Srinivasan Aug 02 '18 at 15:36
  • 1
    @KeshavSrinivasan the reason people are saying the higher-order derivative test you found is not in the same spirit as the 2nd-derivative test is that the 2nd-derivative test is a numerical test: there are standard algorithms to determine if a quadratic form is positive definite, negative definite, or indefinite. The test you found involves deciding if a higher-degree homogeneous polynomial is (positive or negative) definite or indefinite, and this is a hard thing to check in practice. The lowest-degree nonvanishing term in the Taylor expansion, if it has even degree, answers (contd.) – KCd Dec 31 '19 at 19:59
  • your question in theory, but did you ever try to use it in practice? That's a big contrast to the 2nd-derivative test. In a comment to Robert Israel's answer you say that you found a "simple test". In what sense do you consider the higher-derivative test you found to be simple? Did you really use it (including feeding it into a computer if needed) in many examples to check that each time it is simple to apply? – KCd Dec 31 '19 at 19:59