-2

This question is motivated by Log-convexity of a function defined by an integral (Normal Mills ratio), where it is shown that the “Mill's ratio for the normal distribution” $$ f(x)=e^{x^2/2}\int_x^{+\infty}e^{-t^2/2}dt $$ is log-convex on $(0, \infty)$, i.e. $\log f(x)$ is a convex function.

$f$ satisfies the differential equation $$ \tag{$*$} \boxed{ y'(x) = x y(x) - 1 \, .} $$ The general solution of $(*)$ on $(0, \infty)$ is $$ y(x) = e^{x^2/2} \left( \int_x^{+\infty}e^{-t^2/2}dt + C \right) = f(x) + C e^{x^2/2}\, . $$ If $C \ge 0$ then $y$ is positive and log-convex as a sum of log-convex functions (see for example How to prove that the sum of two log-convex functions is log-convex?).

The log-convexity of the Mill's ratio $f$ is proven in the above-referenced Q&A by deriving the relation $f''(x)f(x)-f'(x)^2=f(x)^2+xf(x)-1$, and then showing that $f(x)^2+xf(x)-1 \ge 0$ for $x > 0$.

I wonder if the log-convexity (i.e. $y''y - y'^2 \ge 0$) can be derived from the differential equation $(*)$ alone, without using the explicit form of their solutions. So my question is: Can

Let $y: (0, \infty) \to \Bbb R$ be a solution of the differential equation $(*)$ with $y(x) > 0$ for all $x > 0$. Then $y''(x) y(x) - y'(x)^2 \ge 0$ for all $x > 0$, i.e. $\log \circ y$ is a convex function.

be proven without solving the differential equation?

Martin R
  • 128,226
  • 1
    Related question: Here assume that there is a solution $y(x) > 0, \forall x > 0$. Can we prove that there exists such a solution without using the explicit form of the solutions? – River Li Dec 30 '23 at 01:53

2 Answers2

1

Here is a (very...) hand-made approach that becomes conjectural at its end.

I will use just $y$ for $y(x)$, the functional dependence being understood.

We start by the premise that $\forall x \in(0,\infty)$ $$y: y> 0 \quad {\rm and}\quad y' = xy-1. \tag{1}$$

we want to know whether $(1)$ implies $$y''y-(y')^2 \geq 0 \implies y'' \geq \frac{(y')^2}{y}. \tag {2}$$

Differentiating $(1)$ we get $$y'' = y + xy'\tag{3}$$

Combining with $(2)$ we require

$$ y + xy' \geq \frac{(y')^2}{y} \implies y^2 + xyy' -(y')^2 \geq 0. \tag{4}$$

Using $(1)$ we essentially require $$y^2 + xyy' -(y')^2 = y^2 + (y'+1)y' -(y')^2 = y^2 + y' \geq 0.$$

This will obviously hold if $y'\geq 0$.

We next examine what happens if/when $y'<0$.

For this case it holds $$y'< \implies x < \frac 1y \implies xy' > \frac {y'}{y} \implies y+ xy' > y + \frac {y'}{y}$$ $$\implies y'' > \frac {y^2+y'}{y}.\tag{5}$$

This holds. So to prove that a solution $y$ while $y'<0$ is log convex , it is sufficient (but not necessary) to show that, combining $(5)$ and $(2)$,

$$\frac {y^2+y'}{y} \geq \frac{(y')^2}{y} \implies y^2+y' \geq (y')^2 \implies y^2 \geq y'\cdot (y' -1)$$

or show that $$y^2 \geq (xy-1)(xy-2) = y^2(x-1/y)(x-2/y)$$

or $$1 \geq \left(x - \frac 1y\right)\left(x-\frac 2y\right).$$

Re-arranging this as a $2$nd degree polynomial in $x$ we require

$$x^2 - \frac 3y x + \left(\frac 2{y^2} -1\right) \leq 0,$$

so we require that $x$ lies in the interval in between the roots of this polynomial, which are

$$x_{1,2} = \frac{3\pm\sqrt{1+4y^2}}{2y}$$ and one can verify that

$$\frac{3-\sqrt{1+4y^2}}{2y} < \frac 1y < \frac{3+\sqrt{1+4y^2}}{2y}.$$

What we just proved is that for a $(x,y): y'<0$, $y$ will be log-convex for $$\frac{3-\sqrt{1+4y^2}}{2y} \leq x < \frac 1 y$$

and we still need to prove what happens for $0< x < \frac{3-\sqrt{1+4y^2}}{2y}$, which is a feasible inequality as long as $y < \sqrt{2}$ (needed for the upper bound to be strictly positive). Because we do not know what $y$ is, to cover all cases, we consider the case that this inequality holds. So we are left to examine the log-convexity of the solution $y$ for $$y: y'<0, \quad y< \sqrt{2},\quad 0<x < \frac 1y g_1,\; g_1\equiv \frac{3-\sqrt{1+4y^2}}{2} < 1.$$

There is a reason why we defined $g_1$. Because now we can re-start the process that led to eq. $(5)$,

$$x < \frac 1y g_1 \implies xy' > \frac {y'}{y}g_1...$$

and arrive at the sufficient condition for log-convexity $$1 \geq \left(x - \frac 1y\right)\left(x-\frac {1+g_1}{y}\right),$$

consider the new $2$nd degree polynomial in $x$ etc, which results in shrinking even more the range of $x$ for which log-convexity is not yet proved... so I conjecture that, a bit like fixed-point convergence, iterating this should eventually cover all values of $x$, and fully prove what the OP was wondering about.

  • @MartinR Well, if $y(x_0) < 1/x_0$ it follows that $y'(x_0) < 0$ so the function $y$ is decreasing at this point. This is what I claim. – Alecos Papadopoulos Jan 01 '24 at 12:52
  • @MartinR I just realized that I have accidentally deleted a crucial piece of argument as regards this aspect of the situation, during the writing and the re-writing of my post. I have added it now just after eq. $(1)$ and deleted the assertion from its previous position in the text. – Alecos Papadopoulos Jan 01 '24 at 15:35
  • Why does $y(x_0) = 1/x_0$ imply $y'(x_0) = -1/x_0^2 $? – Martin R Jan 01 '24 at 16:14
  • @MartinR That indeed is unfounded. I deleted this separation of cases. – Alecos Papadopoulos Jan 02 '24 at 10:24
0

This seems a though question. By the way, the proof by Martin R that the sum of two log convex functions is still log convex was very clever. Here I am giving another proof of the fact that $f(x)+Ce^{x^2/2}$ is log convex when $C\geq 0$. We use the fact that a Laplace transform of a positive measure is log convex.

Actually denoting $t=x+y$ we have

$$f(x)=\int_0^{\infty}e^{-y^2/2+xy}dy$$ As a result $f(x)+Ce^{x^2/2}$ is the Laplace transform of the positive density

$$y\mapsto e^{-y^2/2}(C\sqrt{2\pi}+1_{(0,\infty)}(y)).$$

Now let us try to answer to the initial question and to the River Li one...Thanks to Martin R for having corrected my first foolish answer.