13

With regard to this answer to an inverse Laplace transform question, I derived the following result:

$$\frac1{i 2 \pi} \int_{c-i \infty}^{c+i \infty} ds \, e^{s t} \Gamma(s)^2 = 2 K_0 \left ( 2 e^{-t/2} \right ) - t I_0 \left ( 2 e^{-t/2} \right ) $$

This was not a result I can claim to have seen before and I have had a hard time finding anything related. While I am confident that it is correct, I wanted to perform a simple check by deriving the Laplace transform and hopefully getting the original $\Gamma(s)^2$ back. When I did this, I got the following:

$$\int_0^{\infty} dt \, e^{-s t} \left [ 2 K_0 \left ( 2 e^{-t/2} \right ) - t I_0 \left ( 2 e^{-t/2} \right ) \right ] = \sum_{n=0}^{\infty} \frac{\displaystyle \operatorname*{Res}_{s=-n} \Gamma(s)^2}{s+n} $$

Now, I know about partial fraction expansions for rational functions with a finite number of poles.

My question is as follows: Is the above "partial fraction" expansion for the function $F(s) = \Gamma(s)^2$ a valid representation of $\Gamma(s)^2$?

EDIT

The above result is incorrect. Rather, the result is

$$\frac1{i 2 \pi} \int_{c-i \infty}^{c+i \infty} ds \, e^{s t} \Gamma(s)^2 = 2 K_0 \left ( 2 e^{-t/2} \right ) $$

This result holds up: the inverse, which states that

$$\int_{-\infty}^{\infty} dt \, e^{-s t} 2 K_0 \left ( 2 e^{-t/2} \right ) = \Gamma(s)^2 $$

Subbing $u=e^{-t}$, we get

$$2 \int_0^{\infty} du \, u^{s-1} K_0 \left ( 2 \sqrt{u} \right ) = \Gamma(s)^2 $$

Note that this is a two-sided transform. See the link for why it converges for all real values of $t$.

Ron Gordon
  • 141,538

2 Answers2

7

I felt the answer to the original question posed had to be "No", because since $\Gamma(s)$ has simple poles at the nonpositive integers, $\Gamma(s)^2$ has double poles, and your series was suspiciously convergent, suggesting that the right-hand side only had simple poles. Thankfully, this proved to be correct.

So, onto answering the modified question. I'll run through a simpler example first to work out what's going on, then tackle $\Gamma(s)$. The answer to the broader question, "Is there a generalisation of partial fractions to analytic functions", is yes: this is provided by Mittag-Leffler's Theorem:

Let $D$ be an open set in $\mathbb{C}$ and $E \subset D$ a closed discrete subset. For each $a$ in $E$, let $p_a(z)$ be a polynomial in $1/(z-a)$. There is a meromorphic function $f$ on $D$ such that for each $a \in E$, the function $f(z)-p_a(z)$ is holomorphic at $a$. In particular, the principal part of $f$ at $a$ is $p_a(z)$.

This is an existence theorem, but we can in principle extend it to build functions we are actually interested in. For example, the Gamma function is given by the Mellin-type integral $$ \Gamma(s) = \int_0^{\infty} x^{s-1} e^{-x} \, dx, \quad (\Re(s)>0). $$ We can produce a partial fractions expansion for $\Gamma$ in the following way: notice that the problems with this integral converging all stem from the possible singularity at zero. Therefore, we expect that the part of the integral near zero will tell us about the poles of $\Gamma$. Therefore, write $$ \Gamma(s) = \int_0^1 x^{s-1} e^{-x} \, dx + \Gamma_1(s), \quad (\Re(s)>0). $$ $\Gamma_1(s)$ is the upper incomplete Gamma function evaluated at $1$; by trivial bounding arguments, it is convergent for all $s$ and hence defines and entire function (with a convergent power series [...]), so we can safely ignore it from the point of view of partial fractions, residues and so on.

The lower integral's integrand is finite for $s>0$, and in particular we expand $e^x$ in a power series, and can then interchange the order of integration and summation as follows: $$ \begin{align*} \int_0^1 x^{s-1} e^{-x} \, dx &= \int_0^1 x^{s-1} \left( \sum_{k=0}^{\infty} (-1)^k \frac{x^k}{k!} \right) \, dx \\ &= \sum_{k=0}^{\infty} \frac{(-1)^k}{k!} \int_0^1 x^{s+k-1} \, dx \\ &= \sum_{k=0}^{\infty} \frac{(-1)^k}{k!} \frac{1}{s+k} \end{align*}$$ Ah-ha! This is a convergent sum for any complex $s$ that is not a nonpositive integer (just lop off the finite number of terms with $\Re(s)+k<0$ and compare with the exponential sum, for example). It has exactly the same poles as $\Gamma(s)$ (as we hoped for, given that it is $\Gamma(s)$ with an analytic bit subtracted off), and, even better, we got all of $\Gamma(s)$'s residues for free, too (and inspection tells us we were right about what they were all along).


Okay, so we know about $\Gamma(s)$ now. Now we can move onto your question, what about $\Gamma(s)^2$? Now, I could do this by squaring the expansion I just found, but that looks like a seriously unpleasant idea. Thankfully you have provided us with a better way using the integral representation $$ \int_0^{\infty} x^{s-1} \left( 2K_0 ( 2 \sqrt{x} ) \right) \, dx = \Gamma(s)^2. $$ First we write $K_0$ in a more useful form using the identity (mentioned in your original answer, and also in DLMF) $$ 2K_0(2\sqrt{x}) = \sum_{k=0}^{\infty} \frac{2H_k}{(k!)^2} x^k - \left( 2\gamma+\log{z} \right) I_0(2\sqrt{x}) = \sum_{k=0}^{\infty} \frac{1}{(k!)^2} \left( 2H_k-2\gamma- \log{x} \right) x^k . $$

Now we do as we did before, splitting at $x=1$ and ignoring the analytic bit of the integral, the singular part is: $$ \begin{align*} \int_0^1 x^{s-1} 2K_0(2\sqrt{x}) \, dx &= \int_0^1 x^{s-1} \left( \sum_{k=0}^{\infty} \frac{1}{(k!)^2} \left( 2H_k-2\gamma- \log{x} \right) x^k \right) \, dx \\ &=\sum_{k=0}^{\infty} \frac{1}{(k!)^2} \left( 2(H_k-\gamma) \int_0^1 x^{s+k-1} \, dx -\int_0^1 x^{s+k-1}\log{x} \, dx \right) \end{align*}$$

As every schoolchild knows, $$ \int_0^{1} x^{n}\log{x} \, dx = \left. x^{n+1} \left( \frac{\log{x}}{n+1} - \frac{1}{(n+1)^2} \right) \right|_)^1 = -\frac{1}{(n+1)^2}, $$ and hence we obtain the expansion $$ \Gamma(s)^2 = \sum_{k=0}^{\infty} \left( \frac{1}{(k!)^2} \frac{1}{(s+k)^2} + 2\frac{H_k-\gamma}{(k!)^2} \frac{1}{s+k} \right) + \int_1^{\infty} x^{s-1} 2K_0(2\sqrt{x}) \, dx $$

This has all the properties we could hope for: double poles, the right residues, and the coefficients of the double poles are what we'd expect naïvely.


Final important remark: the alert reader my have noticed that I swept something under the rug: I chose to split the integral at $x=1$. Won't this choice cause problems for the values of the residues? In particular, why are the residues well-defined when I produced them out of a hat with this procedure?

Suppose instead I chopped the gamma integral at $x=a$, for example. Then the $k$th term in the expansion of the finite interval's integral is proportional to $$ \int_0^a x^{s+k-1} \, dx = \frac{a^s a^k}{s+k}, $$ and $a^s$ is only analytic in $s$ throughout $\mathbb{C}$ when $a=1$, so the only way to get meromorphic terms in the series was to choose $a=1$ (or $0$ or $\infty$, neither of which give us anything). Hence the choice is no choice at all, and the residues are what I stated they would be.

Chappers
  • 69,099
  • That was terrific. I'll have to ponder the splitting a bit more, but you answered my question perfectly by alerting me to Mittag-Leffler. My experience through this adventure tells me that the difference between integrating to $x=1$ and $x=\infty$ is deciding whether the transform is one-sided or bilateral. (See the answer in the link, which I have edited to reflect this realization.) BTW The integral result, which does indeed follow from the residue theorem, agrees with a result in Whittaker & Watson. That made my day. – Ron Gordon Mar 13 '15 at 00:59
  • Thank you, I'm glad I could help such an experienced user of complex function theory! But I'm not sure your result is in Whittaker & Watson: isn't it in Watson's book on Bessel functions? (the giveaway is that at least my copy of W&W has no equation numbers). It's a shame that Mittag-Leffler is normally only taught in a second course on complex functions. (And indeed, it's entirely possible to get a maths degree from my university without doing any complex analysis!) – Chappers Mar 13 '15 at 14:04
  • Excellent answer, i know this expansions relate to the Mittag-Loeffler theorem. But i wasn't able to put all the details in such a nice form! (+1) – tired Mar 13 '15 at 14:36
  • @Chappers: I was talking about the Laplace transform relation I derived. Section 13-21, Eq. (8). (But who's counting?) As for me, I haven't taken any formal complex analysis since 1993, a devoted course since 1991, so it's just been me and my wits here. People would be surprised by how little I know. I get very happy when I see some weird thing happen in my computation and it turns out to be a use case of one of those abstruse theorems from my bygone days. Excellent. – Ron Gordon Mar 13 '15 at 14:42
  • https://books.google.co.uk/books?id=Mlk3FrNoEVoC&lpg=PA64&pg=PA388#v=onepage&q&f=false This is the page, is it not? (I had a look in Heaviside's book, too... no idea what his notation for the Gamma function means without more rummaging.) It's really quite staggering the amount of integration they used to do in the heyday of complex integration: one of the pinnacles of romantic nonsense, before abstract nonsense took over. – Chappers Mar 13 '15 at 15:45
  • @Chappers:can't see the page, but I trust you have found it. Yes, I love reading these old, pre-computer texts on integration and special functions because these problems required a different kind of cleverness which is not as useful. That's why I love it here: there are loads of us trying to outdo each other with this kind of thinking that puzzles many people. But everyone should have Hardy on his or her shelf. It never really gets old. – Ron Gordon Mar 13 '15 at 16:30
  • Funny that you should mention Hardy: I saw a letter of his once that contained the immortal line "I tried very hard not to spend time on your integrals, but to me the challenge of a definite integral is irresistible." One could almost take it as a motto... – Chappers Mar 13 '15 at 21:24
  • @Chappers: BTW I totally missed your question about W & W vs Watson. I think, yet again, you're right. But still, awesome that it's available. And yes, I saw the Heaviside reference and it took me, like, a half hour to deconstruct his gamma function notation (which looks like synthetic division...do you even have that in the UK?). Re: Hardy, yup. I think I have a book called "Irresistible Integrals"; now I know where the name comes from. – Ron Gordon Mar 16 '15 at 16:04
1

Somewhat similar to how I derived the Mellin transform of $K_{2\alpha}(x) K_{2 \beta}(x)$ here, we can use the integral representation $$ K_{\nu}(x) = \int_{0}^{\infty} e^{-x \cosh(t)} \, \cosh (\nu t) \, \mathrm dt \, , \quad x >0,$$ to show that $$\int_{0}^{\infty} x^{2s-1} K_{2 \nu}(x) \, \mathrm dx = 2^{2(s-1)} \, \Gamma(s+ \nu) \Gamma(s- \nu) \, , \quad s>\nu \ge 0. $$

(This result is deduced in A Treatise on the Theory of Bessel Functions form a more complicated integral.)

It then follows that indeed $$2\int_{0}^{\infty} x^{s-1} K_{0}(2 \sqrt{x}) \, \mathrm dx = 2^{2(1-s)} \int_{0}^{\infty} u^{2s-1} K_{0} (u) \, \mathrm du = \Gamma(s)^{2}.$$


All we need is the well known integral formula $$\int_{0}^{\infty} \frac{\cosh(2 \nu t)}{\cosh^{2s}(t)} \, \mathrm dt = 2^{2(s-1)} \, B (s+ \nu, s-\nu), \quad s > |\nu|. $$

(There's a proof in my previous answer.)

Then we have

$$ \begin{align} \int_{0}^{\infty} x^{2s-1} K_{2 \nu}(x) \, \mathrm dx &= \int_{0}^{\infty} x^{2s-1} \int_{0}^{\infty} e^{-x \cosh (t)} \cosh(2 \nu t) \, \mathrm dt \, \mathrm d x \\ &= \int_{0}^{\infty} \cosh(2 \nu t) \int_{0}^{\infty} x^{2s-1} e^{-x \cosh (t)} \, \mathrm dx \, \mathrm dt \\ &= \Gamma(2s) \int_{0}^{\infty} \frac{\cosh(2 \nu t)}{\cosh^{2s}(t)} \, \mathrm dt \\ &= \Gamma(2s) 2^{2(s-1)} B (s+ \nu, s-\nu) \\ &= 2^{2(s-1)} \, \Gamma(s+ \nu) \Gamma(s- \nu). \end{align}$$


$(1)$ $\int_{0}^{\infty} x^{s-1} e^{-ax} \, \mathrm dx = a^{-s} \, \Gamma(s), \quad a >0$