Loading [MathJax]/jax/element/mml/optable/GeneralPunctuation.js

Wednesday, June 23, 2021

Feynman's trick II

log integral

Formula involving integrals of logarithm function

We prove the following formula involving the integral of powers of the function ln(1x). 10lnn(1x)ln(x)xdx=(1)n+1n![(n+12)ζ(n+2)12n1k=1ζ(k+1)ζ(n+1k)] Some examples for the first naturals: n=110ln(1x)ln(x)xdx=ζ(3)n=210ln2(1x)ln(x)xdx=3ζ(4)+ζ(2)ζ(2)=π4180n=310ln3(1x)ln(x)xdx=12ζ(5)6ζ(2)ζ(3)=12ζ(5)π2ζ(3)n=410ln4(1x)ln(x)xdx=60ζ(6)+24ζ(2)ζ(4)+12ζ(3)ζ(3)=12ζ2(3)2π6105n=510ln5(1x)ln(x)xdx=360ζ(7)120ζ(2)ζ(5)120ζ(3)ζ(4)=4π4ζ(3)320π2ζ(5)+360ζ(7) Proof 10lnn(1x)ln(x)xdx=10[dndtn|t=0+(1x)t][dds|s=0+xs]xdx=dndtn|t=0+dds|s=0+10(1x)txsxdx=dndtn|t=0+dds|s=0+10(1x)txs1dx=dndtn|t=0+dds|s=0+B(t+1,s) Now, recall the series expansion of the beta function (proof in Appendix 1): B(x,y)=j=0(1y)jj!(x+j)y>0 Therefore dndtn|t=0+dds|s=0+B(t+1,s)=dndtn|t=0+dds|s=0+j=0(1s)jj!(t+1+j)=dndtn|t=0+[lims0+ddsj=0(1s)jj!(t+1+j)]=dndtn|t=0+[lims0+j=0(1s)j(ψ(0)(1s)ψ(0)(1s+j))j!(t+1+j)]=dndtn|t=0+[j=0(1)j(ψ(0)(1)ψ(0)(1+j))j!(t+1+j)]=dndtn|t=0+[j=0Γ(j+1)(γψ(0)(1+j))j!(t+1+j)]=dndtn|t=0+[j=0Γ(j+1)Hjj!(t+1+j)]=dndtn|t=0+[j=0Hj(t+1+j)]=limt0+j=0dndtnHj(t+1+j)=limt0+j=0(1)n+1n!Hj(t+1+j)n+1=j=0(1)n+1n!Hj(1+j)n+1 Now we know that Hj+1=1j+1+Hj therefore j=0(1)n+1n!Hj(1+j)n+1=j=0(1)n+1n!Hj+1(1+j)n+1j=0(1)n+1n!(1+j)n+2=(1)n+1n![j=0Hj+1(1+j)n+1j=01(1+j)n+2] If m=j+1 then (1)n+1n![j=0Hj+1(1+j)n+1j=01(1+j)n+2]=(1)n+1n![m=1Hmmn+1m=11mn+2]=(1)n+1n![m=1Hmmn+1ζ(n+2)] Now, recall this beutiful result proven by Euler (Proof in Appendix 2): n=1Hnnr=(1+r2)ζ(r+1)12r2k=1ζ(k+1)ζ(rk) Then (1)n+1n![m=1Hmmn+1ζ(n+2)]=(1)n+1n![(1+n+12)ζ(n+2)12n1k=1ζ(k+1)ζ(n+1k)ζ(n+2)]=(1)n+1n![(n+12)ζ(n+2)12n1k=1ζ(k+1)ζ(n+1k)] Therefore 10lnn(1x)ln(x)xdx=(1)n+1n![(n+12)ζ(n+2)12n1k=1ζ(k+1)ζ(n+1k)]

Apendix 1: Series expansion for the beta function

From the definition of the beta function: B(x,y)=10tx1(1t)y1dt Recall the series expansion of (1t)y1 (1t)y1=j=0(1)j(y1j)tj 10tx1(1x)y1dt=10tx1j=0(1)j(y1j)tjdt=j=0(1)j(y1j)0tx+j1dt=j=0(1)j(y1j)1(x+j) Then we have j=0(1)j(y1j)1(x+j)=j=0(jyj)1(x+j)=j=0(jy)!j!(y)!1(x+j)=j=0Γ(jy+1)j!Γ(1y)1(x+j)=j=0(1y)jj!(x+j) where the last equality is the definnition of the Pochhammer polynomials. Therefore B(x,y)=j=0(1y)jj!(x+j)

Appendix 2: Proof of Euler's theorem for harmonic series

I found a beautiful proof of this result which relies on the residue theorem. I really liked this proof because it shows that the famous result due to Euler is just a residue of certain kernel function. I present here the original proof in the spirit of Flajolet et al. [1] but I complete the omitted parts by the authors in the original paper.

We need the following Lemma due to Cauchy and Lindelöf.

Lemma. Let ξ be a kernel function and let r(s) be a rational function with O(s2) at infinity. Then α{Res(r(s)ξ(s),α)|α is a pole of ξ that is not a pole of r(s)}infinite series=β{Res(r(s)ξ(s),β)|β is a pole of r(s)}finite series

We will apply the lemma to the kernel ξ(s)=(ψ(s)+γ)2

Propositon. If ξ(s)=(ψ(s)+γ)2 then
α{Res(r(s)ξ(s),α)|α is a pole of ξ that is not a pole of r(s)}=2n=1Hnr(n)+n=1r(n)
Proof

The kernel ξ(s) has a countable number of pole at the positive integers.
Let nN, then the function (ψ(s)+γ) has the following expansion at sn.

ψ(s)+γ=1(sn)+Hn+k=1[(1)KHk+1nζ(k+1)](sn)k Therefore
(ψ(s)+γ)2=[1(sn)+Hn+k=1[(1)KHk+1nζ(k+1)]Ak(sn)k]2=(1(sn)+Hn)2+2(1(sn)+Hn)k=1Ak(sn)k+(k=1Ak(sn)k)2=1(sn)2+2Hn(sn)+H2n+2(1(sn)+Hn)k=1Ak(sn)k+(k=1Ak(sn)k)2=1(sn)2+2Hn(sn)+H2n+2A1+O(sn) Now, suppose r(s) is a rational function that can be expanded with its power series (in reality, we have assumed something weaker: that r(s) is O(s2)). Then for sn: r(s)=r(n)+r(n)(sn)+r Therefore \begin{align*} r(s)(\psi(-s)+\gamma)^2 =& r(n)(\psi(-s)+\gamma)^2+ r'(n)(s-n)(\psi(-s)+\gamma)^2+\frac{r''(n)}{2!}(s-n)^2(\psi(-s)+\gamma)^2+...\\ =& \frac{r(n)}{(s-n)^2}+\frac{2H_{n}r(n)}{(s-n)} + H^2_{n}r(n) + 2A_{1}r(n) + \mathscr{O}(s-n)r(n) + \frac{r'(n)}{(s-n)}+2H_{n}r'(n) + H^2_{n}r'(n)(s-n) + 2A_{1}r'(n)(s-n) + \mathscr{O}(s-n)r'(n)(s-n) + ... \\ =& H^2_{n}r(n)+ 2A_{1}r(n)+2H_{n}r'(n)+\frac{r''(n)}{2!}+\frac{1}{s-n}\underbrace{\left[2H_{n}r(n)+r'(n)\right]}_{\textrm{residue at } n} +\frac{r(n)}{(s-n)^2} + \mathscr{O}(s-n) \end{align*} \therefore \operatorname{Res}\left(r(s)(\psi(-s)+\gamma)^2, n\right) = 2H_{n}r(n)+r'(n) Summing each n \in \mathbb{N}\setminus\left\{0\right\} \boxed{\sum_{n=1}^{\infty} \operatorname{Res}\left(r(s)(\psi(-s)+\gamma)^2, n\right) = 2\sum_{n=1}^{\infty}H_{n}r(n)+ \sum_{n=1}^{\infty}r'(n)} or, equivalently \boxed{\sum_{\alpha } \left\{\operatorname{Res}\left(r(s)\xi(s), \alpha\right) \Big| \alpha \textrm{ is a pole of } \xi \textrm{ that is not a pole of } r(s) \right\} = 2\sum_{n=1}^{\infty}H_{n}r(n)+ \sum_{n=1}^{\infty}r'(n)}

We are ready to prove Euler's theorem

Theorem (Euler). For integer \geq 2, \sum_{n=1}^{\infty} \frac{H_{n}}{n^q} = \left(1+\frac{q}{2}\right)\zeta(q+1) -\frac{1}{2}\sum_{k=1}^{q-2} \zeta(k+1)\zeta(q-k) Proof If we apply the last result to the function \displaystyle r(s)= \frac{1}{s^q} we have \sum_{n=1}^{\infty} \operatorname{Res}\left(\frac{(\psi(-s)+\gamma)^2}{s^q}, n\right) = 2\sum_{n=1}^{\infty}\frac{H_{n}}{n^q} -q\zeta(q+1) By the first lemma, \displaystyle -\sum_{\beta} \left\{\operatorname{Res}\left(\frac{(\psi(-s)+\gamma)^2}{s^q}, \beta \right) \Big| \beta \textrm{ is a pole of } \frac{1}{s^q} \right\} = 2\sum_{n=1}^{\infty}\frac{H_{n}}{n^q} -q\zeta(q+1) Given that \displaystyle r(s)= \frac{1}{s^q} has a single pole at s=0 of order q we have: -\operatorname{Res}\left(\frac{(\psi(-s)+\gamma)^2}{s^q}, 0\right) =2\sum_{n=1}^{\infty}\frac{H_{n}}{n^q} -q\zeta(q+1)

To find the residue we can expand \displaystyle \frac{(\psi(-s)+\gamma)^2}{s^q}:
First, recall the series expansion of \psi(-s)+\gamma at s=0:
\psi(-s)+\gamma = \frac{1}{s} - \sum_{k=1}^{\infty} \zeta(k+1)s^{k} Therefore \begin{align*} (\psi(-s)+\gamma)^2 =& \left[\frac{1}{s} -\sum_{k=1}^{\infty} \zeta(k+1)s^{k}\right]^2 \\ =& \frac{1}{s^2} -2\sum_{k=1}^{\infty} \zeta(k+1)s^{k-1} + \left(\sum_{k=1}^{\infty} \zeta(k+1)s^{k}\right)^2\\ =& \frac{1}{s^2} -2\sum_{k=1}^{\infty} \zeta(k+1)s^{k-1} + \sum_{k=1}^{\infty}\sum_{j=1}^{k} \zeta(j+1)\zeta(k-j+2)s^{k+1}\\ \end{align*} Then \begin{align*} \frac{(\psi(-s)+\gamma)^2}{s^q} = \frac{1}{s^{2+q}} -2\underbrace{\sum_{k=1}^{\infty} \zeta(k+1)s^{k-1-q}}_{A} + \underbrace{\sum_{k=1}^{\infty}\sum_{j=1}^{k} \zeta(j+1)\zeta(k-j+2)s^{k+1-q}}_{B} \end{align*} From here is easy to show that the residue in A is the coefficient of \displaystyle s^{k-1-q}=s^{-1} \Longrightarrow k-1-q=-1 \Longrightarrow k=q and the residue in B is coefficient of \displaystyle s^{k+1-q}=s^{-1} where \Longrightarrow k+1-q=-1 \Longrightarrow k=q-2 Therefore \operatorname{Res}\left(\frac{(\psi(-s)+\gamma)^2}{s^q},0\right) = -2\zeta(q+1) + \sum_{j=1}^{q-2} \zeta(j+1)\zeta(q-j) Then 2\zeta(q+1) - \sum_{j=1}^{q-2} \zeta(j+1)\zeta(q-j) = 2\sum_{n=1}^{\infty}\frac{H_{n}}{n^q} -q\zeta(q+1) \boxed{\therefore\sum_{n=1}^{\infty} \frac{H_{n}}{n^q} = \left(1+\frac{q}{2}\right)\zeta(q+1) -\frac{1}{2}\sum_{k=1}^{q-2} \zeta(k+1)\zeta(q-k)}

[1] Philippe Flajolet, Bruno Salvy. Euler Sums and Contour Integral Representations. [Research Report] RR-2917, INRIA. 1996. ffinria-00073780f

No comments:

Post a Comment

Series of the day

Series involving the digamma and the zeta functions The sum \displaystyle \sum\frac{1}{(n+1)^pn^q} ...