32

Since integration is not my strong suit I need some feedback on this, please:

Let $Y$ be $\mathcal{N}(\mu,\sigma^2)$, the normal distrubution with parameters $\mu$ and $\sigma^2$. I know $\mu$ is the expectation value and $\sigma$ is the variance of $Y$.

I want to calculate the $n$-th central moments of $Y$.

The density function of $Y$ is $$f(x)=\frac{1}{\sigma\sqrt {2\pi}}e^{-\frac{1}{2}\left(\frac{y-\mu}{\sigma}\right)^2}$$

The $n$-th central moment of $Y$ is $$E[(Y-E(Y))^n]$$

The $n$-th moment of $Y$ is $$E(Y^n)=\psi^{(n)}(0)$$ where $\psi$ is the Moment-generating function $$\psi(t)=E(e^{tX})$$

So I started calculating:

$$\begin{align} E[(Y-E(Y))^n]&=\int_\mathbb{R}\left(f(x)-\int_\mathbb{R}f(x)dx\right)^n\,dx \\ &=\int_\mathbb{R}\sum_{k=0}^n\left[\binom{n}{k}(f(x))^k\left(-\int_\mathbb{R}f(x)dx\right)^{n-k}\right]\,dx \\ &=\sum_{k=0}^n\binom{n}{k}\left(\int_\mathbb{R}\left[(f(x))^k\left(-\int_\mathbb{R}f(x)dx\right)^{n-k}\right]\,dx\right) \\ &=\sum_{k=0}^n\binom{n}{k}\left(\int_\mathbb{R}\left[(f(x))^k\left(-\mu\right)^{n-k}\right]\,dx\right) \\ &=\sum_{k=0}^n\binom{n}{k}\left((-\mu)^{n-k}\int_\mathbb{R}(f(x))^k\,dx\right) \\ &=\sum_{k=0}^n\binom{n}{k}\left((-\mu)^{n-k}E\left(Y^k\right)\right) \\ \end{align}$$

Am I on the right track or completely misguided? If I have made no mistakes so far, I would be glad to get some inspiration because I am stuck here. Thanks!

André Caldas
  • 4,935
  • 19
  • 31
Aufwind
  • 2,185
  • 2
  • 18
  • 29
  • 1
    Since $Y−E(Y)$ has mean $0$ and in this case is normally distributed $N(0,\sigma^2)$, the $n$-th central moment should not be affected by the original mean $\mu$. – Henry Dec 19 '11 at 00:42
  • What you have so far is correct, but as @Henry points out, the central moments are invariant under a shift. So you may as well simplify things by taking $\mu=0$ from the start. In any case, you still need to find $E[Y^n]$ for the normal distribution with mean $0$. – mjqxxxx Dec 19 '11 at 01:02
  • 1
    Your question has a typo in the normal density: there should be a square in the exponent. Also, I disagree with @mjqxxxx's statement that what you have so far is correct." The first step $$E[(Y-E(Y))^n]=\int_\mathbb{R}\left(f(x)-\int_\mathbb{R}f(x)dx\right)^n\,dx$$ is wrong: it should read $$E[(Y-E(Y))^n]=\int_\mathbb{R}\left(x-\int_\mathbb{R}xf(x)dx\right)^nf(x)\,dx=\int_\mathbb{R}\left(x-\mu\right)^nf(x)\,dx$$ and the last step follows immediately upon expanding $(x-\mu)^n$ via the binomial theorem, separating into a sum of integrals, and identifying $\int_\mathbb{R}x^kf(x)\,dx=E[Y^k]$. – Dilip Sarwate Dec 19 '11 at 01:41
  • 3
    "Am I on the right track.....?" It depends on where you want to go! As Sasha shows in his answer, $$E[(Y-\mu)^n] = \hat{m}_{n} = \begin{cases}0, & n~\text{odd},\\\sigma^n(n-1)(n-3)\cdots 3\cdot 1,& n~\text{even},\end{cases}$$ can be evaluated in straightforward fashion. On the other hand, your approach succeeded in expressing the central $\hat{m}_n$ in terms of the standard (non-central) moments $m_k = E[Y^k]$ and so now you have the task of evaluating $n$ different integrals to find the $m_k$'s. So your approach does not seem too promising to say the least. – Dilip Sarwate Dec 19 '11 at 03:28
  • @Dilip: Thank you for pointing that out. – Aufwind Dec 19 '11 at 04:11

2 Answers2

46

The $n$-th central moment $\hat{m}_n = \mathbb{E}\left( \left(X-\mathbb{E}(X)\right)^n \right)$. Notice that for the normal distribution $\mathbb{E}(X) = \mu$, and that $Y = X-\mu$ also follows a normal distribution, with zero mean and the same variance $\sigma^2$ as $X$.

Therefore, finding the central moment of $X$ is equivalent to finding the raw moment of $Y$.

In other words, $$ \begin{eqnarray} \hat{m}_n &=& \mathbb{E}\left( \left(X-\mathbb{E}(X)\right)^n \right) = \mathbb{E}\left( \left(X-\mu\right)^n \right) = \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi} \sigma} (x-\mu)^n \mathrm{e}^{-\frac{(x-\mu)^2}{2 \sigma^2}} \mathrm{d} x\\ & \stackrel{y=x-\mu}{=}& \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi} \sigma} y^n \mathrm{e}^{-\frac{y^2}{2 \sigma^2}} \mathrm{d} y \stackrel{y = \sigma u}{=} \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi} \sigma} \sigma^n u^n \mathrm{e}^{-\frac{u^2}{2}} \sigma \mathrm{d} u \\ &=& \sigma^n \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi} } u^n \mathrm{e}^{-\frac{u^2}{2}} \mathrm{d} u \end{eqnarray} $$ The latter integral is zero for odd $n$ as it is the integral of an odd function over a real line. So consider $$ \begin{eqnarray} && \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi} } u^{2n} \mathrm{e}^{-\frac{u^2}{2}} \mathrm{d} u = 2 \int_{0}^\infty \frac{1}{\sqrt{2\pi} } u^{2n} \mathrm{e}^{-\frac{u^2}{2}} \mathrm{d} u \\ && \stackrel{u=\sqrt{2 w}}{=} \frac{2}{\sqrt{2\pi}} \int_0^\infty (2 w)^n \mathrm{e}^{-w} \frac{\mathrm{d} w }{\sqrt{2 w}} = \frac{2^n}{\sqrt{\pi}} \int_0^\infty w^{n-1/2} \mathrm{e}^{-w} \mathrm{d} w = \frac{2^n}{\sqrt{\pi}} \Gamma\left(n+\frac{1}{2}\right) \end{eqnarray} $$ where $\Gamma(x)$ stands for the Euler's Gamma function. Using its properties we get $$ \hat{m}_{2n} = \sigma^{2n} (2n-1)!! \qquad\qquad \hat{m}_{2n+1} = 0 $$

alick
  • 165
  • 8
Sasha
  • 68,169
  • 6
  • 133
  • 210
  • 1
    Would you be so kind to explain this a little further, please: *The latter integral is zero for odd n as an integral of even and odd functions over a real line.* – Aufwind Dec 19 '11 at 02:02
3

If $X\sim N(\mu,\sigma^2)$ then the $k$th central moment $E[(X-\mu)^k]$ is the same as the $k$th moment $E(Y^k)$ of $Y\sim N(0,\sigma^2)$.

For $Y\sim N(0,\sigma^2)$ the moment-generating function is$^\color{red}a$: $$E(e^{tY})=e^{t^2\sigma^2/2}.\tag1$$ One of the uses of the moment-generating function is, ahem, to generate moments. You can do this by expanding both sides of (1) as power series in $t$, and then matching coefficients. This is easily done for the normal distribution: Using $\displaystyle e^x=\sum_\limits{k=0}^\infty \frac {x^k}{k!}$, the LHS of (1) expands as $$ E(e^{tY})=E\left(\sum_{k=0}^\infty \frac{(tY)^k}{k!}\right)=\sum_{k=0}^\infty\frac{E(Y^k)}{k!}t^k\tag2 $$ while the RHS expands as $$ e^{t^2\sigma^2/2}=\sum_{k=0}^\infty \frac {(t^2\sigma^2/2)^k}{k!}=\sum_{k=0}^\infty\frac{\sigma^{2k}}{k!2^k}t^{2k}.\tag3 $$ By comparing coefficients of like powers of $t$ in (2) and (3), we see:

  • If $k$ is odd, then $E(Y^k)=0$.

  • If $k$ is even, say $k=2n$, then $\displaystyle\frac{E(Y^{2n})}{(2n)!}$, which is the coefficient of $t^{2n}$ in (2), equals the coefficient of $t^{2n}$ in (3), which is $\displaystyle\frac{\sigma^{2n}}{n!2^n}$. In other words: $$E(Y^{2n})=\frac{(2n)!}{n!2^n}\sigma^{2n}.\tag4 $$ By using $n!2^n=2(n)\cdot 2(n-1)\cdots2(1)=(2n)\cdot(2n-2)\cdots(2)$, we can rewrite (4) as: $$E(Y^{2n})=(2n-1)!!\,\sigma^{2n}.\tag5 $$


$\color{red}a:$ If $Z$ has standard normal distribution then its moment generating function is

$$E(e^{tZ})=\int e^{tz}\frac1{\sqrt{2\pi}}e^{-\frac12z^2}\,dz=\int\frac1{\sqrt{2\pi}}e^{-\frac12(z^2-2tz)}dz=e^{t^2/2}\underbrace{ \int\frac1{\sqrt{2\pi}}e^{-\frac12(z-t)^2}dz }_{1}=e^{t^2/2}.$$

If $X\sim N(\mu,\sigma^2)$ then $X$ is distributed like $\mu+\sigma Z$ hence the moment generating function of $X$ is $$E(e^{tX})=E(e^{t(\mu +\sigma Z)})=e^{t\mu} E(e^{t\sigma Z}) = e^{t\mu+(t\sigma)^2/2}.$$

grand_chat
  • 32,010
  • 1
  • 28
  • 54