28

What is the distribution of a random variable that is the product of the two normal random variables ?

Let $X\sim N(\mu_1,\sigma_1), Y\sim N(\mu_2,\sigma_2)$ and $Z=XY$

That is, what is its probability density function, its expected value, and its variance ?

I'm kind of stuck and I can't find a satisfying answer on the web. If anybody knows the answer, or a reference or link, I would be really thankful...

Justin
  • 1,644
  • 2
  • 16
  • 36
Clara
  • 281
  • 1
  • 3
  • 3
  • Please check if this [post](http://math.stackexchange.com/questions/133938/what-is-the-density-of-the-product-of-k-i-i-d-normal-random-variables) answers your questions. – Sasha Jun 22 '12 at 18:04
  • 3
    Saying that each is normally distributed falls short of saying what the joint distribution is. Often it's intended that they are independent but that doesn't get mentioned. But it should be. – Michael Hardy Jun 22 '12 at 18:06
  • 1
    The distribution is fairly messy. For independent normals with mean $0$, we are dealing with the *product normal*, which has been studied. For general independent normals, mean and variance of the product are not hard to compute from general properties of expectation. – André Nicolas Jun 22 '12 at 18:13
  • 1
    @Clara, I attempted to edit your post to make it more clear. Please let me know if anything was incorrect. – Justin Jun 22 '12 at 19:56

5 Answers5

21

I will assume $X$ and $Y$ are independent. By scaling, we may assume for simplicity that $\sigma_1 = \sigma_2 = 1$. You might then note that $XY = (X+Y)^2/4 - (X-Y)^2/4$ where $X+Y$ and $X-Y$ are independent normal random variables; $(X+Y)^2/2$ and $(X-Y)^2/2$ have noncentral chi-squared distributions with $1$ degree of freedom. If $f_1$ and $f_2$ are the densities for those, the PDF for $XY$ is $$ f_{XY}(z) = 2 \int_0^\infty f_1(t) f_2(2z+t)\ dt$$

Robert Israel
  • 416,382
  • 24
  • 302
  • 603
  • Very nice. (+1) – robjohn Jun 22 '12 at 21:49
  • 1
    If $X$ and $Y$ are independent standard gaussians, the characteristic function should be $$\phi_{XY}(t)=\frac{1}{\sqrt{1+4t^2}}$$ as done here http://math.stackexchange.com/questions/101062/is-the-product-of-two-gaussian-random-variables-also-a-gaussian/101120#101120 – qwertyuio Jan 18 '13 at 22:54
  • @Robert Israel: Why are $X+Y$ and $X-Y$ are independent? – May Sep 13 '13 at 18:05
  • Two normal random variables are independent iff they are uncorrelated. Calculate the covariance of $X+Y$ and $X-Y$. – Robert Israel Sep 14 '13 at 01:40
  • is this correct? $Cov(X+Y,X-Y)=Cov(X,X)-Cov(X,Y)+Cov(X,Y)-Cov(Y,Y)=Var(X)-Var(Y)$ so if their variances are not equal they will not be independent. – May Sep 14 '13 at 01:47
  • 2
    Yes, but I said "By scaling, we may assume for simplicity that $\sigma_1 = \sigma_2 = 1$. – Robert Israel Sep 15 '13 at 06:57
  • Could you please go further into why the PDF for XY is $2 \int f_1(t)f_2(2z+t)dt$ ? I read through the wikis for non-central and central Chi but I'm still having trouble understanding the leap you make to write down the PDF. Why do you have an integral? Can't you just leave the PDF as the product of the independent r.v.s PDFs? – goldisfine Nov 01 '14 at 03:11
  • If $A$ and $B$ are independent with densities $f_A$ and $f_B$, then the density for $A+B$ is $f_{A+B}(x) = \int f_A(t) f_B(x-t)\; dt$ (the convolution of the two densities). In this case take $A = (X+Y)^2/4$ and $B = -(X-Y)^2/4$. – Robert Israel Nov 02 '14 at 02:59
  • Thanks Prof. Israel, much appreciated! – goldisfine Nov 03 '14 at 15:25
  • Why do you assume that $X$ and $Y$ are independent? – Diego Fonseca Feb 26 '17 at 02:09
  • Because otherwise there is no answer: you need more information. – Robert Israel Feb 26 '17 at 07:34
9

For the special case that both Gaussian random variables $X$ and $Y$ have zero mean and unit variance, and are independent, the answer is that $Z=XY$ has the probability density $p_Z(z)={\rm K}_0(|z|)/\pi$. The brute force way to do this is via the transformation theorem: \begin{align} p_Z(z)&=\frac{1}{2\pi}\int_{-\infty}^\infty{\rm d}x\int_{-\infty}^\infty{\rm d}y\;{\rm e}^{-(x^2+y^2)/2}\delta(z-xy) \\ &= \frac{1}{\pi}\int_0^\infty\frac{{\rm d}x}{x}{\rm e}^{-(x^2+z^2/x^2)/2}\\ &= \frac{1}{\pi}{\rm K}_0(|z|) \ . \end{align}

Markus Deserno
  • 191
  • 1
  • 1
8

Given the densities $\varphi$ and $\psi$ of two independent random variables, the probability that their product is less than $z$ is $$ \iint_{xy< z}\varphi(x)\psi(y)\,\mathrm{d}x\,\mathrm{d}y\tag{1} $$ Letting $w=xy$ so that $x=w/y$ yields $$ \iint_{w< z}\varphi\left(\frac{w}{y}\right)\psi(y)\,\mathrm{d}\frac{w}{y}\,\mathrm{d}y=\iint_{w< z}\varphi\left(\frac{w}{y}\right)\psi(y)\,\mathrm{d}w\,\frac{\mathrm{d}y}{y}\tag{2} $$ Taking the derivative of $(2)$ with respect to $z$ gives the density of the product of the random variables to be $$ \phi(z)=\int\varphi\left(\frac{z}{y}\right)\psi(y)\,\frac{\mathrm{d}y}{y}\tag{3} $$ We can compute the expected value using this distribution as $$ \begin{align} \mathrm{E}(Z) &=\int z\phi(z)\,\mathrm{d}z\\ &=\iint z\,\varphi\left(\frac{z}{y}\right)\psi(y)\,\frac{\mathrm{d}y}{y}\,\mathrm{d}z\\ &=\iint xy\,\varphi(x)\psi(y)\,\mathrm{d}y\,\mathrm{d}x\tag{4} \end{align} $$ which is exactly what one would expect when computing the expected value of the product directly.

In the same way, we can also compute $$ \begin{align} \mathrm{E}(Z^2) &=\int z^2\phi(z)\,\mathrm{d}z\\ &=\iint z^2\,\varphi\left(\frac{z}{y}\right)\psi(y)\,\frac{\mathrm{d}y}{y}\,\mathrm{d}z\\ &=\iint x^2y^2\,\varphi(x)\psi(y)\,\mathrm{d}y\,\mathrm{d}x\tag{5} \end{align} $$ again getting the same result as when computing this directly.

The variance is then, as usual, $\mathrm{E}(Z^2)-\mathrm{E}(Z)^2$.

robjohn
  • 326,069
  • 34
  • 421
  • 800
5

To your first point, let's look at how we calculate the expectation of a product. The idea is easy enough but Gaussian distributions can look a little messier than they really are.

Specifically, to look at the distribution function, I would start here: http://en.wikipedia.org/wiki/Product_distribution

For expectation though, recall (or note):

For independent random variables, the joint probability distribution function, $h(x,y)$ can be found simply as the product of the marginal distributions, say $f(x)$ and $g(y)$.

That is $h(x,y)=f(x)*g(y)$. You find the expectation in the same way you would find it for a single variable with single pmf. Namely,

$E(XY)=E(Z)=\int\int xy*h(x,y)dydx$

=$\int\int xy (f(x)g(y)dydx)=[\int xf(x)dx][\int yg(y)dy]=E(X)E(Y)$

For standard normal RVs, this is simple to compute. If, in fact, your variables are not independent, then you need to incorporate a covariance term into your calculations.

Hope that helps.

Justin
  • 1,644
  • 2
  • 16
  • 36
  • 2
    More in general, if $X,Y$ are independent, then $E(Z^m)=E(X^m)E(Y^m)$. In this way, we get $\sigma^2_Z=\sigma^2_X \sigma^2_Y + \sigma^2_Y \mu_X^2 +\sigma^2_X \mu_Y^2 $ – leonbloy Jun 22 '12 at 20:12
  • 2
    ... and more generally, $E[X^m] = \sum_{k=0}^{\lfloor m/2 \rfloor} \dfrac{\mu_1^{m-2k} \sigma_1^{2k} m!}{2^k k! (m-2k)!}$ and similarly for $E[Y^m]$, so you can get all the moments. – Robert Israel Jun 22 '12 at 20:33
3

It is called The Algebra of Random Variables by Melvin D. Spinger (Wiley, 1979) and includes a lot on products: http://www.amazon.com/Algebra-Variables-Probability-Mathematical-Statistics/dp/0471014060/ref=sr_1_1?s=books&ie=UTF8&qid=1340403029&sr=1-1&keywords=the+algebra+of+random+variables

In searching I also found this book by Galambos and Simonelli: http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Dstripbooks&field-keywords=product+of+random+variables

Michael R. Chernick
  • 4,529
  • 2
  • 18
  • 24