61

Say $X_1, X_2, \ldots, X_n$ are independent and identically distributed uniform random variables on the interval $(0,1)$.

What is the product distribution of two of such random variables, e.g., $Z_2 = X_1 \cdot X_2$?

What if there are 3; $Z_3 = X_1 \cdot X_2 \cdot X_3$?

What if there are $n$ of such uniform variables? $Z_n = X_1 \cdot X_2 \cdot \ldots \cdot X_n$?

holeInAce
  • 15
  • 5
lulu
  • 968
  • 1
  • 7
  • 11
  • 16
    If you are after the product of $n$ independent standard Uniform random variables, then the pdf of the product, say $f(z)$ will be: $$f(z) = \frac{(-1)^{n-1} \log ^{n-1}(z)}{(n-1)!} \qquad \text{ for } 0 – wolfies Feb 01 '14 at 07:20
  • 1
    @wolfies Thanks, this is very helpful – lulu Feb 01 '14 at 07:28
  • 2
    @wolfies -log(z) gives 4.6 for `z=.01` . I should be doing something wrong. Probability cannnot go over 1. – Asad Iqbal Dec 29 '14 at 04:13
  • 1
    My questions is answered. I found this. http://math.stackexchange.com/questions/105455/how-can-a-probability-density-be-greater-than-one-and-integrate-to-one Thanks! – Asad Iqbal Dec 29 '14 at 04:32
  • 5
    @AsadIqbal: The probability *density* **can** go over $1$. – robjohn Jun 08 '18 at 14:21
  • How do you use the PDF to get a mean? E.g. If I have 2 uniform random numbers between 0 and 1, what is the mean of their product? – Robert Feb 02 '21 at 13:15

3 Answers3

65

We can at least work out the distribution of two IID ${\rm Uniform}(0,1)$ variables $X_1, X_2$: Let $Z_2 = X_1 X_2$. Then the CDF is $$\begin{align*} F_{Z_2}(z) &= \Pr[Z_2 \le z] = \int_{x=0}^1 \Pr[X_2 \le z/x] f_{X_1}(x) \, dx \\ &= \int_{x=0}^z \, dx + \int_{x=z}^1 \frac{z}{x} \, dx \\ &= z - z \log z. \end{align*}$$ Thus the density of $Z_2$ is $$f_{Z_2}(z) = -\log z, \quad 0 < z \le 1.$$ For a third variable, we would write $$\begin{align*} F_{Z_3}(z) &= \Pr[Z_3 \le z] = \int_{x=0}^1 \Pr[X_3 \le z/x] f_{Z_2}(x) \, dx \\ &= -\int_{x=0}^z \log x dx - \int_{x=z}^1 \frac{z}{x} \log x \, dx. \end{align*}$$ Then taking the derivative gives $$f_{Z_3}(z) = \frac{1}{2} \left( \log z \right)^2, \quad 0 < z \le 1.$$ In general, we can conjecture that $$f_{Z_n}(z) = \begin{cases} \frac{(- \log z)^{n-1}}{(n-1)!}, & 0 < z \le 1 \\ 0, & {\rm otherwise},\end{cases}$$ which we can prove via induction on $n$. I leave this as an exercise.

heropup
  • 112,408
  • 12
  • 89
  • 159
  • 2
    Fantastic!!! thanks – lulu Feb 01 '14 at 07:31
  • 2
    For $Z_2$, how do you go from step 2 to step 3? And 3 to 4? – Sycorax Mar 03 '16 at 18:21
  • 3
    I don't understand why $\Pr[Z_2 \le z] = \int_{x=0}^1 \Pr[X_2 \le z/x] f_{X_1}(x) \, dx$. Can you help? Is this using conditional distribution? I'm not seeing it. – JKEG Oct 17 '18 at 15:35
  • @JKEG: These are just properties of functions of random variables. – MSIS Nov 20 '19 at 20:58
  • +1 for breaking up the integral in step 1 into sum of 2 integrals. Thanks a lot! – DSP Rookie Apr 16 '20 at 12:29
  • I don't understand where the first summand of the integral comes from? Why is the integral broken into pieces in that step? – synack Jul 24 '20 at 06:56
  • 1
    @synack Suppose $z = 1/2$. Then if $x = 1/4$, $z/x = 2$. What is $\Pr[X_2 \le 2]$ if $X_2$ is a random variable that can only ever be between $0$ and $1$? You have two cases for $\Pr[X_2 \le z/x]$, depending on whether $x < z$, or $x \ge z$; that is why the integral gets split up. – heropup Jul 24 '20 at 07:21
  • got it. thanks @heropup – synack Jul 24 '20 at 07:30
  • I just want to clarify that $P(Z_2 \leq z = \int_0^1 Pr(X_2 \leq z/x) f_{X_1}(x_1) dx$ is just an application of the law of total probability, right? – student010101 Apr 05 '21 at 15:15
  • @MSIS In what way is that just a property of random variables? I cannot see how I would transform one into the other. – TheOutZ Mar 07 '22 at 15:16
  • @TheOutZ: I'm not 100% on what is your question , but we're computing: $P(X_1X_2 \geq z)$, which is equivalent to $P(X_1 \geq z/X_2)$ – MSIS Mar 16 '22 at 15:02
  • @TheOutZ :Is that what you meant? – MSIS Mar 17 '22 at 22:05
  • @MSIS Not really, I think what JKEG and me are trying to get at is why the probability can be represented as an integral $\mathbb{P}(Z_2 \leq z) = \int_{0}^{1} \mathbb{P}(X_2 \leq \frac{z}{x}) f_{X_1}(x) \, dx$. If I remember where my confusion came from correctly, I couldn't work out this identity with the transformation theorem ($\int_{\Omega} X \, d\mathbb{P}(\omega) = \int_{\mathbb{R}} x \, d\mathbb{P}_X(x)$) because this change would always affect $X_2$ (because both $X_1$ and $X_2$ vary under the same variable), if I remembered this right. – TheOutZ Mar 17 '22 at 23:30
31

If $X_1$ is uniform, then $-\log X_1 \sim \textrm{Exp}(1)$. Therefore, $$- \log X_1 \dots X_n = -\log X_1 + \dots -\log X_n$$ is a sum of independent exponential random variables and has Gamma distribution with parameters $(n,1)$ and density $g(y) = \frac{1}{(n-1)!} y^{n-1}e^{-y}$ for $y\geq 0$. Let $f$ be the density of the product $X_1 \dots X_n$, then the Jacobi's transformation formula yields $$ f( h^{-1}(y) ) | \partial h^{-1}(y) | = g(y), $$ with $h(x) = -\log x$ and $h^{-1}(y) = \exp(-y)$. The substitution $y=h(x)$ in the above equation gives $$ f(x) = \frac{1}{(n-1)!}(-\log x)^{n-1} \, 1_{ (0,1]}(x).$$

Julian Wergieluk
  • 865
  • 7
  • 14
12

An adaptation of this answer is given here.


PDF of a Function of a Random Variable

If $P(X\le x)=F(x)$ is the CDF of $X$ and $P(Y\le y)=G(y)$ is the CDF of $Y$ where $Y=f(X)$, then $$ F(x)=P(X\le x)=P(Y\le f(x))=G(f(x))\tag1 $$ Taking the derivative of $(1)$, we get $$ F'(x)=G'(f(x))\,f'(x)\tag2 $$ where $F'$ is the PDF of $X$ and $G'$ is the PDF of $Y$.


PDF of the Product of Independent Uniform Random Variables

If $[0\le x\le1]$ is the PDF for $X$ and $Y=\log(X)$, then by $(2)$ the PDF of $Y$ is $e^y[y\le0]$. The PDF for the sum of $n$ samples of $Y$ is the $n$-fold convolution of $e^y[y\le0]$ with itself. The Fourier Transform of this $n$-fold convolution is the $n^\text{th}$ power of the Fourier Transform of $e^y[y\le0]$, which is $$ \int_{-\infty}^0 e^{-2\pi iyt}e^y\,\mathrm{d}y=\frac1{1-2\pi it}\tag3 $$ Thus, the PDF for the sum of $n$ samples of $Y$ is $$ \begin{align} \sigma_n(y) &=\int_{-\infty}^\infty\frac{e^{2\pi iyt}}{(1-2\pi it)^n}\,\mathrm{d}t\tag{4a}\\ &=\frac{e^y}{2\pi i}\int_{1-i\infty}^{1+i\infty}\frac{e^{-yz}}{z^n}\,\mathrm{d}z\tag{4b}\\ &=e^y\frac{(-y)^{n-1}}{(n-1)!}\,[y\le0]\tag{4c} \end{align} $$ Explanation:
$\text{(4a)}$: take the inverse Fourier Transform
$\text{(4b)}$: substitute $t=\frac{1-z}{2\pi i}$
$\text{(4c)}$: if $y\gt0$, close the contour on the right half-plane, missing the singularity at $z=0$
$\phantom{\text{(4c):}}$ if $y\le0$, close the contour on the left half-plane, enclosing the singularity at $z=0$

We can get the PDF for the product of $n$ samples of $X$ by applying $(2)$ to $(4)$ $$ \bbox[5px,border:2px solid #C0A000]{\pi_n(x)=\frac{(-\log(x))^{n-1}}{(n-1)!}\,[0\le x\le1]}\tag5 $$ enter image description here

robjohn
  • 326,069
  • 34
  • 421
  • 800