30

Is there a simple example showing that given $X,Y$ uncorrelated (covariance is zero), $X,Y$ are not independent?

I have looked up two references, however, I am dissatisfied with both.

  • In Reference $1$, $X,Y$ are assumed to be independent uniform RVs from $(0,1)$, construct $Z = X+Y, W = X - Y$, then the claim is that $Z,W$ is uncorrelated but not independent. Unfortunately, finding the PDF of $Z,W$ is not trivial.

  • In Reference $2$, $\phi$ is assumed to be uniform RV from $(0, 2\pi)$, and construct $X = \cos(\phi)$, $Y = \sin(\phi)$. Then the claim is that $X,Y$ are uncorrelated but not independent. Unfortunately, the PDFs of $X,Y$ takes on the form of rarely mentioned arcsine distribution.

I just wish to have an example at hand where I can whip out to show that uncorrelated does not necessarily implies independent. Is this do-able?

Olórin
  • 5,156
  • 1
  • 32
  • 62
  • 4
    Do you really need the joint distributions? In example $1$, for instance, it seems fairly straightforward to calculate $E[ZW]$ without explicitly calculating the distribution of $(Z,W)$. – Theoretical Economist Oct 15 '17 at 02:17
  • 1
    It is indeed a popular question on [stats.stackexchange](https://stats.stackexchange.com/questions/85363/simple-examples-of-uncorrelated-but-not-independent-x-and-y/). – StubbornAtom Oct 15 '17 at 14:18
  • The PDFs of $Z,$ $W,$ and the joint distribution of $Z$ and $W$ look quite simple to me: the joint distribution is uniform on a square rotated $45$ degrees (which is one of the examples given on stats.SE). And the arcsine distribution may be "rarely mentioned" but it's quite simple. If you are looking for an example that's as often-used as a Gaussian, I think you're out of luck. But you now have a lot of answers to choose from; is there no answer among them that's good enough? – David K Oct 17 '17 at 00:01
  • https://math.stackexchange.com/q/249422/321264, https://math.stackexchange.com/q/1215345/321264 – StubbornAtom Apr 18 '21 at 10:54

8 Answers8

48

Here's a (perhaps) simpler example. Let $X$ be $N(0,1)$ and $Y = X^2.$ Then $$ E(XY) = E(X^3) = 0 =E(X)E(Y),$$ so $X$ and $Y$ are uncorrelated, but clearly they aren't independent (if you know $X$, then you know $Y).$

amWhy
  • 204,278
  • 154
  • 264
  • 488
spaceisdarkgreen
  • 51,044
  • 3
  • 35
  • 78
  • 3
    This applies to any symmetric distribution, actually (except for the trivial point mass at zero). For example it would apply with $X$ such that $p_X(1)=1/2,p_X(-1)=1/2,p_X(x)=0$ otherwise, where $p_X$ is the PMF. – Ian Oct 15 '17 at 02:36
  • 2
    @Ian Any nondegenerate, symmetric, *integrable* RV :). – spaceisdarkgreen Oct 15 '17 at 02:39
  • 2
    In fact you need a finite third moment, but yes, you do need some integrability hypothesis. – Ian Oct 15 '17 at 02:45
  • @Ian while we're quibbling here, actually now that I think about it, your example doesn't quite work ($X$ and $X^2$ are independent)... so there is some other nontriviality assumption like $X^2$ is not constant. – spaceisdarkgreen Oct 15 '17 at 02:50
  • 3
    Oh, yes, $X^2$ in that case has the "trivial independence problem", where conditioning doesn't do anything simply because you already know everything there is to know. OK, so yeah, starting from any nonnegative nondegenerate distribution with finite $3/2$ moment, taking the square root and independently choosing the sign uniformly at random gives an $X$ with the desired properties. That's...a somewhat ugly set of hypotheses :/ – Ian Oct 15 '17 at 03:02
25

Two fair coins are tossed independently; the first has sides labelled $0$ and $1,$ the second has sides labelled $1$ and $-1.$ Let $X$ be the number that comes up on the first coin, and let $Y$ be the product of the two numbers that come up.

The variables $X$ and $Y$ are uncorrelated: since $XY=Y,$ $$E(XY)=E(Y)=0=\frac12\cdot0=E(X)E(Y).$$ The variables $X$ and $Y$ are not independent: $$P(X=0,Y=0)=P(X=0)=\frac12\ne\frac12\cdot\frac12=P(X=0)P(Y=0).$$

bof
  • 70,356
  • 5
  • 80
  • 153
15

How about $(X,Y)$ taking values $(1,0)$, $(-1,0)$, $(0,1)$ and $(0,-1)$ each with probability $1/4$? Then $E(X)=E(Y)=0$ and $XY=0$, so the covariance is zero, but $X$ and $Y$ are not independent.

Angina Seng
  • 153,379
  • 28
  • 92
  • 192
7

Really boring example: $$ \begin{array}{c|ccc} Y \backslash X & -1 & 0 & 1 \\ \hline -1 & p & p & p \\ 0 & p & 1-8p & p \\ 1 & p & p & p \end{array}. $$ Then the marginal distributions are both $$ \begin{array}{ccc} -1 & 0 & 1 \\ \hline 3p & 1-6p & 3p \end{array}, $$ so $E[X]=E[Y]=0$. A similar calculation shows $E[XY]=0$, but an independent joint distribution would have be the product of the marginals, $$ \begin{array}{c|ccc} & -1 & 0 & 1 \\ \hline -1 & 9p^2 & 3p(1-6p) & 9p^2 \\ 0 & 3p(1-6p) & (1-6p)^2 & 3p(1-6p) \\ 1 & 9p^2 & 3p(1-6p) & 9p^2 \end{array}. $$ It is easy to see this corresponds to the original table precisely when $p=1/9$ or $0$: $p$ anything between $0$ and $1/9$ exclusive gives a counterexample.

Chappers
  • 65,147
  • 11
  • 62
  • 122
  • 2
    An even smaller example is $$ \begin{array}{c|ccc} Y \backslash X & -1 & 0 & 1 \\ \hline -1 & p & 1/2-2p & p \\ 1 & q & 1/2-2q & q \end{array}, $$ which has the same properties, but is not independent unless $p=q$. – Chappers Oct 15 '17 at 03:05
5

$X$ is uniform over $[-1,1]$. $Y = |X|$ $E[XY]=0$ since it is symmetric about $0$ $Y$ is uniform over $[0,1]$. $Y$ given $X$ is deterministic. So they are not independent.

andrepd
  • 435
  • 2
  • 10
Vignesh
  • 363
  • 1
  • 5
5

Let $X$ be any symmetric, square integrable random variable, and let $Y$ be independent of $X$, with $P(Y=-1)=P(Y=1)=\frac{1}{2}$.

Then $X$ and $XY$ are trivially uncorrelated, but certainly not independent, as $\lvert X\rvert=\lvert XY\rvert$.

tomasz
  • 33,008
  • 3
  • 45
  • 103
3

A very simple example: Let (X,Y) take values from {(0,0), (1,1), (2,0)}, equiprobably. Y is functionally dependent on X, yet the covariance is 0

2

Another easy to explain on the run:

for $\theta$ Uniform [0 , 2$\pi$] set X = sin($\theta$), Y = cos($\theta$).

Then $X^2 + Y^2 = 1$ - clearly X and Y are not independent.

However as correlation is a linear relationship it is unable to produce anything but zero for the correlation between X and Y

Ian
  • 21
  • 2