Questions tagged [independence]

For questions involving the notion of independence of events, of independence of collections of events, or of independence of random variables. Use this tag along with the tags (probability), (probability-theory) or (statistics). Do not use for linear independence of vectors and such.

For events: Two events $A$ and $B$ are independent if $$P(A\cap B)=P(A)P(B)$$ More generally, a family $\mathscr F$ of events is independent if, for every finite number of distinct events $A_1$, $A_2$, $\ldots$, $A_n$ in $\mathscr F$, $$P\left(\bigcap_{i=1}^nA_i\right) =\prod_{i=1}^nP(A_i)$$

Two collections of events (for example, two $\sigma$-algebras) $\mathscr F$ and $\mathscr G$ are mutually independent (or simply, independent) if every $A$ in $\mathscr F$ and every $B$ in $\mathscr G$ are independent.

More generally, some collections $\mathscr F_i$ of events, indexed by some finite or infinite set $I$, are mutually independent (or simply, independent) if, for every finite subset $\\{i_1,i_2,\ldots,i_n\\}$ of $I$ and every event $A_k$ in $\mathscr F_{i_k}$, the family $\\{A_1,\ldots,A_n\\}$ is independent.

For random variables: Two random variables $X$ and $Y$ (defined on the same probability space) are independent if their $\sigma$-algebras $\sigma(X)$ and $\sigma(Y)$ are (mutually) independent.

In particular, 2 events $A$ and $B$ are independent if and only if the indicator random variables $1_A$ and $1_B$ are independent.

More generally, a family $\mathscr X$ of random variables (defined on the same probability space) is independent if, for every finite sub-family $\\{X_1,X_2,\ldots,X_n\\}$ of $\mathscr X$, the $\sigma$-algebras $\sigma(X_{1})$, $\sigma(X_{2})$, $\dots$, $\sigma(X_{n})$ are (mutually) independent.

2560 questions
119
votes
4 answers

Could someone explain conditional independence?

My understanding right now is that an example of conditional independence would be: If two people live in the same city, the probability that person A gets home in time for dinner, and the probability that person B gets home in time for dinner are…
Ryan
  • 1,591
  • 4
  • 11
  • 18
36
votes
6 answers

Why does zero correlation not imply independence?

Although independence implies zero correlation, zero correlation does not necessarily imply independence. While I understand the concept, I can't imagine a real world situation with zero correlation that did not also have independence. Can someone…
user86403
  • 361
  • 1
  • 4
  • 3
27
votes
3 answers

Existence of independent and identically distributed random variables.

I often see the sentence "let $X_1, X_2, \ldots$ be a sequence of i.i.d. random variables with a certain distribution". But given a random variable $X$ on a probability space $\Omega$, how do I know that there is a sequence of INDEPENDENT random…
Spook
  • 4,600
  • 21
  • 46
17
votes
2 answers

Factoring $1+x+\dots +x^n$ into a product of polynomials with positive coefficients

Can the polynomial $1+x+x^2+\dots +x^n$ be factored, for some $n\ge 1$, into a product of two non-constant polynomials with positive coefficients? Thoughts It is easy to factor it into polynomials with non-negative coefficients e.g. $$ 1+x+x^2+x^3…
zhoraster
  • 23,091
  • 2
  • 27
  • 65
17
votes
6 answers

Example of Pairwise Independent but not Jointly Independent Random Variables?

I am asked to: Find a joint probability distribution $P(X_1,\dots, X_n)$ such that $X_i , \, X_j$ are independent for all $i \neq j$, but $(X_1, \dots , X_n)$ are not jointly independent. I have no idea where to start, please help.
user2262504
  • 844
  • 1
  • 9
  • 20
16
votes
2 answers

Independence of disjoint events

I'm taking a class in Probability Theory, and I was asked this question in class today: Given disjoint events $A$ and $B$ for which $$ P(A)>0\\ P(B)>0 $$ Can $A$ and $B$ be independent? My answer was: $A$ and $B$ are disjoint, so $P(A\cap…
Jakob Weisblat
  • 866
  • 6
  • 17
16
votes
1 answer

Uniform distribution on a simplex via i.i.d. random variables

For which $N \in \mathbb{N}$ is there a probability distribution such that $\frac{1}{\sum_i X_i} (X_1, \cdots, X_{N+1})$ is uniformly distributed over the $N$-simplex? (Where $X_1, \cdots, X_{N+1}$ are accordingly distributed iid random variables.)
charles.y.zheng
  • 1,183
  • 1
  • 9
  • 14
15
votes
5 answers

Two tails in a row - what's the probability that the game started with a head?

We're tossing a coin until two heads or two tails in a row occur. The game ended with a tail. What's the probability that it started with a head? Let's say we denote the game as a sequence of heads and tails, e.g. $(T_1, H_2, T_3, H_5, H_6)$ is a…
Angie
  • 1,070
  • 8
  • 21
14
votes
0 answers

Distribution of the sum of absolutes values of T-distributed random variables

Where $X$ is a r.v. following a symmetric $T$ distribution with $0 $mean and tail parameter $\alpha$. I am looking for the distribution of the $n$-summed independent variables $ \sum_{1 \leq i \leq n}|x_i|$. $Y=|X|$ has for PDF $\frac{2…
14
votes
2 answers

A criterion for independence based on Characteristic function

Let $X$ and $Y$ be real-valued random variables defined on the same space. Let's use $\phi_X$ to denote the characteristic function of $X$. If $\phi_{X+Y}=\phi_X\phi_Y$ then must $X$ and $Y$ be independent?
Jeff
  • 6,869
  • 1
  • 23
  • 62
13
votes
1 answer

Independence and conditional expectation

So, it's pretty clear that for independent $X,Y\in L_1(P)$ (with $E(X|Y)=E(X|\sigma(Y))$), we have $E(X|Y)=E(X)$. It is also quite easy to construct an example (for instance, $X=Y=1$) which shows that $E(X|Y)=E(X)$, does not imply independence of…
13
votes
1 answer

Relation between Borel–Cantelli lemmas and Kolmogorov's zero-one law

I was wondering what is the relation between the first and second Borel–Cantelli lemmas and Kolmogorov's zero-one law? The former is about limsup of a sequence of events, while the latter is about tail event of a sequence of independent sub sigma…
Tim
  • 43,663
  • 43
  • 199
  • 459
13
votes
4 answers

Independence intuition

Toss two fair dice. There are $36$ outcomes in the sample space $\Omega$, each with probability $\frac{1}{36}$. Let: $A$ be the event '$4$ on first die'. $B$ be the event 'sum of numbers is $7$'. $C$ be the event 'sum of numbers is $8$'. It says…
Bobby
  • 765
  • 2
  • 8
  • 19
12
votes
3 answers

independent, identically distributed (IID) random variables

I am having trouble understanding IID random variables. I've tried reading http://scipp.ucsc.edu/~haber/ph116C/iid.pdf, http://www.math.ntu.edu.tw/~hchen/teaching/StatInference/notes/lecture32.pdf, and…
1
2 3
99 100