Questions tagged [central-limit-theorem]

This tag should be used for each question where the term "central limit theorem" and with the tag (tag:probability-limit-theorems). The central limit theorem states that the sampling distribution of the mean of any independent, random variable will be normal or nearly normal, if the sample size is large enough.

The central limit theorem (CLT) is one of the most important results in probability theory. It states that,

Let $~X_1,~X_2,\cdots,~X_n~$ be Independent and identically distributed (i.i.d.) random variables with expected value $~E~X_i=μ<∞~$ and variance $~0<Var(X_i)=σ^2<∞~$. Then, the random variable $$\begin{align}%\label{} Z_{\large n}=\frac{\overline{X}-\mu}{\sigma / \sqrt{n}}=\frac{X_1+X_2+...+X_{\large n}-n\mu}{\sqrt{n} \sigma} \end{align}$$ converges in distribution to the standard normal random variable as $~n~$ goes to infinity, that is $$\begin{align}%\label{} \lim_{n \rightarrow \infty} P(Z_{\large n} \leq x)=\Phi(x), \qquad \textrm{ for all }x \in \mathbb{R}, \end{align}$$where $~\Phi(x)~$ is the standard normal CDF.

How to Apply The Central Limit Theorem :

  • Write the random variable of interest, $~Y~$, as the sum of $~n~$ i.i.d. random variable $~X_i~$'s: $$Y=X_1+X_2+\cdots+X_n$$
  • Find $~E~Y~$ and $~Var(Y)~$ by noting that $$E~Y=nμ,\qquad Var(Y)=n~σ^2,$$ where $~μ=E~X_i~$ and $~σ^2=Var(X_i)~$.
  • According to the CLT, conclude that $$\frac{Y−E~Y}{\sqrt{Var(Y)}}=\frac{Y−nμ}{\sqrt n~σ}$$ is approximately standard normal; thus, to find $~P(y_1≤Y≤y_2)~$, we can write $$P(y_1≤Y≤y_2)=P\left(\frac{y_1−nμ}{\sqrt n~σ}≤\frac{Y−nμ}{\sqrt n~σ}≤\frac{y_2−nμ}{\sqrt n~√σ}\right)$$ $$≈\phi\left(\frac{y_2−nμ}{\sqrt n~σ}\right)−\phi\left(\frac{y_1−nμ}{\sqrt n~σ}\right).$$

References:

https://en.wikipedia.org/wiki/Central_limit_theorem

http://mathworld.wolfram.com/CentralLimitTheorem.html

https://www.analyticsvidhya.com/blog/2019/05/statistics-101-introduction-central-limit-theorem/

1318 questions
19
votes
2 answers

Rate of convergence in the central limit theorem (Lindeberg–Lévy)

There are similar posts to this one on stackexchange but none of those seem to actually answer my questions. So consider the CLT in the most common form. Let $(X_n)_{n \in \mathbb{N}}$ be a sequence of i.i.d. random variables with $X_1 \in L^2(P)$…
14
votes
0 answers

Interpreting the Lindeberg's condition

I know the Lindeberg's CLT but I don't have a good grasp of the intuition behind the Lindeberg's condition. Could you please give some intuition behind said condition via an example (or, perhaps, via 2 related examples, one satisfying the condition…
13
votes
2 answers

Limiting distribution of $\frac1n \sum_{k=1}^{n}|S_{k-1}|(X_k^2 - 1)$ where $X_k$ are i.i.d standard normal

Let $(X_n)$ be a sequence of i.i.d $\mathcal N(0,1)$ random variables. Define $S_0=0$ and $S_n=\sum_{k=1}^n X_k$ for $n\geq 1$. Find the limiting distribution of $$\frac1n \sum_{k=1}^{n}|S_{k-1}|(X_k^2 - 1)$$ This problem is from Shiryaev's…
12
votes
2 answers

Convergence in distribution of conditional expectations

I was just reading this question, which is about how the classical central limit theorem can be interpreted as giving a rate of convergence for the law of large numbers for iid random variables. I was wondering whether the same idea can be…
10
votes
3 answers

Counterexamples concerning the central limit theorem

The central limit theorem states that, if $X$ is a random variable with finite variance $\sigma^2$ and expected value $\mu$, and if $(X_n)$ is a sequence of independent random variables identically distributed like $ X $, then \begin{equation} Z_n =…
10
votes
1 answer

Intuition for $N(\mu, \sigma^2)$ in terms of its infinite expansion

To gain deeper insight to the Poisson and exponential random variables, I found that I could derive the random variables as follows: I consider an experiment which consists of a continuum of trials on an interval $[0,t)$. The result of the…
10
votes
0 answers

How do Kolmogorov 0-1 law and CLT imply normalized sample mean doesn't converge in probability nor a.s.?

From WIkipedia the central limit theorem states that the sums Sn scaled by the factor $1/\sqrt{n}$ converge in distribution to a standard normal distribution. Combined with Kolmogorov's zero-one law, this implies that these quantities…
Tim
  • 43,663
  • 43
  • 199
  • 459
10
votes
1 answer

On the central limit theorem

The Central Limit Theorem states for a sequence of i.i.d. random variables $\{X_i\}$, $$\frac{\overline{X} - \mu}{\sigma/\sqrt{n}} \to N(0,1)$$ in distribution as $n \to \infty$. I saw in some lecture notes that this implies $$\overline{X} \to…
user217285
  • 5,607
  • 1
  • 15
  • 29
9
votes
2 answers

Limiting distribution of binary variable (Central limit theorem fails)

Suppose we have a random variable $$Y_i = i \text{ with probability } \frac{1}{i}$$ and $0$ otherwise. Here all the $Y_i$ are independent. We can redefine $X_i = Y_i -1 $ so that $E(X_i)=0$. Then the variance of $X_i$ is $(i-1)^2\cdot 1/i +…
9
votes
2 answers

How can I prove these two questions without using the following theorem?

Question 1: Let $X_1, X_2, \cdots$ be independent random variables such that $$P(X_n=-n^{\theta})=P(X_n=n^{\theta})=\frac{1}{2}.$$ If $\theta > -\frac{1}{2}$ prove that the Lyapunov condition works and the sequence satisfies the central limit…
9
votes
3 answers

Applying Central Limit Theorem to show that $E\left(\frac{|S_n|}{\sqrt{n}}\right) \to \sqrt{\frac{2}{\pi}}\sigma$

In the book Probability Essentials, by Jacod and Protter, the following question has bugged me for a long while and I'm wondering if it is bugged. The question is an application of Central Limit Theorem: Let $(X_j)_{j\geq1}$ be iid with $E[X_j] =0$…
9
votes
1 answer

Checking the Lindeberg condition (central limit theorem)

Problem. Let $W_1, W_2,...$ be independent and identically distributed random variables such that $E(W_1)=0$ and $\sigma^2 := V(W_1) \in (0,\infty)$. Let $T_n = \frac{1}{\sqrt{n}} \sum_{j=1}^n a_j W_j$ where $a_j\neq 0$ for all $j\in \Bbb{N}$. If…
8
votes
2 answers

Why aren't the strong LLNs and CLT contradicting each other?

Given $n$ i.i.d. random variables $\{X_1, X_2, \dots , X_n\}$, each with mean $M$ and variance $V$, both strong and week LLNs seem to say that the average of the $n$ random variables, $S_n = \frac{X_1 + X_2 + \dots + X_n }{n}$, approaches $M$, as $n…
8
votes
0 answers

Normal distribution and inflection points intuition

My professor made a point that I had not seen before. He said to figure out (graphically) where the first standard deviation($\sigma$) in $N(0,\sigma^2)$ you can look at the point of inflection. This makes sense mathematically but I am confused…
8
votes
0 answers

Show that $\frac1n\log X_n$ converges almost surely

Let $X_0$ follow $\mathrm{Uniform}(0,1)$. Define $X_{n+1}$ iteratively as $X_{n+1}$ follows $\mathrm{Uniform}(0,X_n)$, $n\geq0$. Show that $\dfrac{\log X_n}{n}$ converges almost surely and find the limit. I believe the limit is $0$. I tried to…
1
2 3
87 88