Questions tagged [probability-theory]

For questions solely about the modern theoretical footing for probability, for example, probability spaces, random variables, law of large numbers, and central limit theorems. Use [tag:probability] instead for specific problems and explicit computations. Use [tag:probability-distributions] for specific distribution functions, and consider [tag:stochastic-processes] when appropriate.

The modern theory of probability is formulated using . Use this tag if your question either involves the theoretical foundations of probability or you are seeking responses at the level or rigor used in modern probability theory. Examples of subtopics include probability spaces, , convergence of random-variables, , , and other , applications of famous theorems including the strong and weak laws of large numbers, the central limit theorem, the law of the iterated logarithm, et cetera.

Use for explicit computation of probabilities or expectation values, and use for specific distribution functions.

39387 questions
304
votes
8 answers

Intuition for the definition of the Gamma function?

In these notes by Terence Tao is a proof of Stirling's formula. I really like most of it, but at a crucial step he uses the integral identity $$n! = \int_{0}^{\infty} t^n e^{-t} dt$$ , coming from the Gamma function. I have a mathematical…
Qiaochu Yuan
  • 359,788
  • 42
  • 777
  • 1,145
154
votes
11 answers

What is the best book to learn probability?

Question is quite straight... I'm not very good in this subject but need to understand at a good level.
153
votes
6 answers

Is the product of two Gaussian random variables also a Gaussian?

Say I have $X \sim \mathcal N(a, b)$ and $Y\sim \mathcal N(c, d)$. Is $XY$ also normally distributed? Is the answer any different if we know that $X$ and $Y$ are independent?
123
votes
6 answers

Intuition behind Conditional Expectation

I'm struggling with the concept of conditional expectation. First of all, if you have a link to any explanation that goes beyond showing that it is a generalization of elementary intuitive concepts, please let me know. Let me get more specific. Let…
108
votes
8 answers

Lebesgue integral basics

I'm having trouble finding a good explanation of the Lebesgue integral. As per the definition, it is the expectation of a random variable. Then how does it model the area under the curve? Let's take for example a function $f(x) = x^2$. How do we…
user957
  • 3,079
  • 7
  • 30
  • 33
93
votes
4 answers

Probability density function vs. probability mass function

I've a confession to make. I've been using PDF's and PMF's without actually knowing what they are. My understanding is that density equals area under the curve, but if I look at it that way, then it doesn't make sense to refer to the "mass" of a…
83
votes
16 answers

Reference book on measure theory

I post this question with some personal specifications. I hope it does not overlap with old posted questions. Recently I strongly feel that I have to review the knowledge of measure theory for the sake of starting my thesis. I am not totally new…
82
votes
8 answers

Zero probability and impossibility

I read a comment under this question: There are plenty of events that can occur that have zero probability. This reminds me that I have seen similar saying before elsewhere, and have never been able to make sense out of it. So I was wondering if…
Tim
  • 43,663
  • 43
  • 199
  • 459
80
votes
12 answers

The Monty Hall problem

I was watching the movie $21$ yesterday, and in the first 15 minutes or so the main character is in a classroom, being asked a "trick" question (in the sense that the teacher believes that he'll get the wrong answer) which revolves around…
72
votes
3 answers

What is the importance of the infinitesimal generator of Brownian motion?

I have read that the infinitesimal generator of Brownian motion is $\frac{1}{2}\small\triangle$. Unfortunately, I have no background in semigroup theory, and the expositions of semigroup theory I have found lack any motivation or intuition. What is…
72
votes
3 answers

Intuitive explanation of a definition of the Fisher information

I'm studying statistics. When I read the textbook about Fisher Information, I couldn't understand why the Fisher Information is defined like this: $$I(\theta)=E_\theta\left[-\frac{\partial^2 }{\partial \theta^2}\ln P(\theta;X)\right].$$ Could anyone…
maple
  • 2,453
  • 2
  • 23
  • 33
71
votes
9 answers

Proving that $1$- and $2D$ simple symmetric random walks return to the origin with probability $1$

How does one prove that a simple (steps of length $1$ in directions parallel to the axes) symmetric (each possible direction is equally likely) random walk in $1$ or $2$ dimensions returns to the origin with probability $1$? Edit: note that while…
Isaac
  • 35,106
  • 14
  • 99
  • 136
69
votes
6 answers

Chance of meeting in a bar

Two people have to spend exactly 15 consecutive minutes in a bar on a given day, between 12:00 and 13:00. Assuming uniform arrival times, what is the probability they will meet? I am mainly interested to see how people would model this formally. I…
Beltrame
  • 2,976
  • 1
  • 25
  • 33
66
votes
2 answers

Cardinality of Borel sigma algebra

It seems it's well known that if a sigma algebra is generated by countably many sets, then the cardinality of it is either finite or $c$ (the cardinality of continuum). But it seems hard to prove it, and actually hard to find a proof of it. Can…
66
votes
1 answer

Distinguishing probability measure, function and distribution

I have a bit trouble distinguishing the following concepts: probability measure probability function (with special cases probability mass function and probability density function) probability distribution Are some of these interchangeable? Which…
Marc
  • 3,057
  • 2
  • 17
  • 27
1
2 3
99 100