Questions tagged [random-variables]

Questions about maps from a probability space to a measure space which are measurable.

A random variable $X: \Omega \to E$ is a measurable function from a set of possible outcomes $\Omega$ to a measurable space $E$. The technical axiomatic definition requires $\Omega$ to be a sample space of a probability triple. Usually $X$ is real-valued.

The probability that $X$ takes on a value in a measurable set $S \subseteq E$ is written as :

$$P(X \in S) = P(\{ \omega \in \Omega|X(\omega) \in S\})$$

where $P$ is the probability measure equipped with $\Omega$.

10873 questions
30
votes
2 answers

Expectation of the min of two independent random variables?

How do you compute the minimum of two independent random variables in the general case ? In the particular case there would be two uniform variables with a difference support, how should one proceed ? EDIT: specified that they were independent and…
30
votes
8 answers

Is there a *simple* example showing that uncorrelated random variables need not be independent?

Is there a simple example showing that given $X,Y$ uncorrelated (covariance is zero), $X,Y$ are not independent? I have looked up two references, however, I am dissatisfied with both. In Reference $1$, $X,Y$ are assumed to be independent…
30
votes
4 answers

Expected absolute difference between two iid variables

Suppose $X$ and $Y$ are iid random variables taking values in $[0,1]$, and let $\alpha > 0$. What is the maximum possible value of $\mathbb{E}|X-Y|^\alpha$? I have already asked this question for $\alpha = 1$ here: one can show that…
J Richey
  • 1,049
  • 9
  • 22
29
votes
5 answers

How to generate points uniformly distributed on the surface of an ellipsoid?

I am trying to find a way to generate random points uniformly distributed on the surface of an ellipsoid. If it was a sphere there is a neat way of doing it: Generate three $N(0,1)$ variables $\{x_1,x_2,x_3\}$, calculate the distance from the…
28
votes
7 answers

Expected Value of Flips Until HT Consecutively

Suppose you flip a fair coin repeatedly until you see a Heads followed by a Tails. What is the expected number of coin flips you have to flip? By manipulating an equation based on the result of the first flip, shown at this…
Dhruv
  • 283
  • 1
  • 4
  • 5
27
votes
4 answers

how to derive the mean and variance of a Gaussian Random variable?

How do we go about deriving the values of mean and variance of a Gaussian Random Variable $X$ given its probability density function ?
Raaj
  • 671
  • 3
  • 8
  • 15
27
votes
2 answers

Conditional expectation on more than one sigma-algebra

I'm facing the following issue. Let $X$ be an integrable random variable on the probability space $(\Omega,\mathcal{F},\mathbb{P})$ and $\mathcal{G},\mathcal{H} \subseteq \mathcal{F}$ be two sigma-algebras. We assume that $X$ is independent of…
27
votes
3 answers

Infinite expected value of a random variable

How can a positive random variable $X$ which never takes on the value $+\infty$, have expected value $\mathbb{E}[X] = +\infty$?
lodhb
  • 641
  • 2
  • 6
  • 10
25
votes
6 answers

Correlation between three variables question

I was asked this question regarding correlation recently, and although it seems intuitive, I still haven't worked out the answer satisfactorily. I hope you can help me out with this seemingly simple question. Suppose I have three random variables…
tanvach
  • 375
  • 1
  • 3
  • 8
24
votes
3 answers

Expected time to convergence

Consider the following process: we place $n$ points labelled $1...n$ uniformly at random on the interval $[0,1]$. At each time step, two points $i, j$ are selected uniformly at random and $i$ updates its position to be a point chosen uniformly at…
michael
  • 970
  • 8
  • 18
24
votes
1 answer

Events in the tail $\sigma$-algebra

I am having a little trouble understanding what exactly is the tail $\sigma$-algebra. Just so we are all on the same page, my book defined the tail $\sigma$-algebra like this: Let $X_n$ be a sequence of random variable defined on $(\Omega, …
Ant
  • 20,466
  • 5
  • 41
  • 97
22
votes
3 answers

Proof of analogue of the Cauchy-Schwarz inequality for random variables

The Cauchy-Schwarz inequality tells us that for two vectors $u$ and $v$ in an inner product space, $$\lvert (u,v)\rvert \leq \lVert u\rVert \lVert v \rVert$$ with the equality holding iff one vector is a constant multiplier of the other. Prove the…
RADIOACTIVE
  • 223
  • 1
  • 3
  • 7
21
votes
2 answers

Independence of a random variable $X$ from itself

In our lecture on probability, my professor made the comment that "a random variable X is not independent from itself." (Here he was specifically talking about discrete random variables.) I asked him why that was true. (My intuition for two…
Eric Auld
  • 26,353
  • 9
  • 65
  • 174
21
votes
2 answers

Probability distribution of a sum of uniform random variables

Given a random variable $$X = \sum_i^n x_i,$$ where $x_i \in (a_i,b_i)$ are independent uniform random variables, how does one find the probability distribution of $X$?
Matt Munson
  • 1,337
  • 2
  • 10
  • 22
21
votes
3 answers

Collection of standard facts about convergence of random variables

The goal of this question is to collect standard general facts about convergence of random variables (in $\mathbb L^p$, in probability, in distribution) in order to use them when answering questions. I am aware of the existence of this meta-thread…