Questions tagged [expected-value]

Questions about the expected value of a random variable.

The average value of a randomly chosen quantity is its expectation or expected value. For example, the expected value of the number you get when you roll a fair 6-sided dice is 3.5.

In general, if $X$ is a random variable defined on a probability $(\Omega, \Sigma, P)$, then the expected value of $X$, denoted by $E[X], \langle X \rangle,$ or $\bar{X}$ is defined as the Lebegue integral

$$E[X]= \int_{\Omega} X(\omega) dP(\omega)$$

The expected value is often the first and most important thing you want to know about a random variable. For example, in a betting game, the best strategy is often the one that maximizes the expected value of the amount you win.

This tag is for questions about:

  • Computing the expected value in a specific situation.
  • Understanding the properties of expected values, such as Markov's inequality or linearity of expectation.
  • Proving theorems about the expected value of abstract random variables.
  • Understanding what the expected value means and what it tells you about a random variable.
5346 questions
94
votes
13 answers

Expected Number of Coin Tosses to Get Five Consecutive Heads

A fair coin is tossed repeatedly until 5 consecutive heads occurs. What is the expected number of coin tosses?
leava_sinus
  • 1,045
  • 3
  • 9
  • 4
79
votes
4 answers

Intuition behind using complementary CDF to compute expectation for nonnegative random variables

I've read the proof for why $\int_0^\infty P(X >x)dx=E[X]$ for nonnegative random variables (located here) and understand its mechanics, but I'm having trouble understanding the intuition behind this formula or why it should be the case at all. Does…
bouma
  • 1,015
  • 1
  • 9
  • 8
56
votes
5 answers

Why does the median minimize $E(|X-c|)$?

Suppose $X$ is a real-valued random variable and let $P_X$ denote the distribution of $X$. Then $$ E(|X-c|) = \int_\mathbb{R} |x-c| dP_X(x). $$ The medians of $X$ are defined as any number $m \in \mathbb{R}$ such that $P(X \leq m) \geq \frac{1}{2}$…
Tim
  • 43,663
  • 43
  • 199
  • 459
54
votes
3 answers

Expectation of Minimum of $n$ i.i.d. uniform random variables.

$X_1, X_2, \ldots, X_n$ are $n$ i.i.d. uniform random variables. Let $Y = \min(X_1, X_2,\ldots, X_n)$. Then, what's the expectation of $Y$(i.e., $E(Y)$)? I have conducted some simulations by Matlab, and the results show that $E(Y)$ may equal to…
jet
  • 543
  • 1
  • 5
  • 5
52
votes
10 answers

Striking applications of linearity of expectation

Linearity of expectation is a very simple and "obvious" statement, but has many non-trivial applications, e.g., to analyze randomized algorithms (for instance, the coupon collector's problem), or in some proofs where dealing with non-independent…
Clement C.
  • 63,949
  • 7
  • 61
  • 141
49
votes
3 answers

Explain why $E(X) = \int_0^\infty (1-F_X (t)) \, dt$ for every nonnegative random variable $X$

Let $X$ be a non-negative random variable and $F_{X}$ the corresponding CDF. Show, $$E(X) = \int_0^\infty (1-F_X (t)) \, dt$$ when $X$ has : a) a discrete distribution, b) a continuous distribution. I assumed that for the case of a continuous…
Jon Gan
  • 1,381
  • 2
  • 13
  • 18
42
votes
4 answers

Free throw interview question

I recently had an interview question that posed the following... Suppose you are shooting free throws and each shot has a 60% chance of going in (there are no "learning" or "depreciation" effects, all have the some probability no matter how many…
sedavidw
  • 600
  • 4
  • 8
25
votes
2 answers

'Trace trick' for expectations of quadratic forms

I am trying to understand the proof for the Kullback-Leibler divergence between two multivariate normal distributions. On the way, a sort of trace trick is applied for the expectation of the quadratic form $$E[ (x-\mu)^T \Sigma^{-1} (x-\mu) ]=…
tomka
  • 868
  • 1
  • 8
  • 20
24
votes
1 answer

Average swaps needed for a random bubble sort algorithm

Suppose we have $n$ elements in a random permutation (each permutation has equal probability initially). While the elements are not fully sorted, we swap two adjacent elements at random (e.g. the permutation $(1, 3, 2)$ can go to $(1, 2, 3)$ or $(3,…
Kyky
  • 2,615
  • 1
  • 14
  • 30
20
votes
3 answers

Expected value of an expected value

I am looking at a proof that $\text{Var}(X)= E((X - EX)^2) = E(X^2) - (E(X))^2$ $E((X - EX)^2) =$ $E(X^2 - 2XE(X) + (E(X))^2) =$ $E(X^2) - 2E(X)E(X) + (E(X))^2)$ I can't see how the second line can be equal to the third line. I would have had the…
csss
  • 3,245
  • 2
  • 21
  • 35
19
votes
2 answers

Expected value of maximum and minimum of $n$ normal random variables

Let $X_1, \dots, X_n \sim N(\mu,\sigma)$ be normal random variables. Find the expected value of random variables $\max_i(X_i)$ and $\min_i(X_i)$. The sad truth is I don't have any good idea how to start and I'll be glad for a hint.
user65985
19
votes
1 answer

What is the expected value of this game for large N?

For a given even $N$, I have $N/2$ red cards and $N/2$ black cards. Each time I draw a black card I win a dollar, each time I draw a red card I lose a dollar. I can stop at any time I like (and choose to do so in such a way that would maximize my…
confucious
  • 320
  • 1
  • 11
17
votes
6 answers

Four coins with reflip problem?

I came across the following problem today. Flip four coins. For every head, you get $\$1$. You may reflip one coin after the four flips. Calculate the expected returns. I know that the expected value without the extra flip is $\$2$. However, I am…
15
votes
1 answer

Make $2$ cubes out of $1729$ unit cubes, expected number of times you have to paint

I'm trying to solve question 6 from the PuMac 2007 Combinatorics A competition: Joe has $1729$ randomly oriented randomly arranged unit cubes, which are initially unpainted. He makes two cubes of sidelengths $9$ and $10$ or of sidelengths $1$ and…
15
votes
3 answers

Proving $\operatorname{Var}(X) = E[X^2] - (E[X])^2$

I want to understand something about the derivation of $\text{Var}(X) = E[X^2] - (E[X])^2$ Variance is defined as the expected squared difference between a random variable and the mean (expected value): $\text{Var}(X) = E[(X -…
1
2 3
99 100