Questions tagged [random-variables]

Questions about maps from a probability space to a measure space which are measurable.

A random variable $X: \Omega \to E$ is a measurable function from a set of possible outcomes $\Omega$ to a measurable space $E$. The technical axiomatic definition requires $\Omega$ to be a sample space of a probability triple. Usually $X$ is real-valued.

The probability that $X$ takes on a value in a measurable set $S \subseteq E$ is written as :

$$P(X \in S) = P(\{ \omega \in \Omega|X(\omega) \in S\})$$

where $P$ is the probability measure equipped with $\Omega$.

10873 questions
2
votes
1 answer

For random variables $X$ and $Y$, $F_{X,Y}(x,y)=F_X(x)F_Y(y)$ if and only if $f_{X,Y}(x,y)=f_X(x)f_Y(y)$

I have seen either statement used as a definition of independent random variables. I was trying to prove their equivalence for discrete random variables. I am able to prove that if the joint density function (or mass function as some books call it…
2
votes
1 answer

Indicator - Random Variable

To each event $A$ in a probability space $(S,P)$ we associate a random variable (called the indicator of $A$) as follows: $I_A:S \rightarrow R$ where $\forall s\in S$ we have $$I_A(s) = \begin{cases} 1&\text{if}~s\in A \\ 0&\text{otherwise}. …
2
votes
4 answers

Inferring covariance cov[X,Z] from cov[X,Y] and cov[Y,Z] of known distributions

Suppose X, Y and Z are real random variables of known distributions. If one knows the covariance $COV(X,Y)$ and $COV(Y,Z)$, is it possible to infer $COV(X,Z)$?
TimY
  • 151
  • 2
  • 7
2
votes
1 answer

inclusion of $\sigma$-algebra generated by random variables

Consider the following random variables $$X:\Omega\to\mathbb{R}\quad\text{and}\quad Y:\Omega\to \mathbb{R}$$ and $$Z:=XY$$. One may interpret it as follows, i.e. $$Z(\omega) = X(\omega)Y(\omega).$$ In general, we cannot say much about the relations…
newbie
  • 3,261
  • 2
  • 25
  • 46
2
votes
1 answer

Moments of Geometric Random Variable

Let $X$ be a geometric random variable i.e. it represents the number of consecutive failures before you get the first success where the success probability is $\rho$. We know $E[X] = 1/\rho$ and $E[X^2] = (1-\rho)/(\rho)^2$. Does it generally hold…
2
votes
0 answers

conditional expectation of square given sum of iid variables

I have a simple question on conditional expectation but it appears I'm stuck a bit. I want to compute $E[X_1^2|X_1+\cdots+X_k=n]$ where the variables $X_1,\ldots,X_k$ are iid. Now, due to the iid property, we have $E[X_1^2|X_1+\cdots+X_k=n] =…
mathse
  • 2,338
  • 10
  • 15
2
votes
2 answers

{Maximum of independent exponential R.V} need help to understand

I have question about this question What I don't understand I do understand that P(2nd_max of {X1, X2....Xn}< t) = P(X1< t) P(X2< t).... P( Xn-1 < t). This will become the (1-e^(-lamda*t))^(n-1)), which is the second term above. However, I don't…
kou
  • 339
  • 1
  • 6
  • 19
2
votes
0 answers

A nice sequence of random variables

Let $f:U\mapsto \mathbb{R}^k$ with $U\subset \mathbb{R}$ be a smooth injective function. Suppose that $\sqrt n(Y_n- Y)\to N(0,\Omega)$ in distribution with $Y=f(X)$. Define $X_n$ by $X_n=\operatorname{arg}\min_{x\in U}\|f(x)-Y_n\|$ for all $n\geq…
2
votes
1 answer

Monotone convergence and uniform integrability: an application.

If $E[X_n] < \infty$ for $n = 1,2,\ldots,\infty$ and $X_n$ increases to $X_ \infty$ almost everywhere. Prove that $$E\left[|X_n - X_\infty|\right]\to 0$$ as $n$ tends to $\infty$. Here's what I tried (but was told it is wrong): $E[X_n] < \infty$…
2
votes
1 answer

Density of Gaussian Unitary Ensemble

I'm trying to learn a bit about Gaussian matrix ensembles, and am having some trouble making the following connection. Sorry if I'm being a bit obtuse. Take the Gaussian unitary ensemble (GUE) of $n \times n$ Hermitian matrices. A matrix $H$ from…
2
votes
1 answer

Calculating joint MGF

This is an end-of-chapter question from a Korean textbook, and unfortunately it only has solutions to the even-numbered q's, so I'm seeking for some hints or tips to work out this particular joint moment generating function question. or I better…
2
votes
1 answer

Coupling Pairs of Random Variable.

Let $\{X_i\}_{i=1}^{n}$ and $\{Z_i\}_{i=1}^{n}$ be sets of independent random variables with coupling $\{X^{\hat{}}_i\}_{i=1}^{n}$, $\{Z^{\hat{}}_i\}_{i=1}^{n}$ respectively. It then states $$\big(\sum_{i=1}^{n} X_i^{\hat{}},\sum_{i=1}^{n}…
2
votes
1 answer

A basic doubt on sigma algebra generated by a random variable

Why do we need the concept of sigma algebra generated by a random variable ?
aaaaaa
  • 2,516
  • 1
  • 20
  • 25
2
votes
1 answer

How to show that a sequence of random variables doesn't converge in probability?

Say, we have the sequence of random variables defined on $\Omega=[0,1]$ with uniform distribution: $$X_n(\omega) := \begin{cases} \omega, & \text{if $n$ is odd} \\ 1-\omega, & \text{if $n$ is even} \\ \end{cases}$$ And we'd like to investigate if…
Leo
  • 7,400
  • 5
  • 27
  • 56
2
votes
2 answers

Product of standard normal and uniform random variable

I'm trying to find the PDF of the product of two random variables by first finding the CDF. I don't know where I'm going wrong. Let $X\sim N(0,1)$ and $Y\sim Uniform\{-1,1\}$ and let $Z = XY$, then: $F_Z(Z
1 2 3
99
100