16

Let $X_i$ be an iid sequence of random variables with support in $(0,1)$. I'm looking for references (even just a name) for the following infinite sum random variable:

$$S:=X_1+X_1X_2+X_1X_2X_3+X_1X_2X_3X_4+\cdots.$$

This came up for a waiting-time problem. I can easily calculate the expected value and variance of the above sum, but I'm interested if other people have studied this in literature, specifically if there are asymptotics for limiting distribution. I presume there are also issues when $P(X>1-\epsilon)$ falls off too slowly with increasing $\epsilon$.

Alex R.
  • 31,786
  • 1
  • 35
  • 74
  • Just did a little bit search. If I compute the deterministic version of this series (say the expectation), then I got https://www.wolframalpha.com/input/?i=sum+(product+1%2F2%5Ek,+k%3D1+to+n),+n+%3D+1+to+infinity the theta function, which has some relationship with the q-series. But at the moment I cannot find any other similar thing yet. – BGM Jan 11 '17 at 05:22
  • Dumb comment on my part, but maybe there are more results when you take the logarithm of this, although probably not since logarithm doesn't distribute over addition. – Chill2Macht Jan 12 '17 at 22:35
  • 1
    Keyword: stochastic (random) perpetuity. – zhoraster Jan 15 '17 at 18:15
  • 2
    Let $Y$ be a random variable that is independent of $S$ and has the same law as $X_1$. Then $S$ is equal in law to $Y(1+S)$. This may help you obtain the desired asymptotics. – pre-kidney Jan 16 '17 at 12:22
  • I doubt that this helps, but... using above observation by @pre-kidney leads to a recursion $$s_k = \frac{t_k}{1-t_k} \sum_{j=0}^{k-1} {k \choose j } s_j$$ where $s_k,t_k$ are the raw moments of variables $S,X$ resp. – leonbloy Jan 16 '17 at 14:32
  • 1
    This is probably a good paper to start with: http://link.springer.com/chapter/10.1007/978-3-642-57984-4_6 – zhoraster Jan 17 '17 at 12:00
  • What is the distribution of $X_i$ please? – PSPACEhard Jan 18 '17 at 13:10
  • @NP-hard I guess the idea is to express the distribution of $S$ in terms of the (general) distribution of $X_i$ – leonbloy Jan 18 '17 at 18:36
  • @zhoraster: Thanks for the reference! If you want to make it an answer, I'll award the bounty. – Alex R. Jan 19 '17 at 20:44
  • Link-only answers are discouraged here. Moreover, I'm not sure that this does answer the question (it is not very clear what exactly the question is). It's good if I was able to help. – zhoraster Jan 20 '17 at 08:27
  • Since $S \sim X(1 + S)$, don't we have $f_S(s) = \int_{0}^{1} \! x^{-1} \, f_X(x) \, f_S(s/x-1) \, dx$ as an integral equation for the p.d.f.? –  Oct 01 '17 at 22:57
  • https://math.stackexchange.com/questions/2130264/sum-of-random-decreasing-numbers-between-0-and-1-does-it-converge – Carlos Toscano-Ochoa Mar 29 '18 at 12:23

1 Answers1

1

Let $\Sigma_1=\sum_{i=1}^{\infty}{(\prod_{j=1}^{j}{X_i})}$ and notice that $\Sigma_1=X_1(1+\sum_{i=2}^{\infty}{(\prod_{j=2}^{j}{X_i})})=X_1(1+\Sigma_2)$

Also note that $\Sigma_1$ and $\Sigma_2$ are also iid (can be seen via dummy index changes).

Let us also assume that $\Sigma_i$ has the pdf $f_{\Sigma}(\sigma)$

Define $Y=(1+\Sigma_2)$, $f_Y(y)=f_{\Sigma}(y-1)$

Anyway since $\Sigma_2$ does not contain $X_1$, they are also independent.

Finally we can write down:

$f_{\Sigma}(\sigma)=\int_{\mathscr{Y}}{f_X(\sigma/y)f_Y(y)dy}=\int_{x=0}^{1}{f_X(x)f_Y(\sigma/x)dx}=\int_{x=0}^{1}{f_X(x)f_{\Sigma}(\sigma/x-1)dx}$

For an arbitrary $f_X(x)$ this cannot be solved, even existence of $f_{\Sigma}$ cannot be guaranteed.

One trivial solution for $(X,\Sigma)$ pair is satisfied by $f_X(t)=f_\Sigma(t)=\delta(t)$

One trivial $X$ implying that $\Sigma$ does not have a (not heavy tailed) pdf is: $f_X(t)=\delta(t-1)$

So if you can give us $f_X(x)$, we can talk again about the solution. If not for all I care I have given you a solution and an answer.

keoxkeox
  • 431
  • 2
  • 13