Questions tagged [markov-process]

A stochastic process satisfying the Markov property: the distribution of the future states given the value of the current state does not depend on the past states. Use this tag for general state space processes (both discrete and continuous times); use (markov-chains) for countable state space processes.

A Markov process is a stochastic process satisfying the Markov property: the distribution of the future states given the value of the current state does not depend on the past states. This tag is used for general state space processes both in discrete and continuous time, for countable state spaces use .

2266 questions
72
votes
3 answers

What is the importance of the infinitesimal generator of Brownian motion?

I have read that the infinitesimal generator of Brownian motion is $\frac{1}{2}\small\triangle$. Unfortunately, I have no background in semigroup theory, and the expositions of semigroup theory I have found lack any motivation or intuition. What is…
27
votes
6 answers

Why Markov matrices always have 1 as an eigenvalue

Also called stochastic matrix. Let $A=[a_{ij}]$ - matrix over $\mathbb{R}$ $0\le a_{ij} \le 1 \forall i,j$ $\sum_{j}a_{ij}=1 \forall i$ i.e the sum along each column of $A$ is 1. I want to show $A$ has an eigenvalue of 1. The way I've seen…
27
votes
4 answers

Markov process vs. markov chain vs. random process vs. stochastic process vs. collection of random variables

I'm trying to understand each of the above terms, and I'm having a lot of trouble deciphering the difference between them. According to Wikipeda: A Markov chain is a memoryless, random process. A Markov process is a stochastic process, which…
14
votes
2 answers

Interpretation for the determinant of a stochastic matrix?

Is there a probabilistic interpretation for the determinant of a stochastic matrix (i.e. an $n \times n$ matrix whose columns sum to unity)?
14
votes
1 answer

Markov chains: is "aperiodic + irreducible" equivalent to "regular"?

I have two books on stochastic processes. In one book, it says that the limiting matrix is possible to find if the matrix is regular, that is, if for some $n$ $P^n$ has only positive values. The other book says that the limiting values are possible…
14
votes
3 answers

Possibility that all lights $\mathbf{X}=(X_1,X_2,\cdots)$ turn off again with every time turn a light with its number $n\sim\text{geom}(\frac{1}{2})$.

Problem: Let $\mathbf{X} = (\mathbb{Z}_2)^\mathbb N$, i.e., $\mathbf{X} = (X_1,X_2,\cdots,X_N,\cdots)$, $X_i\in \{0,1\} $. It can be considered as countable lightbulbs. $0$ means off, $1$ means on. We start with $\mathbf{X}_0 = 0$. Keep generating…
14
votes
1 answer

Motivation of Feynman-Kac formula and its relation to Kolmogorov backward/forward equations?

Kolmogorov backward/forward equations are pdes, derived for the semigroups constructed from the Markov transition kernels. Feynman-Kac formula is also a pde corresponding to a stochastic process defined by a SDE. But I was wondering if the…
13
votes
1 answer

Hilbert's Barber Shop

Hilbert opens a barber shop with an infinite number of chairs and an infinite number of barbers. Customers arrive via a Poisson random process with an expected 1 person every 10 minutes. Upon arrival, they sit in the first unoccupied chair and their…
bhensley
  • 133
  • 5
13
votes
2 answers

Difference in probability distributions from two different kernels

I wonder if the probability kernels of Markov processes on the same state space are close enough, does it also hold for the probabilities of the event that depend only on first $n$ values of the process. More formally, let $(E,\mathscr E)$ be a…
12
votes
2 answers

Expected number of steps between states in a Markov Chain

Suppose I am given a state space $S=\{0,1,2,3\}$ with transition probability matrix $\mathbf{P}= \begin{bmatrix} \frac{2}{3} & \frac{1}{3} & 0 & 0 \\[0.3em] \frac{2}{3} & 0 & \frac{1}{3} & 0\\[0.3em] \frac{2}{3} & 0 & 0…
12
votes
3 answers

Select a new value from last $N$ values; how long until the last $N$ are all the same?

Say first we have N distinct numbers in a line, like 1,2,3,...,N, in each round, we choose a random one from the last N numbers, and put it in the end. Asking the expected number of rounds to make the last N numbers the same. e.g. for N = 2, first…
11
votes
0 answers

Uniqueness of an infinite system of linear ODEs

How to prove that $\dot{x}=ax,\space x(0)=1$ has a unique solution if $a,x$ are infinite dimensional matrices? More specifically, let $Q$ be a bounded infinitesimal generator, i.e. $Q=(q_{i,j})_{i,j\in\mathbb{N}_0}$, its entries comprise a bounded…
Evan Aad
  • 10,454
  • 2
  • 31
  • 74
11
votes
0 answers

Limit distributions for Markov chains $X\to\sqrt{U+X}$

This question spawned from a recent, very interesting problem. Let $\varphi=\frac{1+\sqrt{5}}{2}$ and $T$ denote the map on the space of continuous probability density functions supported over $\left(0,\varphi\right)$, defined by $$ (T f)(x) =…
Jack D'Aurizio
  • 338,356
  • 40
  • 353
  • 787
10
votes
1 answer

Mean hitting time: reference request

After answering [this question] (Expectation of a stopping time uniquely determined by a function) I was looking for the literature on the mean hitting/exit time for a discrete-time Markov process. In Meyn and Tweedie, Durett and some other books I…
Ilya
  • 34,060
  • 4
  • 69
  • 141
10
votes
1 answer

Strong Markov property of Brownian motion

I was able to understand Brownian Motion $\{B(t):t\geq0\}$ has Strong Markov Property i.e. For any stopping time $\tau$, $P(B(t+\tau)\leq y | \mathcal{F}_{\tau})=P(B(t+\tau)\leq y|B(\tau))$ a.s. , $y \in \mathbb{R}$. I want to prove the following…
1
2 3
99 100