Questions tagged [markov-chains]

Stochastic processes (with either discrete or continuous time dependence) on a discrete (finite or countably infinite) state space in which the distribution of the next state depends only on the current state. For Markov processes on continuous state spaces please use (markov-process) instead.

A Markov chain is a stochastic process on a discrete (finite or countably infinite) space in which the distribution of the next state depends only on the current state. These objects show up in probability and computer science both in discrete-time and continuous-time models. For Markov processes on continuous spaces please use .

A discrete-time Markov chain is a sequence of random variables $\{X_n\}_{n\geq1}$ with the Markov property, namely that the probability of moving to the next state depends only on the present state and not on the previous states, i.e. $$\mathbb P(X_{n+1}=x\mid X_{1}=x_{1},X_{2}=x_{2},\ldots ,X_{n}=x_{n})=\mathbb P(X_{n+1}=x\mid X_{n}=x_{n}),$$ if both conditional probabilities are well defined, i.e. if $\mathbb P(X_{1}=x_{1},\ldots ,X_{n}=x_{n})>0.$

5294 questions
113
votes
6 answers

How often does it happen that the oldest person alive dies?

Today, we are brought the sad news that Europe's oldest woman died. A little over a week ago the oldest person in the U.S. unfortunately died. Yesterday, the Netherlands' oldest man died peacefully. The Gerontology Research Group keeps records:…
Řídící
  • 3,048
  • 6
  • 19
  • 39
52
votes
5 answers

Time to reach a final state in a random dynamical system (answer known, proof unknown)

Consider a dynamical system with state space $2^n$ represented as a sequence of $n$ black or white characters, such as $BWBB\ldots WB$. At every step, we choose a random pair $(i,j)$ with $i
39
votes
9 answers

Probability brain teaser with infinite loop

I found this problem and I've been stuck on how to solve it. A miner is trapped in a mine containing 3 doors. The first door leads to a tunnel that will take him to safety after 3 hours of travel. The second door leads to a tunnel that will return…
user3.14259
  • 677
  • 1
  • 5
  • 15
31
votes
3 answers

Is ergodic markov chain both irreducible and aperiodic or just irreducible?

As I find some definition says: Ergodic = irreducible. And then Irreducible + aperiodic + positive gives Regular Markov chain. A Markov chain is called an ergodic chain if it is possible to go from every state to every state (not necessarily in one…
colinfang
  • 777
  • 1
  • 5
  • 13
29
votes
2 answers

Nice references on Markov chains/processes?

I am currently learning about Markov chains and Markov processes, as part of my study on stochastic processes. I feel there are so many properties about Markov chain, but the book that I have makes me miss the big picture, and I might better look…
Tim
  • 43,663
  • 43
  • 199
  • 459
28
votes
5 answers

What is a Markov Chain?

What is an intuitive explanation of Markov chains, and how they work? Please provide at least one practical example.
28
votes
5 answers

When the product of dice rolls yields a square

Succinct Question: Suppose you roll a fair six-sided die $n$ times. What is the probability that the product of the rolls is a square? Context: I used this as one question in a course for elementary school teachers when $n=2$, and thought the…
27
votes
6 answers

Why Markov matrices always have 1 as an eigenvalue

Also called stochastic matrix. Let $A=[a_{ij}]$ - matrix over $\mathbb{R}$ $0\le a_{ij} \le 1 \forall i,j$ $\sum_{j}a_{ij}=1 \forall i$ i.e the sum along each column of $A$ is 1. I want to show $A$ has an eigenvalue of 1. The way I've seen…
27
votes
4 answers

Markov process vs. markov chain vs. random process vs. stochastic process vs. collection of random variables

I'm trying to understand each of the above terms, and I'm having a lot of trouble deciphering the difference between them. According to Wikipeda: A Markov chain is a memoryless, random process. A Markov process is a stochastic process, which…
27
votes
7 answers

Die fixed so it can't roll the same number twice in a row, using markov chains?

Studying for probability test and the following question came up: A six sided die is 'fixed' so that it cannot roll the same number twice consecutively. The other 5 sides each show up with a probability $\frac{1}{5}$. Calculate P($X_{n+1} = 5 \mid…
simples123
  • 343
  • 3
  • 7
27
votes
2 answers

Knight returning to corner on chessboard -- average number of steps

Context: My friend gave me a problem at breakfast some time ago. It is supposed to have an easy, trick-involving solution. I can't figure it out. Problem: Let there be a knight (horse) at a particular corner (0,0) on a 8x8 chessboard. The knight…
SSF
  • 1,180
  • 1
  • 9
  • 18
26
votes
2 answers

Drunkard's walk on the $n^{th}$ roots of unity.

Fix an integer $n\geq 2$. Suppose we start at the origin in the complex plane, and on each step we choose an $n^{th}$ root of unity at random, and go $1$ unit distance in that direction. Let $X_N$ be distance from the origin after the $N^{th}$…
Eric Naslund
  • 69,703
  • 11
  • 166
  • 260
24
votes
1 answer

Average swaps needed for a random bubble sort algorithm

Suppose we have $n$ elements in a random permutation (each permutation has equal probability initially). While the elements are not fully sorted, we swap two adjacent elements at random (e.g. the permutation $(1, 3, 2)$ can go to $(1, 2, 3)$ or $(3,…
Kyky
  • 2,615
  • 1
  • 14
  • 30
23
votes
5 answers

Good introductory book for Markov processes

Which is a good introductory book for Markov chains and Markov processes? Thank you.
lmsasu
  • 354
  • 1
  • 7
  • 20
22
votes
2 answers

What happens to a random walk when we increase the probabilities of going right?

Consider a random walk on the integers where the probability of transitioning from $n$ to $n+1$ is $p_n$ (and of course, the probability of transitioning from $n$ to $n-1$ is $1-p_n$); we assume all $p_n$ are strictly less than $1$. Suppose we know…
yves
  • 221
  • 1
  • 3
1
2 3
99 100