Questions tagged [markov-process]

A stochastic process satisfying the Markov property: the distribution of the future states given the value of the current state does not depend on the past states. Use this tag for general state space processes (both discrete and continuous times); use (markov-chains) for countable state space processes.

A Markov process is a stochastic process satisfying the Markov property: the distribution of the future states given the value of the current state does not depend on the past states. This tag is used for general state space processes both in discrete and continuous time, for countable state spaces use .

2674 questions
92
votes
3 answers

What is the importance of the infinitesimal generator of Brownian motion?

I have read that the infinitesimal generator of Brownian motion is $\frac{1}{2}\small\triangle$. Unfortunately, I have no background in semigroup theory, and the expositions of semigroup theory I have found lack any motivation or intuition. What is…
30
votes
4 answers

Markov process vs. markov chain vs. random process vs. stochastic process vs. collection of random variables

I'm trying to understand each of the above terms, and I'm having a lot of trouble deciphering the difference between them. According to Wikipeda: A Markov chain is a memoryless, random process. A Markov process is a stochastic process, which…
30
votes
3 answers

'Intuitive' difference between Markov Property and Strong Markov Property

It seems that similar questions have come up a few times regarding this, but I'm struggling to understand the answers. My question is a bit more basic, can the difference between the strong Markov property and the ordinary Markov property be…
27
votes
6 answers

Why Markov matrices always have 1 as an eigenvalue

Also called stochastic matrix. Let $A=[a_{ij}]$ - matrix over $\mathbb{R}$ $0\le a_{ij} \le 1 \forall i,j$ $\sum_{j}a_{ij}=1 \forall i$ i.e the sum along each column of $A$ is 1. I want to show $A$ has an eigenvalue of 1. The way I've seen…
19
votes
2 answers

Probability of Sisyphus laboring forever

Zeus has decreed that Sisyphus must spend each day removing all the rocks in a certain valley and transferring them to Mount Olympus. Each night, each rock Sisyphus places on Mount Olympus is subject to the whims of Zeus: it will either be…
Danjx
  • 329
18
votes
2 answers

Interpretation for the determinant of a stochastic matrix?

Is there a probabilistic interpretation for the determinant of a stochastic matrix (i.e. an $n \times n$ matrix whose columns sum to unity)?
18
votes
2 answers

Expected number of steps between states in a Markov Chain

Suppose I am given a state space $S=\{0,1,2,3\}$ with transition probability matrix $\mathbf{P}= \begin{bmatrix} \frac{2}{3} & \frac{1}{3} & 0 & 0 \\[0.3em] \frac{2}{3} & 0 & \frac{1}{3} & 0\\[0.3em] \frac{2}{3} & 0 & 0…
17
votes
1 answer

Markov chains: is "aperiodic + irreducible" equivalent to "regular"?

I have two books on stochastic processes. In one book, it says that the limiting matrix is possible to find if the matrix is regular, that is, if for some $n$ $P^n$ has only positive values. The other book says that the limiting values are possible…
user119615
  • 10,622
  • 6
  • 49
  • 122
14
votes
0 answers

Is there a Measure-Theoretic Proof of this new Result from Categorical Probability?

Recently, I stumbled across a new paper in categorical probability. Interestingly, they prove a result which may be formulated in purely measure-theoretic terms about which they note that "As far as we know, this strengthening is new, and in…
14
votes
5 answers

Possibility that all lights $\mathbf{X}=(X_1,X_2,\cdots)$ turn off again with every time turn a light with its number $n\sim\text{geom}(\frac{1}{2})$.

Problem: Let $\mathbf{X} = (\mathbb{Z}_2)^\mathbb N$, i.e., $\mathbf{X} = (X_1,X_2,\cdots,X_N,\cdots)$, $X_i\in \{0,1\} $. It can be considered as countable lightbulbs. $0$ means off, $1$ means on. We start with $\mathbf{X}_0 = 0$. Keep generating…
14
votes
1 answer

Motivation of Feynman-Kac formula and its relation to Kolmogorov backward/forward equations?

Kolmogorov backward/forward equations are pdes, derived for the semigroups constructed from the Markov transition kernels. Feynman-Kac formula is also a pde corresponding to a stochastic process defined by a SDE. But I was wondering if the…
13
votes
1 answer

Hilbert's Barber Shop

Hilbert opens a barber shop with an infinite number of chairs and an infinite number of barbers. Customers arrive via a Poisson random process with an expected 1 person every 10 minutes. Upon arrival, they sit in the first unoccupied chair and their…
13
votes
2 answers

Time-dependent transition probabilities

I need to solve the following question, but I got stuck. I would really appreciate it if someone could help me with it! Here is the question: A well-disciplined man, who smokes exactly one half of a cigar each day, buys a box containing $N$ cigars.…
13
votes
2 answers

Is every Markov Process a Martingale Process?

According to the definition (2.3.6) of a Markov Process in Shreve's book titled Stochastic Calculus for Finance II: Let $(\Omega,\mathcal F,\mathbb P)$ be a probability space, let $T$ be a fixed positive number, and let $\mathcal F(t)$, $0\leqslant…
13
votes
2 answers

Difference in probability distributions from two different kernels

I wonder if the probability kernels of Markov processes on the same state space are close enough, does it also hold for the probabilities of the event that depend only on first $n$ values of the process. More formally, let $(E,\mathscr E)$ be a…
1
2 3
99 100