In the book Continuous-Time Markov Processes: An Introduction by Thomas M. Liggett, on page 22 (Chapter 1.7), the author introduces the notation $E^x$, corresponding to a probability measure $P^x$, which is the law of $x + B$, where B is a Brownian motion.
I found a helpful discussion in this Math Stack Exchange post, but I still have some confusion regarding two points:
Definition of $E^x(Y)$ for a general bounded random variable $Y$:
- From the book, I see that $E^x(B(t))=E(x+B(t))$, which suggests that the expectation under $P^x$ simply shifts the process by x.
- However, for a general bounded random variable $Y$, how do we define $E^x(Y)$? Is it always true that $E^x(Y) = E(Y + x)$? In particular, since $Y$ is not necessarily a function of the Brownian motion $B$, the expression $\int YdP^x$ seems to be meaningless.
Understanding the Markov property (Theorem 1.46) and the meaning of $E^{X(s)}(Y) $:
- The theorem states a Markov property in terms of $E^{X(s)}(Y)$, but I am struggling to interpret it.
- It seems that $E^{X(s)}(Y)$ represents modifying the expectation by shifting the starting point of the process, rather than the more typical interpretation of the Markov property, which involves conditional expectation:
$$ E(Y \mid \mathcal{F}_s) = E(Y \mid B^s) $$ where $B^s$ is the future Brownian motion from time $s$. - How exactly should one interpret $E^{X(s)}(Y)$, and how does this formulation relate to the classical Markov property?
I would appreciate any clarifications on these points!