Consider the stochastic process with $X_0=0$ and
$$ X_t= \begin{cases} 0 & \text{ for } \ \ t<\tau_1 \\ 1 & \text{ for } \ \ \tau_1\leq t < \tau_1+\tau_2 \\ 2 & \text{ for } \ \ \tau_1+\tau_2\leq t \end{cases} $$ where the $\tau_i\sim \mathcal{U}(0,1]$ are i.i.d uniform on $(0,1].$
My conclusion: This is not a Markov process. Of course it's a valid stochastic process that has some nice properties, e.g. $X_0=0$, it makes two jumps, and with certainty $X_t=2$ for $t\geq 2$.
Question: Is that conclusion correct? And does it follow from the work below?
I want to show that $$ P(X_{T+s}\in A\mid X_t \text{ for } t\leq T)= P(X_{T+s}\in A\mid X_T) \quad\text{ ($1$) } $$ does not always hold.
For example, let $A=\{2\}$, $T=s=0.5$. If we know the complete dynamics of the process up to time $T=0.5$, then we know the exact value of $\tau_1$, i.e. we know the exact moment the process jumped to state $1$. Let that time be $u\leq 0.5$ so that $X_{0.5}=1$. The left side of $(1)$ is $$ \begin{aligned} P(X_{1}=2\mid X_t \text{ for } t\leq 0.5, X_{0.5}=1) &=P(\tau_2\leq 1-u \mid\tau_1=u)\\ &= \int_{0}^{1-u} \ ds =1-u . \end{aligned} $$ We can stop here, as $u$ is the specific fixed value of $\tau_1$ for the process. I.e. the result depends precisely on when the first jump occurred. Thus we conclude that this is not a Markov process. However, we'll go ahead and calculate the right side of $(1)$ in this case:
$$ \begin{aligned} P(X_{1}=2\mid X_{0.5}=1) &=P(\tau_1+\tau_2\leq 1 \mid\tau_1\leq 0.5)\\ &=\int_0^{0.5} P(\tau_2\leq 1-u \mid\tau_1=u) \ du\\ &= \int_0^{0.5}1-u \ du = \frac{3}{8}. \end{aligned} $$
This shows that the Markov property is not satisfied.