Coming a little late to the party, I would operate closely to what @d.k.o did with the underlying random process, only I will prove that:
$$E(X_{n+1}|\sigma (X_1,...,X_n))=X_n$$
directly without using the lemma he quotes.
Using your notations, let $F_n = \sigma (X_1,...,X_n)$ and $\{U_n\}_{n=0}^\infty$ be a sequence of i.i.d. uniform random variables on $[0,1]$ and define:
$$Y_n = 1\{U_n \leq X_{n-1}\}$$
that is 1 if the ball drawn in the n-th trial is red, and zero otherwise.
We have $X_0 = \frac{r}{r + b}$, and:
$$X_n = \frac{r + t.\sum_{i=1}^{n} Y_i}{r + b + t.n}$$
First, we have that $\forall i \in [1, 2,..., n]: Y_i \in \sigma(X_1, X_2, ..., X_n) = F_n$.
So one gets:
$$E(X_{n+1}|F_n)= E(\frac{r + t.\sum_{i=1}^{n+1} Y_i}{r + b + t.(n + 1)}|F_n)=\frac{r + t.\sum_{i=1}^{n} Y_i}{r + b + t.(n+1)} + \frac{t.E(Y_{n+1}|F_n)}{r+b+t.(n+1)}$$.
Now we want to prove that: $E(Y_{n+1}|F_n) = X_n$.
We're gonna prove that $\forall B \in F_n$: $E(Y_{n+1}.\mathbf{1}_ {B}) = E(X_n.\mathbf{1}_ {B})$, which is a caracterisation of the conditionnal expectation of a random variable with respect to a given sigma algebra.
First let's consider: $W_n = \{(w_1, w_2,..., w_n): \forall i\in [1,...,n], w_i\in \{0,1\} \}$ and define:
$\forall w\in W_n : B_w = \{Y_1=w_1, Y_2=w_2,...,Y_n=w_n \}$. Note that $\forall w_1 \neq w_2 , B_{w_1} \bigcap B_{w_2} = \varnothing$
$\forall w\in W_n$:
$$Y_{n+1}.\mathbf{1}_ {B_w} = 1\{U_{n+1} \leq \frac{r + t.\sum_{i=1}^{n} Y_i}{r + b + t.n}\}.1\{Y_1=w_1, Y_2=w_2,...,Y_n=w_n \} = 1\{U_{n+1} \leq \frac{r + t.\sum_{i=1}^{n} w_i}{r + b + t.n}\} = 1\{U_{n+1} \leq p_w \}$$
So $Y_{n+1}.\mathbf{1}_ {B_w}$ follows a binomial law of parameter: $p_w = \frac{r + t.\sum_{i=1}^{n} w_i}{r + b + t.n}$
$\implies E(Y_{n+1}.\mathbf{1}_ {B_w}) = \frac{r + t.\sum_{i=1}^{n} w_i}{r + b + t.n} = E(\frac{r + t.\sum_{i=1}^{n} Y_i}{r + b + t.n}.1\{Y_1=w_1, Y_2=w_2,...,Y_n=w_n \}) = E(X_n.\mathbf{1}_ {B_w})$
Now since $\{B_w, w\in W_n\}$ is the set of all the smallest elements of $F_n = \sigma(X_1,X_2,...,X_n)$, for any $B\in F_n$ there are $(w_1,w_2,...,w_p) \in W_n$ sucht that:
$$B = \bigcup_{i=1}^{p} B_{w_i} , B_{w_i}\bigcap B_{w_j} = \varnothing, \forall i \neq j.$$
So one gets: $\mathbf{1}_ {B} = \sum_{i = 1}^p \mathbf{1}_ {B_{w_i}}$ and by linearity of the expectation:
$$E(Y_{n+1}.\mathbf{1}_ {B}) = \sum_{i=1}^p E(Y_{n+1}.\mathbf{1}_ {B_{w_i}}) = \sum_{i=1}^p E(X_n.\mathbf{1}_ {B_{w_i}}) = E(X_n.\mathbf{1}_ {B})$$
So we get that:
$$E(Y_{n+1}|F_n) = X_n$$
and plugging it in the first computation gives:
$$E(X_{n+1}|F_n)= E(\frac{r + t.\sum_{i=1}^{n+1} Y_i}{r + b + t.(n + 1)}|F_n)=\frac{r + t.\sum_{i=1}^{n} Y_i}{r + b + t.(n+1)} + \frac{t.X_n}{r+b+t.(n+1)} = \frac{r + b + t.n}{r + b + t.(n+1)}X_n + \frac{t.X_n}{r+b+t.(n+1)} = X_n$$