4

We define the continuous-time, multi-type branching process $(X(t))_{t\ge0}$ as follows: $(X(0))=\alpha\in\mathbb{R}^d$, where $\alpha=(\alpha_1,\dots,\alpha_d)$ is the urn initial composition, meaning that, at time $0$, there are $\alpha_i$ particles of type $i$ alive in the system. Each particle reproduces (or “splits”) independently from the rest at rate $1$, and at a reproduction event triggered by a particle of type $i$, we add to the system $R_{i,j}$ particles of type $j$, for all $1\le i,j\le d$, where $R$ is a replacement matrix of the urn process $X(t)$.

For all $1\le i\le d$, we let $X^{(i)}$ be the urn process of initial composition $e_i$ (the canonical basis of $\mathbb{R}^d$, where the $i$-th position is equal to $1$ and $0$ in all the other coordinates) and replacement matrix $R$. E.g. $X^(1)$ will have as initiali composition the vector $e_1=(1,0,\dots,0)$.

We want to calculate $\mathbb{E}[X^{(i)}(t)]$, for all $t\ge 0$. This is given by $$\mathbb{E}[X^{(i)}(t)]=e_ie^{-t}+\int^t_0e^{-s}\text{d}s\mathbb{E}\bigg[\sum^d_{j=1}\sum^{R_{i,j}+\delta_{i,j}}_{k=1}X^{(j,k)}(t-s)\bigg]$$ where, for all $1\le j\le d$, $(X^{(j,k)})_k\ge1$ is a sequence of i.i.d. copies of $X^{(j)}$, and the double-indexed sequence $(X^{(j,k)})_{i,j\ge1}$ is a sequence of independent processes.

This apparently comes from the following reasoning: "We look at the time when the ball in the urn at time zero splits (with probability $e^{-t}$, the initial ball hasn’t split yet at time $t$)"

I don't understand how to get this formula: I would split the expectation in two events, the first "the ball hasn't split yet at time $t$" and "the ball splits", so the first term $e_ie^{-t}$ concerns the first event, while, for the second event I think I should get the integral written above. The term $e^{-s}$ is the probability, which follows an exponential distribution with parameter 1, but I dont' know how to get $\mathbb{E}\big[\sum^d_{j=1}\sum^{R_{i,j}+\delta_{i,j}}_{k=1}X^{(j,k)}(t-s)\big]$

The mechanism I know is, that if at time $s<t$, a ball "$i$" splits, it is replaced by $R_{i,j}+\delta_{i,j}$ balls "$j$". Any help on how I should get that formula?

Reference: PÒLYA URNS AND OTHER REINFORCEMENT PROCESSES

Dada
  • 177
  • 6
  • 22
  • I don't understand what you mean by $X^{(i)}$, or what "initial composition $e_i$ means -- $X$ is a random variable taking values in $R^d$, is $X^{(i)}$ not a random variable taking values in $R$, whose values equal the $i$th component of $X$? If not, could you clarify what $X^{(i)}$ is then? It's even more unclear to me what the random variables $X^{(j,k)}$ are supposed to represent. Presumably how all of these random variables are defined and related to one another is crucial to the proof of the formula, but none of that information is (in my opinion) very clear currently from the question – Chill2Macht Apr 13 '24 at 14:49
  • Also, are the first two paragraphs a quote from a source? (If so, could you format the question to reflect this, use ">" in the editor.) If so, which source? (In case someone interested wants to look there for clarifying context.) And only the beginning of the third paragraph is the actual question? But then it and the fourth paragraph are intermixed with an attempt to get the answer, followed by the question being restated again. So in short it would help to organize the question better, so people know what to focus on and what to allow their attention to drift away from when necessary. – Chill2Macht Apr 13 '24 at 14:53
  • @Chill2Macht Thank you for your remark, I tried to edit my question in order to leave it a bit clearer. – Dada Apr 13 '24 at 15:48
  • 1
    This is similar to the renewal equation from renewal theory, and I think the proofs are similar. It might be helpful to read about the renewal equation, then come back to this. – Ziv Apr 15 '24 at 11:52
  • @Ziv Do you have any good reference where you find this similar result? I looked a bit around but didn't really find anything like it. – Dada Apr 16 '24 at 08:57
  • 2
    The "condition on the time of the first arrival" idea is used in the proof of claim 22(a) on this page (though it states things using convolutions): https://www.randomservices.org/random/renewal/Equations.html#ren3 – Ziv Apr 17 '24 at 11:42

1 Answers1

3

As a toy example, consider a Yule process $(Y_t, t\geq 0)$: It's law can be described as follows:

  • $Y_0=1$, a.s.
  • If $Y_t=n$, then the transition to $n+1$ happens at rate $n$.

This is very similar to your urn process. You can think of this as a population evolving in continuous time where every individual branches independently into two at rate $1$ and their offspring evolves independently.

Let's compute $\mathbb{E}[Y(t)]$ by considering the time of the first branching event, denote this time by $\tau$. $$m(t) := \mathbb{E}[Y(t)] = \mathbb{P}(\tau > t) + \int_0^t \mathbb{E}[Y(t) \vert \tau = s] \mathbb{P}({\tau \in ds}),$$ where the second equation follows from conditioning on the value of $\tau$. Next, we observe that $\tau$ is distributed like an exponential random variable with rate $1$. Further note that the two offspring that are created evolve independently. For an independent copy $Y'$ of $Y$ and by the Markov property $$\mathbb{E}[Y(t) \vert \tau = s] = \mathbb{E}[Y(t-s)+Y'(t-s)] = 2 \mathbb{E}[Y(t-s)]= 2 m(t-s).$$ If we combine this with the distribution of $\tau$ we obtain the following integral equation (the renewal equation) for $m$: $$m(t) = e^{-t}+\int_0^t 2m(t-s)e^{-s}ds$$ Combine this with $m(0)=1$ to obtain $m(t)=e^t$ (hint: differentiate).


The same exact approach also works for multi-type branching processes. Here the difference is what you will get for the offspring process: in the Yule process you get two independent copies $Y,Y'$ of the same process where as in your process you get $$1+ \sum_{j=1}^d R_{i,j}$$ new process -- one for the initial particle (that will have type $1$) and $R_{i,j}$ new particles of type $j$. The processes emanating from these particles are denoted by $X^{(j,k)}$. They will all evolve independently starting from time $s$, so until time $t$ there is $t-s$ time left after applying the Markov property. That's why in the end you sum over $X^{(j,k)}(t-s)$.

I hope this helps!

David
  • 455
  • This is very well-written and clear. Do you have any recommended references for introductions to Yule processes and/or multi-type branching processes? Whenever I've encountered them in a textbook in a paper it was always without much exposition (even if it was supposed to be an "introduction") or the reader was assumed to have somehow already been familiar with the topics / have an intuition for them from somewhere else. – Chill2Macht Apr 21 '24 at 00:25
  • Thank you David, as @Chill2Macht said, I would really appreciate some good reference about it! Then I will also accept your answer :) – Dada Apr 21 '24 at 19:49
  • Not really sure about good references, maybe it would be worth reading something about renewal theory and queuing theory could be good. There are some chapters in 'Probability and Random Processes' by Grimmett and Strizacker about this (though I never personally looked at the book). – David Apr 22 '24 at 09:56