These results follow from the general formulas above for $$m_t$$ and $$v_t$$, since $$\mu = 2 p$$ and $$\sigma^2 = 4 p (1 - p)$$. (x - y)!}{x!} This is, If $$\mu \gt 1$$ then $$m_t \to \infty$$ as $$t \to \infty$$. We can easily relate extinction for the continuous-time branching chain $$\bs X$$ to extinction for any of the embedded discrete-time branching chains. Without the assumption that $$\mu \lt \infty$$, explosion can actually occur in finite time. For $$t \in [0, \infty)$$, the generating function $$\Phi_t$$ is given by From partial fractions, $$\frac{1}{u^2 - u} = \frac{1}{u - 1} - \frac{1}{u}$$, so the result follows by standard integration and algebra. U(x, y) = \frac{1}{\alpha} \binom{x}{y} B(y, x - y + 1) = \frac{1}{\alpha} \binom{x}{y} \frac{(y - 1)! In general, we know that sampling a (homogeneous) continuous-time Markov chain at multiples of a fixed $$t \in (0, \infty)$$, results in a (homogeneous) discrete-time Markov chain. Given $$X_0 = x$$, extinction has occurred by time $$t$$ if and only if extinction has occurred by time $$t$$ for each of the $$x$$ independent branching chains formed from the descendents of the $$x$$ initial particles. From the proof of the previous theorem, The other results then follow easily. Parts (a) and (b) follow from the general moment results above, with $$\mu = 2$$ and $$\sigma^2 = 0$$. Continuous-time Branching. Given $$X_0 = 1$$, the population is $$n$$ at time $$\tau_{n-1}$$. Next we consider the pure birth branching chain in which each particle, at the end of its life, is replaced by 2 new particles. \end{align*}. These keywords were added by machine and not by the authors. For $$n \in \{0, 1, \ldots, X_t - 1\}$$, the age at time $$t$$ of the particle born at time $$\tau_n$$ is $$t - \tau_n$$. Conversely if extinction occurs for $$\bs Z_t$$ for some $$t \in (0, \infty)$$ then extinction occurs for $$\bs Z_t$$ for every $$t \in (0, \infty)$$ and extinction occurs for $$\bs X$$. Discrete Time Branching Process Offspring Distribution Exponential Random Time Killing Time Immune Response Model These keywords were added by machine and not by the authors. By definition, $$\Psi(r) = (1 - p) r^0 + p r^2$$. Note that $$X_s = n$$ for $$\tau_{n-1} \le s \lt \tau_n$$ and $$n \in \{1, 2, \ldots, k\}$$, while $$X_s = k + 1$$ for $$\tau_k \le s \le t$$. The Kolmogorov backward equation is \[ \alpha \left[\Psi(\Phi_t(r)) - \Phi_t(r)\right] = \Phi_t^\prime(r) \alpha[\Psi(r) - r] Then $$\bs Z_t$$ is a discrete-time branching chain with offspring probability density function $$f_t$$ given by $$f_t(x) = P_t(1, x)$$ for $$x \in \N$$. We will use the notation established above, so that $$\alpha$$ is the parameter of the exponential lifetime of a particle, $$Q$$ is the transition matrix of the jump chain, $$G$$ is the infinitesimal generator matrix, and $$P_t$$ is the transition matrix at time $$t \in [0, \infty)$$. Recall that $$U(x, y)$$ is the expected time spent in state $$y$$ starting in state $$x$$. But this is the negative binomial distribution on $$\N_+$$ with parameters $$x$$ and $$e^{-\alpha t}$$. Finally, in the context of part (b), note that if $$\mu = 1$$ we must have $$\sigma^2 \gt 0$$ since we have assumed that $$f(1) = 0$$. Another relationship is given in the following theorem. Part of Springer Nature. Hence In the proof of the last theorem we showed that In our study of discrete-time Markov chains, we studied branching chains in terms of generational time. If we think of the Yule process in terms of particles that never die, but each particle gives birth to a new particle at rate $$\alpha$$, then we can study the age of the particles at a given time. This is, If $$\mu = 1$$ then $$m_t = 1$$ for all $$t \in [0, \infty)$$. The case of continuous spectrum, Z. Wahrscheinlichkeitstheorie verve. So given $$X_0 = 1$$, $$X_t$$ has the geometric distribution with parameter $$e^{-\alpha t}$$. Hence we have $\E(A_t) = \E\left(\int_0^t X_s ds\right) = \int_0^t \E(X_s) ds = \int_0^t e^{\alpha s} ds = \frac{e^{\alpha t} - 1}{\alpha}$. So with a branching chain, there are essentially two types of behavior: population extinction or population explosion. $\frac{d}{dt} m_t = \alpha (\mu - 1) m_t$ $\frac{d}{dt} \Phi_t^{\prime \prime}(r) = \alpha \Phi_t^{\prime \prime \prime}(r)[\Psi(r) - r] + 2 \alpha \Phi_t^{\prime \prime}(r)[\Psi^\prime(r) - 1] + \alpha \Phi_t^\prime(r) \Psi^{\prime \prime}(r)$ That is, $$\bs Y$$ is not a discrete-time branching chain, since in discrete time, the index $$n$$ represents the $$n$$th generation, whereas here it represent the $$n$$th time that a particle reproduces. This was a project in collaboration with Rosalba Garcia-Millan, Benjamin Walter and Gunnar Pruessner. As mentioned earlier, $$\bs X$$ is also a continuous-time birth-death chain on $$\N$$, with 0 absorbing. This differential equation, along with the initial condition $$\Phi_0(r) = r$$ for all $$r \in \R$$ determines the collection of generating functions $$\bs \Phi$$. The infinitesimal generator $$G$$ is given by If $$p = \frac{1}{2}$$ the factoring is $$\frac{1}{2}(u - 1)^2$$ and partial fractions is not necessary. From a result in the section on the exponential distribution, it follows that $$\tau_n = \sum_{k=1}^n (\tau_k - \tau_{k-1})$$ has distribution function given by From the basic theory of probability generating functions, $$m_t = \Phi_t^\prime(1)$$ and similarly, $$\mu = \Psi^\prime(1)$$. $\P(X_t = 1 \mid X_0 = 1) = \P(\tau \gt t \mid X_0 = 1) = e^{-\alpha t}$ : A branching process for virus survival. The moment functions are given next. where $$f_t^{*x}$$ is the convolution power of $$f_t$$ of order $$x$$. This is also easy. Furthermore, particles can also disappear in an extinction process. From the result above, $$\Phi_t(q) = 1$$ for every $$t \in (0, \infty)$$. So the random interval $$\tau_n - \tau_{n-1}$$ (the time until the next birth) has the exponential distribution with parameter $$\alpha n$$ and these intervals are independent as $$n$$ varies. \end{align*}. Once in state $$y$$ the time spent in $$y$$ has an exponential distribution with parameter $$\lambda(y) = \alpha y$$, and so the mean is $$1 / \alpha y$$. $\P(X_t = 0 \text{ for some } t \in (0, \infty) \mid X_0 = x) = \lim_{t \to \infty} \P(X_t = 0 \mid X_0 = x) = q^x$. $A_t = \int_0^t X_s ds, \quad t \in [0, \infty)$, Suppose that $$X_t = k + 1$$ where $$k \in \N$$, so that $$\tau_k \le t \lt \tau_{k+1}$$. A branching process creates a tree with branches which can split into other branches at each step or at each generation in a chain. with initial condition $$v_0 = 0$$ so trivially $$v_t = \alpha \sigma^2 t$$. 5 (1966) 6-33; 34-54. The particles are biological organisms that reproduce. Using the generator above, $$Q(x, x + k - 1) = f(k)$$ for $$x \in \N_+$$ and $$k \in \N$$. Suppose now that $$\mu = 1$$. \end{align*}. Next we consider the continuous-time branching chain in which each particle, at the end of its life, leaves either no children or two children. $$\Psi(r) = p r^2 + (1 - p)$$ for $$r \in \R$$. Since 0 is an isolated, absorbing state, we will sometimes restrict our attention to positive states. So whether or not extinction is certain depends on the critical parameter $$\mu$$. MathSciNet Article MATH Google Scholar [8] Ji L, Li Z. The statements about the extinction event follow immediately from the fact that $$0$$ is absorbing, so that if $$X_t = 0$$ for some $$t \in (0, \infty)$$ then $$X_s = 0$$ for every $$s \in [t, \infty)$$. $m_t = \E(X_t \mid X_0 = 1), \; v_t = \var(X_t \mid X_0 = 1), \quad t \in [0, \infty)$ \end{align*}. Relat. Is is possible that the system becomes empty? $\frac{d}{ds} \Phi_{t+s}(r) = \Phi_t^\prime[\Phi_s(r)] \frac{d}{ds} \Phi_s(r) = \Phi_t^\prime[\Phi_s(r)] \alpha \left[ \Psi(\Phi_s(r)) - \Phi_s(r)\right]$ If $$p \ne \frac{1}{2}$$, use partial fractions, standard integration, and some algebra. As always, be sure to try these exercises yourself before looking at the proofs and solutions. $\frac{d}{dt} \Phi_t^\prime(r) = \alpha \Phi_t^{\prime \prime}(r)[\Psi(r) - r] + \alpha \Phi_t^\prime(r)[\Psi^\prime(r) - 1]$ With probability 1, the jump chain $$\bs Y$$ visits a transient state only finitely many times, so with probability 1 either $$Y_n = 0$$ for some $$n \in \N$$ or $$Y_n \to \infty$$ as $$n \to \infty$$.