2

Take rates $v$ and $\{f_i\}$, where $i$ is a position index on the periodic ring

enter image description here

with $n$ nodes. Consider the random variable $$ T_i=\min_j\{A_j+F_{|j-i|}\} $$ over all indexes $j$, where $A_j\overset{\mathrm{iid}}{\sim} \text{Exp}(f_j)$ and $F_{k}\sim\text{Erlang}(k,v)$.

What can we say about $E[T_i]$, for each $i$? Could I hope for a closed-formula approximation of this expectation? Could convolution be used here?

For more context, this is a follow-up to this question.

sam wolfe
  • 3,465

1 Answers1

2

This is not a complete answer, but an attempt. Any comments/suggestions are welcome.

For simplicity, let's drop the $i$ dependence and redefine $Y_j\equiv Y_{|j-i|}$ so that $$ Z = X_{j^*} + Y_{j^*} $$ where $j^*$ is the index that minimizes $X_j + Y_j$ over all $j = 1,...,n$.

To find the expected value of $Z$, we need to find the distribution of $X_{j^*} + Y_{j^*}$. Since each $X_j$ and $Y_j$ are independent, $X_{j^*}$ and $Y_{j^*}$ are also independent. Therefore, the pdf of $X_{j^*} + Y_{j^*}$ is given by the convolution $$f_{X_{j^*}+Y_{j^*}}(z) = \int_{0}^{\infty} f_{X_j}(z-y) \cdot f_{Y_j}(y) , dy $$ where $j^*$ is the index that minimizes $X_j + Y_j$.

Let's assume that we can find the index $j^*$ and that it is unique (if not, we can redefine $Z$ to be the sum of all $X_j + Y_j$ where $j^*$ corresponds to the minimum value of $X_j + Y_j$). Then, the expected value of $Z$ is $$ E[Z] = E[X_{j^*} + Y_{j^*}] = \int_{0}^{\infty} z \cdot f_{X_{j^*}+Y_{j^*}}(z) , dz $$ We can evaluate this integral by breaking it into two parts, one from $0$ to $x^*$ and another from $x^*$ to infinity, where $x^*$ is the minimum value of $X_j + Y_j$. We can find $x^*$ by finding the minimum of each sum $X_j + Y_j$, which is a continuous random variable with pdf given by the convolution $$ f_{X_j+Y_j}(z) = \int_{0}^{z} f_{X_j}(z-y) \cdot f_{Y_j}(y) , dy $$ for $j = 1,...,n$.

Then, the expected value of $Z$ is $$ E[Z] = \int_{0}^{x^*} z \cdot f_{X_{j^*}+Y_{j^*}}(z) , dz + \int_{x^*}^{\infty} z \cdot f_{X_{j^*}+Y_{j^*}}(z) , dz $$ We can evaluate the first integral using the pdf of $X_{j^*} + Y_{j^*}$ given above. To evaluate the second integral, note that $$ \int_{x^*}^{\infty} z \cdot f_{X_{j^*}+Y_{j^*}}(z) , dz = \int_{0}^{\infty} z \cdot f_{X_{j^*}+Y_{j^*}}(z) , dz - \int_{0}^{x^*} z \cdot f_{X_{j^*}+Y_{j^*}}(z) , dz $$ The first integral is the expected value of $X_{j^*} + Y_{j^*}$, which we can calculate using the convolution formula above. The second integral is the probability that $X_j + Y_j$ is greater than or equal to $x^*$, multiplied by the expected value of $X_{j^*} + Y_{j^*}$ conditioned on $X_{j^*} + Y_{j^*} \geq x^*$. That is, $$ \int_{0}^{x^*} z \cdot f_{X_{j^*}+Y_{j^*}}(z) , dz = P(X_{j^*} + Y_{j^*} \geq x^*) \cdot E[X_{j^*}+Y_{j^*} \,|\, X_{j^*}+Y_{j^*} \geq x^*] $$ The probability that $X_{j^*} + Y_{j^*}$ is greater than or equal to $x^*$ is given by $$ P(X_{j^*} + Y_{j^*} \geq x^*) = 1 - P(X_{j^*} + Y_{j^*} < x^*) $$ $$ = 1 - \int_{0}^{x^*} f_{X_{j^*}+Y_{j^*}}(z) , dz $$ We can evaluate this integral using the convolution formula for the pdf of $X_j + Y_j$.

The conditional expected value of $X_{j^*} + Y_{j^*}$, given that $X_{j^*} + Y_{j^*} \geq x^*$, is given by $$ E[X_{j^*}+Y_{j^*} \,|\, X_{j^*}+Y_{j^*} \geq x^] = \frac{\int_{x^*}^{\infty} z \cdot f_{X_{j^*}+Y_{j^*}}(z) , dz}{P(X_{j^*} + Y_{j^*} \geq x^*)} $$ Substituting these expressions into the equation for $E[Z]$, we get $$ E[Z] = \int_{0}^{x^*} z \cdot f_{X_{j^*}+Y_{j^*}}(z) , dz + \left(1 - \int_{0}^{x^*} f_{X_{j^*}+Y_{j^*}}(z) , dz\right) \cdot \frac{\int_{x^*}^{\infty} z \cdot f_{X_{j^*}+Y_{j^*}}(z) , dz} {\int_{0}^{\infty} f_{X_{j^*}+Y_{j^*}}(z) , dz} $$ To evaluate this expression, we need to compute the convolution formula for the pdf of $X_j + Y_j$ and find the minimum value $x^*$ of $X_j + Y_j$. Once we know $x^*$, we can compute the integrals above and find the expected value of the minimum of the sum $X_j + Y_j$.

I wonder whether this is correct, or if there is a cleaver way to deal with the minimisation problem. Any ideas?

sam wolfe
  • 3,465