0

I've got a system recursively characterized by:

$$ f_i(x) = \alpha_i(x) + \sum_{j = 1}^J \beta_{i, j}(x) f_j(x) $$

Or, in matrix notation:

$$ F(x) = A(x) + B(x)F(x) $$

Where $A,F$ are vectors and $B$ is a square matrix.

What can we say about the nature of $\frac{\partial F}{\partial x}$?

The product rule tells us:

$$ \frac{\partial F}{\partial x} = \frac{\partial A}{\partial x} + \frac{\partial B}{\partial x}F(x) + B(x)\frac{\partial F}{\partial x} $$

And in turn (exactly analogous to the scalar case of $\frac{\partial}{\partial x}(g(x))$ for $g(x) = \frac{a(x)}{1 - b(x)}$):

$$ \frac{\partial F}{\partial x} = (I - B(x))^{-1} \left( \frac{\partial A}{\partial x} + \frac{\partial B}{\partial x} F(x) \right) $$

I know the following ($\forall x$):

$$ \frac{\partial \alpha_i}{\partial x} \leq 0 \, \forall i \qquad 0 \leq \beta_{i, j}(x) < 1 \, \forall i,j \qquad \frac{\partial \beta_{i,j}}{\partial x} \leq 0 \, \forall i,j $$

And the fact that $\alpha_i(x) \geq 0$ and $\beta_{i,j}(x) < 1$ should imply that $F(x) \geq 0$ as well (this is easy to establish by contradicting whether $\exists ! \, i^{*}$ such that for some $x^{*}$, but the possibility that there are several negative components at that $x^{*}$ makes a full proof more bearish).

Can we assert that $\frac{\partial F}{\partial x} \leq 0$ for all components?

I'm also particularly interested in the limiting case when $x = 0$.

I know that:

$$ \nexists \left( I - B(0) \right)^{-1} \qquad \frac{\partial \alpha_i}{\partial x}(0) = \frac{\partial \beta_{i,j}}{\partial x} = 0 $$

So I suppose that means there's some sort of higher-dimensional L'Hopital's rule to apply (I've not encountered this; no help here)? Can I say this limit is equal to the limit of the "numerator's" (right-side matrix) derivative divided by "denominator's" (left-side matrix) derivative?

1 Answers1

1

Let me abbreviate $\frac{\partial F}{\partial x}$ as $dF$. Then for the limiting case, go back to the result from the product rule and evaluate its terms at $x=0$ $$\eqalign{ dF &= dA + dB\,F + B\,dF \cr dF &= B\,dF \cr }$$ It appears that $dF(0)$ is an eigenvector of $B(0)$ whose corresponding eigenvalue is $1$.

greg
  • 40,033
  • Good observation. In fact $B(0)$ is a Markov matrix, so it appears it's guaranteed to have 1 as an eigenvalue: https://math.stackexchange.com/questions/351142/why-markov-matrices-always-have-1-as-an-eigenvalue – MichaelChirico Mar 15 '18 at 02:26