I've got a system recursively characterized by:
$$ f_i(x) = \alpha_i(x) + \sum_{j = 1}^J \beta_{i, j}(x) f_j(x) $$
Or, in matrix notation:
$$ F(x) = A(x) + B(x)F(x) $$
Where $A,F$ are vectors and $B$ is a square matrix.
What can we say about the nature of $\frac{\partial F}{\partial x}$?
The product rule tells us:
$$ \frac{\partial F}{\partial x} = \frac{\partial A}{\partial x} + \frac{\partial B}{\partial x}F(x) + B(x)\frac{\partial F}{\partial x} $$
And in turn (exactly analogous to the scalar case of $\frac{\partial}{\partial x}(g(x))$ for $g(x) = \frac{a(x)}{1 - b(x)}$):
$$ \frac{\partial F}{\partial x} = (I - B(x))^{-1} \left( \frac{\partial A}{\partial x} + \frac{\partial B}{\partial x} F(x) \right) $$
I know the following ($\forall x$):
$$ \frac{\partial \alpha_i}{\partial x} \leq 0 \, \forall i \qquad 0 \leq \beta_{i, j}(x) < 1 \, \forall i,j \qquad \frac{\partial \beta_{i,j}}{\partial x} \leq 0 \, \forall i,j $$
And the fact that $\alpha_i(x) \geq 0$ and $\beta_{i,j}(x) < 1$ should imply that $F(x) \geq 0$ as well (this is easy to establish by contradicting whether $\exists ! \, i^{*}$ such that for some $x^{*}$, but the possibility that there are several negative components at that $x^{*}$ makes a full proof more bearish).
Can we assert that $\frac{\partial F}{\partial x} \leq 0$ for all components?
I'm also particularly interested in the limiting case when $x = 0$.
I know that:
$$ \nexists \left( I - B(0) \right)^{-1} \qquad \frac{\partial \alpha_i}{\partial x}(0) = \frac{\partial \beta_{i,j}}{\partial x} = 0 $$
So I suppose that means there's some sort of higher-dimensional L'Hopital's rule to apply (I've not encountered this; no help here)? Can I say this limit is equal to the limit of the "numerator's" (right-side matrix) derivative divided by "denominator's" (left-side matrix) derivative?