Suppose $A, B, C$ are all real symmetric and positive definite matrices. Consider the function $f: \mathbb R \to \mathbb R$ given by $$ a \mapsto {\bf tr}\left[ A (B- (I-aC)B(I-aC) ) \right],$$ where $I$ is identity matrix. It is clear $f(0) = 0$ and further assume there exists some $\tau > 0$ such that $f(x) > 0$ for every $x \in (0, \tau)$. We may as well assume the maximal interval such that $f(x) > 0$ to be $(0, \tau)$. That is, $f(x) > 0$ for $x \in (0, \tau)$ and $f(0) = f(\tau) = 0$. I am wondering with these information, is it possible to deduce $\tau \ge \frac{1}{\lambda_{\max}(C)}$?
Essentially I am in the situation that I know for small $a$, the trace is positive and by continuity there should be some maximal interval the trace is always positive. I want to estimate this interval. I tried to use a crude bound \begin{align*} {\bf tr}\left[ A (B- (I-aC)B(I-aC) ) \right] \ge \lambda_{\min}(A) {\bf tr}(B) - \lambda_{\max}^2(I-aC)\lambda_{\max}(A) {\bf tr}(B). \end{align*} But this gives us meaningless bound since if we set above bound to be greater tha $0$, $a$ could be possibly unsolvable.
On the other hand, I feel that $a$ must be related to $\lambda_{\max}(I-aC)$ so we can choose $a$ to minimize this quantity and this would give us $a'=\frac{2}{\lambda_{\min}(C) + \lambda_{\max}(C)}$. Intuitively, I would imagine over $[0, a']$, $f$ should be positive.