Let ${\bf A} \in \mathbb{R}^{n \times n}$ be a symmetric positive semidefinite (PSD) matrix, let ${\bf a} \in \mathbb{R}^n$ and let ${\bf B} \in \mathbb{R}^{n \times n}$ be a symmetric indefinite matrix with symmetric eigenvalues ($\pm \lambda_i \neq 0$). Let $c > 0$. Consider the optimization problem
$$ \begin{array}{ll} \underset {{\bf x} \in \mathbb{R}^n} {\text{minimize}} & {\bf x}^\top {\bf A} \, {\bf x} + {\bf a}^\top {\bf x} \\ \text{subject to} & {\bf x}^\top {\bf B} \, {\bf x} = 0\\ & 0 \le x_i \le c \end{array} $$
I am aware of the fact that this is not a convex problem because of the quadratic equality constraint. Nevertheless, I did find a paper that proposes methods for quadratic optimization under just one quadratic equality constraint. But this doesn't quite fit my problem.
How would one tackle it, and ideally, are there good free solvers available?