the background of my question is that I want to calculate an arbitrary square matrix $A \in \mathbb{R}^{n \times n}$ in a convex optimization problem and ensure it is regular, i.e., it can be inverted. For this reason, I would like to add a second, also convex summand to the optimization problem, or, alternatively, add a convex inequality constraint to the optimization problem. The addition of, e.g., $-log(|det(A)|)$ to the objective function ensures invertibility, but this term is not convex.
Ensuring invertibility in a convex way can be ensured by the methods described in Bounds for determinants with positive diagonals for diagonally dominant matrices or, allowing for a bit more general matrices, Verified bounds for singular values, in particular for the spectral norm of a matrix and its inverse (e.g., Lemma 2.1). This, however, restrics the matrices to have a certain structure, while general matrices would be much better for the optimization I have in mind.
From my research, it seems to be most promising to me to consider the smallest singular value of $A$ which equals the largest singular value of its inverse:
$\sigma_{\mathrm{min}}(A)=\frac{1}{\lVert A^{-1}\rVert_2}$,
where $\sigma_{\mathrm{min}(A)}$ denotes the smallest singular value of $A$ and $\lVert\lVert_2$ the spectral norm, i.e., the largest singular value. My key question is if one can obtain a (concave) positive lower bound for the smallest singular value using $\sigma_{\mathrm{min}}(A)=\frac{1}{\lVert A^{-1}\rVert_2}$? Maximizing this concave lower bound ensures that the eigenvalues of $A$ have absolute value $>0$, see this question. The bound itself can be very inaccurate, the main thing is that $\sigma_{\mathrm{min}}(A)>0 $ is ensured. As far as I have seen, using the triangle inequality (subadditivity) and submultiplicativity of matrix norms only leads to upper bounds for $\sigma_{\mathrm{min}}(A)$, but not to lower ones. Maybe it is possible to play with subadditivity and submultiplicativity as well as other matrix norm properties to obtain such a lower bound that is dependent on $A$ or its elements, but not on its inverse to ensure easy optimization?
Answers on this specific idea (lower bound for minimal singular value) or on how to ensure invertibility in a convex way in general would be greatly appreciated.