Let us ask a slightly more general question first.
Let $F$ is a proper subfield of an extension field $E$ and $A,B\in M_n(F)$. If $A+\lambda B\in M_n(E)$ is invertible for some $\lambda\in E\setminus F$, is it true that $A+cB$ is invertible for some $c\in F$?
It turns out that the answer depends on the cardinality of $F$. If $|F|\le n$, the answer can be negative. E.g. when
$$
F=GF(2)=\{0,1\},\quad A=\pmatrix{0&0\\ 1&1}\quad\text{and}\quad B=I_2,
$$
$A+cB$ is singular for all $c\in F$. However, as $\det(A+xB)=x(x-1)$, $A+\lambda B$ is invertible for all $\lambda\in E\setminus F$.
Yet, when $|F|>n$, the answer is affirmative. As you already know, this is evident from the usual determinantal approach: since $p(x)=\det(A+xB)$ is not the zero polynomial (because $p(\lambda)\ne0$) and its degree is at most $n$ (because the matrices are $n\times n$), it has at most $n$ roots in $F$. Hence $p(c)$ must be nonzero for some $c\in F$ when $|F|>n$. In turn, $A+cB$ is invertible.
As you want other approaches, we will present two proofs below. The first proof works over any field $F$ with $|F|>n$, but for brevity, we will consider only the case where $F$ is infinite. The second proof is tailor-made for your specific case that $F=\mathbb R,\, E=\mathbb C$ and $\lambda=i$.
Proof 1. Let $x$ be an indeterminate and $Q=F(x)$ be the quotient field of the polynomial ring $F[x]$. If $A+xB\in M_n(Q)$ is singular, there exists some nonzero vector $v(x)\in Q^n$ such that $(A+xB)v(x)=0$. By definition of $Q$, each element $v_j(x)$ of $v(x)$ is a fraction of two polynomials in $x$ with coefficients in $F$. We may assume that this fraction is in lowest terms. Therefore, by multiplying $v$ with the least common multiple of the denominators of the $v_j(x)$s, we may assume that each $v_j(x)$ is a polynomial. Then, by multiplying $v(x)$ by $\gcd(v_1(x),\ldots,v_n(x))^{-1}$, we may further assume that $v_1(x),\ldots,v_n(x)$ are coprime.
It follows that $v_j(\lambda)$ must be nonzero for some $j$. Let $h:Q\to E$ be the field homomorphism defined by $h(x)=\lambda$ and $h(1_F)=1_E$. For any vector $w\in Q^n$, let $h(w)$ be the application of $h$ to $w$ entrywise. Then $(A+\lambda B)v(\lambda)=h\left((A+xB)v(x)\right)=h(0)=0$ but $v(\lambda)\ne0$. This contradicts the assumption that $A+\lambda B$ is invertible over $E$. Hence $A+xB$ must be invertible over $Q=F(x)$. Write $(A+xB)^{-1}$ as $\frac{1}{q(x)}P(x)$ where $q(x)\in\mathbb F[x]\setminus0$ and $P(x)\in M_n(F[x])$. Since $F$ is an infinite field, we can always pick some $c\in F$ such that $q(c)\ne0$. (The degree of $q$ can actually be shown to be $\le n$. Therefore, it actually suffices to assume only that $|F|>n$, but to keep our answer short, we shall skip the proof of $\deg q\le n$ here.) Then $\frac{1}{q(c)}P(c)(A+cB)=I$. Hence $A+cB$ is invertible.
Proof 2. (Here $F=\mathbb R,\, E=\mathbb C$ and $A+iB$ is invertible.) Suppose the contrary that $A+cB$ is singular for every real number $c$. Let $(c_1,c_2,\ldots)$ be any $n+1$ distinct real numbers. Then for each $j$ we may pick a nonzero vectors $\mathbf v_j$ such that $(A+c_jB)\mathbf v_j=0$. Let $k\ge2$ be the smallest integer such that $\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_k$ are linearly dependent (such $k$ exists because $\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_{n+1}$ must be linearly dependent in $\mathbb R^n$). Write $\mathbf v_k$ as a linear combination $\sum_{j=1}^{k-1}w_j\mathbf v_j$ of the other $\mathbf v_j$s. By removing any $\mathbf v_j$ whose coefficient $w_j$ is zero from the list, we may assume that each $w_j$ is nonzero. So, if we define $V=\pmatrix{\mathbf v_1&\mathbf v_2&\cdots&\mathbf v_k}$, then $\ker(V)=\operatorname{span}\{\mathbf w\}$ where $\mathbf w=(w_1,w_2,\ldots,w_k)^T$ is an entrywise nonzero vector.
Let $C=\operatorname{diag}(c_1,c_2,\ldots,c_k)$. By construction, we have $AV=-BVC$. Since $C$ is a diagonal matrix with distinct diagonal entries and $\mathbf w$ is entrywise nonzero, $C\mathbf w$ and $\mathbf w$ are linearly independent. Therefore, if
$$
\pmatrix{C&I_k\\ I_k&-C}\pmatrix{p\mathbf w\\ q\mathbf w}=\pmatrix{r\mathbf w\\ s\mathbf w}
$$
for some $p,q,r,s\in\mathbb R$, i.e., if $pC\mathbf w=(r-q)\mathbf w$ and $-qC\mathbf w=(s-p)\mathbf w$, we must have $p=q=r=s=0$. It follows that for any nonzero vector
$$
\pmatrix{\mathbf x\\ \mathbf y}\in\ker\left[\pmatrix{V\\ &V}\pmatrix{C&I_k\\ I_k&-C}\right]
$$
(such a vector exist because the block-diagonal matrix containing two copies of $V$ has deficient rank), at least one of $\mathbf x$ and $\mathbf y$ is not a scalar multiple of $\mathbf w$, i.e., at least one of $V\mathbf x$ and $V\mathbf y$ is nonzero.
By our choice of $\mathbf x$ and $\mathbf y$, we have $VC\mathbf x=-V\mathbf y$ and $VC\mathbf y=V\mathbf x$. Therefore $VC(\mathbf x+i\mathbf y)=iV(\mathbf x+i\mathbf y)$, i.e., $VC\mathbf z=iV\mathbf z$, where $\mathbf z=\mathbf x+i\mathbf y$. Since at least one of $V\mathbf x$ and $V\mathbf y$ is nonzero, $V\mathbf z$ is nonzero. Therefore
$$
(A+iB)V\mathbf z=(AV)\mathbf z+iBV\mathbf z=-BVC\mathbf z+iBV\mathbf z=-B(iV\mathbf z)+iBV\mathbf z=0,
$$
which is a contradiction to the assumption that $A+iB$ is invertible. Hence $A+cB$ must be invertible for some real number $c$.