All of it hidden in Conditional Expectation. In modern probability, conditional probability define as a special case of Conditional Expectation.
$p(A)= E(I_A)=E (E(I_A|X))$
by definition of conditional expectation $E(I_A|X)$
is a function of $X$ (measurable to $\sigma(X)$ that satisfy projection property )
$= \int E(I_A|X=x) f_X(x)dx=\int p(A|X=x) f_X(x)dx $
from now $E(I_A)=\int p(A|X=x) f_X(x)dx$
and so $E(I_{AB})=\int p(AB|X=x) f_X(x)dx$ $\hspace{.5cm}$ (1)
note in modern probability we have (this defination is same for continues and discrete variable, it does not depend to type of $Y$) :
$E(Y|B)=\frac{E(YI_B)}{E(I_B)}
=\frac{E(YI_B)}{E(I_B)}$ $\hspace{.5cm}$ (2)
so
$p(A|B)=E(I_A|B)\overset{ (2)}{=}\frac{E(I_AI_B)}{E(I_B)}=\frac{E(I_{AB})}{E(I_B)}=\frac{p(AB)}{p(B)}\overset{ (1)}{=}\frac{\int p(AB|X=x) f_X(x)dx}{\int p(B|X=x) f_X(x)dx}$
it is over since
you can calculate
$p(AB)=E(I_{AB})=\int p(AB|X=x) f_X(x)dx$
and $p(B)$
.
but for more
$\frac{\int p(AB|X=x) f_X(x)dx}{\int p(B|X=x) f_X(x)dx}=\frac{\int p(AB|X=x) f_X(x)dx}{p(B)}=
\int\frac{ p(AB|X=x)}{p(B)} f_X(x)dx=
\int\frac{ p(AB|X=x)}{p(B)p(B|X=x)} p(B|X=x)f_X(x)dx=
\int\frac{ p(AB|X=x)}{p(B|X=x)} \frac{p(B|X=x)f_X(x)}{p(B)}dx=\int p(A|B,X=x) \frac{p(B|X=x)f_X(x)}{p(B)}dx=
\int p(A|B,X=x) f_X(x|B)dx
$