The problem is that you can't deal with $E[g(X)]$ if this number is not defined.
So what does it take in order to define the expectation of a random variable? If $Y \geq 0$ is measurable, then we can always define $E[Y]$, though it may be infinite.
For a general $Y$ we can split it up into $Y= Y^+ - Y^-$, where $Y^+ = \max\{Y,0\}$ and $Y^- = \max\{-Y,0\}$. Since $Y^+$ and $Y^-$ are nonnegative, we can always define $E[Y^+]$ and $E[Y^-]$.
It is then natural to define $E[Y] := E[Y^+]- E[Y^-]$. This is exactly what we do, and it makes sense as long as we haven't just written $\infty-\infty$. As long as at least one of the terms is finite we are okay in defining $E[Y]$. Now to $E[g(X)]$, $Y:= g(X)$ is a random variable, and if $g \geq 0$ we don't have to worry, but it may be that both $E[Y^+]=E[Y^-] = \infty$ even if $E[|X|]<\infty$ if $g$ is nasty enough.
So how can we tell if $E[g(X)]$ exists? The easiest thing to do is to show $E[|g(X)|]<\infty$. This is equivalent to $E[g(X)^+],E[g(X)^-]<\infty$, and since $|g(X)|$ is nonnegative, we know $E[|g(X)|]$ is at least defined. Less commonly, show directly that at least one of $E[g(X)^+]$ and $E[g(X)^-]$ is finite. Here's where the fact that $X$ has a pdf comes in. If $h \geq 0$, then $E[h(X)]$ is always defined and equals $\int_{-\infty}^{\infty} h(x) f_X(x)dx$. Now we can use this with $h = |g|$ or $g^+$ or $g^-$ to actually compute $E[|g(X)|]$ or $E[g(X)^+]$ or $E[g(X)^-]$ and show at least one is finite.