In any ODE textbook I have looked at, any absolute values that arise during integration of the dependent variable are very carefully respected, but absolute values that arise during integration involving the independent variable are dropped. It seems like dropping the absolute values involving the independent variable make things much easier, and it works out in the end. However, I can't seem to find anywhere where this strategy is explicitly discussed and justified. I'm hoping to see an argument for why it is always okay.
Here is an example: $y'=y/t$. If we solve via separation of variables we quickly arrive at $\ln |y|=\ln|t|+c$ if we are diligent about absolute values everywhere. Then solving for $y$ we would get
$ \begin{align*} |y|&=e^ce^{\ln|t|}\\ |y|&=e^c|t|\\ y&=\pm e^c|t|\\ y&=c|t| \text{ noting that y=0 is a solution} \end{align*} $
If we ignore the absolute value when integrating with respect to $t$, and start out with $\ln|y|=\ln t+c$ then with those same steps we get
$ \begin{align*} |y|&=e^ce^{\ln t}\\ |y|&=e^ct\\ y&=\pm e^ct\\ y&=ct \text{ noting that y=0 is a solution} \end{align*} $
Both are perfectly valid solutions and any textbook I've seen would have started from $\ln|y|=\ln t+c$. Can you always get away with that? Why?
It also just really doesn't sit well with me that when the absolute values are ignored because initially you lost the general solution. e.g. $\ln|y|=\ln t+c$ and $|y|=e^ce^{\ln t}$ define implicit solutions of the ODE only for $t>0$, but then suddenly $|y|=e^ct$ is back to implicitly being a fully general solution to the ODE.