Your are making two wrong assumptions here.
The first one: "the conditional entropy should not have to be negative". That's true for a "true" Shannon entropy (conditioned or not). But that's not true if we are dealing with differential entropies - as we are here. (Don't fall into the trap of thinking that differential entropy is just the entropy of a continuous variable - it's not - this mistake leads to many paradoxes).
However, that's not the main error (your contradiction would still hold if we switch to discrete variables and replace integrals by sums).
Your more fundamental problem is assuming that
$$\iint p(x, y) \log \frac{p(x, y)}{p(x)} \, dxdy =D_{KL}\left[p(x, y) \Vert\, p(x)\right]\tag{1}$$
(note: in the above eq, and in what follows, it doesn't make difference to have discrete or continous variables, Shannon entropies or differential entropies)
That might look right on the surface, but it's wrong. The definition of $D_{KL}\left[p \Vert\ q\right] $ require both $p$ and $q$ to be valid probability densities (or probabilities functions) defined in the same domain, where the integral or sum is done.
Here, the integral is done in ${\mathbb R}^2$ (or some subset) - and there $p(x)$ is not a valid density.
But, you might object, we can always assume that $p(x)$ (my $q$) is defined on the $x,y$ plane, it's just that it doesn't depend on $y$. But then if would not be a valid density, unless we normalize it. Let's assume, for the sake of illustration, that the domain is the rectangle $[0,A]\times[0,B]$.
Then a valid density would be $q(x,y)=p(x)/B$. Then we'd have
$$\begin {align}0 &\le D_{KL}\left[p(x, y) \Vert\, p(x)/B\right]\\
&=\int\int p(x,y) \log \frac{p(x,y)}{p(x)/B} \, dxdy\\
&=\int\int p(x,y) \log B \,dxdy+ \int\int p(x,y) \log\frac{p(x,y)}{p(x)} \, dxdy\\
&=\log B +\int\int p(x,y) \log p(x|y) dx dy \\
&=\log B - h(y|x)\\
\end{align}
$$
This only lead us to $h(y|x)\le \log B$ ... which is indeed true: because $h(y|x)\le h(y)$ (a property of Shannon entropy that also applies to differential entropy) and $h(y) \le log B$ for any variable restricted to the interval $[0,B]$
(under that restriction, the uniform density maximizes the differential entropy).
True, $\log B$ can be negative if $B<1$, and hence so can be $h(y|x)$ - but that's ok for a differential entropy.
So, all is fine, no contradiction.