I am trying to understand the derivation of the relative entropy formula, but have trouble with the reasoning and intuition of the modification made to the differential entropy formula.
My current understanding is as follows: the entropy for discrete distributions defined by Shannon is derived from the expected information or surprise:
$$H(X) = \mathbb{E}[-\log p(X)] = -\sum_{x \in X} p(x) \log p(x) $$
Then, Shannon assumed that the theorem could simply be generalized to continuous distributions by replacing the summation with an integral. However, this was later proven to be not correct, as it does not share all properties of the discrete entropy (e.g. it can be negative). This is known as differential entropy:
$$H(X) = -\int_{-\infty}^\infty p(x) \log p(x) ~\mathrm dx $$
Jaynes argued that the correct formula should look like this:
$$ \lim_{N\rightarrow \infty} H_N(X) = \log N-\int p(x) \log \frac{p(x)}{m(x)} ~\mathrm dx $$
where N is the number of points in an interval $x \in ]a, b[$ and as N approaches infinity, these points resemble a function $m(x)$, according to limiting density of discrete points. The formula known as the relative entropy today is derived by omitting the first term:
$$H(X) = -\int p(x) \log \frac{p(x)}{m(x)} ~\mathrm dx $$
Questions
I do not understand what the reasoning and intuition of the inclusion of $m(x)$ at this point in the formula is. Also, the restriction that $m(x)$ is restricted to an interval is a mystery to me.
The omission of $\log N$ term is understandable from a computational point of view as the term would approach infinity and thus, all distributions would have an infinite entropy. But what is the intuitive understanding of this omission?
The relative entropy is closely related to KL-divergence, a measure of similarity of two distributions $p(x)$ and $m(x)$. How is the idea of average surprise (the original idea) being used as a measure of similarity justified?
I apologize if this is a trivial question to some of you, but I really do struggle to wrap my mind around it. I appreciate your help!