The best simple upper bound depends on how big the edge probability $p$ is.
In all cases, we start from the exact formula (which is shown in the question you link to): $$\sum_{k=2}^n \frac{n^{\underline{k}} p^k}{k}$$ where by $n^{\underline{k}}$ I mean the falling power $n (n-1) \dotsb (n-k+1)$.
If $np < 1$, then the majority of the contribution arises from small cycles, and it makes sense to use the upper bound $n^{\underline k} \le n^k$ and to extend the sum to $\infty$. Then we get the upper bound $$\sum_{k=2}^\infty \frac{(np)^k}{k} = \log \frac1{1-np} - np.$$
This is valid for all $np < 1$ (past that, the sum won't converge) but stops being reasonable around $np = 1 - \frac1n$ (the point at which making the sum infinite is a bad idea) at which point a different approach makes more sense.
Let $k^*$ be the value of $k$ maximizing the numerator $n^{\underline k} p^k$. Then we have $$\sum_{k=2}^n \frac{n^{\underline{k}} p^k}{k} \le \sum_{k=2}^n \frac{n^{\underline{k^*}} p^{k^*}}{k} = n^{\underline{k^*}} p^{k^*} \sum_{k=2}^n \frac1k < n^{\underline{k^*}} p^{k^*} \log n.$$
I claim that $k^* = \lceil n - 1/p\rceil$. To see this: consider the ratio $n^{\underline k} p^k : n^{\underline{k+1}} p^{k+1} = 1/p : n-k$. As long as this ratio is bigger than 1, incrementing $k$ by $1$ increases the value of $n^{\underline k} p^k$, so we should stop at exactly the point where it starts being less than $1$.
So at this point, we can once again say $n^{\underline k} \le n^k$ and get an upper bound of $$(np)^{k^*} \log n$$ on the expectation. This is at most $(np)^{n+1-1/p} \log n$ if $np \ge 1$, and at most $(np)^{n-1/p} \log n$ if $np \le 1$.
This bound is always valid, but we can also be more careful than this in bounding $n^{\underline k}$ (e.g. using Stirling's formula); whether that's worth it or not depends on your use-case.