It is said commonly that a software video encoder has creates higher quality output than hardware encoder. Higher quality here means higher picture quality at a given bit rate.
Hardware encoders are commonly for realtime usage and some are for mobile applications, then there are trade offs in hardware encoders to get realtime performance and have lower power.
Commonly what exactly is the trade off (e.g. which encoding algorithm parameter) in hardware encoder that makes it have lower quality than a software encoder?
Will simply changing some encoding parameter (and as a result more chip area and power consumption) make a hardware encoder have the same quality as a software encoder?
H.264/H.265 are considered.