There is a classical book by Wilkinson ( "Rounding Errors in Algebraic Processes"), but reviewer in the Amazon writes: Unfortunately, the book discusses these critical topics in terms of 1963 technology. A modern eye finds it jarring to see computer arithmetic described in base 10, when binary is the natural language of machines. Wilkinson discusses fixed-point arithmetic, which is still used heavily in computing for multimedia and embedded systems. His approach is haphazard, though; a more systematic development would have served fixed-point users much better. In its day, I'm sure this book was current and practical. It's easy to see how the author has distilled more complex discussions into applicable analysis. A lot has happened between that day and this, however. Techniques have developed (and are still being developed), and expository techniques have improved for the topics that the book does cover. IEEE floating point has since become the standard, and demands discussion of its own. The modern reader looking for a practical guide should keep looking.
Is there a modern book on the same subject, which treats these topics (including IEEE floating point)?