Skip to Main content Skip to Navigation
Book sections

Analysis of Finite Word-Length Effects in Fixed-Point Systems

Abstract : Systems based on fixed-point arithmetic, when carefully designed, seem to behave as their infinite precision analogues. Most often, however, this is only a macroscopic impression: finite word-lengths inevitably approximate the reference behavior introducing quantization errors, and confine the macroscopic correspondence to a restricted range of input values. Understanding these differences is crucial to design optimized fixed-point implementations that will behave “as expected” upon deployment. Thus, in this chapter, we survey the main approaches proposed in literature to model the impact of finite precision in fixed-point systems. In particular, we focus on the rounding errors introduced after reducing the number of least significant bits in signals and coefficients during the so-called quantization process.
Complete list of metadata

Cited literature [149 references]  Display  Hide  Download
Contributor : Olivier Sentieys Connect in order to contact the contributor
Submitted on : Monday, February 4, 2019 - 4:53:48 PM
Last modification on : Tuesday, October 19, 2021 - 10:36:34 PM
Long-term archiving on: : Sunday, May 5, 2019 - 4:39:10 PM


Analysis of Finite Word-Length...
Files produced by the author(s)



Daniel Ménard, Gabriel Caffarena, Juan Antonio Lopez, David Novo, Olivier Sentieys. Analysis of Finite Word-Length Effects in Fixed-Point Systems. Shuvra S. Bhattacharyya. Handbook of Signal Processing Systems, pp.1063-1101, 2019, 978-3-319-91733-7. ⟨10.1007/978-3-319-91734-4_29⟩. ⟨hal-01941888⟩



Record views


Files downloads