ApprOchs: a Memristor-based in-memory adaptive approximate adder

As silicon scaling nears its limits and the Big Data era unfolds, in-memory computing is increasingly important for overcoming the Von Neumann bottleneck and thus enhancing modern computing performance. One of the rising in-memory technologies are Memristors, which are resistors capable of memorizin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Ochs, Dominik (VerfasserIn) , Rapp, Lukas (VerfasserIn) , Borzyk, Leandro (VerfasserIn) , Amirafshar, Nima (VerfasserIn) , Taherinejad, Nima (VerfasserIn)
Dokumenttyp: Article (Journal)
Sprache:Englisch
Veröffentlicht: 31 January 2025
In: IEEE journal on emerging and selected topics in circuits and systems
Year: 2025, Jahrgang: 15, Heft: 1, Pages: 105-119
ISSN:2156-3365
DOI:10.1109/JETCAS.2025.3537328
Online-Zugang:Verlag, lizenzpflichtig, Volltext: https://doi.org/10.1109/JETCAS.2025.3537328
Verlag, lizenzpflichtig, Volltext: https://ieeexplore.ieee.org/document/10859167/authors
Volltext
Verfasserangaben:Dominik Ochs, Lukas Rapp, Leandro Borzyk, Nima Amirafshar, Nima TaheriNejad
Beschreibung
Zusammenfassung:As silicon scaling nears its limits and the Big Data era unfolds, in-memory computing is increasingly important for overcoming the Von Neumann bottleneck and thus enhancing modern computing performance. One of the rising in-memory technologies are Memristors, which are resistors capable of memorizing state based on an applied voltage, making them useful for storage and computation. Another emerging computing paradigm is Approximate Computing, which allows for errors in calculations to in turn reduce die area, processing time and energy consumption. In an attempt to combine both concepts and leverage their benefits, we propose the memristor-based adaptive approximate adder ApprOchs - which is able to selectively compute segments of an addition either approximately or exactly. ApprOchs is designed to adapt to the input data given and thus only compute as much as is needed, a quality current State-of-the-Art (SoA) in-memory adders lack. Despite also using OR-based approximation in the lower k bit, ApprOchs has the edge over S-SINC because ApprOchs can skip the computation of the upper n-k bit for a small number of possible input combinations (22k of 22n possible combinations skip the upper bits). Compared to SoA in-memory approximate adders, ApprOchs outperforms them in terms of energy consumption while being highly competitive in terms of error behavior, with moderate speed and area efficiency. In application use cases, ApprOchs demonstrates its energy efficiency, particularly in machine learning applications. In MNIST classification using Deep Convolutional Neural Networks, we achieve 78.4% energy savings compared to SoA approximate adders with the same accuracy as exact adders at 98.9%, while for k-means clustering, we observed a 69% reduction in energy consumption with no quality drop in clustering results compared to the exact computation. For image blurring, we achieve up to 32.7% energy reduction over the exact computation and in its most promising configuration ( k=3 ), the ApprOchs adder consumes 13.4% less energy than the most energy-efficient competing SoA design (S-SINC+), while achieving a similarly excellent median image quality at 43.74dB PSNR and 0.995 SSIM.
Beschreibung:Gesehen am 29.08.2025
Beschreibung:Online Resource
ISSN:2156-3365
DOI:10.1109/JETCAS.2025.3537328