Online Quantization Adaptation for Fault-Tolerant Neural Network Inference

authored by
Michael Beyer, Jan Micha Borrmann, Andre Guntoro, Holger Blume
Abstract

Neural networks (NNs) are commonly used for environmental perception in autonomous driving applications. Safety aspects in such systems play a crucial role along with performance and efficiency. Since NNs exhibit enormous computational demands, safety measures that rely on traditional spatial or temporal redundancy for mitigating hardware (HW) faults are far from ideal. In this paper, we combine algorithmic properties with dedicated HW features to achieve lightweight fault tolerance. We leverage that many NNs maintain their accuracy when quantized to lower bit widths and adapt their quantization configuration during runtime to counteract HW faults. Instead of masking computations that are performed on faulty HW, we introduce a fail-degraded operating mode. In this mode, reduced precision computations are exploited for NN operations, as opposed to fully losing compute capability. This allows us to maintain important synapses of the network and thus preserve its accuracy. The required HW overhead for our method is minimal because we reuse existing HW features that were originally implemented for functional reasons. To demonstrate the effectiveness of our method, we simulate permanent HW faults in a NN accelerator and evaluate the impact on a NN’s classification performance. We can preserve a NN’s accuracy even at higher error rates, whereas without our method it completely loses its prediction capabilities. Accuracy drops in our experiments range from a few percent to a maximum of 10 %, confirming the improved fault tolerance of the system.

Organisation(s)
Architectures and Systems Section
External Organisation(s)
Robert Bosch GmbH
Type
Contribution to book/anthology
Pages
243–256
No. of pages
14
Publication date
2023
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
Theoretical Computer Science, General Computer Science
Electronic version(s)
https://doi.org/10.1007/978-3-031-40923-3_18 (Access: Closed)