skip to main content
10.1145/3597031.3597058acmotherconferencesArticle/Chapter ViewAbstractPublication PagesheartConference Proceedingsconference-collections
extended-abstract

Noise Resilience of Reduced Precision Neural Networks

Published: 19 July 2023 Publication History

Abstract

Reduced Precision Neural Networks, where computations are performed with as low as one or two bits of precision, are starting to find relevance in a wide range of applications, including vision, speech, and natural language processing. Such networks are capable of running on low power and cost on embedded systems such as FPGAs (field programmable gate arrays). Recent research has extensively studied and advanced the accuracy of these networks. However, unlike regular neural networks, little is known about how resilient they are in the presence of noisy input data. From old photographs that are rediscovered when you dig through your attic to images taken from thousands of miles away in space, noisy input data is a common factor in everyday life. In this study, we characterize the behavior of Reduced precision neural networks to noisy input data and identify techniques to improve their resilience. Benchmark image data is injected with different noise profiles, and the inference capabilities of reduced-precision networks (based on Yolo, Dorefa-net) are studied and contrasted with full precision neural networks. Experimental results show that reduced-precision networks perform well, within 1-5% accuracy, relative to full precision networks in the presence of significant levels of noise. We also show that significant improvements () to overall image recognition accuracy are possible to achieve by creating a high-quality ensemble neural network, which combines multiple reduced-precision neural networks.

References

[1]
2018. Quantized neural networks on Pynq environment. https://github.com/Xilinx/QNN-MO-PYNQ. Accessed: 2023-02-14.
[2]
2019. Xilinx Ultra96-v2 FPGA board. https://www.avnet.com/wps/portal/us/products/avnet-boards/avnet-board-families/ultra96-v2. Accessed: 2023-02-14.
[3]
Michaela Blott, Thomas B. Preußer, Nicholas J. Fraser, Giulio Gambardella, Kenneth O’brien, Yaman Umuroglu, Miriam Leeser, and Kees Vissers. 2018. FINN-R: An End-to-End Deep-Learning Framework for Fast Exploration of Quantized Neural Networks. ACM Trans. Reconfigurable Technol. Syst. 11, 3, Article 16 (dec 2018), 23 pages. https://doi.org/10.1145/3242897
[4]
M. Everingham, L. Van Gool, C. K. I. Williams, J. Winn, and A. Zisserman. 2010. The Pascal Visual Object Classes (VOC) Challenge. International Journal of Computer Vision 88, 2 (June 2010), 303–338.
[5]
Micah Gorsline, James Smith, and Cory E. Merkel. 2021. On the Adversarial Robustness of Quantized Neural Networks. CoRR abs/2105.00227 (2021). arXiv:2105.00227https://arxiv.org/abs/2105.00227
[6]
Fangzhou Liao, Ming Liang, Yinpeng Dong, Tianyu Pang, Jun Zhu, and Xiaolin Hu. 2017. Defense against Adversarial Attacks Using High-Level Representation Guided Denoiser. CoRR abs/1712.02976 (2017). arXiv:1712.02976http://arxiv.org/abs/1712.02976
[7]
Ji Lin, Chuang Gan, and Song Han. 2019. Defensive Quantization: When Efficiency Meets Robustness. CoRR abs/1904.08444 (2019). arXiv:1904.08444http://arxiv.org/abs/1904.08444
[8]
Mengchen Liu, Shixia Liu, Hang Su, Kelei Cao, and Jun Zhu. 2018. Analyzing the Noise Robustness of Deep Neural Networks. CoRR abs/1810.03913 (2018). arXiv:1810.03913http://arxiv.org/abs/1810.03913
[9]
Joseph Redmon and Ali Farhadi. 2018. YOLOv3: An Incremental Improvement. CoRR abs/1804.02767 (2018). arXiv:1804.02767http://arxiv.org/abs/1804.02767
[10]
Taylor Simons and Dah-Jye Lee. 2019. A Review of Binarized Neural Networks. Electronics 8, 6 (2019). https://doi.org/10.3390/electronics8060661
[11]
Chang Song, Elias Fallon, and Hai Helen Li. 2020. Improving Adversarial Robustness in Weight-quantized Neural Networks. CoRR abs/2012.14965 (2020). arXiv:2012.14965https://arxiv.org/abs/2012.14965
[12]
Shuchang Zhou, Zekun Ni, Xinyu Zhou, He Wen, Yuxin Wu, and Yuheng Zou. 2016. DoReFa-Net: Training Low Bitwidth Convolutional Neural Networks with Low Bitwidth Gradients. CoRR abs/1606.06160 (2016). arXiv:1606.06160http://arxiv.org/abs/1606.06160

Index Terms

  1. Noise Resilience of Reduced Precision Neural Networks

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    HEART '23: Proceedings of the 13th International Symposium on Highly Efficient Accelerators and Reconfigurable Technologies
    June 2023
    127 pages
    ISBN:9798400700439
    DOI:10.1145/3597031
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 July 2023

    Check for updates

    Author Tags

    1. FPGA
    2. noise resilience
    3. reduced precision neural networks

    Qualifiers

    • Extended-abstract
    • Research
    • Refereed limited

    Conference

    HEART 2023

    Acceptance Rates

    Overall Acceptance Rate 22 of 50 submissions, 44%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 52
      Total Downloads
    • Downloads (Last 12 months)35
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 06 Nov 2024

    Other Metrics

    Citations

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media