skip to main content
research-article

A Spiking Neuromorphic Architecture Using Gated-RRAM for Associative Memory

Published: 31 December 2021 Publication History

Abstract

This work reports a spiking neuromorphic architecture for associative memory simulated in a SPICE environment using recently reported gated-RRAM (resistive random-access memory) devices as synapses alongside neurons based on complementary metal-oxide semiconductors (CMOSs). The network utilizes a Verilog A model to capture the behavior of the gated-RRAM devices within the architecture. The model uses parameters obtained from experimental gated-RRAM devices that were fabricated and tested in this work. Using these devices in tandem with CMOS neuron circuitry, our results indicate that the proposed architecture can learn an association in real time and retrieve the learned association when incomplete information is provided. These results show the promise for gated-RRAM devices for associative memory tasks within a spiking neuromorphic architecture framework.

References

[1]
Lucas Antón Pastur-Romay, Francisco Cedrón, Alejandro Pazos, and Ana Belén Porto-Pazos. 2016. Deep artificial neural networks and neuromorphic chips for big data analysis: Pharmaceutical and bioinformatics applications. Int. J. Mol. Sci 17, 8 (2016), 1313. DOI:https://doi.org/10.3390/ijms17081313
[2]
Alexander Neckar, Sam Fok, Ben V. Benjamin, Terrence C. Stewart, Nick N. Oza, Aaron R. Voelker, Chris Eliasmith, Rajit Manohar, and Kwabena Boahen. 2019. Braindrop: A mixed-signal neuromorphic architecture with a dynamical systems-based programming model. Proceedings of the IEEE 107, 1 (2019), 144–164. DOI:https://doi.org/10.1109/JPROC.2018.2881432
[3]
Chit-Kwan Lin, Andreas Wild, Gautham N. Chinya, Tsung-Han Lin, Mike Davies, and Hong Wang. 2018. Mapping spiking neural networks onto a manycore neuromorphic architecture. ACM SIGPLAN Notices 53, 4 (2018), 78–89. DOI:https://doi.org/10.1145/3296979.3192371
[4]
Jianshi Tang, Fang Yuan, Xinke Shen, Zhongrui Wang, Mingyi Rao, Yuanyuan He, Yuhao Sun, Xinyi Li, Wenbin Zhang, Yijun Li, Bin Gao, He Qian, Guoqiang Bi, Sen Song, J. Joshua Yang, and Huaqiang Wu. 2019. Bridging biological and artificial neural networks with emerging neuromorphic devices: Fundamentals, progress, and challenges. Adv. Mat 31, 49 (2019). DOI:https://doi.org/10.1002/adma.201902761
[5]
Catherine D. Schuman, Thomas E. Potok, Robert M. Patton, J. Douglas Birdwell, Mark E. Dean, Garrett S. Rose, and James S. Plank. 2017. A survey of neuromorphic computing and neural networks in hardware. arXiv:1705.06963. Retrieved December 14, 2021 from https://arxiv.org/abs/1705.06963.
[6]
Mohammad Reza Mohammadi, Sayed Alireza Sadrossadat, Mir Gholamreza Mortazavi, and Behzad Nouri. 2017. A brief review over neural network modeling techniques. In IEEE International Conference on Power, Control, Signals and Instrumentation Engineering (ICPCSI’17). 54–57. DOI:https://doi.org/10.1109/ICPCSI.2017.8391781
[7]
Laura E. Matzen, Michael C. Trumbo, Ryan C. Leach, and Eric D. Leshikar. 2015. Effects of non-invasive brain stimulation on associative memory. Brain Research 1624 (2015), 286–296. DOI:https://doi.org/10.1016/j.brainres.2015.07.036
[8]
Renée Baillargeon, Elizabeth S. Spelke, and Stanley Wasserman. 1985. Object permanence in five-month-old infants. Cognition 20, 3 (1985), 191–208. DOI:https://doi.org/10.1016/0010-0277(85)90008-3
[9]
Telajala Venkata Mahendra, Sandeep Mishra, and Anup Dandapat. 2017. Self-controlled high-performance precharge-free content-addressable memory. IEEE Transactions on Very Large Scale Integration (VLSI) Systems 25, 8 (2017). DOI:https://doi.org/10.1109/TVLSI.2017.2685427
[10]
J. J. Hopfield. 1982. Neural networks and physical systems with emergent collective computational abilities. In Proc. Natl. Acad. Sci. USA 79 (1982). 2554–2558. DOI:https://doi.org/10.1073/pnas.79.8.2554
[11]
V. Milo, D. Ielmini, and E. Chicca. 2017. Attractor networks and associative memories with STDP learning in RRAM synapses. In IEEE International Electron Devices Meeting (IEDM’17). IEEE, San Francisco, CA. DOI:https://doi.org/10.1109/IEDM.2017.8268369
[12]
S. G. Hu, Y. Liu, Z. Liu, T. P. Chen, J. J. Wang, Q. Yu, L. J. Deng, Y. Yin, and Sumio Hosaka. 2015. Associative memory realized by a reconfigurable memristive Hopfield neural network. Nature Communications 6, 7522 (2015). DOI:https://doi.org/10.1038/ncomms8522
[13]
Jiu Yang, Lidan Wang, Yan Wang, and Tengteng Guo. 2017. A novel memristive Hopfield neural network with application in associative memory. Neurocomputing 277 (2017), 142–148. DOI:https://doi.org/10.1016/j.neucom.2016.07.065
[14]
N. Davey and S. P. Hunt. 1999. The capacity and attractor basins of associative memory models. In International Work-Conference on Artificial Neural Networks. Alicante, Spain. DOI:https://doi.org/10.1007/BFb0098189
[15]
E. Gardner. 1987. Maximum storage capacity in neural networks. Europhys. Lett 4, 4 (1987), 481–485. Retrieved December 14, 2021 from https://iopscience.iop.org/article/10.1209/0295-5075/4/4/016.
[16]
Alexander Jones, Rashmi Jha, Ajey P. Jacob, and Cory Merkel. 2019. A segmented attractor network for neuromorphic associative learning. In Proceedings of the International Conference on Neuromorphic Systems (ICONS’19). ACM, Knoxville, TN, 1–8. DOI:https://doi.org/10.1145/3354265.3354284
[17]
Dmitri B. Strukov, Gregory S. Snider, Duncan R. Stewart, and R. Stanley Williams. 2008. The missing memristor found. Nature 453 (2008), 80–83. DOI:https://doi.org/10.1038/nature06932
[18]
Wenchao Lu, Wenbo Chen, Yibo Li, and Rashmi Jha. 2016. Self current limiting MgO ReRAM devices for low-power non-volatile memory applications. IEEE Journal on Emerging and Selected Topics in Circuits and Systems 6, 2 (2016), 163–170. DOI:https://doi.org/10.1109/JETCAS.2016.2547758
[19]
Sung Hyun Jo, Tanmay Kumar, Sundar Narayanan, and Hagop Nazarian. 2015. Cross-point resistive RAM based on field-assisted superlinear threshold selector. IEEE Transactions on Electron Devices 62, 11 (2015), 3477–3481. DOI:https://doi.org/10.1109/TED.2015.2426717
[20]
Jianshi Tang, Douglas Bishop, Seyoung Kim, Matt Copel, Tayfun Gokmen, Teodor Todorov, SangHoon Shin, Ko-Tao Lee, Paul Solomon, Kevin Chan, Wilfried Haensch, and John Rozen. 2018. ECRAM as scalable synaptic cell for high-speed, low-power neuromorphic computing. In IEEE International Electron Devices Meeting (IEDM’18). IEEE, San Francisco, CA, 13.1.1–13.1.4. DOI:https://doi.org/10.1109/IEDM.2018.8614551
[21]
Suhwan Lim, Jong-Ho Bae, Jai-Ho Eum, Sungtae Lee, Chul-Heung Kim, Dongseok Kwon, and Jong-Ho Lee. 2018. Hardware-based neural networks using a gated-Schottky diode as a synapse device. In IEEE International Symposium on Circuits and Systems (ISCAS’18). IEEE, Florence, Italy, 1–5. DOI:https://doi.org/10.1109/ISCAS.2018.8351152
[22]
Eric Herrmann, Andrew Rush, Tony Bailey, and Rashmi Jha. 2018. Gate controlled three-terminal metal oxide memristor. IEEE Electron Device Letters 39, 4 (2018), 500–503. DOI:https://doi.org/10.1109/LED.2018.2806188
[23]
Yoeri van de Burgt, Ewout Lubberman, Elliot J. Fuller, Scott T. Keene, Grégorio C. Faria, Sapan Agarwal, Matthew J. Marinella, A. Alec Talin, and Alberto Salleo. 2017. A non-volatile organic electrochemical device as a low-voltage artificial synapse for neuromorphic computing. Nature Materials 16, 4 (2017), 414–418. DOI:https://doi.org/10.1038/nmat4856
[24]
Tzu-Ying Lin, Yong-Xiao Chen, Jin-Fu Li, Chih-Yen Lo, Ding-Ming Kwai, and Yung-Fa Chou. 2016. A test method for finding boundary currents of 1T1R memristor memories. In IEEE 25th Asian Test Symposium (ATS’16). Hiroshima, Japan, 281–286. DOI:https://doi.org/10.1109/ATS.2016.44
[25]
Emmanuelle J. Merced-Grafals, Noraica Dávila, Ning Ge, R. Stanley Williams, and John Paul Strachan. 2016. Repeatable, accurate, and high speed multi-level programming of memristor 1T1R arrays for power efficient analog computing applications. Nanotechnology 27, 36 (2016), 1–9. DOI:https://doi.org/10.1088/0957-4484/27/36/365202
[26]
Mahmoud Zangeneh and Ajay Joshi. 2014. Design and optimization of nonvolatile multibit 1T1R resistive RAM. IEEE Transactions on Very Large Scale Integration (VLSI) Systems 22, 8 (2014), 1815–1828. DOI:https://doi.org/10.1109/TVLSI.2013.2277715
[27]
Max M. Shulaker, Tony F. Wu, Asish Pal, Liang Zhao, Yoshio Nishi, Krishna Saraswat, H.-S. Philip Wong, and Subhasish Mitra. 2014. Monolithic 3D integration of logic and memory: Carbon nanotube FETs, resistive RAM, and silicon FETs. In IEEE International Electron Devices Meeting (IEDM’14). IEEE, San Francisco, CA, 27.4.1–27.4.4. DOI:https://doi.org/10.1109/IEDM.2014.7047120
[28]
Georgios Papandroulidakis, Ioannis Vourkas, Angel Abusleme, Georgios Ch. Sirakoulis, and Antonio Rubio. 2017. Crossbar-based memristive logic-in-memory architecture. IEEE Transactions on Nanotechnology 16, 3 (2017), 491–501. DOI:https://doi.org/10.1109/TNANO.2017.2691713
[29]
Ming Zhang, Zonghua Gu, Nenggan Zheng, De Ma, and Gang Pan. 2020. Efficient spiking neural networks with logarithmic temporal coding. IEEE Access 8 (2020), 98156–98167. DOI:https://doi.org/10.1109/ACCESS.2020.2994360
[30]
Amirhossein Tavanei, Masoud Ghodrati, Saeed Reza Kheradpisheh, Timothée Masquelier, and Anthony Maida. 2019. Deep learning in spiking neural networks. Neural Networks 111 (2019), 47–63. DOI:https://doi.org/10.1016/j.neunet.2018.12.002
[31]
D. R. Wilson and T. R. Martinez. 2001. The need for small learning rates on large problems. IEEE International Joint Conference on Neural Networks (IJCNN’01). Washington, DC. DOI:https://doi.org/10.1109/IJCNN.2001.939002
[32]
Tony J. Bailey and Rashmi Jha. 2018. Understanding synaptic mechanisms in SrTiO3 devices. IEEE Transactions on Electron Devices 65, 8 (2018). 3514–3520. DOI:https://doi.org/10.1109/TED.2018.2847413
[33]
Branden Long, Jorhan Ordosgoitti, Rashmi Jha, and Christopher Melkonian. 2011. Understanding the charge transport mechanism in VRS and BRS states of transition metal oxide nanoelectronic memristor devices. IEEE Transactions on Electron Devices 58, 11 (2011), 3912–3919. DOI:https://doi.org/10.1109/TED.2011.2165845
[34]
Alexander Jones and Rashmi Jha. 2020. A compact gated-synapse model for neuromorphic circuits. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, (2020). DOI:https://doi.org/10.1109/TCAD.2020.3028534
[35]
Alexander Jones, Andrew Rush, Cory Merkel, Eric Herrmann, Ajey P. Jacob, Clare Thiem, and Rashmi Jha. 2020. A neuromorphic SLAM architecture using gated-memristive synapses. Neurocomputing 381 (2020), 89–104. DOI:https://doi.org/10.1016/j.neucom.2019.09.098
[36]
Carver Mead. 1989. Analog VLSI and Neural Systems (1st ed.). Addison-Wesley, Reading, MA.
[37]
Giacomo Indiveri, Bernabé Linares-Barranco, Tara Julia Hamilton, André van Schaik, Ralph Etienne-Cummings, Tobi Delbruck, Shih-Chii Liu, Piotr Dudek, Philipp Häfliger, Sylvie Renaud, Johannes Schemmel, Gert Cauwenberghs, John Arthur, Kai Hynna, Fopefolu Folowosele, Sylvain Saighi, Teresa Serrano-Gotarredona, Jayawan Wijekoon, Yingxue Wang, and Kwabena Boahen. 2011. Neuromorphic silicon neuron circuits. Front. Neurosci. 5, 73 (2011). DOI:https://doi.org/10.3389/fnins.2011.00073
[38]
Yanghao Wang, Liutao Yu, Si Wu, Ru Huang, and Yuchao Yang. 2020. Memristor-based biologically plausible memory based on discrete and continuous attractor networks for neuromorphic systems. Advanced Intelligent Systems 2, 3 (2020), 1–7. DOI:https://doi.org/10.1002/aisy.202000001
[39]
V. Milo, D. Ielmini, and E. Chicca. 2017. Attractor networks and associative memories with STDP learning in RRAM synapses. In IEEE International Electron Devices Meeting (IEDM’17). IEEE, San Francisco, CA, 11.2.1–11.2.4. DOI:https://doi.org/10.1109/IEDM.2017.8268369
[40]
S. G. Hu, Y. Liu, Z. Liu, T. P. Chen, J. J. Wang, Q. Yu, L. J. Deng, Y. Yin, and Sumio Hosaka. 2015. Associative memory realized by a reconfigurable memristive Hopfield neural network. Nature Communications 6, 7522 (2015), 1–8. DOI:https://doi.org/10.1038/ncomms8522
[41]
German I. Parisi, Ronald Kemker, Jose L. Part, Christopher Kanan, and Stefan Wermter. 2019. Continual lifelong learning with neural networks: A review. Neural Networks 113 (2019), 54–71. DOI:https://doi.org/10.1016/j.neunet.2019.01.012
[42]
Carina Curto and Katherine Morrison. 2016. Pattern completion in symmetric threshold-linear networks. Neural Computation 28, 12 (2016), 2825–2852. DOI:https://doi.org/10.1162/NECO_a_00869
[43]
Segundo Jose Guzman, Alois Schlögl, Michael Frotscher, and Peter Jonas. 2016. Synaptic mechanisms of pattern completion in the hippocampal CA3 network. Science 353, 6304 (2016), 1117–1123. DOI:https://doi.org/10.1126/science.aaf1836
[44]
Mohsen Imani, Abbas Rahimi, Deqian Kong, Tajana Rosing, and Jan M. Rabaey. 2017. Exploring hyperdimensional associative memory. IEEE International Symposium on High Performance Computer Architecture (HPCA’17). IEEE, Austin, TX. DOI:https://doi.org/10.1109/HPCA.2017.28
[45]
Mohsen Imani, Abbas Rahimi, Pietro Mercati, and Tajana Simunic Rosing. 2018. Multi-stage tunable approximate search in resistive associative memory. IEEE Transactions on Multi-Scale Computing Systems 4, 1 (2018), 17–29. DOI:https://doi.org/10.1109/TMSCS.2017.2665462

Index Terms

  1. A Spiking Neuromorphic Architecture Using Gated-RRAM for Associative Memory

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Journal on Emerging Technologies in Computing Systems
    ACM Journal on Emerging Technologies in Computing Systems  Volume 18, Issue 2
    April 2022
    411 pages
    ISSN:1550-4832
    EISSN:1550-4840
    DOI:10.1145/3508462
    • Editor:
    • Ramesh Karri
    Issue’s Table of Contents

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Journal Family

    Publication History

    Published: 31 December 2021
    Accepted: 01 April 2021
    Revised: 01 March 2021
    Received: 01 August 2020
    Published in JETC Volume 18, Issue 2

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Associative memory
    2. gated-RRAM
    3. neuromorphic applications
    4. segmented attractor network

    Qualifiers

    • Research-article
    • Refereed

    Funding Sources

    • National Science Foundation

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 542
      Total Downloads
    • Downloads (Last 12 months)85
    • Downloads (Last 6 weeks)10
    Reflects downloads up to 20 Jan 2025

    Other Metrics

    Citations

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    Full Text

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media