loading page

Explainable Neuro-Memristive Systems
  • Rahul Kottappuzhackal,
  • Sruthi Pallathuvalappil,
  • Alex James
Rahul Kottappuzhackal
School of Electronic Systems and Automation, Digital University Kerala
Sruthi Pallathuvalappil
School of Electronic Systems and Automation, Digital University Kerala
Alex James
School of Electronic Systems and Automation, Digital University Kerala

Corresponding Author:[email protected]

Author Profile

Abstract

System level simulation of neuro-memristive circuits under variability are complex and follow a black-box neural network approach. In realistic hardware, they are often difficult to cross-check for accuracy and reproducible results. The accurate memristor model prediction becomes critical to decipher the overall the circuit function in wide range of non-ideal and practical conditions. In most neuro-memristive systems, crossbar configuration is essential for implementing multiply and accumulate calculations, that forms the primary unit for neural network implementations. Predicting the specific memristor model that best fits the crossbar simulations to make it explainable is an open challenge that is solved in this paper. As the size of the crossbar increases the cross-validation becomes an even more challenging. This paper proposes predicting the memristor device under test by automatically evaluating the I-V behavior using Random forest and Extreme Gradient Boosting algorithms. Starting with a single memristor model, the prediction approach is extended to memristor crossbarbased circuits explainable. The performance of both algorithms is analysed based on precision, recall, f1-score and support. The accuracy, macro average and weighted average of both algorithms at different operational frequencies are explored.
07 Apr 2024Submitted to TechRxiv
08 Apr 2024Published in TechRxiv