This paper presents an attempt to model the computaional bilogical circuitry of the human retina using spiking neural networks (SNN). Furthermore, a hardware implementation and characterization of a fully connected SNN using leaky integrate-and-fire (LIF) neurons on a mixed-signal application-specific integrated circuit (ASIC) is presented.
The aim is to demonstrate a proof-of-concept that the use of neuromorphic systems on chips (NeuroSoCs) in order to model the human retina is energy efficient and active area efficient. A detailed characterization of the neuromorphic ASIC with a mixed-signal design is provided, including measurements of energy consumption per inference and inference time.Experimental results are obtained using multiple chip samples validating the performance of implemneted SNN over PVT variations. Moreover, a Python-based framework extracted from chip measurements is developed, enabling integration with machine learning techniques. Using this model, we train an artificial retinal ganglion cell layer based on Parasol ON/OFF RGC types, employing the CIFAR-10 dataset for validation.
The results demonstrate the faisability of the proposed approach and provide insights into the practical implementation of SNNs for modeling biological neural networks such as the human retina. The goal of the paper is to proof-concept the idea of modeling the processing of the retina using spiking neural networks implemented on hardware for a retinal implant application.
Neuromorphiccircuit,spikinganalogCMOSneuron,MachineLearning,SpikingNeuralNetworks(SNN),EdgeML,BiomedicalRetinalImplant,RetinalGanglionCells.