This invention introduces a method for a composite SNN-based back-propagation function that leverages Integrate and Fire (IF) neurons to encode gradient signals. It approximates a SoftMax cross-entropy loss function using lateral inhibitory connections and employs Resistive Processing Units (RPUs) for weight updates using stochastic computing techniques. This method enables concurrent feed-forward activations and back-propagated gradient computations, facilitating efficient and localized learning within spiking neuromorphic hardware.
Figure (1) Illustrates a block diagram illustrating comparison of proposed approach of composite SNN based back-propagation method with the existing approach of composite SNN, according to the embodiments as disclosed herein; (2) Illustrates a network architecture and serial pipeline for learning ANNs and SNNs, according to the embodiments as disclosed herein; FIG. 2B illustrates a network architecture for concurrent rate coded computation of forward and backward passes for composite SNN based back-propagation method, according to the embodiments as disclosed herein.
Spiking Neural Networks (SNNs) offer power efficiency and biological plausibility but face challenges in implementing back-propagation (BP) due to the non-differentiable nature of spike events. Traditional BP methods in Artificial Neural Networks (ANNs) do not directly translate to SNNs, limiting their effectiveness and implementation in neuromorphic hardware.
Solution: This invention introduces a method for a composite SNN-based back-propagation function that leverages Integrate and Fire (IF) neurons to encode gradient signals. It approximates a SoftMax cross-entropy loss function using lateral inhibitory connections and employs Resistive Processing Units (RPUs) for weight updates using stochastic computing techniques. This method enables concurrent feed-forward activations and back-propagated gradient computations, facilitating efficient and localized learning within spiking neuromorphic hardware.
- Splitting Gradient Signals: Divides gradient signals into positive and negative streams for encoding by IF neurons.
- Loss Function Approximation: Uses lateral inhibitory connections to approximate the SoftMax cross-entropy loss function.
- Simultaneous Computation: Enables concurrent determination of feed-forward activations and back-propagated gradient signals.
- Stochastic Weight Updates: Utilizes RPUs and stochastic multiplication operations for energy-efficient learning.
Purely Spike-Based Processing: Ensures all computations are performed within the spiking network, eliminating the need for external numerical computations.
Advantages of the technology:
- Energy efficiency due to spike-based processing.
- Biological plausibility and improved compatibility with neuromorphic hardware.
- Reduction in external memory and computational requirements.
- Enhanced online learning capabilities.
The invention emphasizes the development of a method and algorithm for implementing SNN- based back-propagation. The method is illustrated through block diagrams and network architectures, showcasing its theoretical underpinnings and potential applications.
This invention is in its stage of testing and evaluation for the final objective.
3
The technology promises significant advancements in the efficiency and capability of AI systems, particularly in areas requiring real-time processing and low power consumption. It can lead to the development of smarter, more efficient AI-driven devices and systems, potentially transforming sectors like healthcare, robotics, and consumer electronics.
- Neuromorphic processors and hardware.
- Real-time pattern recognition and classification tasks.
- Autonomous systems and robotics.
- Brain-inspired computing systems.
- Advanced AI research and development.
Geography of IP
Type of IP
202221047747
426019