Temporal Intelligence in Spiking Neural Networks: A New Framework for Learning and Adaptation
Loading...
Author(s)
Chakraborty, Biswadeep
Advisor(s)
Editor(s)
Collections
Supplementary to:
Permanent Link
Abstract
Artificial intelligence is at a crossroads: conventional deep learning models, while powerful, remain fundamentally limited in their ability to process information in time, adapt seamlessly to changing environments, and efficiently encode structured memory. The brain, by contrast, operates through spikes—discrete, event-driven signals that inherently capture temporal dependencies. Spiking Neural Networks (SNNs) have long been viewed primarily as energy-efficient alternatives to artificial neural networks (ANNs). This dissertation takes a fundamentally different view: it positions SNNs as a new computational paradigm, capable of expressing forms of temporal reasoning and adaptive intelligence that conventional deep learning struggles to achieve.
However, realizing this vision requires overcoming a key limitation—most existing SNN models are built on homogeneous neuron and synapse dynamics, constraining their expressivity and adaptability. From a dynamical systems perspective, this homogeneity forces all neurons to evolve along similar timescales, reducing the network’s ability to capture multi-scale dependencies and limiting the richness of its attractor landscape. By contrast, in complex dynamical systems—including the brain—heterogeneous timescales create diverse trajectories in state space, enhancing stability, memory capacity, and computational flexibility. To address this, this dissertation introduces Heterogeneous Recurrent Spiking Neural Networks (HRSNNs), a novel class of SNNs that leverage diverse neuronal and synaptic timescales to improve learning efficiency and temporal representation. By incorporating heterogeneity, HRSNNs enable structured memory retention, greater robustness to non-stationary inputs, and improved real-time adaptability.
Yet, heterogeneity alone is insufficient to fully harness the computational power of SNNs. To systematically extract their unique advantages, this dissertation develops a new mathematical framework that bridges spike-based processing with dynamical systems theory, state-space models (SSMs), and Lyapunov stability analysis. These tools provide formal guarantees on stability, convergence, and learning efficiency, key properties that have remained elusive in SNN research. Additionally, this work proposes a task-agnostic pruning methodology, which sparsifies SNNs not based on task-specific heuristics, but by preserving key dynamical properties, allowing for efficient and generalizable representations. Beyond pruning, this dissertation extends SNNs to structured data and continuous domains through innovations such as Spiking Graph Neural Networks (SGNNs) and Spiking State-Space Models (S-SSMs), demonstrating their potential in real-world applications.
Through applications spanning unsupervised learning, time-series prediction, multi-agent interactions, and event-based perception, this dissertation reframes SNNs not as mere energy-efficient alternatives, but as a fundamentally new class of adaptive, real-time intelligent systems. By combining architectural innovations with deep theoretical insights, this work establishes a foundation for spike-based artificial intelligence that is not only efficient, but computationally powerful—offering a new perspective on how learning, memory, and intelligence can be reimagined through the lens of dynamical systems.
Sponsor
Date
2025-04-11
Extent
Resource Type
Text
Resource Subtype
Dissertation