Leveraging neuro-inspired mechanisms for adaptive and efficient deep learning

Author(s)
Gurbuz, Mustafa Burak
Editor(s)
Associated Organization(s)
Organizational Unit
Organizational Unit
School of Computer Science
School established in 2007
Series
Supplementary to:
Abstract
Deep Neural Networks (DNNs) have transformed artificial intelligence, yet their success remains heavily dependent on large, static datasets, rigid architectures, and computationally intensive training processes. In contrast, biological brains excel in dynamic, resource-constrained environments. This thesis explores how principles from neuroscience can be abstracted and adapted to improve the adaptability and efficiency of modern deep learning systems. We propose three neuro-inspired methods that address key challenges in real-world machine learning: continual learning and data efficiency. First, we introduce NISPA, a method inspired by the brain's sparse dynamic connectivity and synaptic stability. Designed for task-incremental continual learning, NISPA preserves previously acquired knowledge by dynamically rewiring a sparse network and selectively freezing crucial connections. This approach significantly outperforms existing methods while requiring up to ten times fewer parameters. Second, we present NICE, a method inspired by adult neurogenesis and contextual memory encoding in the hippocampus. NICE eliminates the need for data replay in class-incremental learning by grouping neurons according to their integration time into the network’s function and using context detection to route inputs appropriately. Without storing or replaying past data, NICE matches—and often exceeds—the performance of popular replay-based methods while avoiding their computational overhead. Finally, we address efficient learning from streaming data with PEAKS, a method inspired by the brain’s top-down attention mechanisms. PEAKS incrementally selects informative training samples based on prediction errors and kernel similarity, effectively filtering out noisy or redundant data. Our experiments show that PEAKS achieves competitive accuracy using as little as one-fourth the data compared to random selection. Collectively, these approaches demonstrate how neuro-inspired mechanisms—such as synaptic rewiring, contextual memory encoding, and attentional filtering—can substantially enhance the adaptability and efficiency of deep learning systems. They represent a step toward AI systems that are not only accurate but also resource-efficient, context-aware, and capable of lifelong learning.
Sponsor
Date
2025-05-13
Extent
Resource Type
Text
Resource Subtype
Dissertation
Rights Statement
Rights URI