PhD Defense | Leveraging Neuro-inspired Mechanisms for Adaptive and Efficient Deep Learning

Title: Leveraging Neuro-inspired Mechanisms for Adaptive and Efficient Deep Learning

Date: Friday, May 2nd

Time: 1:30 PM - 3:30 PM

Location: Coda C1215 Midtown

Zoom Link: https://gatech.zoom.us/j/95801786053?pwd=3zEgT7TVLusVwRJV29piNIftwE0el6.1

 

Mustafa Burak Gurbuz

Machine Learning PhD Student

School of Computer Science
Georgia Institute of Technology

 

Committee

  1. Dr. Constantine Dovrolis (Advisor), School of Computer Science, Georgia Tech
  2. Dr. Zsolt Kira, School of Interactive Computing, Georgia Tech
  3. Dr. Stephen Mussmann, School of Interactive Computing, Georgia Tech
  4. Dr. Sashank Varma, School of Interactive Computing & Psychology, Georgia Tech
  5. Dr. Decebal Mocanu, Department of Computer Science, University of Luxembourg

 

Abstract

DNNs have transformed AI, yet their success heavily depends on static datasets, rigid architectures, and computationally intensive training. In contrast, biological brains excel in dynamic, resource-constrained environments. This thesis explores how principles from neuroscience can be abstracted and adapted to improve the adaptability and efficiency of modern deep learning systems. We propose three neuro-inspired methods that address key challenges in real-world machine learning: continual learning (CL) and efficiency.

First, we introduce NISPA, a method inspired by the brain's sparse dynamic connectivity and synaptic stability. Designed for task-incremental CL, NISPA preserves previously acquired knowledge by dynamically rewiring a sparse network and selectively freezing crucial connections. This approach significantly outperforms existing methods while requiring up to ten times fewer parameters.

Second, we present NICE, a method inspired by adult neurogenesis and contextual memory encoding in the hippocampus. NICE eliminates the need for data replay in class-incremental CL by grouping neurons according to their integration time into the network's function and using context detection to route inputs appropriately. Without storing or replaying past data, NICE matches—and often exceeds—the performance of popular replay-based methods while avoiding their computational overhead.

Finally, we address efficient learning from streaming data with PEAKS, a method inspired by the brain's top-down attention mechanisms. PEAKS incrementally selects informative training samples based on prediction errors and kernel similarity, effectively filtering out noisy or redundant data. Our experiments show that PEAKS achieves competitive accuracy using as little as one-fourth the data compared to random selection.

Together, these approaches demonstrate how neuro-inspired mechanisms can substantially enhance the adaptability and efficiency of DNNs. They represent a step toward AI systems that are not only accurate but also resource-efficient, context-aware, and capable of lifelong learning.

Event Details

Date/Time:

  • Friday, May 2, 2025
    1:30 pm - 3:30 pm
Location: Coda C1215 Midtown