Scale-Invariant Temporal History (SITH): A Neural First Principle Approach to Designing Efficient and High-Performing Deep Learning Networks

Author:
Jacques, Brandon, Psychology - Graduate School of Arts and Sciences, University of Virginia
Advisor:
Sederberg, Per, Psychology, University of Virginia
Abstract:

This dissertation seeks to address the limitations of current deep learning architectures by proposing an innovative approach to designing novel, more efficient, and better performing networks. Through computational cognitive neuroscience research, we are able to mathematically formalize how the brain represents the what happened when of time-varying signals, which we name here as the Scale-Invariant Temporal History (SITH). This representation has several properties that are useful for machine learning, including the ability to capture long-range temporal dynamics without encountering the vanishing gradient problem, to be invariant to rescaling of the input, and to represent large amounts of data in an efficiently log-compressed way. Over the course of three chapters, we introduce three neural network architectures, DeepSITH, SITHCon, and SITHEyeCon, that use SITH at the core while learning the statistical properties of a dataset. DeepSITH uses SITH to model temporal dynamics in a deep learning setting, SITHCon extends this approach to include convolutional layers making the network scale-invariant, and SITHEyeCon builds upon this network architecture further to create a scale-invariant computer vision system. These neural networks, which are built with a neural first principles approach, outperformed their direct counterparts on a wide variety of benchmarks. Our approach has the potential to significantly advance the field of deep learning, and hopefully will function as a proof of concept that the brain can provide a blueprint on how to design neural networks in the future.

Degree:
PHD (Doctor of Philosophy)
Keywords:
Machine Learning, Deep Learning, Scale-Invariant, Neurally Inspired AI
Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2023/05/02