Estimation and Evaluation of Energy Efficient Neural Communication Channel

Author:
Sungkar, Mustafa, Electrical Engineering - School of Engineering and Applied Science, University of Virginia
Advisor:
Berger, Toby, Department of Electrical and Computer Engineering, University of Virginia
Abstract:

The brain is an energy-efficient computation device. At rest, it runs on 15 Watts of power. What can scientists and engineers learn from the brain to make computational devices more energy-efficient? This dissertation begins to address that question by studying the behavior of cortical neurons in the sensory cortex.

The task of the cortical neuron is to send information about its input to other target neurons. It performs this task by expending as little energy as possible. To quantify the performance of the neuron, Shannon's mutual information (MI) is used as a measure of neural information. The neuron is assumed to maximize MI for a fixed energy budget. Thus, an information theoretic framework can be used to analyze the energy efficiency of the neuron.

This dissertation consists of four major parts: the generalized inverse Gaussian (GIG) neuron model, assessing the energy efficiency of the model, optimizing the model, and a rate-distortion (R-D) problem inspired by the model. The GIG neuron model takes into account the fast sodium channels that allow a fast rate of increase of the postsynaptic potential (PSP). This behavior of the PSP determines the input-output behavior of the neuron, which allows the neuron to be modeled as a communication channel. Next, methods for estimating the parameters of the GIG neuron model are developed. The accuracy of the model is evaluated with simulations.

After that, the maximum MI transmitted by the GIG neuron model for a fixed energy budget is determined. Surprisingly, the input distribution that achieves the constrained capacity is discrete with a finite number of mass points for some parameter sets. This implies that the neural network (NN) should exist in discrete states to maximize the MI transmitted by the neuron. To further optimize the GIG neuron model, the parameter sets that produces the most MI for a given energy budget is discussed. An additional variance constraint is imposed to prevent MI from increasing without bound. A numerical example is used to illustrate the theory.

Finally, a R-D problem inpsired by the GIG neuron model is developed. The source distribution is given by the GIG distribution and the distortion function is related to the energy expenditure of the GIG neuron model. The result is that for some parameter sets, the reconstruction alphabet is discrete.

This dissertation is part of a nascent approach to the study of energy-efficient computing. The next steps must involve studying the network in which the neuron operates. This includes studying feedback loops that exist within the network.

Degree:
PHD (Doctor of Philosophy)
Keywords:
Energy Efficiency, Information Theory, Cortical Neuron
Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2018/04/24