Energy Efficiency of Neural Information Processing

Author:
Xing, Jie, Electrical Engineering - School of Engineering and Applied Science, University of Virginia
Advisors:
Berger, Toby, Department of Electrical and Computer Engineering, University of Virginia
Sejnowski, Terrence, Salk Institute
Abstract:

The energy efficiency of information processing in the human brain is astonishingly superior to that of any machine yet designed by mankind. It is estimated that the 10^11 neurons composing the human brain consume on average 20 watts of power, whereas the recent "real-time" simulation of some 10 million neurons in the cat visual cortex headed by IBM Almaden Research Center was 10^9 times more energy costly per neuron. Brains have evolved to prodigiously compute and communicate information with remarkable efficiency. Since neurons are expressly designed to exchange information with one another, it is fundamental to understand information processing and energy expenditure at the nodal level of the network. Furthermore, a steadily increasing fraction of neuroscientists subscribe to the view that each neuron's design should maximize the ratio between the rate at which it conveys information and the rate at which it expends energy. For all of the above reasons, my doctoral research explores single neuron modeling of information processing and energy efficiency from both theoretical and experimental perspectives.

The overall goal of this thesis is to analyze the performance of a single neuron, the smallest working unit of the brain, from an information-energy efficiency perspective. In particular, using information theory, random Poisson measures, Laplace transforms, and calculation of variations, we propose a mathematical framework for the stochastic processing and transmission of information performed at the neuronal level. We find the optimum distribution that characterizes the afferent excitatory/inhibitory postsynaptic potential (EPSP/IPSP) intensity by maximizing the Shannon mutual information rate given a constraint on the total energy that a neuron expends for metabolism, postsynaptic potential generation, and action potential propagation during one interspike interval (ISI). This optimum distribution of the incoming EPSP/IPSP intensity serves as a bridge that specifies how an energy efficient brain needs to match the long term statistics of each of its neuron's inputs to that neuron's particular design. Note that bits per joule (bpj) measures the performance of a neuron when viewed as a communication channel, since bits vs joule = bits/sec vs joule/sec = information rate vs power is the standard tradeoff considered by information theorists when studying a channel's capacity. We treat this tradeoff both analytically and through computational simulations for a series of increasingly sophisticated models.

In collaboration with the Salk Institute, we have tested the validity of this information-energy optimizing hypothesis using in vivo recordings of the visual thalamus from the cat. The experimentally-obtained statistical histograms are a close fit with the theoretically-derived optimum distributions. Imposing a bpj-maximizing condition on single neuron function not only allows us to obtain key analytical conclusions that are in good agreement with experimental observations but also yields an intriguing bridge between single neuron theory and the theory of real neural networks, perhaps paving the way to wider applications in neuroscience and engineering. Overall, this research is a step further in the endeavor of reverse engineering of the brain.

Degree:
PHD (Doctor of Philosophy)
Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2014/08/04