Energy-Efficient Cortical Neuron Models with GIG Likelihood Functions
Sungkar, Mustafa, Electrical Engineering - School of Engineering and Applied Science, University of Virginia
Berger, Toby, Department of Electrical and Computer Engineering, University of Virginia
Neurons are remarkably efficient. The human brain consumes 20 percent of the rest metabolic rate (RMR), but expends energy at a rate of just 25 watts. This efficiency can be explained by natural selection having forced organisms to develop more efficient neural structures, especially during trying times. Hence we have reasons to be believe that neurons' energy usage is highly optimized. We hypothesize that a neuron optimizes the mutual information between its inputs and output, given a fixed energy budget.
We propose the generalized inverse Gaussian (GIG) distribution as the pdf of the random IPI given an input excitation intensity. A strong reason is the GIG distribution is the hitting time of the Barndorff-Nielsen (BN) diffusion, which exhibits attraction towards a threshold. Biological data reveal that the rate of PSP buildup increases as the PSP approaches the threshold, hence can be modeled by the BN diffusion.
Using the GIG model, the optimal input and output marginal distributions are obtained. In a given IPI, let the input intensity be Λ and output IPI be T. The energy costs are the sum of the following: a constant term and terms proportional to T, T^-1, log(T), and ΛT. The source of the energy terms are discussed. Under these assumptions, the marginal output distribution is also GIG with parameters related to the energy costs and the conditional GIG. The input distribution is determined as the inverse Fourier transform of an expression involving modified Bessel functions of the second kind; a procedure for numerically obtaining the distribution is described. The information per IPI is plotted against average energy expended. The result is a concave curve of bits vs. energy analogous to the familiar curves of channel capacity vs. constrained input power in classical information theory where information increases with energy but with diminishing return. A point of interest on the curve is the point with maximum information per energy cost.
This neural model can be viewed as a channel with multiplicative noise that is independent of its input. Accordingly, possible connections between the neuron model and fading channels in communications are discussed. Also, the optimization condition can be generalized to other channels and is discussed.
MS (Master of Science)
Neuron, Generalized Inverse Gaussian, Information Theory, Energy-Efficiency
English
All rights reserved (no additional license for public reuse)
2015/04/09