TY - JOUR

T1 - Spike train statistics and dynamics with synaptic input from any renewal process

T2 - A population density approach

AU - Ly, Cheng

AU - Tranchina, Daniel

PY - 2009/2

Y1 - 2009/2

N2 - In the probability density function (PDF) approach to neural network modeling, a common simplifying assumption is that the arrival times of elementary postsynaptic events are governed by a Poisson process. This assumption ignores temporal correlations in the input that sometimes have important physiological consequences. We extend PDF methods to models with synaptic event times governed by any modulated renewal process. We focus on the integrate-and-fire neuron with instantaneous synaptic kinetics and a random elementary excitatory postsynaptic potential (EPSP), A. Between presynaptic events, the membrane voltage, ν, decays exponentially toward rest, while s, the time since the last synaptic input event, evolves with unit velocity. When a synaptic event arrives, ν jumps by A, and s is reset to zero. If ν crosses the threshold voltage, an action potential occurs, and ν is reset to νreset. The probability per unit time of a synaptic event at time t, given the elapsed time s since the last event, h(s, t), depends on specifics of the renewal process. We study how regularity of the train of synaptic input events affects output spike rate, PDF and coefficient of variation (CV) of the interspike interval, and the autocorrelation function of the output spike train. In the limit of a deterministic, clocklike train of input events, the PDF of the interspike interval converges to a sum of delta functions, with coefficients determined by the PDF for A. The limiting autocorrelation function of the output spike train is a sum of delta functions whose coefficients fall under a damped oscillatory envelope. When the EPSP CV, σA/μA, is equal to 0.45, a CV for the intersynaptic event interval, σT/μT = 0.35, is functionally equivalent to a deterministic periodic train of synaptic input events (CV= 0) with respect to spike statistics. We discuss the relevance to neural network simulations.

AB - In the probability density function (PDF) approach to neural network modeling, a common simplifying assumption is that the arrival times of elementary postsynaptic events are governed by a Poisson process. This assumption ignores temporal correlations in the input that sometimes have important physiological consequences. We extend PDF methods to models with synaptic event times governed by any modulated renewal process. We focus on the integrate-and-fire neuron with instantaneous synaptic kinetics and a random elementary excitatory postsynaptic potential (EPSP), A. Between presynaptic events, the membrane voltage, ν, decays exponentially toward rest, while s, the time since the last synaptic input event, evolves with unit velocity. When a synaptic event arrives, ν jumps by A, and s is reset to zero. If ν crosses the threshold voltage, an action potential occurs, and ν is reset to νreset. The probability per unit time of a synaptic event at time t, given the elapsed time s since the last event, h(s, t), depends on specifics of the renewal process. We study how regularity of the train of synaptic input events affects output spike rate, PDF and coefficient of variation (CV) of the interspike interval, and the autocorrelation function of the output spike train. In the limit of a deterministic, clocklike train of input events, the PDF of the interspike interval converges to a sum of delta functions, with coefficients determined by the PDF for A. The limiting autocorrelation function of the output spike train is a sum of delta functions whose coefficients fall under a damped oscillatory envelope. When the EPSP CV, σA/μA, is equal to 0.45, a CV for the intersynaptic event interval, σT/μT = 0.35, is functionally equivalent to a deterministic periodic train of synaptic input events (CV= 0) with respect to spike statistics. We discuss the relevance to neural network simulations.

UR - http://www.scopus.com/inward/record.url?scp=66249117496&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=66249117496&partnerID=8YFLogxK

U2 - 10.1162/neco.2008.03-08-743

DO - 10.1162/neco.2008.03-08-743

M3 - Letter

C2 - 19431264

AN - SCOPUS:66249117496

SN - 0899-7667

VL - 21

SP - 360

EP - 396

JO - Neural computation

JF - Neural computation

IS - 2

ER -