Cecilia Romaro, Fernando Araujo Najman, Morgan André
In this paper we present a numerical study of a mathematical model of spiking neurons introduced by Ferrari et al. in an article entitled Phase transition forinfinite systems of spiking neurons. In this model we have a countable number of neurons linked together in a network, each of them having a membrane potential taking value in the integers, and each of them spiking over time at a rate which depends on the membrane potential through some rate function ϕ. Beside being affected by a spike each neuron can also be affected by leaking. At each of these leak times, which occurs for a given neuron at a fixed rate γ, the membrane potential of the neuron concerned is spontaneously reset to 0.
Antonio Galves, Eva Löcherbach, Christophe Pouzat, Errico Presutti
In this paper we present a simple microscopic stochastic model describing short term plasticity within a large homogeneous network of interacting neurons. Each neuron is represented by its membrane potential and by the residual calcium concentration within the cell at a given time. Neurons spike at a rate depending on their membrane potential. When spiking, the residual calcium concentration of the spiking neuron increases by one unit. Moreover, an additional amount of potential is given to all other neurons in the system. This amount depends linearly on the current residual calcium concentration within the cell of the spiking neuron. In between successive spikes, the potentials and the residual calcium concentrations of each neuron decrease at a constant rate.
In 2018, Ferrari et al. wrote a paper called “Phase Transition for Infinite Systems of Spiking Neurons” in which they introduced a continuous time stochastic model of interacting neurons. This model consists in a countable number of neurons, each of them having an integer-valued membrane potential, which value determine the rate at which the neuron spikes. This model has also a parameter 𝛾, corresponding to the rate of the leak times of the neurons, that is, the times at which the membrane potential of a given neuron is spontaneously reset to its resting value (which is 0 by convention). As its title says, it was proven in this previous article that this model presents a phase transition phenomenon with respect to 𝛾. Here we prove that this model also exhibits a metastable behavior. By this we mean that if 𝛾 is small enough, then the re-normalized time of extinction of a finite version of this system converges toward an exponential random variable of mean 1 as the number of neurons goes to infinity.
Vinícius Lima Cordeiro, Rodrigo Felipe de Oliveira Pena, Cesar Augusto Celis Ceballos, Renan Oliveira Shimoura and Antonio Carlos Roque
Neurons respond to external stimuli by emitting sequences of action potentials (spike trains). In this way, one can say that the spike train is the neuronal response to an input stimulus. Action potentials are “all-or-none” phenomena, which means that a spike train can be represented by a sequence of zeros and ones. In the context of information theory, one can then ask: how much information about a given stimulus the spike train conveys? Or rather, what aspects of the stimulus are encoded by the neuronal response? In this article, an introduction to information theory is presented which consists of historical aspects, fundamental concepts of the theory, and applications to neuroscience. The connection to neuroscience is made with the use of demonstrations and discussions of different methods of the theory of information. Examples are given through computer simulations of two neuron models, the Poisson neuron and the integrate-and-fire neuron, and a cellular automata network model. In the latter case, it is shown how one can use information theory measures to retrieve the connectivity matrix of a network.
R. F. O. Pena, V. Lima, R. O. Shimoura, C. C. Ceballos, H. G. Rotstein and A. C. Roque
The conventional impedance profile of a neuron can identify the presence of resonance and other properties of the neuronal response to oscillatory inputs, such as nonlinear response amplifications, but it cannot distinguish other nonlinear properties such as asymmetries in the shape of the voltage response envelope. Experimental observations have shown that the response of neurons to oscillatory inputs preferentially enhances either the upper or lower part of the voltage envelope in different frequency bands. These asymmetric voltage responses arise in a neuron model when it is submitted to high enough amplitude oscillatory currents of variable frequencies. We show how the nonlinearities associated to different ionic currents or present in the model as captured by its voltage equation lead to asymmetrical response and how high amplitude oscillatory currents emphasize this response. We propose a geometrical explanation for the phenomenon where asymmetries result not only from nonlinearities in their activation curves but also from nonlinearites captured by the nullclines in the phase-plane diagram and from the system’s time-scale separation. In addition, we identify an unexpected frequency-dependent pattern which develops in the gating variables of these currents and is a product of strong nonlinearities in the system as we show by controlling such behavior by manipulating the activation curve parameters. The results reported in this paper shed light on the ionic mechanisms by which brain embedded neurons process oscillatory information.
Featuring this week:
Stay informed on our latest news!
|Follow Us on Facebook|