Robustness of Persistent Firing in a minimal recurrent network of Working memory

The minimal model network comprising 2 QIF neurons that reciprocally excite each other and form a kind of neural oscillator, simulating the persistent activity of cortical delay selective neurons in a WM task. The effect of random perturbations.

Рубрика Программирование, компьютеры и кибернетика
Вид дипломная работа
Язык английский
Дата добавления 13.09.2017
Размер файла 2,4 M

Отправить свою хорошую работу в базу знаний просто. Используйте форму, расположенную ниже

Студенты, аспиранты, молодые ученые, использующие базу знаний в своей учебе и работе, будут вам очень благодарны.

Размещено на http://www.allbest.ru/

Robustness of Persistent Firing in a minimal recurrent network of Working memory

  • Chapter 1. Introduction
    • From a reductionist point of view, it is trivial that understanding the individual mechanism and interaction of neurons, as the basic constituents of the nerves system, is essential for understanding the brain as a whole. But on the other hand, a great tendency is growing toward applying reduced models in theoretical neuroscience. The biologically realistic population models consisting of thousands of interacting neurons are highly nonlinear and laborious to be analyzed thoroughly, while reduced models can be very simplified yet physiologically plausible frameworks for studying perceptual mechanisms.
      • A relevant example for the application of reduced models, is a study by Gutkin, Laing, Colby, Chow, & Ermentrout (2001), focusing on the role of spike timing asynchrony and synchrony in sustained neural activity. In their paper a simple two-cell circuit were compared with a population model comprising 500 connected neurons, and it was shown that dynamical properties of their proposed reduced model is sensibly analogous to that of the population model. As another instance, we can point to the popular model of decision making by Wang (2002) consisting of 2000 neurons, which was later replaced by a successful simplified version of two coupled neurons, capable of reproducing the same properties of the original model (Wong & Wang, 2006).
      • Except from complexities of large scale population models that should be tackled, the individual single cell models must be also relatively simple as well as being biologically credible. Both electrophysiological and dynamical properties of neurons are important for information processing in the brain (Izhikevich, 2007) and numerous computational models of neurons have been introduced to feature neural dynamics with different levels of complexity. Physiologically detailed computational models of cells, such as the famous Hodgkin-Huxley (HH) model (Hodgkin & Huxley, 1952) or the detailed compartmental models (Brunel, Hakim, & Richardson, 2014; Almog, & Korngreen, 2016) are too complicated to be analyzed mathematically, while relatively simple canonical models are advantageous due to their generic nature, retaining many important features of a whole family.
      • Minimal models of cells due to definition of Izhikevich, (2007) are those having minimal sets of currents that enable them to generate action potentials. Such models are appealing because they are relatively simple and their parameters establish electrophysiological meanings in a way that their role in dynamics can be easily identi?ed. Moreover, they have limited number of phase variables amenable to analysis using geometrical phase plane methods. This is a great benefit specially for the systems that are not solvable analytically.
      • In this thesis, dynamics of neural interaction will be investigated for two coupled excitatory neurons, proposed as a minimal model of working memory (WM). We will demonstrate how information is stored in cellular level and how dynamical features lead to cessation of mnemonic activity in different conditions. We will take advantage of the generic nature of well-known Quadratic Integrate and Fire (QIF) neuron that is a canonical model for type 1 neural activity.

1.1 Type 1 and Type 2 Neurons

Hodgkin (1948) for the first time delineated tw classes of axonal firing. Type 1 activity refers to the nerve cells which their ?ring rate raises smoothly from zero as a function of their input current. Firing rate versus input current (F-I curve) of type 1 neurons has a general known function that is presented in figure 1.a. As it is observed in this figure, there exist a critical value of input current for neurons of this particular type, that switches on a periodic firing activity in the cells (here I = 40). Type 1 cells are silent for the input values below this threshold, while further increment of I, will gradually increase their firing rate within a large range of frequencies, as a function of the injected input.

In contrast, the so called type 2 behavior refers to the neurons that produce only a limited range of frequencies. F-I curve of type 2 cells, as indicated in figure 1.b, are discontinues and a jump from silent to a non-zero firing rate appears in them at a critical value of input current (Hodgkin, 1948; Gutkin & Ermentrout, 1998).

Figure 1. Firing rate versus input current a) Type 1 neurons, b) Type 2 neurons (Gutkin & Ermentrout, 1998)

Behavior of neural oscillations when neurons are coupled together in networks strongly depends on whether they are type1 or type 2. Methods of dynamical systems theory suggest that these two types of behavior are founded in the type of bifurcation that carries the excitable membrane from rest to rhythmic firing. In fact, Type 1 and 2 behaviors, in terminology of dynamical systems, correspond to `saddle node bifurcation on an invariant circle' and `Andronov-Hopf bifurcation' respectively (Ermentrout & Chow, 2002).

Consequently, whether a mathematical model is of Type 1 or 2 can typically be determined by application of bifurcation analyses, but as Skinner has mentioned in his (2013) paper, distinction of neurons within these categories, experimentally, is much trickier. He has pointed to some experimental studies that have brought neural evidence for switching behavior between type 1 and type 2.

Stiefel, Gutkin, and Sejnowski (2009) have described the switching mechanism between Type 1 and 2 behaviors due to modulating biophysical characteristics in three different cellular models. More work is still needed in the field to explain how do model parameters relate to experimental measurements being performed; And how neurons can vary their behavioral characteristics in different conditions.

In this work, we will partly discuss how does two coupled Quadratic Integrate and Fire (QIF) neurons, that are individually type 1, may show a more type 2 behavior, depending on the coupling parameter.

1.2 Quadratic Integrate and Fire Model

QIF model describes the action potentials of biological neurons by an autonomous discrete differential equation as indicated in equation (1).

(1)

In this equation, x characterizes the membrane voltage of the neuron and Iext is a parameter determining the activity mode. On the one hand, a negative Iext introduces two real equilibriums to the system, . For simplicity these fixed points are referred to `a' and `-a' in this work. The negative root `-a' is the stable node that plays the role of a rest state, while the positive root, `a', is a saddle node, playing the role of a threshold for spike initiation. Voltage trajectory always converges to the rest state unless it can cross the Threshold. Any x value beyond the threshold will grow to create an action potential. It will quickly hit Vpeak, regresses to a reset parameter, Vreset, and eventually tends toward the rest state again. In fact, existence of these two real roots keeps the cell in an `Excitable' mode, which means that the neuron is disposed to fire a single spike by a sufficiently large external pulse, but it doesn't exhibit regular pattern of firing.

On the other hand, a neuron with a positive Iext, loses its real equilibriums and launch into a `Periodic' mode of activity. Thus, Iext = 0 is a bifurcation point for this model. In figure 2 the QIF function is indicated for three values of Iext, below, above, and at the bifurcation point. This figure illustrates how increasing the external current from panel a to c leads to occurrence of saddle node bifurcation and evanescence of the fixed points.

Figure 2. dX/dT versus X and fixed points of the QIF model. a) Excitable neuron. b) Saddle node bifurcation, c) Periodic firing mode (Izhikevich, 2007).

1.3 Coupled QIF neurons

Gutkin, Jost, and Tuckwell, (2008a) proposed that a pair of coupled QIF neurons are able to produce self-Sustained neural activity resembling the persistent activity in a delay period of a working memory task in Prefrontal Cortex (PFC) neurons.

Following that study, a more recent publication by Novikov and Gutkin (2016) have investigated the dynamical structure of the same network with fine mathematical details. They have shown that, although the QIF neurons are individually excitable, the system of mutually coupled QIF neurons demonstrates bi-stability between a periodic and quiescent modes of activity. To solve the system of equations analytically and to find the attraction basins in the phase space, time dependent (non-autonomous) synaptic currents are transformed into voltage dependent (autonomous) currents in this work. One limitation of this approach is that analysis are only true for the range of parameters for which the synaptic currents decay very fast compared to the period of spiking. Thus, analytical solutions are not credible in higher frequency bands. What's more, although the autonomous model work's perfectly well in the typical conditions of slow rate activity, some features of the system in reaction to synchronization, or external perturbations are not comparable with biological neurons.

Realistic synaptic currents that are produced following an action potential are time dependent (Purves et al., 2014), i.e. regardless of what happens to the presynaptic cell's membrane voltage, postsynaptic currents decay as a function of time. But in the autonomous models variations of currents are always tied to the voltage variations of the breeder cell. Consequently, it is believed that for studying the effect of external perturbations on neurons, it is more realistic to stay with the non-autonomous models.

In this thesis the same minimal model of working memory (Gutkin at al., 2008a; Gutkin at al., 2008b; Novikov & Gutkin, 2016) comprising two QIF neurons with more realistic time dependent exponential synaptic kernels will be investigated. It will be shown that the intended network is a bistable system amenable to switch between a sustained firing mode and a quiescent state. Mechanisms by which a regular pattern of spiking is generated, sustained, and extinguished will be discussed. For the non-autonomous model finding attraction basins in 2D phase space is impossible, but using numerical analysis the effect of noise on dynamical properties of the network will be studied in particular. It is believed that the model is generalizable to the whole class of type-1 neuron models and its properties in stochastic conditions are generalizable to other non-biological dynamical systems with bi-stability between a Stable Limit Cycle (SLC) and a stable rest state.

In the next chapter, first a short introduction will be given about working memory, and then a few computational models of this cognitive function will be overviewed that are somehow related to this work. In chapter 3, we will bring forward the importance of considering the effect of noise in neural models. `Stochastic Resonance' and `Inverse Stochastic Resonance', two phenomena that occur in nonlinear dynamical systems in presence of noise will be introduced. It will be shown that the dynamical structure of our minimal model is disposed to exhibit Inverse Stochastic Resonance (ISR) in reaction to random noise. Chapter 4 and 5 will contain the simulations, results and conclusion.

Chapter 2. Working Memory

The term working memory (WM) became famous after (1974) through a homonymic model introduced by Baddeley and Hitch for a kind of temporary information storage in the brain. Distinction of WM, and Short Term Memory (STM) is still challenging and these terms are sometimes used interchangeably (Aben, Stapert, & Blokland, 2012). What is given in this chapter follows a paradigm based on which, WM and STM are not identical constructs.

2.1 Definition

In contrast to Short Term Memory, WM is not only the maintenance property but also controlled manipulation of a stored information before recall (Baddeley, 1992). A very common example of this mental potency is the process of reading a phone number, keeping it in mind, dial it, and then immediately forgetting it. In fact, WM have been considered as a system with limited capacity that functions like an interface between Long Term Memory (LTM), perception, and action (Cowan, 2008).

A few years after identification of WM in human cognition studies (Baddeley, 1992; Baddeley, & Hitch, 1974), well-known neurobiologist Goldman-Rakic (1995) could describe the cellular basis of MW in primates' prefrontal Cortex (PFC). These underlying biological mechanisms will be discussed in the next section, but perhaps the best intelligible explanation for WM is also what she had stated in her paper:

“If semantic and procedural memory are the processes by which stimuli and events acquire archival permanence, working memory is the process for the retrieval and proper utilization of this acquired knowledge.” (Goldman-Rakic, 1995)

This kind of ability for storing information is considered by some scientist to pertain to attentional processes as well. For example, Purves et al (2014) have stated that WM is a special category of attention which provides an internal representation of the sensory inputs. It has also substantial influence on other cognitive functions such as learning, planning, reasoning, language comprehension, and thinking.

In addition to the potency of storing information, the ability to forget the acquired information is also equally important (Purves et al., 2001). The stored information must wipe out steadily to provide free space for new coming information.

In a recent review paper by Dipoppa and Gutkin (2015) five basic operations are enumerated for the procedure of WM in the brain. The first operator is `Load', functioning when the nervous system encounters a sensory stimulus and encodes it in the memory. When the stimulus disappears, the second operator `Maintain' performs the successful retention of the memory. What's more, almost always in real life, there are some irrelevant stimuli that appear in between the cue and response phase of the tasks requiring WM. Therefore, there is a need for blocking such distractors which might perturb the retention of relevant information. This process relates to the third operator called `Prevent'. Next, in the response phase of the task, the stored relevant information will be extrapolated by the operator `Read Out' to be used for completing the task. And lastly, the stored information that is useless after the response-phase will be erased by the operator `Clear' providing free slots for new coming information.

2.2 Neural Basis of Working Memory

During last 45 years, considerable evidence has confirmed the prominent role of prefrontal cortex (PFC) in the behaviors which are memory-guided (Frank, Loughry, & Reilly, 2001). Studies, have introduced three classes of neurons in PFC that are active during a WM task. First, the `Cue selective neurons', which respond to the onset of the cue. Second, the `delay selective neurons', that exhibit persistent tonic activity during the delay period. And third the `Respond selective neurons', which react to occurrence of response phase (Goldman-Rakic, 1995). Figure 3 indicates neural activity recorded from these three neural groups. The letter C, is indicator of the cue period of the WM task. D, is the delay period, and R, is the response time.

The key feature of the neurons in the second group, which also makes them responsible for retention of information, is that their activity can persists in absence of any input. Although, such self-sustained activities have been also observed in infratemporal cortex (ITC) as well as in parietal cortices and basal ganglia (Frank et al., 2001; Fuster, 1997), the property which distinguishes PFC neurons from others is that their activity is greatly robust to distractors (Miller, Erickson, & Desimone, 1996). Such robust neural firing in PFC neurons, are so called `Persistent Activity', and are the foundations of WM mechanisms (Goldman-Rakic, 1995).

Figure 3. Different types of neuron in PFC (Goldman-Rakic, 1995)

Moreover, as an extra evidence for this idea, analogues to the persistent activity, a sustained change in human event related potentials (ERP), called the Contralateral Delay Activity (CDA), have been reported during the delay interval of visual working memory tasks (Vogel, Mccollough, & Machizawa, 2005; Vogel, & Machizawa, 2004).

2.3 Computational Models of Working Memory

Attempts to model working memory for explaining the underlying mechanisms of its procedural operations in neural circuits, started soon after the famous papers of Goldman-rakic (1995) and Fuster (1997) in which they had properly explained the cellular basis of working memory. Indeed, some earlier models of short term memory were also available previously, which their main ideas of information storage were used for the later WM models. For example, Hebb (1949) had introduced the paradigm of attractor An `attractor' in the context of dynamical systems theory refers to a set toward which the variables converge in time states to theories the mechanisms of information storage. This idea became famous and plenty of computational models exploited it later with various neural structures capable of sustaining attractor states. For a review on different approaches to model WM look at (Dipoppa, & Gutkin, 2013).

One prominent class of computational models of WM suggest local recurrent neural networks as a tool for encoding memory and maintaining the focus of persistent activity. The first publication that greatly progressed this approach is the work by Amit and Brunel (1997). They utilized a network of connected integrate and fire (IF) neurons in their proposed model and successfully simulated electrophysiological data of a particular kind of working memory task. A follow up study by Brunel and Wang (2001) implemented realistic GABAergic and NMDA synapses, claiming that although persistent activity can sustain in subpopulation of neurons by excitatory reverberation, robustness of activity in presence of distractors is provided only by synaptic inhibition.

Another recurrent based model was proposed containing inhibitory and excitatory neural populations connected to each other with spatial Gaussian distributions. This network was in fact proposed as a model for spatial WM where each neuron had a favorite memory field and its firing rate was a function of spatial location of the cue (Compte, Brunel, Goldman-Rakic, & Wang, 2000).

All of the models remarked above have applied a multi-synaptic inhibition mechanism for the operator `Clear' in WM. In spite of the fact that clear is introduced as an excitatory signal which globally enhances both excitatory and inhibitory activity in these models, but inhibition finally dominates and erases the memory. To clarify the main idea of this mechanism, a picture captured from (Compte et al., 2000) is shown in figure 4. Panels a and b indicate the color maps of firing rate for excitatory and inhibitory neurons respectively. Letters C, D, and R, indicate the cue period, the delay period, and the response time of a WM task, respectively. The population firing profile, averaged over the delay period, is also shown in panel c. As it is seen, after accomplishment of the task, global enhancement of inhibitory activity has terminated the persistent firing in the selective excitatory population.

Figure 4. Information maintenance and erasure in a recurrent population model of WM by Compte et al (200). a) Activity in pyramidal excitatory cells, b) Activity in inhibitory interneurons, c) The Gaussian profile of mean firing rates due to Gaussian distribution of spatial connections.

Another approach to implement the operator `clear' in recurrent models of WM was suggested by (Gutkin et al., 2001). They employed HH models that are more faithful to biophysics of cortical neurons and argue that Inhibition is not necessary to switch off the persistent activity, because even a transient excitation can synchronize the spikes of the neurons, leading to arrival of all the synaptic inputs right at the refractory periods when cells are inactive and consequently shuts off the activity. The pattern of raster plots during the delay and after injection of a transient excitation in figure 5 indicates this phenomenon.

Figure 5. Stopping the persistent activity in WM model of Gutkin et al (2001). Raster plot of excitatory neurons (upper), and the effect of transient external excitation (lower) on it show that synchronization can switch off the activity.

Gutkin et al (2001), have also suggested that perhaps a copy of the motor command after accomplishment of the task is the source for both read-out and a direct clearing signal in the brain, while Brunel and Wang (2001) have pointed to `reward' as a possible source for implementation of erasure.

As mentioned earlier, in this work, we will investigate the dynamical features of a minimal network with time dependent synapses, in reaction to both excitatory and inhibitory pulses. We will indicate that for such non-autonomous systems robustness varies with respect to the different phases of the limit cycle. Finding the attraction basins and regions of stability is not straight forward for this system, because everything depends on time. However, by numerical computations, it will be shown that in general inhibition is more trustable for wiping out the memory.

Chapter 3. Ubiquitous noise

According to numerous studies of the nervous system, Stochastic noise is ubiquitous in neural networks, wheatear intrinsic to the cells and ion channels or coming from external sources (Stein, Gossen, & Jones, 2005). Besides, random noise occurs at many different scales, ranging from molecular to large brain networks (Faisal, Selen, & Wolpert, 2008). Research interests, both in new experimental methods for identification of noise, and in the computational modelling approaches in terms of investigating the functional roles of noise, has significantly increased recently (McDonnell, & Ward, 2011). It is even proposed that perhaps the nervous system has evolved not only to adapt to the unavoidable perturbations, but also to make advantage of noise in some cases (Faisal et al., 2008). So, perhaps that is why our brain is yet performing perfectly in spite of exposure to significant noise levels. Out of all these wide range of studies from synapses, to neurons and networks, to whole brain, and to behavior, diverse positive and negative kinds of influence have been identified for noise (McDonnell et al., 2015).

Noise as one of the main sources of trial wise variations in the brain-behavior studies had been assumed to be troublesome for a long time. It was strongly believed that the influence of noise must be controlled in brain recording analyses as well as in computational modelling. Averaging the activity over population of neurons and over several trials is a very typical approach for cancelling out the randomness caused by noise, but the problem is that this approach can smooth over some properties of neurons and misrepresent neural dynamics (Stokes, & Spaak, 2016; Lundqvist, Rose, Herman, Brincat, Buschman, & Miller, 2016). Consequently, the new lines of research on the role of noise are greatly appealing firstly because in experimental methods it would be easier deciding what should be considered as signal and what should be considered as noise (McDonnell et al., 2015). Secondly, in theoretical neuroscience the dream of gaining highly realistic models can come true by applying realistic noise sources (Ashida, & Kubo, 2010).

Moreover, understanding the effect of noise can have interesting benefits in pathological and therapeutic approaches. For example, Priplata et al (2006), after reporting observation of a facilitating effect of noise on their debilitated human subjects, have proposed that randomly vibrating shoes may improve posture and balance in patients. Also, suggested by Gutkin et al Gutkin, Jost, & Tuckwell, (2009), basis of unsound oscillatory activities in the brain like epilepsy might be the consequence of small noisy signals. Besides, understanding the role of noise in higher order cognitive functions such as learning (Buesing, Bill, Nessler, & Maass, 2011), memory (Rolls, 2013), and decision making (Miller & Katz, 2010) would hopefully lead to finding ways for improving cognition by the use of appropriate noise. In this thesis, we will show that how random perturbations influence the mechanism of information storage in the WM model under discussion. It is proposed that randomness in neural activity of WM circuits might use advantage of a phenomenon called Inverse Stochastic Resonance to implement the operation `clear'.

3.1 Stochastic Resonance - History

In 1995, Kurt Wiesenfelt and frank Moss, wrote with rapture that, although engineers had always sought to minimize the effect of noise in electrical and communicational circuits, recent research has shown that noise can play a constructive role in detection of weak periodic signals via a mechanism so called Stochastic Resonance (Wiesenfelt & Moss, 1995). Actually, this phenomenon was firstly propounded in 1981, by Roberto Benzi as a means of explaining the periodic recurrence of ice ages. At that time, it was already known that the average period of interglacial transitions on the earth is about 100000 years and the periodicity of climate fluctuations coincides with the glaciation cycles. Benzi had proposed that this random ?uctuations in the amount of energy receiving to earth, makes it possible for this small periodic perturbation to manifest itself in the form of periodic glaciation cycles. This proposed mechanism for was termed Stochastic Resonance (SR), (Benzi, Sutera, & Vulpiani, 1981).

After this initial idea, the field was largely quiescent until 1989, when presence of SR in a bistable ring laser, switching between clockwise and counter clockwise modes, demonstrated experimentally (McNamara, Wiesenfeld, & Roy, 1988) Another, early work was the observation of SR in bistable electric-paramagnetic-resonance systems (Gammaitoni, Martinelli, Pardi, & Santucci, 1991). Then, this idea became more and more popular and appeared in other contexts.

In 1993, SR was demonstrated in the sensory neurons of cray?sh (Douglass, Wilkens, Pantazelou, & Moss, 1993). This idea crossed the disciplinary boundaries and led to a widespread interest in its applications to biological systems and neuroscience (Wiesenfeld, & Moss, 1995). Researchers started investigating SR in the neuronal models and benefiting it for decoding neural data. Longtin (1993) for the first time showed that the excitable cell model, Fitzhug-Naguma, demonstrates stochastic resonance when it is driven simultaneously by a periodic and a stochastic input.

3.1.1 Definition

Despite being a unique phenomenon, SR is being defined in different manners. McNamara et al (1988) have pointed to improvement of signal to noise ratio in presence of noise, as the primary signature of stochastic resonance. David Lyttle (2008) has written that SR is a nonlinear phenomenon where the activity of a dynamical system correlates with a periodic input in the presence of an optimal level of noise. Longtin (1993) has specified that SR occurs when the time scale imposed by the external periodic modulation becomes commensurate with an appropriately defined switching rate of a bistable system. Russell, Wilkens, & Moss (1999) have stated that SR is the phenomenon whereby an optimal level of noise added to a weak information-carrying input, enhances the information content at the output of a certain nonlinear system. All of these statements have pointed to the beneficial effect of noise in natural or artificial systems possessing some prerequisites conditions.

3.1.2 SR in Biological Networks

An early interesting evidence for the role of SR in biological species was the work by Russel et al (1999). They applied noisy electric fields in the water to investigate the feeding behavior of paddle fish. Paddle fish in the nature employ kind of biological electroreceptors to detect electrical signals from their planktonic preys. Russel and his colleagues could find an optimal level of noise at which the spatial range of detecting planktons by paddle fish was significantly broadened. Figure 6 is captured from their paper indicating the spatial distribution of preys in the water, from the fish's point of view. Panel b of this figure demonstrates the enhancement of detecting planktons in comparison with the noise-free condition that is shown in panel a. Also, as shown in panel c further increment of noise has decreased the performance again.

Figure 6. Spatial distributions of the location of plankton preys at different noise levels from the perspective of a paddle fish. a) Control without noise, b) Optimal moderate noise c) High noise. (Russel et al., 1999)

The first demonstration of SR in the neuronal networks of the brain came out by Gluckman et al (1996). They applied a time varying electrical field carrying both signal and noise to a population of neurons in a mammalian brain. By varying the magnitude of the stochastic component of the field they could observe stochastic resonance in the responses of neurons to the weak periodic signals. (Results of this work are brought as an example in section 3.1.7 to discuss methods of measuring SR).

A fine detailed study on the role of SR in different membrane patch sizes of the cell, was carried out by Schmid, Goychuk, and Hдnggi, (2001). Using a stochastic generalization of the HH model, they have shown that SR occurs only for sufficiently large membrane patches and thus biological SR is not rooted in individual stochastic dynamics of single ion channels, but rather in the collective properties of ion channel populations.

One of the very first experiments confirming the enhancement effect of noise in human behavior was published by Kitajo, Nozaki, Ward, and Yamamoto, (2003). In this study, authors reported that the behavioral responses of their human subjects to weak visual stimuli were optimized by presenting randomly changing grey level signals to their contralateral eyes. There exist also Plenty of studies underscoring the important role of SR on enhancement of auditory signal detection and discrimination in human subjects, to the extent that Zeng, Fu, and Morse, (2000, p. 1) have written: “Noise is an integral part of the normal sensory process and should be added to auditory prostheses”.

Beneficial effects of noise have been investigated in several human tactile experiments as well. For instance, it has been shown that noise can improve balance and reduce postural swaying in elderlies (Priplata et al., 2006). Moss, Ward, and Sannita, (2004) have reviewed a few of landmark experiments in this field comprising psychophysics, electrophysiology, animal behavior, fMRI, human vision, hearing and tactile function, plus single and multiunit activity recordings.

3.1.3 Mechanism of Classic SR

SR was classically considered to happen in any bistable nonlinear dynamical system subject to noise and a periodic signal (Gammaitoni et al., 1998). A generic form of such systems can be described by Equation (2).

(2)

Here is a random noisy input and is the reference periodic input signal to the system. Also, is a double well potential possessing three equilibria; two stable fixed points separated by an unstable equilibrium above a potential barrier. This system is illustrated by a mechanical analogy in figure 7, that provides succor for understanding this phenomenon.

Figure 7. a) A symmetric double well with three equilibriums, equivalent to equation 2, b) Periodic modulation of the double well (Lyttle, 2008).

If we imagine that a particle subject to friction is moving in this double well, then the particle's position, , can be treated as the state variable of this system. The periodic forcing is supposed to successively levitate the potential barriers up and down. In absence of this periodic forcing () the random noise will cause to fluctuate around the minima with a variance proportional to the intensity of the noise. It may also lead to noise induced hops from one well to the other occasionally, forming an irregular pattern of switching. On the other hand, we can assume a weak periodic signal applied to this system which is solely far too small for exciting the particle to cross the barrier. However, concomitance of this weak signal with noise causes the transition probability density to vary periodically. As a result, the weak periodic signal that its frequency could not be observed solely by watching the state variable of this double well, will pronounce in presence of a small noise (Lyttle, 2008; Wiesenfeld, & Moss, 1995).

3.1.4 Threshold or non-dynamical SR

A more general characterization of stochastic resonance emerged by the work of Gingl et al (1995), discarding the necessity of bi-stability, so called the non-dynamical Stochastic Resonance. In this context, a nonlinear system having a threshold, a subthreshold information bearing stimulus, and a noise source was sufficient for SR phenomena to occur (Moss, Ward, & Sannita, 2004).

This new idea widely extended the class of systems in which SR could be expected to be found, because the three required ingredients were extremely ubiquitous and present in many manmade and natural systems. In particular, models of neural activity which are not bistable but exhibit threshold-like dynamics became relevant systems to search for stochastic resonance in them.

An example of such a system from the Gingl et al paper (1995) is presented in Figure 8. In this figure a subthreshold periodic signal is combined with noise (panel a) to be applied to the system. Then, as soon as this noisy signal crosses the threshold, an action potential is produced in the output (panel b). The resonance here refers to the correlation of the signal frequency with the distribution of produced spikes. Panel c of this figure shows the salience of frequency components of the reference periodic signal in the power Spectral Density (PSD) of the spike trains.

Figure 8. a) sinusoidal signal plus a Gaussian white noise underlying a threshold that is shown by the straight line, b) output spikes which are the threshold-crossing events, c) Averaged power spectrum of the spike trains (Gingl et al.,1995).

3.1.5 Computational Implications of SR

In essence, we can say that SR occurs whereby a weak signal entrains the noise induced hopping and makes the transitions between two states of a system more regular. Such regularity develops with increment of the noise intensity up to a particular point and then gets more and more irregular again. Therefore, there exist an intermediate value of noise which is optimal for signal detection. It is important to notice that for SR to be useful, positive detection of a sub-threshold input must be more desirable than a failure to detect a supra-threshold input (Faisal, Selen, & Wolpert, 2008).

Practically, we can think about detection of a weak signal in a noisy environment while the output of the system is observed as a time series of switching events. In such a condition the more noise is added, the more sensitive will be the detector (Wiesenfeld, & Moss, 1995). Phenomena with such mechanism exist very often in the biological networks such as neurons in the auditory fiber of the squirrel monkey (Rose, Brugge, Anderson, & Hind, 1967), in hand mechanoreceptive afferents of the macaque monkey (Talbot, & Mountcastle, 1968), and in the primary visual cortex of the cat when subjected to a periodic stimulus (Ogawa, Bishop, & Levick, 1966). These neurons respond to periodic signals and exhibit multimodal inter spike interval histograms, while the peaks of these histograms locate at the integer multiples of the period of the driving force. Histogram of interspike intervals from a cat auditory nerve fiber captured from Longtin's paper (1993) is shown in figure 9.

Figure 9. ISI data from extracellular recording of cat auditory nerve fibers. The stimulus had been a pure 800-Hz tone (Longtin, 1993)

This phenomenon has been simulated with a Fitzhug-Naguma model in presence of different noise intensities by Longtin (1993). Results of this work presented in figure 10 suggested that as the noise intensity increases the more spikes occur on the first and second integers of the reference period, narrowing down the Inter Spike Interval Histogram (ISIH). This work was a theoretical evidence for enhancement of signal detection by noise in neurons.

Figure 10. Histogram of ISI distribution for the FHN equations with four different noise intensity (Longtin, 1993)

3.1.6 Measures of SR

Over the years a number of methods have been used to quantitatively demonstrate the SR effect in dynamical systems. Primary works in this field mostly focused on the behavior of the output signal in presence of noise, but this focus has shifted subsequently. Both theoretically and experimentally, it seems more interesting today to define stochastic resonance in terms of Signal to Noise Ratio (SNR) (McNamara & Wiesenfeld, 1989).

In general, power spectrums are the tools which provide the necessary information for calculation of SNR. However, the Pattern of the output Power Spectral Density (PSD) itself can be used directly as an evidence for presence of SR. For this purpose, the spectral power at the input frequency should be greater than the power at other frequencies. Besides, this prominence of the input component on the PSD should highlight at a particular noise level. In analogy, the amplitude of the spike corresponding to the periodic component of the response increases, undergoes a maximum, and then decays as a function of noise. As a result, in systems with SR, SNR versus noise intensity will form a shape curve (Lyttle, 2008; Gluckman et al., 1996).

Another measure of SR often used in neuroscience applications is the Inter Spike Interval Histogram (ISIH). For the classical SR, and for the systems without periodic signal, the time spent in each of the attraction domains is a random variable and the Probability distribution of this time span decreases exponentially with time. In presence of periodic forcing, ISI distribution turns into a multimodal function with the Gaussian-like peaks on the multiple integers of the reference period, yet having its exponential form of decay (Similar to the example in figure 8). In presence of noise, the first order peaks of ISIH gets higher and narrower, that is a result of phase synchronization of switching events (Lyttle, 2008).

3.1.7 Example

Experimental results of Gluckman et al (1996) are briefly reviewed in following as an example for the mechanism and measures of SR. The data of this work had been obtained from mammalian brain cells, subject to a Gaussian white noise with rms of Anoise, and periodic forcing of amplitude, Asin. Figure 11.A represents the local field potentials (LFP) of a typical network of neurons in response to two different types of input. Upper traces are LFPs and the second traces are the processed signals after filtering LFP for identifying bursts. In fact, height of these spikes are indicators of the burst rate. As it is observed in the upper panel, for the network with pure noisy input only random occurring bursts have been recorded. Adding a subthreshold sinusoidal signal to the input (Lower panel) have caused bursts to occur preferentially near the peaks of the sinusoid. Bursting Probability Density (BPD) as a function of phase of the periodic forcing, which is plotted in figure 11.B, has shown that in a pure noisy network bursts have occurred randomly with respect to the phase. However, as the Anoise increases, from panel a to d, BPD becomes a peaked function of phase as well as getting broader. In panel d, BPD has become so broad that bursts do not synchronize with a particular phase anymore. That is why SNR will decrease for this noise value.

The ISI patterns and PSD in figure 11.C and 11.D respectively, are other measures of SR. In panel a where Asin=0, ISIH is completely featureless, and PSD is widely distributed over all frequency ranges. Then, by increasing noise (panels b and c), peaks of ISIH have tended to lie on the multiple integers of the sinusoid, while DSP peaks have appeared at harmonics of sinusoid signal's frequency. And finally, as expected, the last largest noise in panel d has destroyed the regularity in ISIH, as well as shrinking down the proportion of signal components to noise components. The essence of SR can be grasped by looking at the shape SNR versus noise intensity in figure 11.E. It is clear that there exists an optimal noise at about Anoise = 10 which has maximized the SNR.

Figure 11. A) Activity in CA1 layer of a longitudinal slice with imposed electric field. First trace: purely random input. Second trace: a sinusoid plus random input. B) probability density of bursting as a function of phase of the periodic forcing. C) ISISH, D) PSD, E) SNR versus noise intensity (Gluckman et al., 1996)

3.2 Inverse stochastic resonance - History and Definition

About 2 decades after all delights due to discovery of SR in the field, a study by Gutkin, Hely, and Jost (2004) opened a new view to the effects of noise on nonlinear dynamical systems. Noise, hitherto assumed to advance bifurcation in multi stable systems, was shown to delay switching in a bistable neural oscillator in that study. The minimal network discussed there comprised of two synaptically connected theta neurons that reciprocally excited each other and produced sustained firing. The surprising issue was that although addition of noise to each neuron model individually advanced the saddle node bifurcation, intrinsic to the employed theta neuron models, insertion of small noise had shown an opposite effect that was facilitation of bifurcation. In fact, the observed effect of noise in their model was cessation or at least diminution of firing activity and not enhancement of it. Few years later, this phenomenon which had been theoretically proved in other computational models as an opposing phenomenon to SR, denominated the Inverse Stochastic Resonance (ISR) (Gutkin et al.,2009).

Some experimental studies also have supported the inhibitory effect of noise on repetitive firing of neurons. For example, a squid axon pacemaker having a quiescent and a firing mode of activity has been studied (Paydarfar, Forger, & Clay, 2006) indicating that pattern of transitions in this network is a function of the present noise intensity therewith small noise has an inhibitory effect on spiking.

Gutkin et al have released other publications (20081; 2008b) about the effect of noise on coupled neurons of type one and type two, using HH and QIF models respectively. They have indicated that by synaptically coupling these neuron models it is possible to create a system capable of producing sustained firing without requiring any input. This oscillatory activity can be stopped by small amounts of noise due to bi-stability of the system between a periodic state and a resting state. In their paper, termination of activity by introducing stochastic inputs has been portrayed in terms of probability of escape from the attraction domain of the periodic attractor and absorption by the stable rest point.

One year later Tuckwell, Jost and Gutkin (2009) claimed that such seeming odd trace of noise on dynamics of neural systems, is not only restricted to coupled neurons, but is generalizable to any system of nonlinear ordinary differential equations with a stable limit cycle and a stable rest state accompanying each other, such as what happens in circadian rhythms and cardiology, or even ecological and astronomic systems. It was shown in this study that activity of a single regular firing HH neuron, which has the aforesaid properties intrinsically, can be terminated by noise in a way that probability of stop is a U shape function of the noise intensity. Figure 12.a is captured from this study indicating this function.

Figure 12. a) Mean firing rate versus noise intensity in HH model for 3 different mean input current. ISR curve is plotted for subcritical = 5.5, supercritical = 8, and right at the critical bifurcation point = 6.8. (Tuckwell, Jost & Gutkin, 2009), b) Mean firing rate versus noise intensity in the same model for different coloured noise (Guo, 2011).

Noise effect has also been studied on a more realistic HH model, including spatial extent (Tuckwell & Jost, 2010; Tuckwell & Jost, 2011) to support the notion and existence of ISR in biological neurons. The same phenomenon due to a finite number of stochastic ion channels in HH model have been also demonstrated by Uzuntarla, Cressman, Ozer, and Barreto (2013).

Some experimental studies have revealed that the background noise input of real biological neural systems is similar to coloured noise rather than white noise (Destexhe & Contreras, 2006). Guo (2011) has investigated the effect of coloured noise on the firing activity of HH model. He has claimed that the inhibitory influence of coloured noise is stronger than the Gaussian white noise. A function indicating mean firing rate versus noise intensity for 3 coloured noise plus a white noise in a HH model was reported in his paper that is presented in figure 12.b. This figure suggests that the more colorful is a noise, the more probable it is to extinguish the activity by that.

Besides, Guo suggested that although ISR diminish the activity in neural systems, it is not necessarily a negative impact. He has proposed that ISR might be a mechanism for enriching the intrinsic stochastic dynamics of biological neural systems.

The first clear experimental evidence for functional role of ISR in biological neurons has come out very recently (Buchin, Rieubland, Hausser, Gutkin, & Roth, 2016), suggesting that a particular value of noise can efficiently inhibit cerebellar Purkinje cells. These cells possess two stable states of activity and are aided by synaptic perturbations to quickly switch between silent and repetitive firing modes.

3.2.1 ISR Mechanism

As Uzuntarla et al (2013) have argued, there are three key features underlying dynamical structure of a nonlinear system which lead to occurrence of the ISR phenomenon. (1) The first of these factors is the coexistence of a limit cycle with a stable resting equilibrium. In such condition, the state space divides in to the basins of unstable limit cycle and the stable rest. Therefore, the system is disposed to switch between these two regions by force of a random perturbation. (2) The second factor is that the basin boundary must be relatively close to the stable limit cycle so that a small noise would easily deviate the phase trajectories from the limit cycle and force them to the attraction domain of the rest. (3) And the last factor is the relatively far distance of the boundary from the rest point. Due to this property, small noise will not be able to turn the system back into the activity mode when it is silent. As long as such deterministic structure stands in a system, the probability of transitions between the states will decrease in presence of low noise, reach a minimum at intermediate level of noise, and increases again for large noise intensities.

3.2.2 Example

The mentioned dynamical structure that has been investigated for a single HH model by Uzuntarla et al (2013) will be shortly presented as an example here. For this intended system the limit cycle and equilibrium correspond to firing activity and rest state of a HH neuron respectively. The neuron exhibits two kinds of bifurcations as follow:

For small input currents the only attractor of the neuron is a stable equilibrium equivalent to the resting membrane voltage. By increasing the input current, one stable and one unstable limit cycle emerge and coexist with the rest state at a saddle node bifurcation point. For larger values of input the unstable limit cycle shrinks gradually and at an Andronov-Hopf bifurcation it totally collapses on to the stable equilibrium again. As a result, between these bifurcations the neuron is bi-stable and sufficiently large noise can induce a switching behavior in it. Figure 13 indicates the neuron's transitions between spiking and resting states in the phase state. The neuron's reaction to a large noise (panel a and b) is compared with a condition of exposure to small noise (panel c and d). In this figure the right panels show the magnified versions of the left panels close to the rest. As it is seen, low noise has spiraled the trajectories toward the equilibrium that is too far from the basin boundary. However, a large noise has become able to convey the trajectories across this distance and push them back to the attraction domain of the limit cycle. Therefore, an intermittent pattern of firing has appeared in the case of large noise. In the ISR curves, as in figure 12, The growth of spike counts on the right side of the optimal noise is due to this phenomenon.

...

Подобные документы

  • Social network theory and network effect. Six degrees of separation. Three degrees of influence. Habit-forming mobile products. Geo-targeting trend technology. Concept of the financial bubble. Quantitative research method, qualitative research.

    дипломная работа [3,0 M], добавлен 30.12.2015

  • История Network File System. Общие опции экспорта иерархий каталогов. Описание протокола NFS при монтировании удаленного каталога. Монтирование файловой системы Network Files System командой mount. Конфигурации, обмен данными между клиентом и сервером.

    курсовая работа [1,3 M], добавлен 16.06.2014

  • Information security problems of modern computer companies networks. The levels of network security of the company. Methods of protection organization's computer network from unauthorized access from the Internet. Information Security in the Internet.

    реферат [20,9 K], добавлен 19.12.2013

  • Технология протокола NAT (Network Address Translation). Особенности его функционирования, применения и основные конфигурации. Протоколы трансляции сетевых адресов. Преимущества и недостатки NAT. Основные способы его работы: статический и динамический.

    курсовая работа [480,1 K], добавлен 03.03.2015

  • Overview of social networks for citizens of the Republic of Kazakhstan. Evaluation of these popular means of communication. Research design, interface friendliness of the major social networks. Defining features of social networking for business.

    реферат [1,1 M], добавлен 07.01.2016

  • Основные виды сетевых атак на VIRTUAL PERSONAL NETWORK, особенности их проведения. Средства обеспечения безопасности VPN. Функциональные возможности технологии ViPNet(c) Custom, разработка и построение виртуальных защищенных сетей (VPN) на ее базе.

    курсовая работа [176,0 K], добавлен 29.06.2011

  • Сущность и понятие кластеризации, ее цель, задачи, алгоритмы; использование искусственных нейронных сетей для кластеризации данных. Сеть Кохонена, самоорганизующиеся нейронные сети: структура, архитектура; моделирование кластеризации данных в MATLAB NNT.

    дипломная работа [3,1 M], добавлен 21.03.2011

  • IS management standards development. The national peculiarities of the IS management standards. The most integrated existent IS management solution. General description of the ISS model. Application of semi-Markov processes in ISS state description.

    дипломная работа [2,2 M], добавлен 28.10.2011

  • Оперативна пам'ять як один з найважливіших елементів комп'ютера. Історія, розвиток та принцип функціонування пам'яті з довільним доступом (RAM - Random Access Memory). Будова, принцип організації, функціонування. Аналіз процесорів MMX, їх продуктивність.

    курсовая работа [176,6 K], добавлен 31.10.2014

  • Виды и внешние устройство системных блоков. Вывод звукового сигнала на акустическую систему. Оперативная память (Random Access Memory - память с произвольным доступом). Системная плата компьютера. Дисководы для работы со сменными носителями информации.

    презентация [1,8 M], добавлен 20.09.2013

  • Сущность и предназначение технологии VPN (Virtual Private Network), принципы ее работы. Современные средства криптографической защиты информации. Достоинства и недостатки использования VPN-технологий. VPN-appliances класса Small Office Home Office.

    презентация [1,2 M], добавлен 10.04.2014

  • 2 November 1988 Robert Morris younger (Robert Morris), graduate student of informatics faculty of Cornwall University (USA) infected a great amount of computers, connected to Internet network.

    реферат [9,3 K], добавлен 24.04.2005

  • Central Processing Unit. Controls timing of all computer operations. Types of adapter card. Provides quick access to data. Uses devices like printer. Random Access Memory. Directs and coordinates operations in computer. Control the speed of the operation.

    презентация [3,5 M], добавлен 04.05.2012

  • Преимущества и недостатки пиринговых сетей. Сети и протоколы. eDonkey2000: поиск, загрузка, межсерверніе соединения. Использование Kad Network. BitTorrent, принцип работы протокола, файл метаданных, трекер. Программы для работы с пиринговыми сетями.

    курсовая работа [78,6 K], добавлен 16.02.2009

  • Изучение архитектуры искусственных нейронных сетей, способов их графического изображения в виде функциональных и структурных схем и программного представления в виде объектов специального класса network. Неокогнитрон и инвариантное распознавание образов.

    курсовая работа [602,6 K], добавлен 12.05.2015

  • NIC (Network Interface Card) или сетевые адаптеры. Создание локальной сети и профиля. Выбор оборудования и операционной системы. Обжим проводов. Установка Windows 2003 Server, Traffic Inspector, DNS-сервера, DHCP-сервера. Применение маршрутизаторов.

    курсовая работа [8,8 M], добавлен 17.03.2014

  • Опис механізмів передачі даних між сторінками. Розробка доступного та зручного інтерфейсу веб-сайту компанії "Artput" для відвідувачів сайту і для адміністратора. Установка Apache 1.3.29 та PHP 4.3.4 під Windows XP. Структура веб-сервера та веб-сайту.

    дипломная работа [5,0 M], добавлен 24.09.2012

  • Разработка корпоративной и сети масштаба города (Metro Area Network, MAN) в программном продукте OpNet IT Guru Academic Edition v.9.1 и анализ полученных результатов. Определение профиля трафика, настройка оборудования, выбор типа собираемой статистики.

    лабораторная работа [2,9 M], добавлен 04.02.2013

  • Процессоры Duron на ядре Spitfire (Model 3), Morgan (Model 7), Applebred (Model 8), Mobile Duron Camaro. Схема материнской платы EP-8KHAL+. Микросхема "Северный мост". Звуковой чип ALC201A. Конфигурация системной памяти. Регулятор заглушки шины RT9173.

    курсовая работа [3,6 M], добавлен 26.03.2013

  • Создание сетевой модели для анализа маршрута транспортной сети города Майкопа. Использование программного обеспечения "ArcGIS 10.2" и дополнительного модуля "Network Analyst". Сетевой анализ и геокодирование и транспортная логистика в среде "ArcGIS 10.2".

    дипломная работа [4,9 M], добавлен 06.01.2016

Работы в архивах красиво оформлены согласно требованиям ВУЗов и содержат рисунки, диаграммы, формулы и т.д.
PPT, PPTX и PDF-файлы представлены только в архивах.
Рекомендуем скачать работу.