Transcript of "Elements of Theory for Multi-Neuronal Systems"
ELEMENTS OF THEORY FOR MULTI-NEURONAL SYSTEMS Witali L. Dunin-Barkowski , Scientific Research Inst ш tute for System Analysis, Russian Academy of Sciences, Moscow, Russia Achievements and Applications of Contemporary Physics and Informatics, Kiev, August 17 , 201 1 [email_address]
The course consists of two parts: (1) Representation of continuous values in neuronal systems; and (2) Informational evaluations of neural systems. (1) Inputs and outputs of neural systems, as well as sometimes structures, which are located deep inside them, use the neural impulses frequency code for continuous variables. In models, this code is often substituted with continuous variables, stripped off the impulses. A systematic approach, correct at both extremes – just impulses and just continuous entities, as well as anywhere between them, will be given in lectures. In other cases, groups (pools) of neurons are used to represent continuous variables. We consider two cases of multi-neuronal representation of continuous variables - the simultaneous (or parallel) representation and the sequential (or bump) representation. The first is implemented, for example, in a set of motor neurons of a single muscle. The second is thought to be implemented in cortical columns, in hippocampal place cell system, and in many other working sites of the neural computational machinery. We will consider in detail qualitative and quantitative properties of the multi-neuronal coding of continuous variables (4 hours).
(2) The theory for evaluation of informational efficiency and informational capacity of neuronal systems in theoretical models and physiological experiments will be presented. The plausible roadmap to complete understanding of principles of neural operations in human brain will be shortly discussed in conclusion (2 hours). The plausible roademap to complete understanding of principles of neural operations in human brain will be shortly`discussed in conclusion. Course language: English Course duration: 6 hours References Borisyuk G.N., Borisyuk R.M., Dunin-Barkowski W.L., Kovalenko V.N., Kovalenko E.I. Estimation of Information Capacity of Purkinje Cells. In: " Locally Interacting Systems and their application in Biology ", Lecture Notes in Mathematics, vol. 653, Springer-Verlag, Berlin, N.-Y., 1978, pp. 72-90. Дунин-Барковский В.Л. Информационные процессы в нейронных структурах. М. Наука, 1978, 166 с. (Dunin-Barkowski W.L. Informational Processes in Neuronal Structures. Moscow, Nauka Publishers, 1978, 166 h., in Russian)
Дунин-Барковский В.Л. Многонейронные структуры: теория и эксперимент. УФН, 1986, т. 150, вып. 2, сс. 321-323. (Dunin-Barkowski W.L. Multineuronal systems: theory and experiment. – Soviet Physics – Uspehi, 1986, Vol. 150, pp. 321-323, in Russian; an English version might be purchased on Internet). Dunin-Barkowski W.L., Osovets N.B. Hebb-Hopfield neural networks based on one-dimensional sets of neuronal states. - Neural Processing Letters , 1995, Vol. 2, No. 4, pp. 28-31. Orem J.M., Lovering A.T., Dunin-Barkowski W.L., Vidruk E.H. Endogenous excitatory drive to the respiratory system in rapid eye movement sleep. - J. Physiol. , 2000, Vol. 527.2, p. 365-376.4. Dunin-Barkowski W.L., Sirota M.G., Lovering A.T., Orem J.M., Vidruk E.H., Beloozerova I.N.Precise rhythmicity in activity of neocortical, thalamic and brain stem neurons in behaving cats and rabbits. - Behavioral Brain Research, 2006, Vol. 175, no. 1, pp: 27-42. Hopfield J.J. Neurodynamics of mental exploration. PNAS , 2010, 107 (4). 1648-1653. Romani S, Tsodyks M. Continuous attractors with morphed/correlated maps. - PLoS Comput Biol. 2010 Aug 5;6(8).
Szatmáry B, Izhikevich EM. Spike-timing theory of working memory. - PLoS Comput Biol. 2010 Aug 19;6(8). W.L. Dunin-Barkowski, A.T. Lovering, J.M. Orem, D.M. Baekey T.E. Dick, I.A. Rybak, K.F. Morris, R. O’Connor, S.C. Nuding, R. Shannon and B.G. Lindsey. L-plotting - A method for visual analysis of physiological experimental and modeling multi-component data. – Neurocomputing , 2010, Vol. 74, No. 1-3, pp. 328-336. Dunin-Barkowski W. L. , Flerov Yu. A., and Wyshinsky L. L. Prognosis of Dynamical Systems Behavior Based on Cerebellar-Type Neural Technologies. Optical Memory and Neural Networks (Information Optics) , 2011, Vol. 20, No. 1, pp. 43–58. Itskov PM, Vinnik E, Diamond ME Hippocampal Representation of Touch-Guided Behavior in Rats: Persistent and Independent Traces of Stimulus and Reward Location. - PLoS ONE (2011) 6(1): e16462. doi:10.1371/journal.pone.0016462
<ul><li>PLAN FOR AUGUST 17, 2011 </li></ul><ul><li>Concrete applications of informational approaches in neurophysiology. </li></ul><ul><li>What’s INFORMATION? </li></ul><ul><li>Information – deterministic and stochastic approaches. </li></ul><ul><li>In search of live bump attractors and their identification. </li></ul><ul><li>Code of Mind. </li></ul><ul><li>The language as a series of inventions of unknown geniuses of the primordial ages. </li></ul>
On the other hand, That is where from the famous expression for the amount of information appears … (not the “-” in exponent)
So, the amount of information is always the number (often, the logarithm of this number) of unknown versions of the response to the quest. If this number becomes reduced after some procedure, the procedure is considered to give information to its user. The information gain usually is measured by the logarithm of the ration of the number of initial versions of the response to the number of final versions of the response. If the latter is 1, then the information gain is equal to the information amount of the source of information. The complicated informational probabilistic calculations are always targeted to the estimates of these numbers. Physical entropy is the same.
<ul><li>The CODING of information is performed for three purposes: </li></ul><ul><li>Hiding the information. </li></ul><ul><li>It seems that that was the main engine to create the information theory (e.g. Fomin and Shannon). </li></ul><ul><li>2. To compress the information. Effective coding. </li></ul><ul><li>3. To preserve the information in presence of noise. </li></ul><ul><li>For this purpose redundancy is fed into the informational system. The simplest is just repetition of the transmitted signals. But there are much more efficient ways of fighting noises. </li></ul>
Почему «бугорковый»? В интервалах времени за N /4 тактов до моментов «2» и «3» в данной нумерации нейронов активность выглядит как бугорок: В симметричной сети «бугорки» представляют устойчивые состояния, а в асимметричной – динамику активности сети.
Л-карта ( L-plot) активности 17 дыхательных нейронов продолговатого мозга кошки (W.L. Dunin-Barkowski, 2006; W.L Dunin-Barkowski et al., 2006)