Neural Information Processing with Dynamical Synapses
Author | : Si Wu |
Publisher | : Frontiers E-books |
Total Pages | : 179 |
Release | : 2015-01-08 |
ISBN-10 | : 9782889193837 |
ISBN-13 | : 2889193837 |
Rating | : 4/5 (37 Downloads) |
Read and Download All BOOK in PDF
Download Neural Information Processing With Dynamical Synapses full books in PDF, epub, and Kindle. Read online free Neural Information Processing With Dynamical Synapses ebook anywhere anytime directly on your device. Fast Download speed and no annoying ads.
Author | : Si Wu |
Publisher | : Frontiers E-books |
Total Pages | : 179 |
Release | : 2015-01-08 |
ISBN-10 | : 9782889193837 |
ISBN-13 | : 2889193837 |
Rating | : 4/5 (37 Downloads) |
Author | : Michael I. Jordan |
Publisher | : MIT Press |
Total Pages | : 1114 |
Release | : 1998 |
ISBN-10 | : 0262100762 |
ISBN-13 | : 9780262100762 |
Rating | : 4/5 (62 Downloads) |
The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. These proceedings contain all of the papers that were presented.
Author | : Wulfram Gerstner |
Publisher | : Cambridge University Press |
Total Pages | : 591 |
Release | : 2014-07-24 |
ISBN-10 | : 9781107060838 |
ISBN-13 | : 1107060834 |
Rating | : 4/5 (38 Downloads) |
This solid introduction uses the principles of physics and the tools of mathematics to approach fundamental questions of neuroscience.
Author | : Jannik Luboeinski |
Publisher | : |
Total Pages | : 201 |
Release | : 2021-09-02 |
ISBN-10 | : |
ISBN-13 | : |
Rating | : 4/5 ( Downloads) |
Memory serves to process and store information about experiences such that this information can be used in future situations. The transfer from transient storage into long-term memory, which retains information for hours, days, and even years, is called consolidation. In brains, information is primarily stored via alteration of synapses, so-called synaptic plasticity. While these changes are at first in a transient early phase, they can be transferred to a late phase, meaning that they become stabilized over the course of several hours. This stabilization has been explained by so-called synaptic tagging and capture (STC) mechanisms. To store and recall memory representations, emergent dynamics arise from the synaptic structure of recurrent networks of neurons. This happens through so-called cell assemblies, which feature particularly strong synapses. It has been proposed that the stabilization of such cell assemblies by STC corresponds to so-called synaptic consolidation, which is observed in humans and other animals in the first hours after acquiring a new memory. The exact connection between the physiological mechanisms of STC and memory consolidation remains, however, unclear. It is equally unknown which influence STC mechanisms exert on further cognitive functions that guide behavior. On timescales of minutes to hours (that means, the timescales of STC) such functions include memory improvement, modification of memories, interference and enhancement of similar memories, and transient priming of certain memories. Thus, diverse memory dynamics may be linked to STC, which can be investigated by employing theoretical methods based on experimental data from the neuronal and the behavioral level. In this thesis, we present a theoretical model of STC-based memory consolidation in recurrent networks of spiking neurons, which are particularly suited to reproduce biologically realistic dynamics. Furthermore, we combine the STC mechanisms with calcium dynamics, which have been found to guide the major processes of early-phase synaptic plasticity in vivo. In three included research articles as well as additional sections, we develop this model and investigate how it can account for a variety of behavioral effects. We find that the model enables the robust implementation of the cognitive memory functions mentioned above. The main steps to this are: 1. demonstrating the formation, consolidation, and improvement of memories represented by cell assemblies, 2. showing that neuromodulator-dependent STC can retroactively control whether information is stored in a temporal or rate-based neural code, and 3. examining interaction of multiple cell assemblies with transient and attractor dynamics in different organizational paradigms. In summary, we demonstrate several ways by which STC controls the late-phase synaptic structure of cell assemblies. Linking these structures to functional dynamics, we show that our STC-based model implements functionality that can be related to long-term memory. Thereby, we provide a basis for the mechanistic explanation of various neuropsychological effects. Keywords: synaptic plasticity; synaptic tagging and capture; spiking recurrent neural networks; memory consolidation; long-term memory
Author | : Peiji Liang |
Publisher | : Springer |
Total Pages | : 338 |
Release | : 2015-12-22 |
ISBN-10 | : 9789401773935 |
ISBN-13 | : 9401773939 |
Rating | : 4/5 (35 Downloads) |
This book provides an overview of neural information processing research, which is one of the most important branches of neuroscience today. Neural information processing is an interdisciplinary subject, and the merging interaction between neuroscience and mathematics, physics, as well as information science plays a key role in the development of this field. This book begins with the anatomy of the central nervous system, followed by an introduction to various information processing models at different levels. The authors all have extensive experience in mathematics, physics and biomedical engineering, and have worked in this multidisciplinary area for a number of years. They present classical examples of how the pioneers in this field used theoretical analysis, mathematical modeling and computer simulation to solve neurobiological problems, and share their experiences and lessons learned. The book is intended for researchers and students with a mathematics, physics or informatics background who are interested in brain research and keen to understand the necessary neurobiology and how they can use their specialties to address neurobiological problems. It is also provides inspiration for neuroscience students who are interested in learning how to use mathematics, physics or informatics approaches to solve problems in their field.
Author | : Christof Koch |
Publisher | : Oxford University Press |
Total Pages | : 588 |
Release | : 2004-10-28 |
ISBN-10 | : 9780190292850 |
ISBN-13 | : 0190292857 |
Rating | : 4/5 (50 Downloads) |
Neural network research often builds on the fiction that neurons are simple linear threshold units, completely neglecting the highly dynamic and complex nature of synapses, dendrites, and voltage-dependent ionic currents. Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available to single neurons. The author shows how individual nerve cells can multiply, integrate, or delay synaptic inputs and how information can be encoded in the voltage across the membrane, in the intracellular calcium concentration, or in the timing of individual spikes. Key topics covered include the linear cable equation; cable theory as applied to passive dendritic trees and dendritic spines; chemical and electrical synapses and how to treat them from a computational point of view; nonlinear interactions of synaptic input in passive and active dendritic trees; the Hodgkin-Huxley model of action potential generation and propagation; phase space analysis; linking stochastic ionic channels to membrane-dependent currents; calcium and potassium currents and their role in information processing; the role of diffusion, buffering and binding of calcium, and other messenger systems in information processing and storage; short- and long-term models of synaptic plasticity; simplified models of single cells; stochastic aspects of neuronal firing; the nature of the neuronal code; and unconventional models of sub-cellular computation. Biophysics of Computation: Information Processing in Single Neurons serves as an ideal text for advanced undergraduate and graduate courses in cellular biophysics, computational neuroscience, and neural networks, and will appeal to students and professionals in neuroscience, electrical and computer engineering, and physics.
Author | : Dietmar Plenz |
Publisher | : John Wiley & Sons |
Total Pages | : 734 |
Release | : 2014-04-14 |
ISBN-10 | : 9783527651023 |
ISBN-13 | : 3527651020 |
Rating | : 4/5 (23 Downloads) |
Neurowissenschaftler suchen nach Antworten auf die Fragen, wie wir lernen und Information speichern, welche Prozesse im Gehirn verantwortlich sind und in welchem Zeitrahmen diese ablaufen. Die Konzepte, die aus der Physik kommen und weiterentwickelt werden, können in Medizin und Soziologie, aber auch in Robotik und Bildanalyse Anwendung finden. Zentrales Thema dieses Buches sind die sogenannten kritischen Phänomene im Gehirn. Diese werden mithilfe mathematischer und physikalischer Modelle beschrieben, mit denen man auch Erdbeben, Waldbrände oder die Ausbreitung von Epidemien modellieren kann. Neuere Erkenntnisse haben ergeben, dass diese selbstgeordneten Instabilitäten auch im Nervensystem auftreten. Dieses Referenzwerk stellt theoretische und experimentelle Befunde internationaler Gehirnforschung vor zeichnet die Perspektiven dieses neuen Forschungsfeldes auf.
Author | : Suzanna Becker |
Publisher | : MIT Press |
Total Pages | : 1738 |
Release | : 2003 |
ISBN-10 | : 0262025507 |
ISBN-13 | : 9780262025508 |
Rating | : 4/5 (07 Downloads) |
Proceedings of the 2002 Neural Information Processing Systems Conference.
Author | : Sensen Liu |
Publisher | : |
Total Pages | : 153 |
Release | : 2018 |
ISBN-10 | : OCLC:1057899841 |
ISBN-13 | : |
Rating | : 4/5 (41 Downloads) |
The brain produces complex patterns of activity that occur at different spatio-temporal scales. One of the fundamental questions in neuroscience is to understand how exactly these dynamics are related to brain function, for example our ability to extract and process information from the sensory periphery. This dissertation presents two distinct lines of inquiry related to different aspects of this high-level question. In the first part of the dissertation, we study the dynamics of burst suppression, a phenomenon in which brain electrical activity exhibits bistable dynamics. Burst suppression is frequently encountered in individuals who are rendered unconscious through general anesthesia and is thus a brain state associated with profound reductions in awareness and, presumably, information processing. Our primary contribution in this part of the dissertation is a new type of dynamical systems model whose analysis provides insights into the mechanistic underpinnings of burst suppression. In particular, the model yields explanations for the emergence of the characteristic two time-scales within burst suppression, and its synchronization across wide regions of the brain.The second part of the dissertation takes a different, more abstract approach to the question of multiple time-scale brain dynamics. Here, we consider how such dynamics might contribute to the process of learning in brain and brain-like networks, so as to enable neural information processing and subsequent computation. In particular, we consider the problem of optimizing information-theoretic quantities in recurrent neural networks via synaptic plasticity. In a recurrent network, such a problem is challenging since the modification of any one synapse (connection) has nontrivial dependency on the entire state of the network. This form of global learning is computationally challenging and moreover, is not plausible from a biological standpoint. In our results, we overcome these issues by deriving a local learning rule, one that modifies synapses based only on the activity of neighboring neurons. To do this, we augment from first principles the dynamics of each neuron with several auxiliary variables, each evolving at a different time-scale. The purpose of these variables is to support the estimation of global information-based quantities from local neuronal activity. It turns out that the synthesized dynamics, while providing only an approximation of the true solution, nonetheless are highly efficacious in enabling learning of representations of afferent input. Later, we generalize this framework in two ways, first to allow for goal-directed reinforcement learning and then to allow for information-based neurogenesis, the creation of neurons within a network based on task needs. Finally, the proposed learning dynamics are demonstrated on a range of canonical tasks, as well as a new application domain: the exogenous control of neural activity.
Author | : Michael S. Kearns |
Publisher | : MIT Press |
Total Pages | : 1122 |
Release | : 1999 |
ISBN-10 | : 0262112450 |
ISBN-13 | : 9780262112451 |
Rating | : 4/5 (50 Downloads) |
The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented.