home libri books Fumetti ebook dvd top ten sconti 0 Carrello


Torna Indietro
ARGOMENTO:  BOOKS > BIOLOGIA > BIOLOGIA > NEUROBIOLOGIA

helias moritz; dahmen david - statistical field theory for neural networks

Statistical Field Theory for Neural Networks

;




Disponibilità: Normalmente disponibile in 15 giorni


PREZZO
81,98 €
NICEPRICE
77,88 €
SCONTO
5%



Questo prodotto usufruisce delle SPEDIZIONI GRATIS
selezionando l'opzione Corriere Veloce in fase di ordine.


Pagabile anche con Carta della cultura giovani e del merito, 18App Bonus Cultura e Carta del Docente


Facebook Twitter Aggiungi commento


Spese Gratis

Dettagli

Genere:Libro
Lingua: Inglese
Editore:

Springer

Pubblicazione: 08/2020
Edizione: 1st ed. 2020





Trama

This book presents a self-contained introduction to techniques from field theory applied to stochastic and collective dynamics in neuronal networks. These powerful analytical techniques, which are well established in other fields of physics, are the basis of current developments and offer solutions to pressing open problems in theoretical neuroscience and also machine learning. They enable a systematic and quantitative understanding of the dynamics in recurrent and stochastic neuronal networks.

This book is intended for physicists, mathematicians, and computer scientists and it is designed for self-study by researchers who want to enter the field or as the main text for a one semester course at advanced undergraduate or graduate level. The theoretical concepts presented in this book are systematically developed from the very beginning, which only requires basic knowledge of analysis and linear algebra.





Sommario

Introduction.- Probabilities, moments, cumulants.- Gaussian distribution and Wick’s theorem.- Perturbation expansion.- Linked cluster theorem.- Functional preliminaries.- Functional formulation of stochastic differential equations.- Ornstein-Uhlenbeck process: The free Gaussian theory.- Perturbation theory for stochastic differential equations.- Dynamic mean-field theory for random networks.- Vertex generating function.- Application: TAP approximation.- Expansion of cumulants into tree diagrams of vertex functions.- Loopwise expansion of the effective action - Tree level.- Loopwise expansion in the MSRDJ formalism.- Nomenclature.




Autore

Moritz Helias is group leader at the Jülich Research Centre and assistant professor in the department of physics of the RWTH Aachen University, Germany. He obtained his diploma in theoretical solid state physics at the University of Hamburg and his PhD in computational neuroscience at the University of Freiburg, Germany. Post-doctoral positions in RIKEN Wako-Shi, Japan and Jülich Research Center followed. His main research interests are neuronal network dynamics and function, and their quantitative analysis with tools from statistical physics and field theory.

David Dahmen is a post-doctoral researcher in the Institute of Neuroscience and Medicine at the Jülich Research Centre, Germany. He obtained his Master's degree in physics from RWTH Aachen University, Germany, working on effective field theory approaches to particle physics. Afterwards he moved to the field of computational neuroscience, where he received his PhD in 2017. His research comprises modeling, analysis and simulation of recurrent neuronal networks with special focus on development and knowledge transfer of mathematical tools and simulation concepts. His main interests are field-theoretic methods for random neural networks, correlations in recurrent networks, and modeling of the local field potential.










Altre Informazioni

ISBN:

9783030464431

Condizione: Nuovo
Collana: Lecture Notes in Physics
Dimensioni: 235 x 155 mm
Formato: Brossura
Illustration Notes:XVII, 203 p. 127 illus., 5 illus. in color.
Pagine Arabe: 203
Pagine Romane: xvii


Dicono di noi