libri scuola books Fumetti ebook dvd top ten sconti 0 Carrello


Torna Indietro
ARGOMENTO:  BOOKS > BIOLOGIA > BIOLOGIA > NEUROBIOLOGIA

trappenberg thomas p. - fundamentals of computational neuroscience
Zoom

Fundamentals of Computational Neuroscience




Disponibilità: Normalmente disponibile in 10 giorni
A causa di problematiche nell'approvvigionamento legate alla Brexit sono possibili ritardi nelle consegne.


PREZZO
45,98 €
NICEPRICE
43,68 €
SCONTO
5%



Questo prodotto usufruisce delle SPEDIZIONI GRATIS
selezionando l'opzione Corriere Veloce in fase di ordine.


Facebook Twitter Aggiungi commento


Spese Gratis

Dettagli

Lingua: Inglese





Note Editore

Computational neuroscience is the theoretical study of the brain to uncover the principles and mechanisms that guide the development, organization, information processing, and mental functions of the nervous system. Fundamentals of computational neuroscience is the first introductory book to this topic. It introduces the theoretical foundations of neuroscience with a focus on understanding information processing in the brain. The book is aimed at those within the brain and cognitive sciences, from graduate level and upwards.




Sommario

Introduction1.1: What is computational Neuroscience?; 1.2: Domains in Computational Neuroscience; 1.3: What is a model?; 1.4: Emergence and adaptation; 1.5: From exploration to a theory of the brain; 1.6: Some notes on the book; Neurons and Conductance-based Models2.1: Modelling biological neurons; 2.2: Neurons are specialized cells; 2.3: Basic synaptic mechanisms; 2.4: The generation of action potentials: Hodgkin-Huxley equations; 2.5: Dendritic trees, the propagation of action potentials, and compartmental models; 2.6: Above and Beyond the Hodgkin-Huxley neuron: Fatigue, bursting and simplifications; Spiking Neurons and response variability3.1: Integrate-and-fire neurons; 3.2: The spike-response model; 3.3: Spike time variability; 3.4: Noise Models for IF neurons; Neurons in a Network4.1: Organizations of neuronal networks; 4.2: Information transmission in networks; 4.3: Population Dynamics: modelling the average behaviour of neurons; 4.4: The sigma node; 4.5: Networks with non-classical synapses: the sigma-pi node; Representations and the neural node5.1: How Neurons talk; 5.2: Information theory; 5.3: Information in spike trains; 5.4: Population coding and decoding; 5.5: Distributed representation; Feed-forward mapping networks6.1: Perception, function represntation, and look-up tables; 6.2: The sigma node as perception; 6.3: Multi-layer mapping networks; 6.4: Learning, generalization and biological interpretations; 6.5: Self-organizing network architectures and geentic algorighms; 6.6: Mapping networks with context units; 6.7: Probabilistic mapping networks; Associators and synaptic plasticity7.1: Associative memory and Hebbian learning; 7.2: An example of learning association; 7.3: The biochemical basis of synaptic plasticity; 7.4: The temporal structure of Hebbian plasticity: LTP and LTD; 7.5: Mathematical formulation of Hebian plasticity; 7.6: Weight distributions; 7.7: Neuronal response variability, gain control, and scaling; 7.8: Features of associators and Hebbian learning; Auto-associative memory and network dynamics8.1: Short-term memory and reverberating network activity; 8.2: Long-term memory and auto-associators; 8.3: Point attractor networks: The Grossberg-Hopfield model; 8.4: The phase diagram and the Grossberg-Hopfield model; 8.5: Sparse attractor neural networks; 8.6: Chaotic networks: a dynamical systems view; 8.7: Biologically more realistic variation of attractor networks; Continuous attractor and competitive networks9.1: Spatial representations and the sense of directions; 9.2: Learning with continuous pattern representations; 9.3: Asymptotic states and the dynamics of neural fields; 9.4: Path-integration, Hebbian trace rule, and sequence learning; 9.5: Competitive networks and self-organizing maps; Supervised learning and rewards systems10.1: Motor learning and control; 10.2: The delta rule; 10.3: Generalized delta rules; 10.4: Reward learning; System level organization and coupled networks111.1: System level anatomy of the brain; 11.2: Modular mapping networks; 11.3: Coupled attractor networks; 11.4: Working memory; 11.5: Attentive vision; 11.6: An interconnecting workspace hypothesis; A MATLAB guide to computational neuroscience12.1: Introduction to hte MATLAB programming environment; 12.2: Spiking neurons and numerical integration in MATLAB; 12.3: Associators and Hebbian learning; 12.4: Recurrent networks and networks dynamics; 12.5: Continuous attractor neural networks; 12.6: Error-backpropagation network; Appendix ASome Useful Mathematics; Appendix BBasic Probability Theory; Appendix CNumerical Integration; Index










Altre Informazioni

ISBN:

9780198515838

Condizione: Nuovo
Dimensioni: 246x171 mm.
Formato: Paperback
Illustration Notes:numerous figures
Pagine Arabe: 360


Dicono di noi