A Field Guide to Dynamical Recurrent NetworksJohn F. Kolen, Stefan C. Kremer John Wiley & Sons, 15. 1. 2001. - 464 страница Acquire the tools for understanding new architectures and algorithms of dynamical recurrent networks (DRNs) from this valuable field guide, which documents recent forays into artificial intelligence, control theory, and connectionism. This unbiased introduction to DRNs and their application to time-series problems (such as classification and prediction) provides a comprehensive overview of the recent explosion of leading research in this prolific field. A Field Guide to Dynamical Recurrent Networks emphasizes the issues driving the development of this class of network structures. It provides a solid foundation in DRN systems theory and practice using consistent notation and terminology. Theoretical presentations are supplemented with applications ranging from cognitive modeling to financial forecasting. A Field Guide to Dynamical Recurrent Networks will enable engineers, research scientists, academics, and graduate students to apply DRNs to various real-world problems and learn about different areas of active research. It provides both state-of-the-art information and a road map to the future of cutting-edge dynamical recurrent networks. |
Садржај
Dynamical Recurrent Networks | 3 |
Chapter | 6 |
Chapter | 8 |
Chapter | 11 |
Networks with Adaptive State Transitions | 15 |
Chapter | 19 |
Representation of Discrete States | 83 |
Simple Stable Encodings of FiniteState Machines in Dynamic | 103 |
Understanding and Explaining DRN Behavior | 207 |
LIMITATIONS | 229 |
The Difficulty of Learning | 237 |
APPLICATIONS | 255 |
Sentence Processing and Linguistic Structure | 291 |
Neural Network Architectures for the Modeling of Dynamic | 311 |
Theory and Applications | 351 |
Looking Back and Looking | 377 |
4 | 132 |
Generalization and Inductive Bias | 141 |
PART IV | 148 |
Insertion of Prior Knowledge | 155 |
Gradient Calculations for Dynamic Recurrent Neural Networks | 179 |
Glossary | 409 |
415 | |
About the Editors | |
Друга издања - Прикажи све
Чести термини и фразе
activation algorithm attractor automaton back propagation behavior Bengio bifurcation BPTT Chapter Chomsky hierarchy computational connectionist consider context-free context-free languages defined delay derivative described discrete discriminant functions dynamical recurrent networks dynamical systems Elman encoding equations error example extraction feedback feedforward networks filter module finite finite-state machines first-order fixed point Giles gradient hidden neurons hidden units implemented input symbol iterated Laguerre filter learning algorithm limit cycle linear logistic logistic mapping long-term dependencies mapping Mealy machine memory kernel Moore machines NARX networks network architecture neurons node nonlinear oscillation output function parameters performance prediction problem processing units recurrent connections recurrent neural networks regular language represent representation S₁ second-order Section sequence short-term memory Siegelmann Sierpinski triangle sigmoid function signals simulate space stack strings structure Theorem training algorithm trajectory transformations transition Turing machine update values VC dimension vector weights
Популарни одломци
Страница iii - Department of Computing and Information Science University of Guelph Guelph, Ontario, Canada...
Страница 387 - Constructive learning of recurrent neural networks: limitations of recurrent cascade correlation and a simple solution," IEEE Transactions on Neural Networks, voL 6, no.