Fundamentals of Artificial Neural NetworksMIT Press, 1995 - 511 страница "In Fundamentals of Artificial Neural Networks, Mohamad Hassoun provides the first systematic account of artificial neural network paradigms by identifying clearly the fundamental concepts and major methodologies underlying most of the current theory and practice employed by neural network researchers. Such a systematic and unified treatment makes the subject more accessible to students and practitioners. Here, important results are integrated in order to more fully explain a wide range of existing empirical observations and commonly used heuristics. There are numerous illustrative examples, more than 200 end-of-chapter analytical and computer-based problems that will aid in the development of neural network analysis and design skills, and a bibliography of nearly 700 references. Proceeding in a clear and logical fashion, the book presents the basic building blocks and concepts of artificial neural networks, brings together supervised, reinforcement, and unsupervised learning rules in simple nets in a common framework, and then covers such topics as the convergence and solution properties of these learning rules, learning multilayer nets using backprop and its variants, major neural network paradigms, associative memories, energy minimizing nets, Boltzmann machines and Boltzmann learning, and other global search/optimization algorithms such as stochastic gradient search, simulated annealing, and genetic algorithms."--Page 4 of cover. |
Садржај
1 | 1 |
4 | 21 |
2 | 30 |
2 | 88 |
Mathematical Theory of Neural Learning | 143 |
Adaptive Multilayer Neural Networks I | 197 |
Adaptive Multilayer Neural Networks II | 285 |
Associative Neural Memories | 345 |
Global Search Methods for Neural Networks | 417 |
469 | |
501 | |
Друга издања - Прикажи све
Чести термини и фразе
activation function adaptive approximation arbitrary architecture artificial neural networks associative memory assumed asymptotically autoassociative binary Boolean functions capable classifier clustering computational convergence criterion function derived discrete-time dynamical system eigenvalues eigenvector employing equilibrium error example f(net feedforward neural fundamental memories genetic algorithm given gradient descent gradient-descent Hassoun Hebbian Hebbian learning hidden layer hidden units Hopfield Hopfield net IEEE initial input pattern input space input vector interconnected Iteration learning rate learning rule Liapunov function linearly separable local minima matrix method minimizing multilayer nets neural net nonlinear Note optimal output layer output unit parameters perceptron polynomial positive preceding problem prototype random RBF network realized receptive field recurrent region retrieval rule in Equation Section Show shown in Figure sigmoidal signal simulated annealing solution stability stochastic strings supervised learning term theorem tion training set update values w₁ weight vector wk+1 zero
Популарни одломци
Страница 488 - Real-Time Dynamic Control of an Industrial Manipulator Using a Neural-Network-Based Learning Controller," IEEE Transactions on Robotics and Automation, Vol.