Neural Networks: A Comprehensive FoundationPrentice Hall, 1999 - 842 страница Introduction; Learning processes; Single layer perceptrons; Multilayer perceptrons; Radial-basis function networks; Support vector machines; Comittee machines; Principal components analysis; Self-organizing maps; Information-theoretic models; Stochastic machines and their approximates rooted in statistical mechanics; neurodynamic programming; Temporal processing using feedforward networks; Neurodynamics; Dynamically driven recurrent networks; Epilogue; Bibliography; Index. |
Садржај
Introduction | 1 |
Learning Processes | 50 |
Single Layer Perceptrons | 117 |
Ауторска права | |
други делови (15) нису приказани
Друга издања - Прикажи све
Чести термини и фразе
activation function adaptive applied approximation attractor back-propagation back-propagation algorithm back-propagation learning bias Boltzmann machine Chapter classification computation condition convergence corresponding cost function defined denote derivative described in Eq desired response discussed distribution dynamic eigenvalue entropy equation estimate example experts feature map feedback feedforward FIGURE follows Gaussian gradient Hebbian Hessian matrix hidden layer hidden neurons HME model Hopfield network induced local field input layer input space input vector iteration kernel Kullback-Leibler divergence learning algorithm learning-rate parameter linear LMS algorithm method minimization multilayer perceptron mutual information neural network neuron nodes noise nonlinear operation optimal output layer output neuron performance principal components principal components analysis probability density function problem radial-basis function random variable RBF network recurrent network represents respect result Section self-organizing sigmoid signal-flow graph statistical stochastic supervised learning support vector machine theorem tion training data training sample training set VC dimension x₁ zero