MARC details
000 -LEADER |
fixed length control field |
02570nmm a22002415i 4500 |
005 - DATE AND TIME OF LATEST TRANSACTION |
control field |
20230705150632.0 |
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION |
fixed length control field |
121227s2000 xxk| s |||| 0|eng d |
020 ## - INTERNATIONAL STANDARD BOOK NUMBER |
International Standard Book Number |
9781846285677 |
-- |
978-1-84628-567-7 |
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER |
Classification number |
629.8 |
Edition number |
23 |
100 ## - MAIN ENTRY--PERSONAL NAME |
Personal name |
Hammer, Barbara. |
9 (RLIN) |
20171 |
245 ## - TITLE STATEMENT |
Title |
Learning with Recurrent Neural Networks |
Medium |
[electronic resource] / |
Statement of responsibility, etc. |
by Barbara Hammer. |
250 ## - EDITION STATEMENT |
Edition statement |
1st ed. 2000. |
260 ## - PUBLICATION, DISTRIBUTION, ETC. |
Place of publication, distribution, etc. |
London : |
Name of publisher, distributor, etc. |
Springer London : |
-- |
Imprint: Springer, |
Date of publication, distribution, etc. |
2000. |
300 ## - PHYSICAL DESCRIPTION |
Extent |
150 p. |
Other physical details |
online resource. |
505 ## - FORMATTED CONTENTS NOTE |
Formatted contents note |
Introduction, Recurrent and Folding Networks: Definitions, Training, Background, Applications -- Approximation Ability: Foundationa, Approximation in Probability, Approximation in the Maximum Norm, Discussions and Open Questions -- Learnability: The Learning Scenario, PAC Learnability, Bounds on the VC-dimension of Folding Networks, Consquences for Learnability, Lower Bounds for the LRAAM, Discussion and Open Questions -- Complexity: The Loading Problem, The Perceptron Case, The Sigmoidal Case, Discussion and Open Questions -- Conclusion. |
520 ## - SUMMARY, ETC. |
Summary, etc. |
Folding networks, a generalisation of recurrent neural networks to tree structured inputs, are investigated as a mechanism to learn regularities on classical symbolic data, for example. The architecture, the training mechanism, and several applications in different areas are explained. Afterwards a theoretical foundation, proving that the approach is appropriate as a learning mechanism in principle, is presented: Their universal approximation ability is investigated- including several new results for standard recurrent neural networks such as explicit bounds on the required number of neurons and the super Turing capability of sigmoidal recurrent networks. The information theoretical learnability is examined - including several contribution to distribution dependent learnability, an answer to an open question posed by Vidyasagar, and a generalisation of the recent luckiness framework to function classes. Finally, the complexity of training is considered - including new results on the loading problem for standard feedforward networks with an arbitrary multilayered architecture, a correlated number of neurons and training set size, a varying number of hidden neurons but fixed input dimension, or the sigmoidal activation function, respectively. |
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM |
Topical term or geographic name entry element |
Control engineering. |
9 (RLIN) |
20172 |
|
Topical term or geographic name entry element |
Robotics. |
9 (RLIN) |
20173 |
|
Topical term or geographic name entry element |
Automation. |
9 (RLIN) |
20174 |
|
Topical term or geographic name entry element |
Control, Robotics, Automation. |
9 (RLIN) |
20175 |
856 ## - ELECTRONIC LOCATION AND ACCESS |
Uniform Resource Identifier |
<a href="https://doi.org/10.1007/BFb0110016">https://doi.org/10.1007/BFb0110016</a> |
942 ## - ADDED ENTRY ELEMENTS (KOHA) |
Koha item type |
e-Book |