جمع التبرعات 15 سبتمبر 2024 – 1 أكتوبر 2024 حول جمع التبرعات

Learning with recurrent neural networks

Learning with recurrent neural networks

Barbara Hammer
كم أعجبك هذا الكتاب؟
ما هي جودة الملف الذي تم تنزيله؟
قم بتنزيل الكتاب لتقييم الجودة
ما هي جودة الملفات التي تم تنزيلها؟
Folding networks, a generalisation of recurrent neural networks to tree structured inputs, are investigated as a mechanism to learn regularities on classical symbolic data, for example. The architecture, the training mechanism, and several applications in different areas are explained. Afterwards a theoretical foundation, proving that the approach is appropriate as a learning mechanism in principle, is presented: Their universal approximation ability is investigated- including several new results for standard recurrent neural networks such as explicit bounds on the required number of neurons and the super Turing capability of sigmoidal recurrent networks. The information theoretical learnability is examined - including several contribution to distribution dependent learnability, an answer to an open question posed by Vidyasagar, and a generalisation of the recent luckiness framework to function classes. Finally, the complexity of training is considered - including new results on the loading problem for standard feedforward networks with an arbitrary multilayered architecture, a correlated number of neurons and training set size, a varying number of hidden neurons but fixed input dimension, or the sigmoidal activation function, respectively
عام:
2000
الناشر:
Springer
اللغة:
english
الصفحات:
159
ISBN 10:
185233343X
ISBN 13:
9781852333430
سلسلة الكتب:
Lecture Notes in Control and Information Sciences 254
ملف:
DJVU, 1.51 MB
IPFS:
CID , CID Blake2b
english, 2000
إقرأ علي الإنترنت
جاري التحويل إلى
التحويل إلى باء بالفشل

أكثر المصطلحات والعبارات المستخدمة