Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implications for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a“fault tolerance hint” can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.
Bu e-kitabı satın alın ve 1 tane daha ÜCRETSİZ kazanın!
Dil İngilizce ● Biçim PDF ● Sayfalar 192 ● ISBN 9789812830012 ● Dosya boyutu 84.1 MB ● Yayımcı World Scientific Publishing Company ● Kent Singapore ● Ülke SG ● Yayınlanan 1996 ● İndirilebilir 24 aylar ● Döviz EUR ● Kimlik 2682497 ● Kopya koruma Adobe DRM
DRM özellikli bir e-kitap okuyucu gerektirir