Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implications for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a“fault tolerance hint” can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.
Köp den här e-boken och få 1 till GRATIS!
Språk Engelska ● Formatera PDF ● Sidor 192 ● ISBN 9789812830012 ● Filstorlek 84.1 MB ● Utgivare World Scientific Publishing Company ● Stad Singapore ● Land SG ● Publicerad 1996 ● Nedladdningsbara 24 månader ● Valuta EUR ● ID 2682497 ● Kopieringsskydd Adobe DRM
Kräver en DRM-kapabel e-läsare