INEB
INEB
TitleNeural network classification: Maximizing zero-error density
Publication TypeConference Paper
Year of Publication2005
AuthorsSilva, LM, Alexandre, LA, De Sá, JM
EditorS., S, M., S, C., A, P., P
Conference NameLecture Notes in Computer ScienceLect. Notes Comput. Sci.
Date Published2005///
Conference LocationBath
ISBN Number03029743 (ISSN)
KeywordsAlgorithms, Backpropagation, Backpropagation algorithm, Classification (of information), Costs, Error analysis, Function evaluation, Learning systems, Mean square error, Neural network classification, Neural networks, Zero-error density
AbstractWe propose a new cost function for neural network classification: the error density at the origin. This method provides a simple objective function that can be easily plugged in the usual backpropagation algorithm, giving a simple and efficient learning scheme. Experimental work shows the effectiveness and superiority of the proposed method when compared to the usual mean square error criteria in four well known datasets. © Springer-Verlag Berlin Heidelberg 2005.
URLhttp://www.scopus.com/inward/record.url?eid=2-s2.0-27244450344&partnerID=40&md5=73ee3c0c073fb3bc7bc85013bf7dcce6