Please use this identifier to cite or link to this item:
Title: A Comparison of Architectural Varieties in Radial Basis Function Neural Networks
Authors: Efe, Mehmet Önder
Kasnakoğlu, Coşku
Keywords: [No Keywords]
Issue Date: 2008
Publisher: IEEE
Source: International Joint Conference on Neural Networks -- JUN 01-08, 2008 -- Hong Kong, PEOPLES R CHINA
Series/Report no.: IEEE International Joint Conference on Neural Networks (IJCNN)
Abstract: Representation of knowledge within a neural model is an active field of research involved with the development of alternative structures, training algorithms, learning modes and applications. Radial Basis Function Neural Networks (RBFNNs) constitute an important part of the neural networks research as the operating principle is to discover and exploit similarities between an input vector and a feature vector. In this paper, we consider nine architectures comparatively in terms of learning performances. Levenberg-Marquardt (LM) technique is coded for every individual configuration and it is seen that the model with a linear part augmentation performs better in terms of the final least mean squared error level in almost all experiments. Furthermore, according to the results, this model hardly gets trapped to the local minima. Overall, this paper presents clear and concise figures of comparison among 9 architectures and this constitutes its major contribution.
ISBN: 978-1-4244-1820-6
ISSN: 2161-4393
Appears in Collections:Elektrik ve Elektronik Mühendisliği Bölümü / Department of Electrical & Electronics Engineering
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection

Show full item record

CORE Recommender

Page view(s)

checked on Dec 26, 2022

Google ScholarTM



Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.