Multi-Kernel Fusion for RBF Neural Networks

Syed Muhammad Atif, Shujaat Khan, Imran Naseem, Roberto Togneri, Mohammed Bennamoun

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

A simple yet effective architectural design of radial basis function neural networks (RBFNN) makes them amongst the most popular conventional neural networks. The current generation of radial basis function neural network is equipped with multiple kernels which provide significant performance benefits compared to the previous generation using only a single kernel. In existing multi-kernel RBF algorithms, multi-kernel is formed by the convex combination of the base/primary kernels. In this paper, we propose a novel multi-kernel RBFNN in which every base kernel has its own (local) weight. This novel flexibility in the network provides better performance such as faster convergence rate, better local minima and resilience against stucking in poor local minima. These performance gains are achieved at a competitive computational complexity compared to the contemporary multi-kernel RBF algorithms. The proposed algorithm is thoroughly analysed for performance gain using mathematical and graphical illustrations and also evaluated on three different types of problems namely: (i) pattern classification, (ii) system identification and (iii) function approximation. Empirical results clearly show the superiority of the proposed algorithm compared to the existing state-of-the-art multi-kernel approaches.

Original languageEnglish
Pages (from-to)1045-1069
Number of pages25
JournalNeural Processing Letters
Volume55
Issue number2
Early online date24 Jun 2022
DOIs
Publication statusPublished - Apr 2023

Fingerprint

Dive into the research topics of 'Multi-Kernel Fusion for RBF Neural Networks'. Together they form a unique fingerprint.

Cite this