A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks

Shujaat Khan, Jawwad Ahmad, Imran Naseem, Muhammad Moinuddin

Research output: Contribution to journalArticlepeer-review

37 Citations (Scopus)

Abstract

In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.

Original languageEnglish
Pages (from-to)593-612
Number of pages20
JournalCircuits, Systems, and Signal Processing
Volume37
Issue number2
DOIs
Publication statusPublished - 1 Feb 2018

Fingerprint

Dive into the research topics of 'A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks'. Together they form a unique fingerprint.

Cite this