A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks

Shujaat Khan, Jawwad Ahmad, Imran Naseem, Muhammad Moinuddin

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.

Original languageEnglish
Pages (from-to)593-612
Number of pages20
JournalCircuits, Systems, and Signal Processing
Volume37
Issue number2
DOIs
Publication statusPublished - 1 Feb 2018

Fingerprint

Recurrent neural networks
Back Propagation
Recurrent Neural Networks
Backpropagation
Learning algorithms
Learning Algorithm
Fractional
Gradient
Fractional Calculus
Nonlinear System Identification
Gradient Descent Method
Chaotic Time Series
Time Series Prediction
Pattern Classification
Pattern recognition
Nonlinear systems
Time series
Identification (control systems)

Cite this

Khan, Shujaat ; Ahmad, Jawwad ; Naseem, Imran ; Moinuddin, Muhammad. / A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks. In: Circuits, Systems, and Signal Processing. 2018 ; Vol. 37, No. 2. pp. 593-612.
@article{9d201abb260c4fd395318f52a7f86ef9,
title = "A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks",
abstract = "In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.",
keywords = "Back-propagation through time (BPTT), Fractional calculus, Gradient descent, Mackey–Glass chaotic time series, Minimum redundancy and maximum relevance (mRMR), Recurrent neural network (RNN)",
author = "Shujaat Khan and Jawwad Ahmad and Imran Naseem and Muhammad Moinuddin",
year = "2018",
month = "2",
day = "1",
doi = "10.1007/s00034-017-0572-z",
language = "English",
volume = "37",
pages = "593--612",
journal = "Circuits, Systems and Signal Processing",
issn = "0278-081X",
publisher = "Springer",
number = "2",

}

A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks. / Khan, Shujaat; Ahmad, Jawwad; Naseem, Imran; Moinuddin, Muhammad.

In: Circuits, Systems, and Signal Processing, Vol. 37, No. 2, 01.02.2018, p. 593-612.

Research output: Contribution to journalArticle

TY - JOUR

T1 - A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks

AU - Khan, Shujaat

AU - Ahmad, Jawwad

AU - Naseem, Imran

AU - Moinuddin, Muhammad

PY - 2018/2/1

Y1 - 2018/2/1

N2 - In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.

AB - In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.

KW - Back-propagation through time (BPTT)

KW - Fractional calculus

KW - Gradient descent

KW - Mackey–Glass chaotic time series

KW - Minimum redundancy and maximum relevance (mRMR)

KW - Recurrent neural network (RNN)

UR - http://www.scopus.com/inward/record.url?scp=85040836202&partnerID=8YFLogxK

U2 - 10.1007/s00034-017-0572-z

DO - 10.1007/s00034-017-0572-z

M3 - Article

VL - 37

SP - 593

EP - 612

JO - Circuits, Systems and Signal Processing

JF - Circuits, Systems and Signal Processing

SN - 0278-081X

IS - 2

ER -