TY - JOUR
T1 - A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks
AU - Khan, Shujaat
AU - Ahmad, Jawwad
AU - Naseem, Imran
AU - Moinuddin, Muhammad
PY - 2018/2/1
Y1 - 2018/2/1
N2 - In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.
AB - In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.
KW - Back-propagation through time (BPTT)
KW - Fractional calculus
KW - Gradient descent
KW - Mackey–Glass chaotic time series
KW - Minimum redundancy and maximum relevance (mRMR)
KW - Recurrent neural network (RNN)
UR - http://www.scopus.com/inward/record.url?scp=85040836202&partnerID=8YFLogxK
U2 - 10.1007/s00034-017-0572-z
DO - 10.1007/s00034-017-0572-z
M3 - Article
AN - SCOPUS:85040836202
SN - 0278-081X
VL - 37
SP - 593
EP - 612
JO - Circuits, Systems, and Signal Processing
JF - Circuits, Systems, and Signal Processing
IS - 2
ER -