Load-flow parallel processing system. Conjugate-gradient neural network architecture

Tien Nguyen

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

Whilst very many previously published papers have been devoted to the non-linear mapping functions of neural networks, the possible scope of neural networks in numerical analysis applications has not so far been the subject of correspondingly extensive investigation. In that acknowledgment, research has been undertaken in the area of solving by neural networks the large non-linear equation systems that arise in power network system analysis. From this research, the paper reports the development of neural network architectures for Newton-Raphson load-flow analysis.Load-flow analysis is interpreted as an unconstrained minimisation formulation. The linearised load-flow equations at each Newton iteration are transformed to a scalar objective function of quadratic form. Using standard minimisation procedures for minimising this objective function requires only the multiplications and additions which arise in matrix/vector operations. It is here where neural networks can excel. The paper shows how a massively parallel processing structure can be achieved in which very many multiplications are carried out at the same time. The neural network architecture developed, therefore, achieves ultra-high-speed load-flow analysis. The computing time to achieve a converged load-flow solution for a 1000 node power network is less than about 20 ms.Reference is made to validation studies in which load-flow solutions from the new neural network architecture are compared with those from a standard sequential processor Newton-Raphson load-flow program. (C) 1996 Elsevier Science SA.
Original languageEnglish
Pages (from-to)73-79
JournalElectric Power Systems Research
Volume39
Issue number12
DOIs
Publication statusPublished - 1996

Fingerprint

Dive into the research topics of 'Load-flow parallel processing system. Conjugate-gradient neural network architecture'. Together they form a unique fingerprint.

Cite this