Abstract
A model order reduction algorithm is presented that generates a reduced-order model of the original high-order model, which ensures high fidelity within the desired time interval. The reduced model satisfies a subset of the first-order optimality conditions for the time-limited ℋ2-model reduction problem. The algorithm uses a computationally efficient Krylov subspace-based framework to generate the reduced model, and it is applicable to large-scale systems. The reduced-order model is parameterised to enforce a subset of the first-order optimality conditions in an iteration-free way. The authors also propose an adaptive framework of the algorithm, which ensures a monotonic decay in the error irrespective of the choice of interpolation points and tangential directions. The efficacy of the algorithm is validated on benchmark model reduction problems.
Original language | English |
---|---|
Pages (from-to) | 1995-2007 |
Number of pages | 13 |
Journal | IET Control Theory and Applications |
Volume | 14 |
Issue number | 14 |
DOIs | |
Publication status | Published - 24 Sept 2020 |