Abstract
This letter is concerned with the problem of selecting the best or most informative dimension for dimension reduction and feature extraction in high-dimensional data. The dimension of the data is reduced by principal component analysis; subsequent application of independent component analysis to the principal component scores determines the most nongaussian directions in the lower-dimensional space. A criterion for choosing the optimal dimension based on bias-adjusted skewness and kurtosis is proposed. This new dimension selector is applied to real data sets and compared to existing methods. Simulation studies for a range of densities show that the proposed method performs well and is more appropriate for nongaussian data than existing methods.
Original language | English |
---|---|
Pages (from-to) | 513-545 |
Number of pages | 33 |
Journal | Neural Computation |
Volume | 19 |
Issue number | 2 |
DOIs | |
Publication status | Published - 1 Jan 2007 |
Externally published | Yes |