Dimension selection for feature selection and dimension reduction with principal and independent component analysis

Inge Koch, Kanta Naito

Research output: Contribution to journalArticlepeer-review

20 Citations (Scopus)

Abstract

This letter is concerned with the problem of selecting the best or most informative dimension for dimension reduction and feature extraction in high-dimensional data. The dimension of the data is reduced by principal component analysis; subsequent application of independent component analysis to the principal component scores determines the most nongaussian directions in the lower-dimensional space. A criterion for choosing the optimal dimension based on bias-adjusted skewness and kurtosis is proposed. This new dimension selector is applied to real data sets and compared to existing methods. Simulation studies for a range of densities show that the proposed method performs well and is more appropriate for nongaussian data than existing methods.

Original languageEnglish
Pages (from-to)513-545
Number of pages33
JournalNeural Computation
Volume19
Issue number2
DOIs
Publication statusPublished - 1 Jan 2007
Externally publishedYes

Fingerprint

Dive into the research topics of 'Dimension selection for feature selection and dimension reduction with principal and independent component analysis'. Together they form a unique fingerprint.

Cite this