%0 Journal Article %T £¿1 Major Component Detection and Analysis (£¿1 MCDA): Foundations in Two Dimensions %A Ye Tian %A Qingwei Jin %A John E. Lavery %A Shu-Cherng Fang %J Algorithms %D 2013 %I MDPI AG %R 10.3390/a6010012 %X Principal Component Analysis (PCA) is widely used for identifying the major components of statistically distributed point clouds. Robust versions of PCA, often based in part on the £¿1 norm (rather than the £¿2 norm), are increasingly used, especially for point clouds with many outliers. Neither standard PCA nor robust PCAs can provide, without additional assumptions, reliable information for outlier-rich point clouds and for distributions with several main directions (spokes). We carry out a fundamental and complete reformulation of the PCA approach in a framework based exclusively on the £¿1 norm and heavy-tailed distributions. The £¿1 Major Component Detection and Analysis ( £¿1 MCDA) that we propose can determine the main directions and the radial extent of 2D data from single or multiple superimposed Gaussian or heavy-tailed distributions without and with patterned artificial outliers (clutter). In nearly all cases in the computational results, 2D £¿1 MCDA has accuracy superior to that of standard PCA and of two robust PCAs, namely, the projection-pursuit method of Croux and Ruiz-Gazen and the £¿1 factorization method of Ke and Kanade. (Standard PCA is, of course, superior to £¿1 MCDA for Gaussian-distributed point clouds.) The computing time of £¿1 MCDA is competitive with the computing times of the two robust PCAs. %K heavy-tailed distribution %K £¿1 %K £¿2 %K major component %K multivariate statistics %K outliers %K principal component analysis %K 2D %U http://www.mdpi.com/1999-4893/6/1/12