Statistical learning
Dimension reduction
We study different dimension reduction techniques. There are many problems that we are interested for. First we are interested on the effect of unsupervised dimension reduction techniques in a regression setting. Second we are interested in supervised dimension reduction techniques and more specifically to extend the class of methods that is known as Sufficient Dimension Reduction (SDR). Third we are investigating how machine learning ideas like SVM can be combined with SDR methodology to provide better estimation of the reduced subspace. Finally we are interested for the theoretical framework of dimension reduction methods in a text mining setting. The projects that we are working on can be methodological or computational.
Cardiff Investigators
- Dr. Andreas Artemiou
- Dr. Jenny Morgan
- Mr. Luke Smallman (Ph.D. student)
Collaborators
- Prof. Bing Li (Pennsylvania State University)
- Dr. Yuexiao Dong (Temple University)
- Dr. Seung-Jun Shin (Korea University)
Selected publications
- Andreas Artemiou and Lipu Tian (2015), “Using Sliced Inverse Mean Difference for Sufficient Dimension Reduction”, To appear in Statistics and Probability Letters.
- Luke Smallman and Andreas Artemiou (2015), “A Study on Imbalance Support Vector Machine Algorithms for Sufficient Dimension Reduction”, To appear in Communications in Statistics, Theory and Methods.
- Andreas Artemiou and Bing Li (2013), “Predictive power of principal components for single-index model and sufficient dimension reduction”, Journal of Multivariate Analysis, 119, 176-184.
- Bing Li, Andreas Artemiou and Lexin Li (2011), “Principal support vector machine for linear and nonlinear sufficient dimension reduction”, Annals of Statistics, 39, 3182-3210
- Andreas Artemiou and Bing Li (2009), “On principal components and regression: A statistical explanation of a natural phenomenon”, Statistica Sinica, 19, 1557-1565.