Some Examples of Regularized Matrix Decomposition
Department of Statistics
Texas A&M University
Abstract: In this talk, I will review some of my recent works on regularized matrix decomposition. Depending on the application, the matrix in consideration can be a data matrix, a latent canonical parameter matrix of an exponential family distribution, or the regression coefficient matrix of a multivariate regression. I will use the penalized least squares or penalized maximum likelihood method as a common approach for these matrix decomposition problems. I will discuss use of various penalty functions for regularization purpose, including sparsity-inducing penalty, roughness penalty, and their combinations. Governed by the structure of the problem, the penalty can be designed for one-way or two-way regularization. I will illustrate the key ideas using some examples, including functional principal components analysis, biclustering, reconstruction of MEG/EEG source signals, and protein structure clustering using protein backbone angular distributions. This talk is based on joint works with Andreas Buja, Xin Gao, Seokho Lee, Mehdi Maadooliat, Haipeng Shen, Siva Tian, and Lan Zhou.
Brief bio: Dr. Jianhua Huang received his MS degree in Probability & Statistics from Beijing University and his Ph.D. in Statistics from University of California at Berkeley. He previously held a position at the Wharton School, University of Pennsylvania. He is currently a professor in Department of Statistics at Texas A&M University and an adjunct professor of MD Anderson Cancer Center. Dr. Huang is an active researcher in the areas of statistical machine learning, nonparametric and semiparametric methods, functional data analysis, and statistics applications in business and engineering. He has published more than 70 refereed papers. He is a fellow of ASA and IMS.