Lan Liu, PhD, University of Minnesota at Twin Cities
Title:The Inner Partial Least Square A Probe into the Necessary” Dimension Reduction.
ٰ:
The partial least square (PLS) algorithm retains the combinations of predictors that maximize the covariance with the outcome. The Fisherian interpretation of PLS remained a mystery until Cook et al. (2013) showed that it results in a predictor envelope, which is the smallest reducing subspace of Σ X that contains the coefficient. This paper is motivated by findings after making a seemingly trivial change to the PLS: what if we change the max in PLS to min? Counterintuitively, this does not calculate the complement of the traditional PLS space. Instead, it results in a new space: the largest reducing subspace of Σ X that is contained in the coefficient matrix space. We define the modified PLS as the inner PLS and the resulting space as the inner predictor envelope space. Unlike the traditional PLS that removes irrelevant information, the inner PLS incorporates the knowledge that some information is purely relevant. Consequently, the inner PLS algorithm can lead to a more efficient regression estimator than the PLS in certain scenarios; however, it is not the most efficient under the inner predictor envelope model. Therefore, we derive the maximum likelihood estimator and provide a non-Grassmannian optimization technique to compute it. We confirm the efficiency gain of our estimators both in simulations and real-world data from the China Health and Nutrition survey.