Regularized factor mdoels

Martha White. Regularized factor models. PhD Thesis, University of Alberta. 2014.

Download

[PDF] 

Abstract

This dissertation explores regularized factor models as a simple unification of machine learn- ing problems, with a focus on algorithmic development within this known formalism. The main contributions are (1) the development of generic, efficient algorithms for a subclass of regularized factorizations and (2) new unifications that facilitate application of these algorithms to problems previously without known tractable algorithms. Concurrently, the generality of the formalism is further demonstrated with a thorough summary of known, but often scattered, connections between supervised and unsupervised learning problems and algorithms. The dissertation first presents the main algorithmic advances: convex reformulations of non- convex regularized factorization objectives. A convex reformulation is developed for a general subset of regularized factor models, with an efficiently computable optimization for five different regularization choices. The thesis then describes advances using these generic convex reformula- tion techniques in three important problems: multi-view subspace learning, semi-supervised learn- ing and estimating autoregressive moving average models. These novel settings are unified under regularized factor models by incorporating problem properties in terms of regularization. Once ex- pressed as regularized factor models, we can take advantage of the convex reformulation techniques to obtain novel algorithms that produce global solutions. These advances include the first global es- timation procedure for two-view subspace learning and for autoregressive moving average models. The simple algorithms obtained from these general convex reformulation techniques are empirically shown to be effective across these three problems on a variety of datasets. This dissertation illustrates that many problems can be specified as a simple regularized factor- ization, that this class is amenable to global optimization and that it is advantageous to represent machine learning problems as regularized factor models.

BibTeX

@misc(14thesis-factors,
  Title = "Regularized factor models",
  Author = "Martha White",
  school =	 "University of Alberta",
   year =	 "2014",
  bib2html_dl_pdf="../publications/martha_white_phd_thesis.pdf"
)