Abstract
Many learning algorithms are formulated in terms of finding model parameters which minimize a datafitting loss function plus a regularizer. When the regularizer involves the (Formula presented.) pseudonorm, the resulting regularization path consists of a finite set of models. The fastest existing algorithm for computing the breakpoints in the regularization path is quadratic in the number of models, so it scales poorly to highdimensional problems. We provide new formal proofs that a dynamic programming algorithm can be used to compute the breakpoints in linear time. Our empirical results include analysis of the proposed algorithm in the context of various learning problems (regression, changepoint detection, clustering, and matrix factorization). We use a detailed analysis of changepoint detection problems to demonstrate the improved accuracy and speed of our approach relative to grid search and a previous quadratic time algorithm.
Original language  English (US) 

Pages (fromto)  313323 
Number of pages  11 
Journal  Journal of Computational and Graphical Statistics 
Volume  31 
Issue number  2 
DOIs  
State  Published  2022 
Keywords
 Binary segmentation
 Changepoint detection
 Dynamic programming
 Model selection
ASJC Scopus subject areas
 Statistics and Probability
 Statistics, Probability and Uncertainty
 Discrete Mathematics and Combinatorics
Fingerprint
Dive into the research topics of 'Linear Time Dynamic Programming for Computing Breakpoints in the Regularization Path of Models Selected From a Finite Set'. Together they form a unique fingerprint.Datasets

Linear time dynamic programming for computing breakpoints in the regularization path of models selected from a finite set
Vargovich, J. (Creator) & Hocking, T. (Creator), Taylor & Francis, 2021
DOI: 10.6084/m9.figshare.17014513.v1, https://tandf.figshare.com/articles/dataset/Linear_time_dynamic_programming_for_computing_breakpoints_in_the_regularization_path_of_models_selected_from_a_finite_set/17014513/1
Dataset