STATISTICS SEMINAR :: Feature Selection through Lasso: Blasso algorithm and model selection consistency

Event Details
  • Date/Time:
    • Tuesday October 17, 2006
      11:00 am - 12:00 am
  • Location: Executive classroom of Main building
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
Barbara Christopher
Industrial and Systems Engineering
Contact Barbara Christopher
404.385.3102
Summaries

Summary Sentence: STATISTICS SEMINAR :: Feature Selection through Lasso: Blasso algorithm and model selection consistency

Full Summary: STATISTICS SEMINAR :: Feature Selection through Lasso: Blasso algorithm and model selection consistency

Information technology advances are making data collection possible in most if not all fields of science and engineering and beyond. Often data reduction or feature selection is the first step towards solving current IT age problems. However, data reduction through model selection or l_0 constrained LS optimization leads to a combinatorial search which is computationally infeasible for massive data problems. A computationally efficient alternative is the l_1 constrained LS optimization or Lasso optimization.

In this talk, we first study the model selection property of Lasso in linear regression models. We show that an Irrepresentable Condition on the design matrix is almost necessary and sufficient for the model selection consistency of Lasso for fixed p and p >> n cases, provided that the true model is sparse. Moreover, we describe the Boosted Lasso (BLasso)algorithm which produces an approximation to the complete regularization path of Lasso. BLasso consists of both a forward step and a backward step. The forward step is similar to Boosting and Forward Stagewise Fitting, but the backward step is new and crucial for BLasso to approximate the Lasso path in all situations. For cases with finite number of base learners, when the step size goes to zero, the BLasso path is shown to converge to the Lasso path. Finally, the Blasso algorithm is extended to give an approximate path for the case of a convex loss function plus a convex penalty.

Additional Information

In Campus Calendar
No
Groups

H. Milton Stewart School of Industrial and Systems Engineering (ISYE)

Invited Audience
No audiences were selected.
Categories
Seminar/Lecture/Colloquium
Keywords
No keywords were submitted.
Status
  • Created By: Barbara Christopher
  • Workflow Status: Published
  • Created On: Oct 8, 2010 - 7:33am
  • Last Updated: Oct 7, 2016 - 9:52pm