scholar.google.com › citations
Sep 21, 2010 · We describe a fast method to eliminate features (variables) in l1 -penalized least-square regression (or LASSO) problems.
Our method allows to immediately extend the scope of existing algorithms, allowing us to run them on data sets of sizes that were out of their reach before.
Sep 21, 2010 · Our framework applies to a large class of problems, including support vector machine classification, logistic regression and least-squares. The ...
This work investigates fast methods that allow to quickly eliminate variables (features) in supervised learning problems involving a convex loss function ...
Oct 22, 2024 · PDF | We describe a fast method to eliminate features (variables) in l1 -penalized least-square regression (or LASSO) problems.
People also ask
What is feature elimination in machine learning?
What is recursive feature elimination in logistic regression?
Safe feature elimination in sparse supervised learning. Laurent El Ghaoui, Vivian Viallon and Tarek Rabbani. Key words, Mathematices Subject Classification.
Motivated by the row sparsity of the optimal solution, an improved safe feature elimination rule termed IEDPP is proposed to accelerate the training process. It ...
Missing: Supervised | Show results with:Supervised
Sep 21, 2010 · The method extends the scope of existing LASSO algorithms to treat larger data sets, previously out of their reach, and can be extended to ...
Laurent El Ghaoui, Vivian Viallon, and Tarek Rabbani. Safe feature elimination for the lasso and sparse supervised learning problems. Pacific Journal of ...
Safe Feature Elimination in Sparse Learning. Laurent El Ghaoui. We describe methods that allow to quickly eliminate variables (features) in supervised learning ...