Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- research-articleDecember 2024
Enhancing classification with hybrid feature selection: A multi-objective genetic algorithm for high-dimensional data
Expert Systems with Applications: An International Journal (EXWA), Volume 255, Issue PAhttps://doi.org/10.1016/j.eswa.2024.124518AbstractFeature selection is a fundamental step in machine learning, serving to reduce dataset redundancy, accelerate training speed, and improve model quality. This is particularly crucial in high-dimensional datasets, where the excess of features ...
Highlights- Hybrid method optimizes feature selection with multi-objective genetic algorithm.
- Achieves improved classification metrics with optimized and smaller feature sets.
- Combines multiple feature selection mechanisms for superior ...
- research-articleNovember 2024
Solving high-dimensional parametric engineering problems for inviscid flow around airfoils based on physics-informed neural networks
Journal of Computational Physics (JOCP), Volume 516, Issue Chttps://doi.org/10.1016/j.jcp.2024.113285Highlights- Solving a high-dimensional parametric engineering problem for inviscid airfoil flow.
- Simultaneous solutions across a broad spectrum of flow conditions and shapes.
- Introducing pretraining-finetuning for rapid task-specific accuracy ...
Engineering problems often involve solving partial differential equations (PDEs) over a range of similar problem setups, leading to high computational costs when using traditional numerical approaches to solve each setup individually. Recently ...
- research-articleNovember 2024
Hierarchical learning multi-objective firefly algorithm for high-dimensional feature selection
Applied Soft Computing (APSC), Volume 165, Issue Chttps://doi.org/10.1016/j.asoc.2024.112042AbstractFeature selection is a crucial data preprocessing technique extensively employed in machine learning and image processing. However, feature selection encounters significant challenges when addressing high-dimensional data due to the huge and ...
Highlights- HMOFA is proposed to solve high-dimensional feature selection tasks.
- A clustering initialization method is introduced to reduce redundant features and improve the quality of initial population.
- The population updates its position ...
- research-articleOctober 2024
Feature selection using a classification error impurity algorithm and an adaptive genetic algorithm improved with an external repository
Knowledge-Based Systems (KNBS), Volume 301, Issue Chttps://doi.org/10.1016/j.knosys.2024.112345AbstractFeature selection in small-sample high-dimensional datasets enhances classification accuracy and reduces computational time for model training. This paper introduces the filter Classification Error Impurity (CEI) as a frequency-based ranker that ...
Graphical abstractDisplay Omitted
Highlights- A Classification Error Impurity (CEI) algorithm is proposed as a frequency-based filter ranker.
- An Adaptive Genetic Algorithm with an External Repository (AGAwER) is proposed as a wrapper method to augment the exploration of GA.
- It ...
- research-articleOctober 2024
A fast dual-module hybrid high-dimensional feature selection algorithm
Information Sciences: an International Journal (ISCI), Volume 681, Issue Chttps://doi.org/10.1016/j.ins.2024.121185AbstractWhen dealing with large-scale datasets, high-dimensional feature selection plays a crucial role in improving the performance and interpretability of machine learning models. However, it still faces the problems of the “dimensionality curse” and ...
-
- research-articleOctober 2024
Consistent skinny Gibbs in probit regression
Computational Statistics & Data Analysis (CSDA), Volume 198, Issue Chttps://doi.org/10.1016/j.csda.2024.107993AbstractSpike and slab priors have emerged as effective and computationally scalable tools for Bayesian variable selection in high-dimensional linear regression. However, the crucial model selection consistency and efficient computational strategies ...
- ArticleSeptember 2024
High-Dimensional Bayesian Optimization via Random Projection of Manifold Subspaces
Machine Learning and Knowledge Discovery in Databases. Research Track and Demo TrackPages 288–305https://doi.org/10.1007/978-3-031-70371-3_17AbstractBayesian Optimization (BO) is a popular approach to optimizing expensive-to-evaluate black-box functions. Despite the success of BO, its performance may decrease exponentially as the dimensionality increases. A common framework to tackle this ...
- research-articleSeptember 2024
Roulette wheel-based level learning evolutionary algorithm for feature selection of high-dimensional data
Applied Soft Computing (APSC), Volume 163, Issue Chttps://doi.org/10.1016/j.asoc.2024.111948AbstractFeature selection in high-dimensional data is a large-scale sparse and discrete optimization problem. Most evolutionary algorithms are designed to tackle continuous optimization problems. However, when dealing with high-dimensional feature ...
Highlights- A roulette wheel-based level learning evolutionary algorithm is introduced for the high-dimensional feature selection.
- The population is divided into levels, with individuals at the higher levels guiding the lower levels for updating.
- research-articleAugust 2024
A physics and data co-driven surrogate modeling method for high-dimensional rare event simulation
Journal of Computational Physics (JOCP), Volume 510, Issue Chttps://doi.org/10.1016/j.jcp.2024.113069AbstractThis paper presents a physics and data co-driven surrogate modeling method for efficient rare event simulation of civil and mechanical systems with high-dimensional input uncertainties. The method fuses interpretable low-fidelity physical models ...
Highlights- Physics-data-driven surrogate modeling method for estimating small probabilities.
- Fusion of physical model, error correction, active learning, and importance sampling.
- Highly efficient for rare event simulation with high-...
- research-articleMay 2024
Information criteria for structured parameter selection in high-dimensional tree and graph models
Digital Signal Processing (DISP), Volume 148, Issue Chttps://doi.org/10.1016/j.dsp.2024.104437AbstractParameter selection in high-dimensional models is typically fine-tuned in a way that keeps the (relative) number of false positives under control. This is because otherwise the few true positives may be dominated by the many possible false ...
- articleApril 2024
fabisearch: A package for change point detection in and visualization of the network structure of multivariate high-dimensional time series in R
Neurocomputing (NEUROC), Volume 578, Issue Chttps://doi.org/10.1016/j.neucom.2024.127321AbstractIn this work, we introduce the R package fabisearch, available on the Comprehensive R Archive Network (CRAN), which implements an original change point detection method for multivariate high-dimensional time series data and a new interactive, 3-...
- research-articleApril 2024
Variable selection for high-dimensional incomplete data
Computational Statistics & Data Analysis (CSDA), Volume 192, Issue Chttps://doi.org/10.1016/j.csda.2023.107877AbstractRegression analysis is often affected by high dimensionality, severe multicollinearity, and a large proportion of missing data. These problems may mask important relationships and even lead to biased conclusions. This paper proposes a novel ...
Highlights- Proposed MultiAIS-hrlasso, an efficient variable selection algorithm for high-dimensional incomplete data.
- MultiAIS offered better consistency under challenging missing patterns.
- The hrlasso showed accuracy in variable selection ...
- research-articleMarch 2024
Sparse estimation of linear model via Bayesian method
Computational Statistics (CSTAT), Volume 39, Issue 4Pages 2011–2038https://doi.org/10.1007/s00180-024-01474-5AbstractThis paper considers the sparse estimation problem of regression coefficients in the linear model. Note that the global–local shrinkage priors do not allow the regression coefficients to be truly estimated as zero, we propose three threshold rules ...
- research-articleMarch 2024
A transparent and nonlinear method for variable selection
Expert Systems with Applications: An International Journal (EXWA), Volume 237, Issue PAhttps://doi.org/10.1016/j.eswa.2023.121398AbstractVariable selection is a procedure to obtain truly important predictors from inputs. Complex nonlinear dependencies and strong coupling pose great challenges for variable selection in high-dimensional data. Real-world applications have increased ...
Highlights- Proposing transparent information decoupling to group inputs into four subsets.
- Adopting a nonlinear partial correlation to identify nonmonotonic relevance.
- Proposing a model-free variable selection framework for high-dimensional ...
- research-articleJanuary 2024
The sparse dynamic factor model: a regularised quasi-maximum likelihood approach
Statistics and Computing (KLU-STCO), Volume 34, Issue 2https://doi.org/10.1007/s11222-023-10378-1AbstractThe concepts of sparsity, and regularised estimation, have proven useful in many high-dimensional statistical applications. Dynamic factor models (DFMs) provide a parsimonious approach to modelling high-dimensional time series, however, it is ...
- research-articleJanuary 2024
Estimation of banded time-varying precision matrix based on SCAD and group lasso
Computational Statistics & Data Analysis (CSDA), Volume 189, Issue Chttps://doi.org/10.1016/j.csda.2023.107849AbstractA new banded time-varying precision matrix estimator is proposed for high-dimensional time series. The estimator utilizes the modified Cholesky decomposition, and the two factors in the decomposition are dynamically estimated by applying the ...
Highlights- A novel method for modeling high-dimensional time-varying banded precision matrices.
- The method ensure both Cholesky factor and innovation variance dynamic over time.
- Both the SCAD and group lasso penalty are used to obtain the ...
- research-articleDecember 2023
Ensemble LDA via the modified Cholesky decomposition
Computational Statistics & Data Analysis (CSDA), Volume 188, Issue Chttps://doi.org/10.1016/j.csda.2023.107823AbstractA binary classification problem in the high-dimensional settings is studied via the ensemble learning with each base classifier constructed from the linear discriminant analysis (LDA), and these base classifiers are integrated by the ...
- research-articleOctober 2023
High-dimensional local polynomial regression with variable selection and dimension reduction
Statistics and Computing (KLU-STCO), Volume 34, Issue 1https://doi.org/10.1007/s11222-023-10308-1AbstractVariable selection and dimension reduction have been considered in nonparametric regression for improving the precision of estimation, via the formulation of a semiparametric multiple index model. However, most existing methods are ill-equipped to ...
- research-articleAugust 2023
Aligned deep neural network for integrative analysis with high-dimensional input
Journal of Biomedical Informatics (JOBI), Volume 144, Issue Chttps://doi.org/10.1016/j.jbi.2023.104434Abstract Objective:Deep neural network (DNN) techniques have demonstrated significant advantages over regression and some other techniques. In recent studies, DNN-based analysis has been conducted on data with high-dimensional input such as omics ...
Graphical abstractDisplay Omitted