×
Essentially: One by one, take one of the E features out and compute the partial fractal dimension of our dataset with respect to the other E-1 features. Select the grouping with the partial fractal dimension closest to the overall fractal dimension. Iterate this process until we have J features.
Jun 12, 2019
May 27, 2010 · In this paper we present a fast, scalable algorithm to quickly select the most important attributes (dimensions) for a given set of n-dimensional vectors.
Sep 24, 2024 · We present a fast, scalable algorithm to quickly select the most important attributes (dimensions) for a given set of n-dimensional vectors.
Section 4 presents the fractal dimension algorithm developed as well as the datasets used in the experiments. Section 5 gives the proposed method for attribute ...
The idea is to use the 'fractal' dimension of a dataset as a good approximation of its intrinsic dimension, and to drop attributes that do not affect it. We ...
May 27, 2010 · It is shown that the Fractal Theory is indeed helpful to a large spectrum of activities required to manage large amounts of data. Research ...
Jan 9, 2024 · This paper proposes a feature selection model based on the fractal concept to improve the performance of intelligent systems in classifying high-dimensional ...
In this paper we propose an algorithm, based on a “tug-of-war” idea, which computes the fractal dimension in a single pass over the dataset using only constant ...
Based on fractal dimension computation, we propose a new feature selection algorithm that identifies how many attributes are sufficient for representing the ...
Nov 26, 2015 · These articles propose certain techniques on how to choose the Kmax value. You can try their techniques for your problem.