An Adaptive Multi-levels Sequential Feature Selection
Keywords:
Classification accuracy, Data mining, Dimensionality reduction, Feature selection, Sequential search, Supervised learningAbstract
Dealing with a large amount of data becomes a major challenge in data mining and machine learning. Feature selection is a significant preprocessing step for selecting the most informative features by removing irrelevant and redundant features, especially for the large datasets. These selected features play an important role in information searching and enhance the performance of a machine learning model such as classification and prediction. There have been several strategies proposed during the past few decades. In this study, we have proposed a new technique called Multi-levels Forward Inclusion (MLFI). The proposed algorithm consists of two parts. The first part is aimed to search for the maximum classification accuracy by applying the multi-levels forward-searching technique. The second part provides an improvement on the previous result by replacing a week feature. Hence, the idea is to apply an adaptive multi-levels forward search method and a replacement step during the feature addition without any backtracking search. However, we need to limit the level of forward-searching to maintain a lower execution time by introducing an adaptive variable called the generalization limit. We have tested our algorithm on eight UCI datasets and compare their accuracy with standard methods. MLFI shows better results than the other sequential forward floating techniques for the majority of the tested datasets.
Downloads
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 International Journal of Computer Information Systems and Industrial Management Applications

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.