Multiobjective optimization solution for the selection of Quasi equally informative subsets in classification models
DOI:
https://doi.org/10.4314/jobasr.v3i3.33Keywords:
Quasi Equally, Informative Subsets, Extreme Learning Machine (ELM), Multi-objective Optimization, Pareto Efficiency Machine LearningAbstract
Feature selection is crucial in machine learning, particularly for high-dimensional data. This study presents two advanced multi-objective techniques—Improved Wrapper QEISS (IW-QEISS) and Improved Filter QEISS (IF-QEISS)—designed to identify multiple quasi-equally informative feature subsets. Unlike traditional methods, which focus solely on accuracy and subset size, our approach enhances robustness and interpretability. Using a four-objective NSGA-II framework with a population of 100 and 100 generations, we optimize for accuracy, redundancy, and feature importance (threshold = 0.05). Experiments show IW-QEISS identified seven subsets with a cardinality of four on the Heart dataset, achieving 0.836 accuracy—on par with W-MOSS. IF-QEISS offered similar accuracy with reduced computation. These results validate the efficiency and effectiveness of our proposed methods.
References
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.