Preprint Article Version 1 This version is not peer-reviewed

A Closest Resemblance Classifier with Feature Interval Learning and Outranking Measures for Improved Performance

Version 1 : Received: 15 July 2024 / Approved: 16 July 2024 / Online: 16 July 2024 (10:13:11 CEST)

How to cite: Belacel, N. A Closest Resemblance Classifier with Feature Interval Learning and Outranking Measures for Improved Performance. Preprints 2024, 2024071257. https://doi.org/10.20944/preprints202407.1257.v1 Belacel, N. A Closest Resemblance Classifier with Feature Interval Learning and Outranking Measures for Improved Performance. Preprints 2024, 2024071257. https://doi.org/10.20944/preprints202407.1257.v1

Abstract

Classifiers face a myriad of challenges in today’s data-driven world, ranging from overfitting and high computational costs to low accuracy, imbalanced training datasets, and the notorious black box effect. Furthermore, many traditional classifiers struggle with the robust handling of noisy and missing feature values. In response to these hurdles, we present classification methods that leverage the power of feature partitioning learning and outranking measures. Our classification algorithms offer an innovative approach, eliminating the need for prior domain knowledge by automatically discerning feature intervals directly from the data. These intervals capture essential patterns and characteristics within the dataset, empowering our classifiers with newfound adaptability and insight. In addition, we employ outranking measures to mitigate the influence of noise and uncertainty in the data. Through pairwise comparisons of alternatives on each feature, we enhance the robustness and reliability of our classification outcomes. The developed classifiers are empirically evaluated on several data sets from UCI repository and are compared with well-known classifiers including k Nearest Neighbors (K-NN), Support Vector Machine (SVM), Random Forest (RF), Neural Network (NN), Naive Bayesian (NB) and Nearest Centroid (NC). The experiments result demonstrate that the classifiers based on feature interval learning and outranking approaches are robust to imbalanced data and to irrelevant features and achieve comparably and even better performances than the well-known classifiers in some cases. Moreover, our proposed classifiers produce more explainable models whilst preserving high predictive performance levels.

Keywords

Classification; Machine learning; Supervised learning; Feature Interval Learning; Outranking measures

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.