Article
Version 1
Preserved in Portico This version is not peer-reviewed
Multi-label feature selection based on logistic regression and manifold learning
Version 1
: Received: 14 July 2021 / Approved: 15 July 2021 / Online: 15 July 2021 (08:15:00 CEST)
How to cite: Zhang, Y.; Ma, Y.; Yang, X. Multi-label feature selection based on logistic regression and manifold learning. Preprints 2021, 2021070341. https://doi.org/10.20944/preprints202107.0341.v1 Zhang, Y.; Ma, Y.; Yang, X. Multi-label feature selection based on logistic regression and manifold learning. Preprints 2021, 2021070341. https://doi.org/10.20944/preprints202107.0341.v1
Abstract
Like traditional single label learning, multi-label learning is also faced with the problem of dimensional disaster.Feature selection is an effective technique for dimensionality reduction and learning efficiency improvement of high-dimensional data. In this paper, Logistic regression, manifold learning and sparse regularization were combined to construct a joint framework for multi-label feature selection (LMFS). Firstly, the sparsity of the eigenweight matrix is constrained by the $L_{2,1}$-norm. Secondly, the feature manifold and label manifold can constrain the feature weight matrix to make it fit the data information and label information better. An iterative updating algorithm is designed and the convergence of the algorithm is proved.Finally, the LMFS algorithm is compared with DRMFS, SCLS and other algorithms on eight classical multi-label data sets. The experimental results show the effectiveness of LMFS algorithm.
Keywords
feature selection; manifold learning; multi-label learning; $L_{2,1}$-norm; logistic regression
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment