Preprint Article Version 1 This version is not peer-reviewed

Review Non-convex Optimization Method for Machine Learning

Version 1 : Received: 3 October 2024 / Approved: 3 October 2024 / Online: 4 October 2024 (04:22:32 CEST)

How to cite: Popovich, P.; Fotopoulos, G.; Papadopoulos, N. Review Non-convex Optimization Method for Machine Learning. Preprints 2024, 2024100278. https://doi.org/10.20944/preprints202410.0278.v1 Popovich, P.; Fotopoulos, G.; Papadopoulos, N. Review Non-convex Optimization Method for Machine Learning. Preprints 2024, 2024100278. https://doi.org/10.20944/preprints202410.0278.v1

Abstract

Non-convex optimization is a critical tool in advancing machine learning, especially for complex models like deep neural networks and support vector machines. Despite challenges such as multiple local minima and saddle points, non-convex techniques offer various pathways to reduce computational costs. These include promoting sparsity through regularization, efficiently escaping saddle points, and employing subsampling and approximation strategies like stochastic gradient descent. Additionally, non-convex methods enable model pruning and compression, which reduce the size of models while maintaining performance. By focusing on good local minima instead of exact global minima, non-convex optimization ensures competitive accuracy with faster convergence and lower computational overhead. This paper examines the key methods and applications of non-convex optimization in machine learning, exploring how it can lower computation costs while enhancing model performance. Furthermore, it outlines future research directions and challenges, including scalability and generalization, that will shape the next phase of non-convex optimization in machine learning.

Keywords

Deep Neural Networks; Support Vector Machine; Non-convex Optimization; Gradient Method \and Machine Learning

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.