Preprint Article Version 2 This version is not peer-reviewed

Optimizing Machine Learning Models for Urban Sciences: A Comparative Analysis of Hyperparameter Tuning Methods

Version 1 : Received: 4 June 2024 / Approved: 5 June 2024 / Online: 5 June 2024 (10:59:28 CEST)
Version 2 : Received: 26 July 2024 / Approved: 29 July 2024 / Online: 29 July 2024 (08:46:31 CEST)

How to cite: Kee, T.; Ho, W. K. Optimizing Machine Learning Models for Urban Sciences: A Comparative Analysis of Hyperparameter Tuning Methods. Preprints 2024, 2024060264. https://doi.org/10.20944/preprints202406.0264.v2 Kee, T.; Ho, W. K. Optimizing Machine Learning Models for Urban Sciences: A Comparative Analysis of Hyperparameter Tuning Methods. Preprints 2024, 2024060264. https://doi.org/10.20944/preprints202406.0264.v2

Abstract

The advancement of urban scholarship and the effective addressing of urban environment challenges necessitate the adoption of sophisticated analytical methods. Urban scholars and policymakers need advanced analytical methods to tackle issues like gentrification, housing affordability, and urban sprawl. Predictive models are crucial in the realm of urban sciences, and hyperparameter tuning methods can significantly improve their accuracy and efficiency. Our study compares three such methods — Optuna, Random Search, and Grid Search — using a housing transaction dataset. We find that Optuna is not only 5.58 to 70.50 times faster than the other two methods when applied to Random Forest and Gradient Boosting Machine algorithms, but also achieves lower error values in key evaluation metrics on the test set, such as mean absolute error, mean squared error, mean absolute percentage error and root mean squared error.

Keywords

hyperparameter tuning; optuna; grid search; random search; urban studies

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.