Article
Version 1
Preserved in Portico This version is not peer-reviewed
Resource Demand Forecasting Model Based on Dynamic Cloud Workload
Version 1
: Received: 9 October 2017 / Approved: 9 October 2017 / Online: 9 October 2017 (12:40:34 CEST)
How to cite: An, C.; Zhou, J. Resource Demand Forecasting Model Based on Dynamic Cloud Workload. Preprints 2017, 2017100051. https://doi.org/10.20944/preprints201710.0051.v1 An, C.; Zhou, J. Resource Demand Forecasting Model Based on Dynamic Cloud Workload. Preprints 2017, 2017100051. https://doi.org/10.20944/preprints201710.0051.v1
Abstract
The primary attraction of IaaS is providing elastic resources on demand. It becomes imperative that IaaS-users have an effective methodology for learning what resources they require, how many resources and for how long they need. However, the heterogeneity of resources, the diversity resource demands of different cloud applications and the variation of application-user behaviors pose IaaS-users big challenge. In this paper, we purpose a unified resource demand forecasting model suiting for different applications, various resources and diverse time-varying workload patterns. With the model, taking input from parameterized applications, resources and workload scenarios, the corresponding resources demands during any time interval can be deduced as output. The experiments configure concrete functions and parameters to help understanding the above model.
Keywords
cloud computing; workload model; workload-aware resource forecasting model
Subject
Computer Science and Mathematics, Computational Mathematics
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment