收录:
摘要:
A large number of services provided by cloud/edge computing systems have become the most important part of Internet services. In spite of their numerous benefits, cloud/edge providers face some challenging issues, e.g., inaccurate prediction of large-scale workload and resource usage traces. However, due to the complexity of cloud computing environments, workload and resource usage traces are highly-variable, thus making it difficult for traditional models to predict them accurately. Traditional models fail to deal with nonlinear characteristics and long-term memory dependencies. To solve this problem, this work proposes an integrated prediction method that combines Bi-directional and Grid Long Short-Term Memory network (BG-LSTM) models to predict workload and resource usage traces. In this method, workload and resource usage traces are first smoothed by a Savitzky-Golay filter to eliminate their extreme points and noise interference. Then, an integrated prediction model is established to achieve accurate prediction for highly-variable traces. Using real-world workload and resource usage traces from Google cloud data centers, we have conducted extensive experiments to show the effectiveness and adaptability of BG-LSTM for different traces. The performance results well demonstrate that BG-LSTM achieves better prediction results than some typical prediction methods for highly-variable real-world cloud systems.
关键词:
通讯作者信息:
电子邮件地址:
来源 :
2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC)
ISSN: 1062-922X
年份: 2020
页码: 1206-1211
语种: 英文
归属院系: