收录:
摘要:
With the arrival of the 5G era, a new service paradigm known as mobile-edge computing (MEC) has been introduced for providing high quality mobile services by offloading the delay-sensitive and computation-intensive tasks from mobile devices to nearby MEC servers. In this paper, we investigate the problem of delaysensitive task scheduling and resource (e.g. CPU, memory) management on the server side in multi-user MEC scenario, and propose a new online algorithm based on deep reinforcement learning (DRL) method to reduce average slowdown and average timeout period of tasks in the queue. We also design a new reward function to guide the algorithm to learn directly from experience to scheduling tasks and managing resources. Simulation result shows that our algorithm outperforms multiple traditional algorithms and have a big advantage of intelligence and good understanding towards workload and environment. © 2019 Association for Computing Machinery.
关键词:
通讯作者信息:
电子邮件地址: