Cost-Effective Task Offloading Scheduling for Hybrid Mobile Edge-Quantum Computing
In this paper, we aim to address the challenge of hybrid mobile edge-quantum computing (MEQC) for sustainable task offloading scheduling in mobile networks. We develop cost-effective designs for both task offloading mode selection and resource allocation, subject to the individual link latency constraint guarantees for mobile devices, while satisfying the required success ratio for their computation tasks. Specifically, this is a time-coupled offloading scheduling optimization problem in need of a computationally affordable and effective solution. To this end, we propose a deep reinforcement learning (DRL)-based Lyapunov approach. More precisely, we reformulate the original time-coupled challenge into a mixed-integer optimization problem by introducing a penalty part in terms of virtual queues constructed by time-coupled constraints to the objective function. Subsequently, a Deep Q-Network (DQN) is adopted for task offloading mode selection. In addition, we design the Deep Deterministic Policy Gradient (DDPG)-based algorithm for partial-task offloading decision-making. Finally, tested in a realistic network setting, extensive experiment results demonstrate that our proposed approach is significantly more cost-effective and sustainable compared to existing methods.
READ FULL TEXT