Abstract
In the realm of Vehicular Fog Computing (VFC), the dynamic nature of vehicular networks presents substantial challenges for effective task scheduling and resource allocation. The rapidly changing mobility patterns of vehicles complicate the management of service delays for vehicular requests and the energy usage of servers. Our research addresses these challenges by focusing on cooperative and mobility-aware task scheduling in VFC, aiming to optimize fog server performance and ensure the timely processing of vehicular tasks. We model vehicle mobility using a Markov renewal process to determine vehicle movements. The task scheduling problem is formulated as a mixed-integer non-linear programming (MINLP) problem, considering constraints such as task deadlines, resource limits, and vehicle mobility. To tackle this problem, we utilize a deep reinforcement learning (DRL) technique, which allows for adaptive and intelligent task scheduling and resource allocation. Through extensive simulations, our approach demonstrates significant improvements over existing benchmark techniques, achieving a 12% reduction in service delays and a 14% decrease in energy consumption.