Resource Management for Stochastic Parallel Synchronous Tasks: Bandits to the Rescue

  • Anna Friebe
  • , Alberto Marchetti-Spaccamela
  • , Tommaso Cucinotta
  • , Alessandro Vittorio Papadopoulos
  • , Thomas Nolte
  • , Sanjoy Baruah

Research output: Contribution to journalArticlepeer-review

Abstract

In scheduling real-time tasks, we face the challenge of meeting hard deadlines while optimizing for some other objective, such as minimizing energy consumption. Formulating the optimization as a Multi-Armed Bandit (MAB) problem allows us to use MAB strategies to balance the exploitation of good choices based on observed data with the exploration of potentially better options. In this paper, we integrate hard real-time constraints with MAB strategies for resource management of a Stochastic Parallel Synchronous Task. On a platform with M cores available for the task, m≤M cores are initially assigned. Prior work has shown how to compute a virtual deadline such that assigning all M cores to the task if it has not completed by this virtual deadline guarantees that the deadline will be met. An MAB strategy is used to select the value of m. A Dynamic Power Management (DPM) energy model considering CPU sockets and sleep states is described. Experimental evaluation shows that MAB strategies learn consistently suitable m, and perform well compared to binary exponential search and greedy methods.

Original languageEnglish
Pages (from-to)359-402
Number of pages44
JournalReal-Time Systems
Volume61
Issue number3-4
DOIs
StatePublished - Dec 2025

Keywords

  • DAG task
  • Multi-Armed Bandit
  • Multicore scheduling
  • Parallel Synchronous Task

Fingerprint

Dive into the research topics of 'Resource Management for Stochastic Parallel Synchronous Tasks: Bandits to the Rescue'. Together they form a unique fingerprint.

Cite this