Close

Presentation

Cache-aware Task Decomposition for Efficient Intermittent Computing Systems
DescriptionEnergy harvesting offers a scalable and cost-effective power solution for IoT devices, but it introduces the challenge of frequent and unpredictable power failures due to the unstable environment.
To address this, intermittent computing has been proposed, which periodically backs up the system state to non-volatile memory (NVM), enabling robust and sustainable computing even in the face of unreliable power supplies.
In modern processors, write back cache is extensively utilized to enhance system performance.
However, it poses a challenge during backup operations as it buffers updates to memory, potentially leading to inconsistent system states.
One solution is to adopt a write-through cache, which avoids the inconsistency issue but incurs increased memory access latency for each write reference.
Some existing work enforces a cache flushing before backups to maintain a consistent system state, resulting in significant backup overhead.
In this paper, we point out that although cache delays updates to the main memory, it may preserve a recoverable system state in the main memory.
Leveraging this characteristic, we propose a cache-aware task decomposition method that divides an application into multiple tasks, ensuring that no dirty cache lines are evicted during their execution.
Furthermore, the cache-aware task decomposition maintains a unchanged memory state during the execution of each task, enabling us to parallelize the backup process with task execution and effectively hide the backup latency.
Experimental results with different power traces demonstrate the effectiveness of the proposed system.
Event Type
Research Manuscript
TimeWednesday, June 265:00pm - 5:15pm PDT
Location3001, 3rd Floor
Topics
Embedded Systems
Keywords
Embedded Software