Srinivasan, S and Juve, G and Da Silva, R F and Vahi, K and Deelman, E
(2014)
A cleanup algorithm for implementing storage constraints in scientific workflow executions.
In: 9th Workshop on Workflows in Support of Large-Scale Science (WORKS), 16th November, 2014, New Orleans; United States.
Abstract
Scientific workflows are often used to automate large-scale data analysis pipelines on clusters, grids, and clouds. However, because workflows can be extremely data-intensive, and are often executed on shared resources, it is critical to be able to limit or minimize the amount of disk space that workflows use on shared storage systems. This paper proposes a novel and simple approach that constrains the amount of storage space used by a workflow by inserting data cleanup tasks into the workflow task graph. Unlike previous solutions, the proposed approach provides guaranteed limits on disk usage, requires no new functionality in the underlying workflow scheduler, and does not require estimates of task runtimes. Experimental results show that this algorithm significantly reduces the number of cleanup tasks added to a workflow and yields better workflow makespans than the strategy currently used by the Pegasus Workflow Management System.
Actions (login required)
|
View Item |