Scheduling MapReduce tasks based on estimated workload distribution
First Claim
1. A method comprising:
- receiving a set of task statistics corresponding to task execution within a MapReduce job, wherein the set of task statistics includes a job input size, a cluster size, an average map process rate, a shuffle data size, a network bandwidth, and a convergence of a workload distribution corresponding to the set of executed tasks;
estimating a completion time corresponding to a map task completion time and a shuffle operation completion time to provide an estimated completion time;
calculating a soft decision point based on a convergence of a workload distribution corresponding to a set of executed tasks, wherein the soft decision point corresponds to a point at which a workload is most evenly distributed among available resources;
calculating a hard decision point according to the equation HDP=max{0, map task completion time−
shuffle operation completion time};
determining a selected decision point based on the soft decision point and the hard decision point, wherein the selected decision point is the lesser of the soft decision point and the hard decision point; and
scheduling a next set of tasks to be executed at the selected decision point.
1 Assignment
0 Petitions
Accused Products
Abstract
A method for scheduling MapReduce tasks includes receiving a set of task statistics corresponding to task execution within a MapReduce job, estimating a completion time for a set of tasks to be executed to provide an estimated completion time, calculating a soft decision point based on a convergence of a workload distribution corresponding to a set of executed tasks, calculating a hard decision point based on the estimated completion time for the set of tasks to be executed, determining a selected decision point based on the soft decision point and the hard decision point, and scheduling upcoming tasks for execution based on the selected decision point. The method may also include estimating a map task completion time and estimating a shuffle operation completion time. A computer program product and computer system corresponding to the method are also disclosed.
42 Citations
1 Claim
-
1. A method comprising:
-
receiving a set of task statistics corresponding to task execution within a MapReduce job, wherein the set of task statistics includes a job input size, a cluster size, an average map process rate, a shuffle data size, a network bandwidth, and a convergence of a workload distribution corresponding to the set of executed tasks; estimating a completion time corresponding to a map task completion time and a shuffle operation completion time to provide an estimated completion time; calculating a soft decision point based on a convergence of a workload distribution corresponding to a set of executed tasks, wherein the soft decision point corresponds to a point at which a workload is most evenly distributed among available resources; calculating a hard decision point according to the equation HDP=max{0, map task completion time−
shuffle operation completion time};determining a selected decision point based on the soft decision point and the hard decision point, wherein the selected decision point is the lesser of the soft decision point and the hard decision point; and scheduling a next set of tasks to be executed at the selected decision point.
-
Specification