×

Cache-aware thread scheduling in multi-threaded systems

  • US 8,533,719 B2
  • Filed: 04/05/2010
  • Issued: 09/10/2013
  • Est. Priority Date: 04/05/2010
  • Status: Active Grant
First Claim
Patent Images

1. A method for predictively scheduling a thread in a multi-threaded processor, comprising:

  • executing a first thread in a processor core associated with a shared cache that is also associated with a second processor core;

    executing a second thread in a third processor core which is not associated with the cache;

    executing a third thread in the second processor core;

    while executing the first thread, measuring one or more metrics to characterize the first thread;

    measuring one or more metrics to characterize the second thread;

    measuring one or more metrics to characterize the third thread;

    using the characterization of the first thread and a characterization of a second thread to predict a performance impact associated with simultaneously executing the second thread in a second processor core which is associated with the cache, wherein predicting the performance impact comprises predicting an impact in an execution metric for the first thread and an impact in an execution metric for the second thread while the first thread and the second thread access the shared cache when simultaneously executing, and wherein predicting the performance impact involves using the characterization of the third thread and the characterization of the second thread in addition to the characterization of the first thread to determine whether migrating the second thread to the second processor core will improve performance for the multi-threaded processor; and

    when the predicted performance impact indicates that executing the second thread on the second processor core will improve performance for the multi-threaded processor, executing the second thread on the second processor core.

View all claims
  • 1 Assignment
Timeline View
Assignment View
    ×
    ×