Multi-threaded processing using path locks
First Claim
1. A method comprising:
- causing threads of a plurality of threads that access a particular portion of a shared computational resource during a particular instruction path to be scheduled at a thread scheduler for a set of one or more processors;
receiving, at the thread scheduler, data that indicates a first thread of the plurality of threads is to execute next the particular instruction path to access the particular portion of the shared computational resource, wherein a path identifier (ID) is received and is indicative of the particular instruction path to access the particular portion of the shared computational resource;
determining whether a second thread of the plurality of threads is eligible to execute the particular instruction path on any processor of the set of one or more processors to access the particular portion of the shared computational resource;
if it is determined that the second thread is eligible to execute the particular instruction path, then preventing the first thread from executing any instruction from the particular instruction path on any processor of the set of one or more processors; and
storing data that indicates the path ID in association with data that indicates the first thread.
2 Assignments
0 Petitions
Accused Products
Abstract
In one embodiment, a method includes receiving at a thread scheduler data that indicates a first thread is to execute next a particular instruction path in software to access a particular portion of a shared computational resource. The thread scheduler determines whether a different second thread is exclusively eligible to execute the particular instruction path on any processor of a set of one or more processors to access the particular portion of the shared computational resource. If so, then the thread scheduler prevents the first thread from executing any instruction from the particular instruction path on any processor of the set of one or more processors. This enables several threads of the same software to share a resource without obtaining locks on the resource or holding a lock on a resource while a thread is not running.
119 Citations
20 Claims
-
1. A method comprising:
-
causing threads of a plurality of threads that access a particular portion of a shared computational resource during a particular instruction path to be scheduled at a thread scheduler for a set of one or more processors; receiving, at the thread scheduler, data that indicates a first thread of the plurality of threads is to execute next the particular instruction path to access the particular portion of the shared computational resource, wherein a path identifier (ID) is received and is indicative of the particular instruction path to access the particular portion of the shared computational resource; determining whether a second thread of the plurality of threads is eligible to execute the particular instruction path on any processor of the set of one or more processors to access the particular portion of the shared computational resource; if it is determined that the second thread is eligible to execute the particular instruction path, then preventing the first thread from executing any instruction from the particular instruction path on any processor of the set of one or more processors; and storing data that indicates the path ID in association with data that indicates the first thread. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method comprising:
-
receiving data that indicates a unique path identifier (ID) for a particular instruction path, wherein a particular portion of a shared computational resource is accessed within the particular instruction path; receiving data that indicates a processor is to execute the particular instruction path; sending, to a thread scheduler, switch data that indicates the processor should switch to another thread and that indicates the path ID; and accessing the particular portion of the shared computational resource when the thread scheduler determines, based at least on the path ID, that another thread is not eligible to execute the particular instruction path to access the particular portion of the shared computational resource, wherein the receiving of data that indicates the path ID further comprises; receiving data that indicates a particular queue of data to be processed by the particular instruction path; and determining the path ID based on the particular queue of data.
-
-
10. An apparatus for processing a thread of a plurality of threads that share a processor, comprising:
-
means for causing threads of a plurality of threads that access a particular portion of a shared computational resource during a particular instruction path to be received at a thread scheduler for a set of one or more processors; means for receiving, at the thread scheduler, data that indicates a first thread of the plurality of threads is to execute next the particular instruction path to access the particular portion of the shared computational resource, wherein a path identifier (ID) is received and is indicative of the particular instruction path to access the particular portion of the shared computational resource; means for determining whether a second thread of the plurality of threads is eligible to execute the particular instruction path on any processor of the set of one or more processors to access the particular portion of the shared computational resource; means for preventing the first thread from executing any instruction from the particular instruction path on any processor of the set of one or more processors, if it is determined that the second thread is eligible to execute the particular instruction path; and storing data that indicates the path ID in association with data that indicates the first thread.
-
-
11. An apparatus comprising:
-
one or more processors; software encoded as instructions in one or more non-transitory computer-readable media for execution on the one or more processors; a computational resource to be shared among a plurality of execution threads executing the software on the one or more processors; a single thread scheduler for scheduling the plurality of execution threads, wherein the thread scheduler includes logic encoded in one or more tangible media for execution and, when executed, operable for; receiving data that indicates a first thread of the plurality of execution threads is to execute next a particular instruction path in the software to access a particular portion of the computational resource, wherein a path identifier (ID) is received and is indicative of the particular instruction path to access the particular portion of the shared computational resource; determining whether a second thread of the plurality of threads is eligible to execute the particular instruction path on any processor of the one or more processors to access the particular portion of the computational resource; if it is determined that the second thread is eligible to execute the particular instruction path, then preventing the first thread from executing any instruction from the particular instruction path on any processor of the one or more processors; and storing data that indicates the path ID in association with data that indicates the first thread. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18, 19)
-
-
20. A method for achieving near line rate processing of data packets at a router comprising:
-
receiving over a first network interface a first set of one or more data packets; storing data based on the first set of one or more data packets in a particular queue in a shared memory; and determining a second set of one or more data packets to send through a different second network interface based on data in the particular queue, including; receiving, at the thread scheduler, data that indicates a first thread of the plurality of threads is to execute next a particular instruction path that includes one or more instructions to access the particular queue, wherein a path identifier (ID) is received and is indicative of the particular instruction path to access the particular queue; determining whether a second thread of the plurality of threads is eligible to execute the particular instruction path on any processor of a set of one or more processors to access the particular queue; if it is determined that the second thread is eligible to execute the particular instruction path to access the particular queue, then preventing the first thread from executing any instruction from the particular instruction path on any processor of the set of one or more processors to access the particular queue; and storing data that indicates the path ID in association with data that indicates the first thread.
-
Specification