Multi-threaded transmit transport engine for storage devices
First Claim
Patent Images
1. An apparatus comprising:
- a task context pre-fetch engine to pre-fetch a first and a second task context from a task context memory based on a first and a second pre-fetch request, respectively, wherein the first and second task contexts each include context information for accessing one of a plurality of storage devices remotely coupled via a network external to the apparatus;
a multi-threaded transmit transport layer (T×
TL) coupled to the task context pre-fetch engine, including a first transport layer to process a first input/output (I/O) sequence from a pool of I/O sequences based on the first task context to generate a first transmit data sequence, and a second transport layer to process a second I/O sequence based on the second task context from the pool of I/O sequences to generate a second transmit data sequence, wherein processing the second I/O sequence is associated with a latency period and each I/O sequence of the pool of I/O sequences includes I/O operations for reading and writing data to a different storage device of the plurality of storage devices; and
a switch fabric/controller coupled to the multi-threaded T×
TL to route the first transmit data sequence to a single link layer interface during the latency period and the second transmit data sequence to the single link layer interface using the switch fabric/controller.
1 Assignment
0 Petitions
Accused Products
Abstract
An embodiment of the present invention is a technique to process a plurality of I/O sequences associated with a storage device. A task context pre-fetch engine pre-fetches a task context from a task context memory based on a pre-fetch request. At least a multi-threaded transmit transport layer (T×TL) processes the plurality of I/O sequences from an I/O pool simultaneously. The multi-threaded T×TL generates the pre-fetch request and one or more frames from the plurality of I/O sequences. A switch fabric and controller routes the frame to a link layer associated with the storage device.
11 Citations
20 Claims
-
1. An apparatus comprising:
-
a task context pre-fetch engine to pre-fetch a first and a second task context from a task context memory based on a first and a second pre-fetch request, respectively, wherein the first and second task contexts each include context information for accessing one of a plurality of storage devices remotely coupled via a network external to the apparatus; a multi-threaded transmit transport layer (T×
TL) coupled to the task context pre-fetch engine, including a first transport layer to process a first input/output (I/O) sequence from a pool of I/O sequences based on the first task context to generate a first transmit data sequence, and a second transport layer to process a second I/O sequence based on the second task context from the pool of I/O sequences to generate a second transmit data sequence, wherein processing the second I/O sequence is associated with a latency period and each I/O sequence of the pool of I/O sequences includes I/O operations for reading and writing data to a different storage device of the plurality of storage devices; anda switch fabric/controller coupled to the multi-threaded T×
TL to route the first transmit data sequence to a single link layer interface during the latency period and the second transmit data sequence to the single link layer interface using the switch fabric/controller. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A method comprising:
-
pre-fetching a first and a second task context from a task context memory based on a first and a second pre-fetch request, respectively, using a task context pre-fetch engine, wherein the first and second task contexts each include context information for accessing one of a plurality of storage devices remotely coupled via a network; processing a first input/output (I/O) sequence from a pool of I/O sequences using a first transport layer of a multi-threaded transmit transport layer (T×
TL) based on the first task context to generate a first transmit data sequence, wherein each I/O sequence of the pool of I/O sequences includes I/O operations for reading and writing data to a different remote storage device of the plurality of storage devices;processing a second I/O sequence from the pool of I/O sequences via a second transport layer thread of the multi-threaded T×
TL based on the second task context to generate a second transmit data sequence, wherein processing the second I/O sequence is associated with a latency period;routing the first transmit data sequence to a single link layer interface using a switch fabric/controller during the latency period; and routing the second transmit data sequence to the single link layer interface using the switch fabric/controller. - View Dependent Claims (8, 9, 10, 11, 12)
-
-
13. A system comprising:
-
a plurality of storage device interfaces to interface to a plurality of storage devices remotely coupled via a network; an input/output sequence scheduler coupled to the storage device interfaces to schedule processing first input/output (I/O) sequence from a pool of I/O sequences associated with the storage devices and based on a first task context to generate a first transmit data sequence, and a second transport layer to process a second I/O sequence from the pool of I/O sequences based on the second context, wherein each I/O sequence of the I/O pool includes I/O operations for reading and writing data to a different storage device of the plurality of storage devices; and a multi-threaded transmit transport engine (TTE) coupled to the storage device interfaces and the I/O sequence scheduler, the multi-threaded TTE comprising; a task context pre-fetch engine to pre-fetch a first and the second task context from a task context memory based on a first and a second pre-fetch request, respectively, wherein the first and second task contexts each include context information for accessing one of a plurality of storage devices remotely coupled via the network; a multi-threaded transmit transport layer (T×
TL) coupled to the task context pre-fetch engine, including a first transport layer to process the first I/O sequence to generate a first transmit data sequence, and the second transport layer to process the second I/O sequence to generate a second transmit data sequence, wherein processing the second I/O sequence is associated with a latency period; anda switch fabric/controller coupled to the multi-threaded T×
TLs to route the first transmit data sequence during the latency period and the second transmit data sequence to a single link layer interface. - View Dependent Claims (14, 15, 16, 17, 18, 19, 20)
-
Specification