Pattern Matching Technique
First Claim
1. A standalone cache unit for caching data operations requested from one or more networked data storage devices by one or more remote clients, the standalone cache unit comprising:
- a cache memory for caching data that is requested by a remote client;
a payload memory for storing payload data for one or more packet flows;
a packet processor coupled to the cache memory, the payload memory, and to one or more I/O ports for transparently splicing connections between the data storage devices and remote clients, where the packet processor comprises a pattern detection module configured to find a matching pattern by scanning a received packet for one or more predetermined trigger patterns and generating a direct memory address for the matching pattern in memory; and
a host processor coupled to the packet processor and the payload memory, where the host processor uses the direct memory address to directly retrieve the matching pattern from memory and applies a cache policy profile to the matching pattern to make a caching decision for a data cache request associated with the matching pattern.
3 Assignments
0 Petitions
Accused Products
Abstract
A method, system and program are disclosed for accelerating data storage in a cache appliance that transparently monitors NFS and CIFS traffic between clients and NAS subsystems and caches files in a cache memory by using a perfect hashing memory index technique to rapidly detect predetermined patterns in received packet payloads and retrieve matching patterns from memory by generating a data structure pointer and index offset to directly address the pattern in the datagram memory, thereby accelerating evaluation of the packet with the matching pattern by the host processor.
-
Citations
20 Claims
-
1. A standalone cache unit for caching data operations requested from one or more networked data storage devices by one or more remote clients, the standalone cache unit comprising:
-
a cache memory for caching data that is requested by a remote client; a payload memory for storing payload data for one or more packet flows; a packet processor coupled to the cache memory, the payload memory, and to one or more I/O ports for transparently splicing connections between the data storage devices and remote clients, where the packet processor comprises a pattern detection module configured to find a matching pattern by scanning a received packet for one or more predetermined trigger patterns and generating a direct memory address for the matching pattern in memory; and a host processor coupled to the packet processor and the payload memory, where the host processor uses the direct memory address to directly retrieve the matching pattern from memory and applies a cache policy profile to the matching pattern to make a caching decision for a data cache request associated with the matching pattern. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A pattern search method for retrieving a matching trigger pattern from memory that matches a pattern contained in a data packet, comprising:
-
receiving a data packet which contains a pattern comprising a plurality of data segments; hashing one or more data segments from the received data packet to generate a pattern identifier which is used to retrieve one of a plurality of trigger patterns from a first memory; comparing a retrieved trigger pattern from the first memory with the plurality of data segments from the received data packet to determine if the retrieved trigger pattern is a matching trigger pattern that matches at least part of the plurality of data segments; and generating a direct memory address for use by a host processor in locating a copy of the matching trigger pattern in a second memory. - View Dependent Claims (14, 15, 16, 17, 18, 19)
-
-
20. A method for accelerating data storage cache access, comprising:
-
storing a plurality of trigger patterns in a first memory; applying a reconfigurable perfect hash matching scheme to a received data packet to identify a matching trigger pattern from the plurality of trigger patterns, where the matching trigger pattern matches a data pattern contained in the received data packet; generating a direct memory address for use by a host processor in locating a copy of the matching trigger pattern in a second memory; and applying a cache policy profile to the copy of the matching trigger pattern from the second memory to make a caching decision for a data cache request associated with the matching trigger pattern.
-
Specification