Apparatus and method for cache provisioning, configuration for optimal application performance
First Claim
1. A method, comprising:
- recording, by an application monitor, application-level heuristics;
wherein the application-level heuristics are monitored and recorded at an application level by the application monitor;
recording, by an IO (input/output) monitor, IO-level (input/output-level) heuristics in IO requests and subsequently passing, by the IO monitor, the IO-level heuristics to a storage stack;
wherein the IO monitor identifies IO stream characteristics in the IO requests and correlates the IO stream characteristics with application-defined objects;
wherein the application-level heuristics indicate a first plurality of components that can be accelerated in an application, a second plurality of components that are frequently accessed in the application, and a third plurality of components that are key to an application performance of the application;
wherein the IO-level heuristics indicate the IO stream characteristics and indicate a nature of the IO requests;
wherein the IO stream characteristics include a size of a dataset in an IO stream having the IO stream characteristics, an access pattern exhibiting if the IO stream comprises a sequential access pattern or a random access pattern, and whether the IO stream is predominantly reads or predominantly writes;
wherein the application-level heuristics indicate which components of the application can be accelerated, which components of the application are frequently accessed, and which components of the application are key to application performance;
correlating and analyzing, by an analyzer, the application-level heuristics and IO-level heuristics;
based on and in response to an analysis and a correlation of statistics in the application-level heuristics and in the IO-level heuristics, generating, by the analyzer, a caching policy, including parameters related to a cache and a primary storage coupled to the cache, for achieving optimal application performance;
based on the analysis and the correlation of the statistics in the application-level heuristics and in the IO-level heuristics, generating and provisioning a cache configuration for achieving optimal application performance including selecting settings for cache unit size, cache capacity, read cache size, write cache size, and regions of the primary storage that are candidates for cache acceleration;
wherein the caching policy comprises cache provisioning hints and an acceleration strategy; and
using, by a caching engine, the caching policy during caching operations on the cache in order to optimize a utilization of the cache and optimize a performance of the application.
2 Assignments
0 Petitions
Accused Products
Abstract
In an embodiment of the invention, a method comprises: recording application-level heuristics and IO-level (input/output-level) heuristics; correlating and analyzing the application-level heuristics and IO-level heuristics; and based on an analysis and correlation of the application-level heuristics and IO-level heuristics, generating a policy for achieving optimal application performance. In another embodiment of the invention, an apparatus comprises: a system configured to record application-level heuristics and IO-level heuristics, to correlate and analyze the application-level heuristics and IO-level heuristics, and based on an analysis and correlation of the application-level heuristics and IO-level heuristics, to generate a policy for achieving optimal application performance.
80 Citations
20 Claims
-
1. A method, comprising:
-
recording, by an application monitor, application-level heuristics; wherein the application-level heuristics are monitored and recorded at an application level by the application monitor; recording, by an IO (input/output) monitor, IO-level (input/output-level) heuristics in IO requests and subsequently passing, by the IO monitor, the IO-level heuristics to a storage stack; wherein the IO monitor identifies IO stream characteristics in the IO requests and correlates the IO stream characteristics with application-defined objects; wherein the application-level heuristics indicate a first plurality of components that can be accelerated in an application, a second plurality of components that are frequently accessed in the application, and a third plurality of components that are key to an application performance of the application; wherein the IO-level heuristics indicate the IO stream characteristics and indicate a nature of the IO requests; wherein the IO stream characteristics include a size of a dataset in an IO stream having the IO stream characteristics, an access pattern exhibiting if the IO stream comprises a sequential access pattern or a random access pattern, and whether the IO stream is predominantly reads or predominantly writes; wherein the application-level heuristics indicate which components of the application can be accelerated, which components of the application are frequently accessed, and which components of the application are key to application performance; correlating and analyzing, by an analyzer, the application-level heuristics and IO-level heuristics; based on and in response to an analysis and a correlation of statistics in the application-level heuristics and in the IO-level heuristics, generating, by the analyzer, a caching policy, including parameters related to a cache and a primary storage coupled to the cache, for achieving optimal application performance; based on the analysis and the correlation of the statistics in the application-level heuristics and in the IO-level heuristics, generating and provisioning a cache configuration for achieving optimal application performance including selecting settings for cache unit size, cache capacity, read cache size, write cache size, and regions of the primary storage that are candidates for cache acceleration; wherein the caching policy comprises cache provisioning hints and an acceleration strategy; and using, by a caching engine, the caching policy during caching operations on the cache in order to optimize a utilization of the cache and optimize a performance of the application. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. An apparatus, comprising:
-
a system comprising an application monitor, an IO (input/output) monitor, an analyzer, and a caching engine; wherein the application monitor is configured to record application-level heuristics; wherein the application-level heuristics are monitored and recorded at an application level by the application monitor; wherein the IO monitor is configured to record IO-level (input/output-level) heuristics in IO requests and is configured to subsequently pass the IO-level heuristics to a storage stack; wherein the IO monitor identifies IO stream characteristics in the IO requests and correlates the IO stream characteristics with application-defined objects; wherein the analyzer is configured to correlate and analyze statistics in the application-level heuristics and in the IO-level heuristics and to generate a caching policy, including parameters related to a cache and a primary storage coupled to the cache, for achieving optimal application performance based on and in response to an analysis and a correlation of the application-level heuristics and IO-level heuristics, wherein based on the analysis and the correlation of the statistics in the application-level heuristics and in the IO-level heuristics, the analyzer generates and provisions a cache configuration for achieving optimal application performance including selecting settings for cache unit size, cache capacity, read cache size, write cache size, and regions of the primary storage that are candidates for cache acceleration, and wherein the caching engine is configured to use the caching policy during caching operations on the cache in order to optimize a utilization of the cache and optimize a performance of an application; wherein the application-level heuristics indicate a first plurality of components that can be accelerated in an the application, a second plurality of components that are frequently accessed in the application, and a third plurality of components that are key to an application performance of the application; wherein the IO-level heuristics indicate the IO stream characteristics and indicate a nature of the IO requests; wherein the IO stream characteristics include a size of a dataset in an IO stream having the IO stream characteristics, an access pattern exhibiting if the IO stream comprises a sequential access pattern or a random access pattern, and whether the IO stream is predominantly reads or predominantly writes; wherein the application-level heuristics indicate which components of the application can be accelerated, which components of the application are frequently accessed, and which components of the application are key to application performance; and wherein the caching policy comprises cache provisioning hints and an acceleration strategy. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16)
-
-
17. An article of manufacture, comprising:
-
a non-transitory computer-readable medium having stored thereon instructions operable to permit an apparatus to perform a method comprising; recording, by an application monitor, application-level heuristics; wherein the application-level heuristics are monitored and recorded at an application level by the application monitor; recording, by an IO (input/output) monitor, IO-level (input/output-level) heuristics in IO requests and subsequently passing, by the IO monitor, the IO-level heuristics to a storage stack; wherein the IO monitor identifies IO stream characteristics in the IO requests and correlates the IO stream characteristics with application-defined objects; wherein the application-level heuristics indicate a first plurality of components that can be accelerated in an application, a second plurality of components that are frequently accessed in the application, and a third plurality of components that are key to an application performance of the application; wherein the IO-level heuristics indicate the IO stream characteristics and indicate a nature of the IO requests; wherein the IO stream characteristics include a size of a dataset in an IO stream having the IO stream characteristics, an access pattern exhibiting if the IO stream comprises a sequential access pattern or a random access pattern, and whether the IO stream is predominantly reads or predominantly writes; wherein the application-level heuristics indicate which components of the application can be accelerated, which components of the application are frequently accessed, and which components of the application are key to application performance; correlating and analyzing, by an analyzer, the application-level heuristics and IO-level heuristics; and based on and in response to an analysis and a correlation of statistics in the application-level heuristics and in the IO-level heuristics, generating, by an analyzer, a caching policy, including parameters related to a cache and a primary storage coupled to the cache, for achieving optimal application performance; based on the analysis and the correlation of the statistics in the application-level heuristics and in the IO-level heuristics, generating and provisioning a cache configuration for achieving optimal application performance including selecting settings for cache unit size, cache capacity, read cache size, write cache size, and regions of the primary storage that are candidates for cache acceleration; wherein the caching policy comprises cache provisioning hints and an acceleration strategy; and using, by a caching engine, the caching policy during caching operations on the cache in order to optimize a utilization of the cache and optimize a performance of the application. - View Dependent Claims (18, 19, 20)
-
Specification