Resource provisioning systems and methods
First Claim
1. A system comprising a resource manager having one or more hardware processors configured to:
- monitor received data processing requests from a plurality of computerized query sources to be executed by an execution platform, the data processing requests directed to database data stored on a plurality of shared storage devices collectively storing the database data, wherein the execution platform comprises a plurality of nodes, each node independent of the plurality of computerized query sources and comprising at least one processor and at least one local cache caching at least a portion of the database data;
monitor query response rates corresponding to the database data;
determine that at least one of additional data storage capacity and additional processing capacity are needed based on the data processing requests and the query response rates; and
increase, in response to the determining, an amount of the data cached by the execution platform by adding one or more nodes to the plurality of nodes, wherein each of the one or more nodes added comprise at least one processor and at least one local cache.
2 Assignments
0 Petitions
Accused Products
Abstract
Example resource provisioning systems and methods are described. In one implementation, an execution platform accesses multiple remote storage devices. The execution platform includes multiple virtual warehouses, each of which includes a cache to store data retrieved from the remote storage devices and a processor that is independent of the remote storage devices. A resource manager is coupled to the execution platform and monitors received data processing requests and resource utilization. The resource manager also determines whether additional virtual warehouses are needed based on the data processing requests and the resource utilization. If additional virtual warehouses are needed, the resource manager provisions a new virtual warehouse.
157 Citations
17 Claims
-
1. A system comprising a resource manager having one or more hardware processors configured to:
-
monitor received data processing requests from a plurality of computerized query sources to be executed by an execution platform, the data processing requests directed to database data stored on a plurality of shared storage devices collectively storing the database data, wherein the execution platform comprises a plurality of nodes, each node independent of the plurality of computerized query sources and comprising at least one processor and at least one local cache caching at least a portion of the database data; monitor query response rates corresponding to the database data; determine that at least one of additional data storage capacity and additional processing capacity are needed based on the data processing requests and the query response rates; and increase, in response to the determining, an amount of the data cached by the execution platform by adding one or more nodes to the plurality of nodes, wherein each of the one or more nodes added comprise at least one processor and at least one local cache. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method comprising:
-
controlling, by one or more processors, an execution platform comprising a plurality of nodes, each node thereof comprising at least one processor and at least one local cache, the execution platform configured to process database queries corresponding to database data stored by a plurality of shard storage devices independent from the execution platform; monitoring, by the one or more processors, the data processing requests that originate from a plurality of query sources independent from the execution platform, the data processing requests directed to the database data; monitoring, by the one or more processors, query response rates corresponding to the database data; determining, by the one or more processors, that at least one of additional data storage capacity and additional processing capacity are needed based on the data processing requests and the query response rates; and increasing, by the one or more processors, in response to the determining, an amount of the database data cached by the execution platform by adding one or more nodes to the plurality of nodes of the execution platform, wherein each of the one or more nodes added comprise at least one processor and at least one local cache. - View Dependent Claims (10, 11, 12)
-
-
13. Non-transitory computer readable storage media storing instructions that, when executed by one or more processors, cause the one or more processors to:
-
allocate storage resources in a storage platform, the storage resources comprising a plurality of shared storage devices storing database data; allocate computing resources in an execution platform comprising a plurality of nodes, each node comprising at least one processor and at least one local cache, wherein the execution platform is independent from the storage resources and processes queries corresponding to database data stored by the plurality of shared storage devices of the storage platform; monitor data processing requests that originate from a plurality of query sources independent from the execution platform, the data processing requests directed to the database data; monitor query response rates corresponding to the database data; increase or decrease a number of nodes of the execution platform based on the data processing requests and the query response rates; and increase or decrease the storage resources in the storage platform based on the data processing requests and current data allocation in the plurality of shard storage devices; wherein the number of nodes of the execution platform is independent from the amount of storage resources in the storage platform. - View Dependent Claims (14, 15, 16, 17)
-
Specification