Opportunistic block transmission with time constraints
First Claim
Patent Images
1. A system comprising:
- a virtual execution environment;
a block granularity caching engine;
a cache;
wherein, in operation;
a process associated with a stream-enabled application is executed in the virtual execution environment;
the virtual execution environment intercepts a resource request for a resource from the process executing in the virtual execution environment;
the virtual execution environment identifies one or more blocks that are associated with the resource;
the virtual execution environment makes a block request requesting a block associated with the resource;
the block granularity engine makes the block request available to a streaming server so as to prefetch a predictively streamed block based on the block request wherein the predictively streamed block is ordered in a priority queue by probability, based at least in part on the requested block, that the predictively streamed block will be requested;
the block granularity engine receives blocks of the stream-enabled application, including the predictively streamed block and the requested block, within an interactivity threshold;
the block granularity engine checks the cache for blocks to satisfy the block request;
the block granularity engine provides the predictively streamed block to the virtual execution environment if the predictively streamed block is found in the cache;
the virtual execution environment satisfies the resource request of the process using the predictively streamed block.
3 Assignments
0 Petitions
Accused Products
Abstract
A technique for determining a data window size allows a set of predicted blocks to be transmitted along with requested blocks. A stream enabled application executing in a virtual execution environment may use the blocks when needed.
346 Citations
21 Claims
-
1. A system comprising:
-
a virtual execution environment; a block granularity caching engine; a cache; wherein, in operation; a process associated with a stream-enabled application is executed in the virtual execution environment; the virtual execution environment intercepts a resource request for a resource from the process executing in the virtual execution environment; the virtual execution environment identifies one or more blocks that are associated with the resource; the virtual execution environment makes a block request requesting a block associated with the resource; the block granularity engine makes the block request available to a streaming server so as to prefetch a predictively streamed block based on the block request wherein the predictively streamed block is ordered in a priority queue by probability, based at least in part on the requested block, that the predictively streamed block will be requested; the block granularity engine receives blocks of the stream-enabled application, including the predictively streamed block and the requested block, within an interactivity threshold; the block granularity engine checks the cache for blocks to satisfy the block request; the block granularity engine provides the predictively streamed block to the virtual execution environment if the predictively streamed block is found in the cache; the virtual execution environment satisfies the resource request of the process using the predictively streamed block. - View Dependent Claims (2, 3, 4)
-
-
5. A method comprising:
-
executing a process associated with a stream-enabled application that provides a first request requesting a block for resources, wherein a streaming server provides a first predictively streamed block using a priority queue of predictively streamed blocks ordered by probability based at least in part on the first request, wherein the first predictively streamed block is first in the priority queue; receiving, within a data window, one or more blocks including resources used to satisfy the first request for resources as well as the first predictively streamed block; receiving the first predictively streamed block and the requested block in an amount of time that is less than or equal to an interactivity threshold; storing the first predictively streamed block in a cache; providing a second request for resources; checking the cache to find the first predictively streamed block to satisfy the second request for resources; at least partially satisfying the second request for resources using the first predictively streamed block in the cache. - View Dependent Claims (6)
-
-
7. A system comprising:
-
blocks of a stream-enabled application embodied in a computer readable medium; an output buffer, wherein, in operation, the output buffer stores one or more of the blocks of the stream enabled application; a data window engine, wherein, in operation, the data window engine determines a data window size using an interactivity threshold to limit predicted block aggregation; a predicted block aggregation engine, wherein, in operation, the predicted block aggregation engine orders predicted blocks by probability, based on a requested block, in a priority queue and stores a set of predicted blocks from the front of the priority queue in the output buffer until or before the size of the output buffer reaches the data window size; an interface, wherein, in operation, the interface sends the one or more blocks, a requested block and the set of predicted blocks in the output buffer. - View Dependent Claims (8, 9, 10)
-
-
11. A method comprising:
-
predicting one or more blocks that are associated with a request requesting a block for resources of a stream-enabled application; ordering in a priority queue the one or more predictive blocks based at least in part on the requested block; adding a set of predicted blocks from the front of the priority queue to an output buffer until the output buffer has reached a data window size; continuously updating the data window size using a continuously updated value for latency; transmitting within an interactivity threshold the set of predicted blocks and the requested block in the output buffer. - View Dependent Claims (12, 13, 14, 15)
-
-
16. A system comprising:
-
a streaming playback device; a streaming server; wherein, in operation, the streaming server; adds blocks of the stream-enabled application to an output buffer; predicts one or more blocks that will be needed by the streaming playback device based on a request requesting a block; orders the one or more blocks, by probability based on the requested block, in a priority queue; adds a set of predicted blocks from the front of the priority queue to the output buffer, up to a limit of a data window size set in accordance with an interactivity threshold limiting the amount of time for transmission of blocks to preserve a quality of execution of the stream-enabled application; provides the contents of the output buffer, including the requested block and the set of predicted blocks, to the streaming playback device. - View Dependent Claims (17, 18, 19, 20, 21)
-
Specification