System for distributed data processing with automatic caching at various system levels
First Claim
1. A data processing system for distributed data processing with automatic caching at multiple system levels, the data processing system comprising:
- a memory device with computer-readable program code stored thereon;
a communication device;
a processing device operatively coupled to the memory device and the communication device, wherein the processing device is configured to execute the computer-readable program code to;
access a master queue of data processing work comprising a plurality of data processing jobs stored in a long term memory cache;
select at least one of the plurality of data processing jobs from the master queue of data processing work;
push the selected data processing jobs to an interface layer, comprising;
accessing the selected data processing jobs from the long term memory cache;
saving the selected data processing jobs in an interface layer cache of data processing work;
pushing at least a first portion of the selected data processing jobs to a memory cache of a first user system associated with a first user for minimizing latency in user data processing of the pushed data processing jobs; and
pushing at least a second portion of the selected data processing jobs to the memory cache of the first user system before the first user system has finished processing the first portion of the selected data processing jobs.
1 Assignment
0 Petitions
Accused Products
Abstract
Embodiments enable distributed data processing with automatic caching at multiple system levels by accessing a master queue of data processing work comprising a plurality of data processing jobs stored in a long term memory cache; selecting at least one of the plurality of data processing jobs from the master queue of data processing work; pushing the selected data processing jobs to an interface layer including (i) accessing the selected data processing jobs from the long term memory cache; and (ii) saving the selected data processing jobs in an interface layer cache of data processing work; and pushing at least a portion of the selected data processing jobs to a memory cache of a first user system for minimizing latency in user data processing of the pushed data processing jobs.
-
Citations
20 Claims
-
1. A data processing system for distributed data processing with automatic caching at multiple system levels, the data processing system comprising:
-
a memory device with computer-readable program code stored thereon; a communication device; a processing device operatively coupled to the memory device and the communication device, wherein the processing device is configured to execute the computer-readable program code to; access a master queue of data processing work comprising a plurality of data processing jobs stored in a long term memory cache; select at least one of the plurality of data processing jobs from the master queue of data processing work; push the selected data processing jobs to an interface layer, comprising; accessing the selected data processing jobs from the long term memory cache; saving the selected data processing jobs in an interface layer cache of data processing work; pushing at least a first portion of the selected data processing jobs to a memory cache of a first user system associated with a first user for minimizing latency in user data processing of the pushed data processing jobs; and pushing at least a second portion of the selected data processing jobs to the memory cache of the first user system before the first user system has finished processing the first portion of the selected data processing jobs. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A computer program product for distributed data processing with automatic caching at multiple system levels, the computer program product comprising at least one non-transitory computer-readable medium having computer-readable program code portions embodied therein, the computer-readable program code portions comprising:
-
an executable portion configured for accessing a master queue of data processing work comprising a plurality of data processing jobs stored in a long term memory cache; an executable portion configured for selecting at least one of the plurality of data processing jobs from the master queue of data processing work; an executable portion configured for pushing the selected data processing jobs to an interface layer, comprising; accessing the selected data processing jobs from the long term memory cache; saving the selected data processing jobs in an interface layer cache of data processing work; pushing at least a first portion of the selected data processing jobs to a memory cache of a first user system associated with a first user for minimizing latency in user data processing of the pushed data processing jobs by the first user system; and pushing at least a second portion of the selected data processing jobs to the memory cache of the first user system before the first user system has finished processing the first portion of the selected data processing jobs. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
-
-
19. A computer-implemented method for distributed data processing with automatic caching at multiple system levels, the method comprising:
-
accessing a master queue of data processing work comprising a plurality of data processing jobs stored in a long term memory cache; selecting at least one of the plurality of data processing jobs from the master queue of data processing work; pushing the selected data processing jobs to an interface layer, comprising; accessing the selected data processing jobs from the long term memory cache; and saving the selected data processing jobs in an interface layer cache of data processing work; pushing at least a first portion of the selected data processing jobs to a memory cache of a first user system associated with a first user for minimizing latency in user data processing of the pushed data processing jobs by the first user system; and pushing at least a second portion of the selected data processing jobs to the memory cache of the first user system before the first user system has finished processing the first portion of the selected data processing jobs. - View Dependent Claims (20)
-
Specification