System for wireless push and pull based services
First Claim
1. A method for optimizing performance of at least one pull service and at least one push service to a plurality of mobile users comprising the steps of:
- reducing access latency for said at least one pull service running on at least one Web server to optimize said at least one pull service by prefetching documents into a cache of at least one proxy gateway by using factors of a frequency of access of said plurality of mobile users to said pull content of said pull service, an update cycle of said pull content, said update cycle of pull content is the average length of time between two successive expiration times or two successive modifications of said pull content and response delay for fetching said pull content from said at least one Web server to said at least one proxy gateway, said at least one proxy gateway connected between said mobile user and said Web server; and
iteratively estimating a state of each of said plurality of mobile users for determining push content to be forwarded to said mobile user by said at least one push service running on said at least one Web server to optimize said at least one push service;
wherein said step of reducing access latency comprises the step of selecting a predetermined number of documents to be prefetched into cache of a proxy gateway, and said step of selecting a predetermined number of documents uses said factors of;
said frequency of access, said update cycle and said response delay, wherein said frequently accessed pull documents having a shorter update cycle and a longer response delay are prioritized for being prefetched in said cache of said proxy gateway;
wherein said step of iteratively estimating a state of each of said plurality of mobile users is determined from tracking data of said plurality of mobile users and geo-location measurement and behavior observation data.
0 Assignments
0 Petitions
Accused Products
Abstract
The present invention relates to a method and system for providing Web content from pull and push based services running on Web content providers to mobile users. A proxy gateway connects the mobile users to the Web content providers. A prefetching module is used at the proxy gateway to optimize performance of the pull services by reducing average access latency. The average access latency can be reduced by using at least three factors: one related to the frequency of access to the pull content; second, the update cycle of the pull content determined by the Web content providers; and third, the response delay for fetching pull content from the content provider to the proxy gateway. Pull content, such as documents, having the greatest average access latency are sorted and a predetermined number of the documents are prefetched into the cache. Push services are optimized by iteratively estimating a state of each of the mobile users to determine relevant push content to be forward to the mobile user.
-
Citations
10 Claims
-
1. A method for optimizing performance of at least one pull service and at least one push service to a plurality of mobile users comprising the steps of:
-
reducing access latency for said at least one pull service running on at least one Web server to optimize said at least one pull service by prefetching documents into a cache of at least one proxy gateway by using factors of a frequency of access of said plurality of mobile users to said pull content of said pull service, an update cycle of said pull content, said update cycle of pull content is the average length of time between two successive expiration times or two successive modifications of said pull content and response delay for fetching said pull content from said at least one Web server to said at least one proxy gateway, said at least one proxy gateway connected between said mobile user and said Web server; and iteratively estimating a state of each of said plurality of mobile users for determining push content to be forwarded to said mobile user by said at least one push service running on said at least one Web server to optimize said at least one push service; wherein said step of reducing access latency comprises the step of selecting a predetermined number of documents to be prefetched into cache of a proxy gateway, and said step of selecting a predetermined number of documents uses said factors of;
said frequency of access, said update cycle and said response delay, wherein said frequently accessed pull documents having a shorter update cycle and a longer response delay are prioritized for being prefetched in said cache of said proxy gateway;wherein said step of iteratively estimating a state of each of said plurality of mobile users is determined from tracking data of said plurality of mobile users and geo-location measurement and behavior observation data. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A system for providing optimization of performance at least one pull service and at least one push service to a plurality of mobile users comprising:
-
means for reducing access latency for said at least one pull service running on at least one Web server by prefetching documents into a cache of a proxy gateway, said proxy gateway being connected between said mobile user and said pull service and push service by using factors of a frequency of access of said plurality of mobile users to said pull content of said pull service, an update cycle of said pull content, said update cycle of pull content is the average length of time between two successive expiration times or two successive modifications of said pull content and response delay for fetching said pull content from said at least one Web server to said at least one proxy gateway said at least one proxy gateway connected between said mobile user and said Web server; and means for iteratively estimating a state of each of said plurality of mobile users for determining push content to be forwarded to said mobile user by said at least one push service running on said at least one Web server to optimize said at least one push service; wherein said pull content is a plurality of documents and said means for reducing access latency comprises the step of;
selecting a predetermined number of documents to be prefetched into cache of a proxy gateway, and selecting a predetermined number of documents uses said factors of;
said frequency of access, said update cycle and said response delay, wherein said frequently accessed pull documents having a shorter update cycle and a longer response delay are prioritized for being prefetched in said cache of said proxy gateway;wherein said means for iteratively estimating a state of each of said plurality of mobile users is determined from tracking data of said plurality of mobile users and geo-location measurement and behavior observation data. - View Dependent Claims (9, 10)
-
Specification