×

Predictive memory caching for media-on-demand systems

  • US 5,815,662 A
  • Filed: 08/15/1996
  • Issued: 09/29/1998
  • Est. Priority Date: 08/15/1995
  • Status: Expired due to Fees
First Claim
Patent Images

1. A system for predictive memory caching for a media-on-demand network connected to a plurality of clients, comprising:

  • (a) a network server for handling requests from multiple clients for streaming media data for media programs provided by said system, said server being provided with a memory buffer;

    (b) a data storage device associated with said network server for storing media data for media programs provided by said system;

    (c) a server scheduler for allocating a plurality of sections of the memory buffer for requests for a given media program, wherein each buffer section can store one of a series of data blocks of predetermined size into which the media program is divided;

    (d) said server scheduler having first process program means which, upon receiving requests from one or more clients constituting a first group of network clients for the media program within a first predetermined time interval TI1, establishes a first streaming data process for sending the data blocks of the requested media program to the first group of clients, including process program means for checking for and, otherwise, for reading in a first data block of the requested media program from the data storage device in the server'"'"'s memory buffer and for sending it to the first group of clients;

    (e) said server scheduler having second process program means for sending a current data block to a current group of clients, and for checking for and, otherwise, for reading in a next data block of the media program in the server'"'"'s memory buffer;

    (f) said server scheduler having third process program means for checking if requests for the same media program have been made by another group of clients close enough in time to the first time interval TI1 that the first data block of the large data file remains in memory, and for establishing a streaming data process for such group and sending the first data block to such group;

    (g) said server scheduler having fourth process program means for checking if the allocated sections of the memory buffer are full and, if so, for removing an oldest-in-time, lowest priority data block to free a section of the memory buffer;

    wherein the operations of said second, third, and fourth process program means are performed iteratively until all data blocks of the requested large data file have been sent to the groups of clients for which the streaming data processes were established; and

    wherein said network server employs a microprocessor having a capacity for addressing up to 4 gigabytes of random access memory for the memory buffer which is capable of handling multiple accesses by a multiplicity of clients for "Top 10" video-on-demand service during Prime Time hours.

View all claims
  • 1 Assignment
Timeline View
Assignment View
    ×
    ×