OBJECT ORIENTED VIDEO SYSTEM
First Claim
1. A method of generating an object oriented interactive multimedia file, including:
- encoding data comprising at least one of video, text, audio, music and/or graphics elements as a video packet stream, text packet stream, audio packet stream, music packet stream and/or graphics packet stream respectively;
combining said packet streams into a single self-contained object, said object containing its own control information;
placing a plurality of said objects in a data stream; and
grouping one or more of said data streams in a single contiguous self-contained scene, said scene including format definition as the initial packet in a sequence of packets.
0 Assignments
0 Petitions
Accused Products
Abstract
A method of generating an object oriented interactive multimedia file, including encoding data comprising at least one of video, text, audio, music and/or graphics elements as a video packet stream, text packet stream, audio packet stream, music packet stream and/or graphics packet stream respectively, combining the packet streams into a single self-contained object, said object containing its own control information, placing a plurality of the objects in a data stream, and grouping one or more of the data streams in a single contiguous self-contained scene, the scene including format definition as the initial packet in a sequence of packets. An encoder for executing the method is provided together with a player or decoder for parsing and decoding the file, which can be wirelessly streamed to a portable computer device, such as a mobile phone or a PDA. The object controls provide rendering and interactive controls for objects allowing users to control dynamic media composition, such as dictating the shape and content of interleaved video objects, and control the objects received.
-
Citations
167 Claims
-
1. A method of generating an object oriented interactive multimedia file, including:
-
encoding data comprising at least one of video, text, audio, music and/or graphics elements as a video packet stream, text packet stream, audio packet stream, music packet stream and/or graphics packet stream respectively;
combining said packet streams into a single self-contained object, said object containing its own control information;
placing a plurality of said objects in a data stream; and
grouping one or more of said data streams in a single contiguous self-contained scene, said scene including format definition as the initial packet in a sequence of packets. - View Dependent Claims (2, 3, 4, 5, 6, 8, 135, 136, 141, 142, 143, 144, 147, 154, 156, 157)
-
-
7. A method of generating an object oriented interactive multimedia file, including:
-
encoding data comprising at least one of video and audio elements as a video packet stream and audio packet stream respectively;
combining said packet streams into a single self-contained object;
placing said object in a data stream;
placing said stream in a single contiguous self-contained scene, said scene including format definition; and
combining a plurality of said scenes. - View Dependent Claims (9, 10)
-
-
11. An interactive multimedia file format comprising single objects containing video, text, audio, music, and/or graphical data wherein at least one of said objects comprises a data stream, and at least one of said data streams comprises a scene, at least one of said scenes comprises a file, and wherein directory data and metadata provide file information.
-
12. A system for dynamically changing the actual content of a displayed video in an object-oriented interactive video system comprising:
-
a dynamic media composition process including an interactive multimedia file format including objects containing video, text, audio, music, and/or graphical data wherein at least one of said objects comprises a data stream, at least one of said data streams comprises a scene, at least one of said scenes comprises a file;
a directory data structure for providing file information;
selecting mechanism for allowing the correct combination of objects to be composited together;
a data stream manager for using directory information and knowing the location of said objects based on said directory information; and
control mechanism for inserting, deleting, or replacing in real time while being viewed by a user, said objects in said scene and said scenes in said video. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 116)
-
-
23. An object oriented interactive multimedia file, comprising:
-
a combination of one or more of contiguous self-contained scenes, each said scene comprising scene format definition as the first packet, and a group of one or more data streams following said first packet;
each said data stream apart from first data stream containing objects which may be optionally decoded and displayed according to a dynamic media composition process as specified by object control information in said first data stream; and
each said data stream including one or more single self-contained objects and demarcated by an end stream marker;
said objects each containing it'"'"'s own control information and formed by combining packet streams;
said packet streams formed by encoding raw interactive multimedia data including at least one or a combination of video, text, audio, music, or graphics elements as a video packet stream, text packet stream, audio packet stream, music packet stream and graphics packet stream respectively. - View Dependent Claims (24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34)
-
-
35. A method of providing a voice command operation of a low power device capable of operating in a streaming video system, comprising the following steps:
-
capturing a user'"'"'s speech on said device;
compressing said speech;
inserting encoded samples of said compressed speech into user control packets;
sending said compressed speech to a server capable of processing voice commands;
said server performs automatic speech recognition;
said server maps the transcribed speech to a command set;
said system checks whether said command is generated by said user or said server;
if said transcribed command is from said server, said server executes said command;
if said transcribed command is from said user said system forwards said command to said user device; and
said user executes said command. - View Dependent Claims (36)
-
-
37. A method of processing objects, comprising the steps of:
-
parsing information in a script language;
reading a plurality of data sources containing a plurality of objects in the form of at least one of video, graphics, animation, and audio;
attaching control information to the plurality of objects based on the information in the script language; and
interleaving the plurality of objects into at least one of a data stream and a file. - View Dependent Claims (38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 113, 114, 115)
-
-
71. A system for processing objects, comprising:
-
means for parsing information in a script language;
means for reading a plurality of data sources containing a plurality of objects in the form of at least one of video, graphics, animation, and audio;
means for attaching control information to the plurality of objects based on the information in the script language; and
means for interleaving the plurality of objects into at least one of a data stream and a file. - View Dependent Claims (72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104)
-
-
105. A method of transmitting an electronic greeting card, comprising the steps of:
-
inputting information indicating features of a greeting card;
generating image information corresponding to the greeting card;
encoding the image information as an object having control information;
transmitting the object having the control information over a wireless connection;
receiving the object having the control information by a wireless hand-held computing device;
decoding the object having the control information into a greeting card image by the wireless hand-held computing device; and
displaying the greeting card image which has been decoded on the hand-held computing device. - View Dependent Claims (106)
-
-
107. A system transmitting an electronic greeting card, comprising:
-
means for inputting information indicating features of a greeting card;
means for generating image information corresponding to the greeting card;
means for encoding the image information as an object having control information;
means for transmitting the object having the control information over a wireless connection;
means for receiving the object having the control information by a wireless hand-held computing device;
means for decoding the object having the control information into a greeting card image by the wireless hand-held computing device; and
means for displaying the greeting card image which has been decoded on the hand-held computing device. - View Dependent Claims (108)
-
- 109. An object oriented multimedia video system capable of supporting multiple arbitrary shaped video objects without the need for extra data overhead or processing overhead to provide video object shape information.
-
117. A video encoding method, including:
-
encoding video data with object control data as a video object; and
generating a data stream including a plurality of video objects with respective video data and object control data. - View Dependent Claims (118, 119, 120, 121, 122, 123, 124, 125, 129, 130, 137, 138, 139, 140, 145, 146, 148, 149, 150, 151, 152, 153, 163, 164, 165, 166, 167)
-
-
126. A video encoding method, including:
-
quantising colour data in a video stream based on a reduced representation of colours;
generating encoded video frame data representing said quantised colours and transparent regions; and
generating encoded audio data and object control data for transmission with said encoded video data as a video object. - View Dependent Claims (127, 128, 131, 132, 133)
-
-
134. A video encoding method, including:
-
(i) selecting a reduced set of colours for each video frame of video data;
(ii) reconciling colours from frame to frame;
(iii) executing motion compensation;
(iv) determining update areas of a frame based on a perceptual colour difference measure;
(v) encoding video data for said frames into video objects based on steps (i) to (iv); and
(vi) including in each video object animation, rendering and dynamic composition controls.
-
-
155. A video decoding method as claimed in 135 including processing voice commands from a user to control a video display generated on the basis of said video objects.
-
158. A wireless streaming video and animation system, including:
-
(i) a portable monitor device and first wireless communication means;
(ii) a server for storing compressed digital video and computer animations and enabling a user to browse and select digital video to view from a library of available videos; and
(iii) at least one interface module incorporating a second wireless communication means for transmission of transmittable data from the server to the portable monitor device, the portable monitor device including means for receiving said transmittable data, converting the transmittable data to video images displaying the video images, and permitting the user to communicate with the server to interactively browse and select a video to view. - View Dependent Claims (159)
-
-
160. A method of providing wireless streaming of video and animation including at least one of the steps of:
-
(a) downloading and storing compressed video and animation data from a remote server over a wide area network for later transmission from a local server;
(b) permitting a user to browse and select digital video data to view from a library of video data stored on the local server;
(c) transmitting the data to a portable monitor device; and
(d) processing the data to display the image on the portable monitor device.
-
-
161. A method of providing an interactive video brochure including at least one of steps of:
(a) creating a video brochure by specifying (i) the various scenes in the brochure and the various video objects that may occur within each scene, (ii) specifying the preset and user selectable scene navigational controls and the individual composition rules for each scene, (iii) specifying rendering parameters on media objects, (iv) specifying controls on media objects to create forms to collect user feedback, (v) integrating the compressed media streams and object control information into a composite data stream. - View Dependent Claims (162)
Specification