AUDIO/VIDEO METHODS AND SYSTEMS
First Claim
1. A method comprising the acts:
- receiving video information, and transforming the video information for representation in a video portion of a unitary data object;
receiving audio information, and transforming the audio information for representation in an audio portion of the unitary data object;
receiving sensor information, comprising at least one parameter relating to acceleration, orientation or tilt, and transforming the sensor information for representation in the unitary data object; and
transmitting the unitary data object to a data receiver, or storing the unitary data object on a computer readable storage medium, so that the sensor information is structurally associated with the audio and video information by the unitary data object, and thereby adapted for use by a processor in altering the audio or video information.
1 Assignment
0 Petitions
Accused Products
Abstract
Audio and or video data is structurally and persistently associated with auxiliary sensor data (e.g., relating to acceleration, orientation or tilt) through use of a unitary data object, such as a modified MPEG file or data stream. In this form, different rendering devices can employ co-conveyed sensor data to alter the audio or video content. Such use of the sensor data may be personalized to different users, e.g., through preference data. For example, accelerometer data can be associated with video data, allowing some users to view a shake-stabilized version of a video, and other users to view the video with such motion artifacts undisturbed. In like fashion, camera parameters, such as focal plane distance, can be co-conveyed with audio/video content—allowing the volume to be diminished (or not, again depending on user preference) when a camera captures audio/video from a distant subject. Some arrangements employ multiple image sensors and/or multiple audio sensors—each also collecting auxiliary data. A great number of other features and arrangements are also detailed.
65 Citations
33 Claims
-
1. A method comprising the acts:
-
receiving video information, and transforming the video information for representation in a video portion of a unitary data object; receiving audio information, and transforming the audio information for representation in an audio portion of the unitary data object; receiving sensor information, comprising at least one parameter relating to acceleration, orientation or tilt, and transforming the sensor information for representation in the unitary data object; and transmitting the unitary data object to a data receiver, or storing the unitary data object on a computer readable storage medium, so that the sensor information is structurally associated with the audio and video information by the unitary data object, and thereby adapted for use by a processor in altering the audio or video information. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method comprising the acts:
-
receiving a unitary data object; recovering audio data from an audio portion of the unitary data object; and recovering sensor data from the audio portion of the unitary data object, the sensor data comprising at least one parameter relating to acceleration, orientation or tilt; wherein at least one of said recovering acts is performed by a hardware processor. - View Dependent Claims (12, 13, 14)
-
-
15. A method comprising the acts:
-
receiving a unitary data object; recovering audio data from an audio portion of the unitary data object; and recovering camera data from the audio portion of the unitary data object, the camera data comprising at least one parameter relating to focus, zoom, aperture size, depth-of-field, exposure time, ISO setting, and/or focal depth; wherein at least one of said recovering acts is performed by a hardware processor. - View Dependent Claims (16, 17)
-
-
18. A method comprising the acts:
-
receiving a unitary data object; recovering both video data and sensor data from the unitary data object, the sensor data comprising at least one parameter relating to acceleration, orientation or tilt; and processing the video data in accordance with at least some of the sensor data, to yield altered video data for rendering to a user; wherein at least one of said acts is performed by a hardware processor. - View Dependent Claims (19, 20, 21)
-
-
22. A computer readable storage medium containing non-transitory software instructions causing a processor programmed thereby to:
-
recover both video data and sensor data from a received unitary data object, the sensor data comprising at least one parameter relating to acceleration, orientation or tilt; and process the video data in accordance with at least some of the sensor data, to yield altered video data for rendering to a user.
-
-
23. A method comprising the acts:
-
receiving a unitary data object; recovering both video data and camera data from the unitary data object, the camera data comprising at least one parameter relating to focus, zoom, aperture size, depth-of-field, exposure time, ISO, and/or lens focal-length; and processing the video data in accordance with at least some of the camera data to yield altered video data for rendering to a user; wherein at least one of said acts is performed by a hardware processor.
-
-
24. A method comprising the acts:
-
receiving a unitary data object; recovering both audio data and sensor data from the unitary data object, the sensor data comprising at least one parameter relating to acceleration, orientation or tilt; and processing the audio data in accordance with at least some of the sensor data to yield altered audio data for rendering to a user; wherein at least one of said acts is performed by a hardware processor.
-
-
25. A method comprising the acts:
-
receiving a unitary data object; recovering both audio data and camera data from the unitary data object, the camera data comprising at least one parameter relating to focus, zoom, aperture size, depth-of-field, exposure time, ISO, and/or lens focal-length; and processing the audio data in accordance with at least some of the camera data to yield altered audio data for rendering to a user; wherein at least one of said acts is performed by a hardware processor. - View Dependent Claims (26)
-
-
27. A method comprising:
-
collecting sensor information from a sensor conveyed by a movable object in a sporting event; collecting video information from the sporting event using a camera at a location remote from said movable object; creating a unitary data object that includes both data corresponding to the collected sensor information, and data corresponding to said collected video information; and storing the unitary data object in a computer readable storage medium, or transmitting the unitary data object to a data receiver. - View Dependent Claims (28, 29)
-
- 30. A mobile phone comprising a processor, and at least first and second sensors, the processor being configured to create a unitary data object including information sensed by the first sensor and information sensed by the second sensor, wherein the first sensor comprises an image or audio sensor, and the second sensor comprises an acceleration sensor, an orientation sensor, or a tilt sensor.
Specification