Method, system and computer program product for distributed video editing
First Claim
Patent Images
1. A system of distributed non-linear video editing, comprising:
- a processor; and
a non-transitory computer readable medium storing instructions translatable by the processor to;
generate, by a transcoder, a texture strip for visually representing a plurality of frames of a video corresponding to moving image data over time, wherein the texture strip comprises a sequence of textured frame representations corresponding to the plurality of frames of the video formatted as a single still image;
send the texture strip to a remote client device;
receive an edit command from the client device, the edit command associated with a selected frame identified by a location of a positioner relative to the texture strip;
apply the edit command based on the selected frame to generate an edited video; and
send a representation of the edited video to the client device.
14 Assignments
0 Petitions
Accused Products
Abstract
A network editor comprises a central location with stored videos such as movies that can be edited by editors at remote locations. An editor receives a representation of a video and specifies edits relative to the representation, enabling the editor to use a device lacking sufficient processing capability to edit the video directly, and also reducing the volume of information transmitted between the central location and the remote editor. The central location is able to provide the edited movie in a format suitable to the display capabilities of the viewing device of the viewer requesting the edited video.
-
Citations
21 Claims
-
1. A system of distributed non-linear video editing, comprising:
-
a processor; and a non-transitory computer readable medium storing instructions translatable by the processor to; generate, by a transcoder, a texture strip for visually representing a plurality of frames of a video corresponding to moving image data over time, wherein the texture strip comprises a sequence of textured frame representations corresponding to the plurality of frames of the video formatted as a single still image; send the texture strip to a remote client device; receive an edit command from the client device, the edit command associated with a selected frame identified by a location of a positioner relative to the texture strip; apply the edit command based on the selected frame to generate an edited video; and send a representation of the edited video to the client device. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A computer program product comprising a non-transitory computer readable medium storing instructions translatable by a processor to:
-
generate, by a transcoder, a texture strip for visually representing a plurality of frames of a video corresponding to moving image data over time, wherein the texture strip comprises a sequence of textured frame representations corresponding to the plurality of frames of the video formatted as a single still image; send the texture strip to a remote client device; receive an edit command from the client device, the edit command associated with a selected frame identified by a location of a positioner relative to the texture strip; apply the edit command based on the selected frame to generate an edited video; and send a representation of the edited video to the client device. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A method of distributed non-linear video editing, comprising:
-
generating, by a transcoder at a computer, a texture strip for visually representing a plurality of frames of a video corresponding to moving image data over time, wherein the texture strip comprises a sequence of textured frame representations corresponding to the plurality of frames of the video formatted as a still image; sending, by the computer, the texture strip to a client device communicatively connected to and remote from the computer; receiving, by the computer, an edit command from the client device via a user interface displaying the texture strip, the edit command associated with a selected frame identified by a location of a positioner relative to the texture strip; applying, by the transcoder, the edit command to the selected frame to generate an edited video; and sending, by the computer, a representation of the edited video to the client device. - View Dependent Claims (16, 17, 18, 19, 20, 21)
-
Specification