Control system using aesthetically guided gesture recognition
First Claim
1. A method for authoring gesture definitions for use in gesture-based control systems, comprising:
- during performance of a gesture by a performer with a plurality of sensors, generating a plurality of sets of raw sensor data each corresponding to differing parameters of the performance of the gesture;
displaying a graphical user interface (GUI) on a display device, wherein the GUI includes graphical plots of the plurality of sets of raw sensor data;
receiving director input identifying a subset of the parameters and storing the identified subset of parameters in a gesture definition, thereby first authoring the gesture definition;
for the graphical plots of the plurality of sets of sensor data corresponding to each of the parameters in the subset of the parameters in the gesture definition, receiving director input defining at least two of a starting position, an ending position, a maximum value, and a minimum value and storing the defined at least two of the starting position, the ending position, the maximum value, and the minimum value in the memory in the gesture definition, thereby second authoring the gesture definition;
receiving additional director input modifying the definition of at least one of the starting position, the ending position, the maximum value, and the minimum value for at least one of the parameters in the subset of the parameters of the gesture definition; and
receiving director input defining for at least one of the graphical plots associated with the subset of the parameters used in the gesture definition at least one of a must pass window and a must not pass window defining an area a plot of sensor data must pass or must not pass, respectively, for gesture recognition.
1 Assignment
0 Petitions
Accused Products
Abstract
A method for facilitating and enhancing computer-based authoring of gesture definitions that are useful in controlling a walk-around character and other systems using gesture controls. The method includes, during performance of a gesture by a performer, collecting sets of raw sensor data each corresponding to differing parameters of the performance of the gesture. The method includes displaying a graphical user interface with a graphical plot of each of the sets of raw sensor data. The method includes receiving user input identifying which of the parameters to include in a gesture definition. The method includes, for the graphical plots associated with the chosen parameters receiving user input defining a starting position, an ending position, a maximum value, and a minimum value. The gesture may involve movement of a performer'"'"'s arms, legs, hands, head, eyes, and so on in a particular manner.
9 Citations
15 Claims
-
1. A method for authoring gesture definitions for use in gesture-based control systems, comprising:
-
during performance of a gesture by a performer with a plurality of sensors, generating a plurality of sets of raw sensor data each corresponding to differing parameters of the performance of the gesture; displaying a graphical user interface (GUI) on a display device, wherein the GUI includes graphical plots of the plurality of sets of raw sensor data; receiving director input identifying a subset of the parameters and storing the identified subset of parameters in a gesture definition, thereby first authoring the gesture definition; for the graphical plots of the plurality of sets of sensor data corresponding to each of the parameters in the subset of the parameters in the gesture definition, receiving director input defining at least two of a starting position, an ending position, a maximum value, and a minimum value and storing the defined at least two of the starting position, the ending position, the maximum value, and the minimum value in the memory in the gesture definition, thereby second authoring the gesture definition; receiving additional director input modifying the definition of at least one of the starting position, the ending position, the maximum value, and the minimum value for at least one of the parameters in the subset of the parameters of the gesture definition; and receiving director input defining for at least one of the graphical plots associated with the subset of the parameters used in the gesture definition at least one of a must pass window and a must not pass window defining an area a plot of sensor data must pass or must not pass, respectively, for gesture recognition. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A system for authoring gestures for use by a gesture-based controller, comprising:
-
a computer with a processor managing a display and a memory device; a gesture monitoring assembly including a set of sensors sampling data during performance of a gesture by a performer; and a gesture authoring tool run by the processor to; during the performance of the gesture by the performer, collect sets of sensor data from the sensors; display a graphical user interface (GUI) on a display device, wherein the GUI includes graphical plots of the sets of sensor data; generate a gesture definition by receiving user input identifying a subset of the sensors to include in the gesture definition and storing the identified subset of the sensors in the memory device; for the graphical plots of each of the sensors in the subset of the sensors included in the gesture definition, receiving user input defining at least two of a starting position, an ending position, a maximum value, and a minimum value of the sensor data and storing, in the memory device, the defined at least two of the starting position, the ending position, the maximum value, and the sensor data as part of the gesture definition; wherein the gesture authoring tool further acts to receive user input defining for one of the graphical plots associated with the subset of the sensors used in the gesture definition at least one of a must pass window and a must not pass window defining, respectively, an area a plot of sensor data must pass for gesture recognition and an area a plot of sensor data must not pass for gesture recognition; and wherein the gesture authoring tool receives additional user input modifying the stored definition of at least one of the starting position, the ending position, the maximum value, and the minimum value for the sensor data for at least one of the sensors in the subset of the sensors included in the gesture definition. - View Dependent Claims (9, 10)
-
-
11. A gesture authoring method, comprising:
-
collecting a set of data from a plurality of sensors during a data collection window; receiving first user input defining a gesture performance time period with a start time and an end time for performance of a gesture by a performer during the collecting; creating a first gesture definition for the gesture including the set of data from a subset of the plurality of sensors during the gesture performance time period; receiving second user input marking in the first gesture definition, for each of the sensors in the subset of the plurality of sensors, at least two of a start position, an end position, a maximum value, and a minimum value for the set of data; receiving third user input marking in the first gesture definition, for at least one of the sensors in the subset of the plurality of sensors, at least one of a must pass window and a must not pass window defining an area sensor data must pass or must not pass, respectively, for gesture recognition; creating a second gesture definition for the gesture comprising a modified version of the first gesture definition including the markings from the second user input and the third user input; and operating a gesture-based control system to generate a control signal when the gesture-based control system determines a performer has performed the gesture defined by the second gesture definition. - View Dependent Claims (12, 13, 14, 15)
-
Specification