USER INTERFACES FOR BI-MANUAL CONTROL
First Claim
1. A method performed by a computing device comprising storage hardware, processing hardware, a display, and an input device, the method comprising:
- while a graphical user operation displayed at a first region of the display is being interactively performed by a first hand that is providing input via the input device at the first region of the display, recognizing a gesture inputted by a second hand and responding to the recognized gesture by associating inputs inputted by the second hand with the graphical user operation being interactively performed by the first hand, wherein the gesture is initially recognized as such based on invocation features thereof that are independent of the orientation and scale of the gesture, and wherein the graphical user operation is augmented, controlled, or modified based on the inputs being associated with the graphical user operation.
1 Assignment
0 Petitions
Accused Products
Abstract
Techniques for bi-manual control of an operation may allow gestural input for the operation to be inputted blindly with one hand while a user'"'"'s attention remains on the other hand. An operation invoked and controlled by the gestural input may automatically associate with an operation controlled by the other hand. The gestural input may facilitate blind control of the operation by being able to be initially recognized without regard for the scale, orientation, and/or location of the invoking gestural input. Moreover, scale and/or orientation of the operation may be determined by the gestural input. Changes in scale and/or orientation of the gestural input during the blind operation may change the scale and/or orientation of the operation. The operation may manifest by displaying corresponding graphics at a location independent of the gestural input or by associating functionality of the blind operation with an operation performed by the other hand.
24 Citations
20 Claims
-
1. A method performed by a computing device comprising storage hardware, processing hardware, a display, and an input device, the method comprising:
while a graphical user operation displayed at a first region of the display is being interactively performed by a first hand that is providing input via the input device at the first region of the display, recognizing a gesture inputted by a second hand and responding to the recognized gesture by associating inputs inputted by the second hand with the graphical user operation being interactively performed by the first hand, wherein the gesture is initially recognized as such based on invocation features thereof that are independent of the orientation and scale of the gesture, and wherein the graphical user operation is augmented, controlled, or modified based on the inputs being associated with the graphical user operation. - View Dependent Claims (2, 3, 4, 5, 6, 8, 9, 10)
-
7. A method performed by a computing device comprised of input hardware, a display, storage hardware, and processing hardware, the method comprising:
-
responsive to receiving first user input, recognizing the first user input as a summoning gesture; and determining a value of a characteristic of the first user input;
based on the recognizing of the summoning gesture, invoking a corresponding user interface control and associating second user input with the user interface control, wherein the user interface control is configured to be operated by the second user input according to the determined value of the first user input, and wherein the user interface control, while being operated by the second user input, either, (i) lacks any graphic representation displayed on the display or, (ii) comprises a graphic representation displayed at a location of the display that is not dependent on a location of the first user input. - View Dependent Claims (11, 12, 13)
-
-
14. A computing device comprising:
-
input hardware; a display; processing hardware; and storage hardware; the storage hardware storing instructions executable by the processing hardware, the instructions configured to perform a process, the process comprising; allowing summoning gestures to be recognized for user inputs entered above arbitrary applications and based on recognition features of the user inputs that do not depend on orientation and/or scale of the user inputs; when a summoning gesture is recognized for a user input, automatically causing a corresponding user interface control to be summoned, by the user input, for user-manipulation (i) at a display location of the user input location of the display and (ii) at a display scale and/or orientation of the user input; and automatically associating a functional and/or graphical manifestation of the user interface control with an object displayed on the display or at a location that does not depend on the display location of the user input. - View Dependent Claims (15, 16, 17, 18, 19, 20)
-
Specification