Gesture library for natural user input
First Claim
Patent Images
1. Enacted in a computing machine operatively coupled to a display and to a vision system having at least one camera, a method to decode natural user input (NUI) from a human subject, the method comprising:
- displaying an operating system (OS) shell of an OS of the computing machine, the OS shell including a user-interface (UI) element for each of a plurality of applications of the computing machine,detecting in three dimensions a hand gesture and concurrent grip state of the human subject while displaying the OS shell;
modifying an extent of visibility of the OS shell based on the gesture if the grip state is close-fisted, wherein modifying the extent of visibility includes displaying a larger portion of the OS shell than currently displayed, to enable different applications to be launched; and
launching an application of the computing machine if the gesture is directed to a UI element presented on the OS shell and associated with the application and if the grip state is open-handed.
2 Assignments
0 Petitions
Accused Products
Abstract
A method to decode natural user input from a human subject. The method includes detection of a gesture and concurrent grip state of the subject. If the grip state is closed during the gesture, then a user-interface (UI) canvas of the computer system is transformed based on the gesture. If the grip state is open during the gesture, then a UI object arranged on the UI canvas is activated based on the gesture.
198 Citations
17 Claims
-
1. Enacted in a computing machine operatively coupled to a display and to a vision system having at least one camera, a method to decode natural user input (NUI) from a human subject, the method comprising:
-
displaying an operating system (OS) shell of an OS of the computing machine, the OS shell including a user-interface (UI) element for each of a plurality of applications of the computing machine, detecting in three dimensions a hand gesture and concurrent grip state of the human subject while displaying the OS shell; modifying an extent of visibility of the OS shell based on the gesture if the grip state is close-fisted, wherein modifying the extent of visibility includes displaying a larger portion of the OS shell than currently displayed, to enable different applications to be launched; and launching an application of the computing machine if the gesture is directed to a UI element presented on the OS shell and associated with the application and if the grip state is open-handed. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A computing machine configured to decode natural user input (NUI) from a human subject, the computing machine operatively coupled to a display and to a vision system having at least one camera and configured to detect a hand gesture of the subject in three dimensions, the computing machine comprising:
an operating system (OS) accessible to a user of the computing machine via an OS shell, the OS shell being configured to present a user-interface (UI) element for each of a plurality of applications of the computing machine, the OS having an application programming interface (API) to provide, based on the detected hand gesture, input to one or more processes of the computing machine, the input including; input to signal engagement of the subject as a user of the computing machine, input to signal disengagement of the subject as a user of the computing machine, input to shift subsequent input focus to the OS shell upon detection of a close-fisted gesture, wherein shifting subsequent input focus includes reducing a process currently executing on the computing machine to an icon and displaying a larger portion of the OS shell than currently displayed, and input to launch an application of the computing machine upon detection of an open-handed push gesture directed to a UI element presented on the OS shell and associated with the application. - View Dependent Claims (10, 11, 12, 13, 14, 15)
-
16. Enacted in a computer system operatively coupled to a vision system, a method to decode natural user input (NUI) from a human subject, the method comprising:
-
displaying an operating system (OS) shell of an OS of the computing machine, the OS shell including a user-interface (UI) element for each of a plurality of applications of the computing machine; from a set of gestural inputs detectable by the vision system, selecting a context-appropriate subset based on an operating condition of the computer system; from a set of actions executable in the computer system, selecting a context-appropriate subset based on the operating condition of the computer system, the operating condition including a grip state of the human subject; detecting a gestural input from the context-appropriate subset of gestural inputs while displaying the OS shell; and executing an action from the context-appropriate subset of executable actions responsive to the detected gestural input, the action including modifying an extent of visibility of the OS shell based on the gesture if the grip state is close-fisted, wherein modifying the extent of visibility includes displaying a larger portion of the OS shell than currently displayed, to enable different applications to be launched; and launching an application of the computing machine if the gesture is directed to a UI element presented on the OS shell and associated with the application and if the grip state is open-handed. - View Dependent Claims (17)
-
Specification