User interface systems and methods for managing multiple regions
First Claim
1. A user interface system comprising:
- a plane registration module, the plane registration module configured to identify a first plane within an environment;
a gesture and posture recognition (GPR) module, the GPR module configured to observe a first allocation gesture, a second allocation gesture, a first modal gesture, a second modal gesture, and a third modal gesture within the environment;
a region definition module, the region definition module configured to;
determine, as defined by the first allocation gesture, a first region comprising a first portion of the first plane, anddetermine, as defined by the second allocation gesture, a second region comprising a second portion of the first plane;
a mode determination module, the mode determination module configured to;
determine, as defined by the first modal gesture, a first interaction mode of the first region, the first interaction mode of the first region comprising a first plurality of commands for a user to interact with a device,determine, as defined by the second modal gesture, a first interaction mode of the second region, the first interaction mode of the second region comprising a second plurality of commands for the user to interact with the device, anddetermine, as defined by the third modal gesture, a second interaction mode of the first region, the second interaction mode of the first region comprising a third plurality of commands for the user to interact with the device,wherein the first interaction mode of the first region and the second interaction mode of the first region are different, wherein the first interaction mode and the second interaction mode are each at least one selected from a group consisting of a touch input mode, a key-press interaction mode, and a handwritten text interaction mode; and
a visual feedback module including visual feedback circuitry, the visual feedback module configured to provide visual feedback associated with a parameter of the first region.
3 Assignments
0 Petitions
Accused Products
Abstract
A user interface system includes a plane registration module configured to identify a first plane within an environment, and a gesture and posture recognition (GPR) module configured to observe a first allocation gesture, a second allocation gesture, a first modal gesture, a second modal gesture, and a third modal gesture within the environment. A region definition module is configured to determine a first region comprising a first portion of the first plane based on the first allocation gesture, and to determine a second region comprising a second portion of the first plane based on the second allocation gesture. A mode determination module is configured to determine different interaction modes for the various regions. A visual feedback module is configured to provide visual feedback associated with a parameter of the first region.
-
Citations
16 Claims
-
1. A user interface system comprising:
-
a plane registration module, the plane registration module configured to identify a first plane within an environment; a gesture and posture recognition (GPR) module, the GPR module configured to observe a first allocation gesture, a second allocation gesture, a first modal gesture, a second modal gesture, and a third modal gesture within the environment; a region definition module, the region definition module configured to; determine, as defined by the first allocation gesture, a first region comprising a first portion of the first plane, and determine, as defined by the second allocation gesture, a second region comprising a second portion of the first plane; a mode determination module, the mode determination module configured to; determine, as defined by the first modal gesture, a first interaction mode of the first region, the first interaction mode of the first region comprising a first plurality of commands for a user to interact with a device, determine, as defined by the second modal gesture, a first interaction mode of the second region, the first interaction mode of the second region comprising a second plurality of commands for the user to interact with the device, and determine, as defined by the third modal gesture, a second interaction mode of the first region, the second interaction mode of the first region comprising a third plurality of commands for the user to interact with the device, wherein the first interaction mode of the first region and the second interaction mode of the first region are different, wherein the first interaction mode and the second interaction mode are each at least one selected from a group consisting of a touch input mode, a key-press interaction mode, and a handwritten text interaction mode; and a visual feedback module including visual feedback circuitry, the visual feedback module configured to provide visual feedback associated with a parameter of the first region. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A user interface system comprising:
-
a gesture and posture recognition (GPR) module, the GPR module configured to observe a first allocation posture and a first modal gesture within an environment; a plane registration module, the plane registration module configured to identify, based on the first allocation posture, a first plane within an environment, the first allocation posture comprising a first digit of a hand and a second digit of the hand held substantially perpendicular to each other; a region definition module, the region definition module configured to determine, as defined by the first allocation posture, a first region comprising a first portion of the first plane; a mode determination module, the mode determination module configured to determine, based on the first modal gesture, a first interaction mode of the first region, the first interaction mode comprising a plurality of commands for a user to interact with a device, wherein the first interaction mode is at least one selected from a group consisting of a touch input mode, a key-press interaction mode, and a handwritten text interaction mode. - View Dependent Claims (12, 13)
-
-
14. A user interface method comprising:
-
identifying a first plane within an environment; observing a first allocation gesture, a second allocation gesture, a first modal gesture, a second modal gesture, and a third modal gesture within the environment; determining, as defined by the first allocation gesture, a first region comprising a first portion of the first plane; determining, as defined by the second allocation gesture, a second region comprising a second portion of the first plane; determining, as defined by the first modal gesture, a first interaction mode of the first region, the first interaction mode of the first region comprising a first plurality of commands for a user to interact with a device; determining, as defined by the first modal gesture, a first interaction mode of the second region, the first interaction mode of the second region comprising a second set of commands for the user to interact with the device; determining, as defined by the first modal gesture, a second interaction mode of the first region, the second interaction mode of the second region comprising a third set of commands for the user to interact with the device, wherein the first interaction mode of the first region and the second interaction mode of the first region are different, wherein the first interaction mode and the second interaction mode are each at least one selected from a group consisting of a touch input mode, a key-press interaction mode, and a handwritten text interaction mode; and providing, via a processor, visual feedback associated with a parameter of the first region. - View Dependent Claims (15, 16)
-
Specification