Controlling an object within an environment using a pointing device
First Claim
1. A computer-implemented process for controlling an electronic component within an environment using a pointing device, comprising:
- receiving orientation messages transmitted by the pointing device, said orientation messages comprising at least one orientation sensor reading generated by at least one orientation sensor of the pointing device;
upon receiving an orientation message, determining whether a switch state indicates that a pointing device switch has been activated by a user to indicate that a gesture is being performed;
repeating the reception of orientation messages and the determination of whether a switch state indicates that the pointing device switch has been activated whenever it is determined that the switch state does not indicate that the pointing device switch is activated;
recording a prescribed one or ones of the pointing device sensor outputs taken from the orientation message received, in accordance with a determination that the switch state indicates that the pointing device switch is activated;
determining whether a threshold if just one, or all the thresholds if more than one, of a gesture threshold definition under consideration are exceeded by the recorded sensor output associated with the same sensor output as the threshold for the gesture threshold definition assigned to the electronic component;
designating that the user has performed the gesture associated with the gesture threshold definition whenever it is determined that the threshold if just one, or all the thresholds if more than one, of the gesture threshold definition are exceeded by the recorded sensor output associated with the same sensor output; and
implementing the control action represented by the gesture that the user was designated to have performed.
1 Assignment
0 Petitions
Accused Products
Abstract
The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs.
-
Citations
20 Claims
-
1. A computer-implemented process for controlling an electronic component within an environment using a pointing device, comprising:
-
receiving orientation messages transmitted by the pointing device, said orientation messages comprising at least one orientation sensor reading generated by at least one orientation sensor of the pointing device; upon receiving an orientation message, determining whether a switch state indicates that a pointing device switch has been activated by a user to indicate that a gesture is being performed; repeating the reception of orientation messages and the determination of whether a switch state indicates that the pointing device switch has been activated whenever it is determined that the switch state does not indicate that the pointing device switch is activated; recording a prescribed one or ones of the pointing device sensor outputs taken from the orientation message received, in accordance with a determination that the switch state indicates that the pointing device switch is activated; determining whether a threshold if just one, or all the thresholds if more than one, of a gesture threshold definition under consideration are exceeded by the recorded sensor output associated with the same sensor output as the threshold for the gesture threshold definition assigned to the electronic component; designating that the user has performed the gesture associated with the gesture threshold definition whenever it is determined that the threshold if just one, or all the thresholds if more than one, of the gesture threshold definition are exceeded by the recorded sensor output associated with the same sensor output; and implementing the control action represented by the gesture that the user was designated to have performed. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A computer-implemented process for controlling an object within an environment using a pointing device, comprising:
-
receiving orientation messages transmitted by the pointing device, said orientation messages comprising at least one orientation sensor reading generated by at least one orientation sensor of the pointing device; determining whether a switch state indicates that a pointing device switch has been activated by a user to indicate that a gesture is being performed; in accordance with a determination that the switch state indicates that the pointing device switch is activated, identifying one or ones of the pointing device sensor outputs corresponding with at least one orientation message received; determining whether at least one threshold of a gesture threshold definition associated with the object is exceeded by at least one of the identified sensor outputs; designating that the user has performed the gesture associated with the gesture threshold definition whenever it is determined that the at least one threshold is exceeded by the at least one of the identified sensor outputs; and implementing the control action represented by the gesture that the user was designated to have performed. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16)
-
-
17. A computer-implemented process for controlling an object within an environment using a pointing device, comprising:
-
waiting for an orientation message to be received; receiving the orientation message transmitted by the pointing device, said orientation message comprising at least one orientation sensor reading generated by at least one orientation sensor of the pointing device; identifying the at least one orientation sensor reading corresponding with at least one orientation message received; determining whether at least one threshold of a gesture threshold definition corresponding with the object is exceeded by the at least one orientation sensor reading; designating that the user has performed a gesture associated with the gesture threshold definition in accordance with a determination that the at least one threshold of the gesture threshold definition is exceeded by the at least one orientation sensor reading; and implementing a control action represented by the gesture that the user was designated to have performed. - View Dependent Claims (18, 19, 20)
-
Specification