Adaptive non-contact computer user-interface system and method
First Claim
1. A method of defining a user interface in a computer system in response to the state of a physical object, comprising:
- providing a computer system, the computer system having a computer and input sensors, the input sensors coupled with the computer and the sensors for sensing a state of the physical object;
placing the physical object into a first state;
employing the input sensors to sense the first state, and communicating a sensors output to the computer;
storing the sensors output in the computer;
associating the first state with an assigned meaning; and
programming the computer to associate the sensors output with the assigned meaning;
placing the physical object into a second state; and
and returning the physical object into the first state, whereby the input sensors inform the computer system of the occurrence of the first state, whereby the assigned meaning is provided as an input, such as a command or a data value, to the computer system.
0 Assignments
0 Petitions
Accused Products
Abstract
The present invention provides method and apparatus for defining and redefining a non-contact dynamic user interface, which can control multiple applications or applets with one user interface. Sensors are used to determine the position and motion of a user'"'"'s body or a pre-selected object, such as a wand. As new users train the user interface, the user interface determines the functionality of newly programmed commands and directions associated with physical positions and orientations of each users body, or of a physical object that is under the user'"'"'s control. Existing functions and input meanings are used if they desired. New input command meanings and functionality are added as required. In an example system, a computer is directed by hand and arm positions of the user. The speed with which the user or physical moves may also by interpreted as an input parameter by the user interface.
29 Citations
17 Claims
-
1. A method of defining a user interface in a computer system in response to the state of a physical object, comprising:
-
providing a computer system, the computer system having a computer and input sensors, the input sensors coupled with the computer and the sensors for sensing a state of the physical object;
placing the physical object into a first state;
employing the input sensors to sense the first state, and communicating a sensors output to the computer;
storing the sensors output in the computer;
associating the first state with an assigned meaning; and
programming the computer to associate the sensors output with the assigned meaning;
placing the physical object into a second state; and
and returning the physical object into the first state, whereby the input sensors inform the computer system of the occurrence of the first state, whereby the assigned meaning is provided as an input, such as a command or a data value, to the computer system. - View Dependent Claims (2, 3, 4, 5)
-
-
6. A computer system comprising:
-
a sensor for sensing the position of a physical object and generating a sensor output indicating a sensing of the position;
a processor, responsive to the sensor output, and for executing an application program, the processor including;
an input module, responsive to at least one sensor output, wherein the input module accepts the sensor output corresponding to the position of the physical object; and
an interpreter, the interpreter associating the at least one sensor output with an input to the application program, whereby the input may be a command or a data value. - View Dependent Claims (7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17)
-
Specification