System and method for issuing commands based on pen motions on a graphical keyboard
First Claim
1. A method of issuing a command using a graphical keyboard, comprising:
- inputting a movement on the graphical keyboard;
automatically recognizing the inputted movement as an actual command, based on the inputted movement in relation to a layout of the graphical keyboard by;
analyzing pattern aspects of the inputted movement by channels, wherein said analyzing utilizes said channels to determine location of said inputted movement and to determine shape of a single said inputted movement;
comparing the pattern aspects to a command gesture database;
if no match exists between the pattern aspects and command gesture database, using context clues that are based on one or more previous commands gestured by a user and comparing the context clues with potential commands;
executing the actual command;
utilizing multiple command template representations for each of said actual commands whether or not said graphical keyboard contains duplicate keys;
utilizing shortcut commands of a current application as command template representations of said actual commands;
teaching the user with a dynamic morphing, process comprising;
projecting of said command template representation onto said graphical keyboard,allowing the user to see which parts of a shape least match said actual command,wherein future inputted movements more closely match said command template; and
teaching the user by coloring points on a morphed command gesture, based on how closely said points match said command template, wherein;
said coloring points comprise outputting colored points to said graphical keyboard,said morphed command gesture comprises inputted movements more closely matching said command template, resulting from the user adjusting to said command template representation projected onto said graphical keyboard with colored points, thereby allowing the user to see which parts of a shape least match said actual command.
8 Assignments
0 Petitions
Accused Products
Abstract
A command pattern recognition system based on a virtual keyboard layout combines pattern recognition with a virtual, graphical, or on-screen keyboard to provide a command control method with relative ease of use. The system allows the user conveniently issue commands on pen-based computing or communication devices. The system supports a very large set of commands, including practically all commands needed for any application. By utilizing shortcut definitions it can work with any existing software without any modification. In addition, the system utilizes various techniques to achieve reliable recognition of a very large gesture vocabulary. Further, the system provides feedback and display methods to help the user effectively use and learn command gestures for commands.
182 Citations
18 Claims
-
1. A method of issuing a command using a graphical keyboard, comprising:
-
inputting a movement on the graphical keyboard; automatically recognizing the inputted movement as an actual command, based on the inputted movement in relation to a layout of the graphical keyboard by; analyzing pattern aspects of the inputted movement by channels, wherein said analyzing utilizes said channels to determine location of said inputted movement and to determine shape of a single said inputted movement; comparing the pattern aspects to a command gesture database; if no match exists between the pattern aspects and command gesture database, using context clues that are based on one or more previous commands gestured by a user and comparing the context clues with potential commands; executing the actual command; utilizing multiple command template representations for each of said actual commands whether or not said graphical keyboard contains duplicate keys; utilizing shortcut commands of a current application as command template representations of said actual commands; teaching the user with a dynamic morphing, process comprising; projecting of said command template representation onto said graphical keyboard, allowing the user to see which parts of a shape least match said actual command, wherein future inputted movements more closely match said command template; and teaching the user by coloring points on a morphed command gesture, based on how closely said points match said command template, wherein; said coloring points comprise outputting colored points to said graphical keyboard, said morphed command gesture comprises inputted movements more closely matching said command template, resulting from the user adjusting to said command template representation projected onto said graphical keyboard with colored points, thereby allowing the user to see which parts of a shape least match said actual command. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A first computer program product having a plurality of instruction codes stored on a computer-readable medium, for issuing a command using a graphical keyboard, the computer program product comprising:
-
a first set of instruction codes for capturing a gesture from an input movement on the graphical keyboard such that said input movement is capturable from one of an electronic white board, a touch screen monitor, a personal digital assistant, an eye-tracker, a cellular phone, a tablet computer, an electronic pen, a court reporting system, a dictation system, and a retail sales terminal; a second set of instruction codes for automatically recognizing the movement on said graphical keyboard as an actual command, based on; determining whether said movement on said graphical keyboard is short or long if said movement on said graphical keyboard is short, said second set of instruction codes relates said input movement on the graphical keyboard with a single letter matched to said graphical keyboard at the location of said movement on said graphical keyboard; analyzing the captured gesture using at least one shape channel and at least one location channel, matching an analyzed movement on said graphical keyboard trajectory with at least one command template in a stored command template database; a third set of instructions for comparing an ambiguous movement on said graphical keyboard trajectory with a context model channel to provide an actual command if there is no match in the command template database; a fourth set of instruction codes for executing the actual command; a fifth set of instruction codes for adapting said first computer program product to an executing second computer program product such that said second computer program product captures said gesture from said input movement on said graphical keyboard; a sixth set of instruction codes for teaching the user to match said analyzed movement trajectory with one of said command templates such that said sixth set of codes outputs to the user how far said command templates differ from said analyzed movement trajectory; a seventh set of instruction codes for teaching said user to match said analyzed movement trajectory with one of said command templates by outputting colored points along said analyzed movement trajectory on said graphical keyboard, wherein said outputting colored points comprises outputting colored points to said graphical keyboard; and an eighth set of instruction codes for gradually changing said analyzed movement trajectory to match one of said command templates. - View Dependent Claims (16, 17)
-
-
18. An apparatus for issuing a command using a graphical keyboard, the apparatus comprising:
-
a first sensing interface for recording movement on said graphical keyboard; a plurality of channels for automatically recognizing said movement on said graphical keyboard as a command, based on said movement on said graphical keyboard trajectory in relation to a layout of said graphical keyboard, wherein the plurality of channels identifies more than one set of pattern aspects of said movement on said graphical keyboard trajectory in relation to a layout of said graphical keyboard, wherein the plurality of channels analyze pattern aspects of said movement on said graphical keyboard trajectory and compare the pattern aspects to a command gesture database; an integrator connected to the plurality of channels, for analyzing the movement trajectory, wherein the integrator uses context clues from a context model channel, wherein the context clues are based on one or more previous commands gestured by a user and are compared with a potential command provided by the plurality of channels, wherein the integrator then outputs a best-matched command to be executed; wherein the best-matched command comprises a menu action; wherein said first sensing interface adapts to a second sensing interface installed on said apparatus wherein said movement on said graphical keyboard records movement on said second sensing interface; wherein said apparatus uses a plurality of said channels to analyze the shape of said movement on said graphical keyboard and to analyze location of said movement on said graphical keyboard in relation to said graphical keyboard; wherein said integrator teaches the user to match movement on said graphical keyboard to said command gesture by outputting information to said first sensing interface, showing a comparison of information from said command gesture database to said pattern aspects of said movement on said graphical keyboard for teaching the user to see which parts of said movement on said graphical keyboard least match said command gesture database; and wherein said apparatus teaches the user to match said movement on said graphical keyboard to said command gesture database by outputting colored points to said first sensing interface where said movement is on said graphical keyboard based on how closely said pattern aspects match said command gesture database, wherein said outputting colored points comprises outputting colored points to said graphical keyboard.
-
Specification