Method and system for a full screen user interface and data entry using sensors to implement handwritten glyphs
First Claim
1. A computer implemented method of implementing a touch screen user interface for a computer system, said method comprising the steps of:
- providing touchscreen area for accepting text input strokes and for accepting icon manipulation strokes;
providing a sensor adapted to detect a grasp pressure adjacent to the touchscreen area for registering an actuation by a user;
accepting user input strokes into the touchscreen area;
interpreting user input strokes into the touchscreen area as text input strokes when the sensor is actuated by the user; and
interpreting user input strokes into the touchscreen area as icon manipulation strokes and not text input strokes when the sensor is not actuated by the user.
5 Assignments
0 Petitions
Accused Products
Abstract
A computer implemented method of implementing a touch screen user interface in conjunction with sensors for a computer system. A touchscreen area is provided for accepting text input strokes and for accepting icon manipulation strokes. A sensor is provided adjacent to the touchscreen area for registering sensor actuations from a user. In implementing the user interface, user input strokes are accepted into the touchscreen input area. User input strokes into touchscreen input area are interpreted as text input strokes when the sensor is actuated by the user during the user input strokes. User input strokes into the touchscreen input area are interpreted as icon manipulation strokes and not text input strokes when the sensor is not actuated by the user during the user input strokes. The nature of the sensor actuation is configured in order to require a deliberate touching, holding, grasping, or the like, to register an actuation of the sensor in order to distinguish between text input strokes and icon manipulation strokes, but not require so significant a user actuation as to interfere with a fluid and intuitive text entry process of the user. Interpreted input strokes can be rapidly displayed as characters on the touchscreen when the sensor is actuated by the user. An audible feedback can be generated to confirm the icon manipulation when the sensor is not actuated by the user. Similarly, a visual feedback can be generated to confirm the icon manipulation when the sensor is not actuated by the user.
79 Citations
20 Claims
-
1. A computer implemented method of implementing a touch screen user interface for a computer system, said method comprising the steps of:
-
providing touchscreen area for accepting text input strokes and for accepting icon manipulation strokes;
providing a sensor adapted to detect a grasp pressure adjacent to the touchscreen area for registering an actuation by a user;
accepting user input strokes into the touchscreen area;
interpreting user input strokes into the touchscreen area as text input strokes when the sensor is actuated by the user; and
interpreting user input strokes into the touchscreen area as icon manipulation strokes and not text input strokes when the sensor is not actuated by the user. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A computer implemented method of implementing a user interface for a computer system, said method comprising the steps of:
-
providing a user input area for accepting text input strokes and for accepting icon manipulation strokes;
providing a sensor adapted to detect a grasp pressure adjacent to the user input area for registering an actuation by a user;
accepting user input strokes into the user input area;
interpreting user input strokes into the user input area as text input strokes when the sensor is actuated by the user; and
interpreting user input strokes into the user input area as icon manipulation strokes and not text input strokes when the sensor is not actuated by the user. - View Dependent Claims (9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A touchscreen equipped computer apparatus, comprising:
-
a touchscreen area for accepting text input strokes and for accepting icon manipulation strokes;
a sensor adapted to detect a grasp pressure adjacent to the user input area for registering an actuation by a user; and
a computer system coupled to the touchscreen area, the computer system configured to implement the steps of;
accepting user input strokes into the touchscreen area;
interpreting user input strokes into the touchscreen area as text input strokes when the sensor is actuated by the user; and
interpreting user input strokes into the touchscreen area as icon manipulation strokes and not text input strokes when the sensor is not actuated by the user. - View Dependent Claims (18, 19, 20)
-
Specification