TOUCH SCREEN APPARATUS AND METHOD FOR INPUTTING USER INFORMATION ON A SCREEN THROUGH CONTEXT AWARENESS
First Claim
1. A touch screen apparatus comprising:
- a first light-emitting section for emitting light of an optical signal to perform non-touch sensing;
a second light-emitting section for emitting light of an optical signal to perform touch sensing along with the non-touch sensing;
a light guide section for guiding the light emitted from the second light-emitting section; and
a light-receiving section for receiving the lights emitted from the first light-emitting section and the second light-emitting section varying with an object.
0 Assignments
0 Petitions
Accused Products
Abstract
The present invention provides a touch screen apparatus comprising a first light emitting unit for generating an optical signal for performing non-touch sensing, a second light emitting unit for generating an optical signal for performing touch sensing together with the non-touch sensing, an optical guide unit for guiding light emitted from the second light emitting unit, and a light receiving unit for receiving light emitted and changed by an object. Further, the present invention provides a method for inputting user information on a screen through context awareness, which can input user information in an accurate and convenient manner on the screen through the awareness of a variety of user contexts, and which can effectively prevent an erroneous operation caused by a contact of the palm of the user by ignoring the contact coordinates input by a means other than the finger of the user on the screen.
46 Citations
17 Claims
-
1. A touch screen apparatus comprising:
-
a first light-emitting section for emitting light of an optical signal to perform non-touch sensing; a second light-emitting section for emitting light of an optical signal to perform touch sensing along with the non-touch sensing; a light guide section for guiding the light emitted from the second light-emitting section; and a light-receiving section for receiving the lights emitted from the first light-emitting section and the second light-emitting section varying with an object. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A touch screen apparatus comprising:
-
first and second light-emitting sections for emitting lights of optical signals to perform non-touch sensing and touch sensing; and a light-receiving section for receiving the lights emitted from the first and second light-emitting sections varying with an object, wherein the light-receiving section separates and senses the lights emitted from the first and second light-emitting sections. - View Dependent Claims (8, 9, 10)
-
-
11. A method for inputting user information on a screen through context awareness, comprising the steps of:
-
(a) recognizing a position of a user by sensing the user accessing the screen; (b) recognizing a position of the user'"'"'s hand by sensing an access state of the user located on the screen; (c) recognizing right and left hands of the user using an angle and a distance according to the position of the user and the position of the user'"'"'s hand recognized in steps (a) and (b); (d) recognizing a shape and a specific motion of the user'"'"'s hand by sensing a motion of the user located on the screen; (e) recognizing a type of finger of the user located on the screen using a real-time image processing method; and (f) allocating, after sensing an object making contact on the screen and recognizing coordinates of the object, a specific command for recognized contact coordinates on the basis of at least one of the left and right hands of the user, the shape and the specific motion of the user'"'"'s hand, and the type of finger of the user recognized in steps (c) to (e). - View Dependent Claims (12, 13, 14, 15)
-
-
16. A method for inputting user information on a screen through context awareness, comprising the steps of:
-
(a′
) recognizing a shape and a specific motion of a user'"'"'s hand by sensing a motion of the user located on the screen; and(b′
) allocating a specific command on the basis of the recognized shape and specific motion of the user'"'"'s hand. - View Dependent Claims (17)
-
Specification