Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
First Claim
1. A user interface apparatus, comprising:
- a touch-screen input arrangement including a touchscreen and graphics processing circuitry adapted to render touch activated user input elements on said touchscreen, wherein a touch activated user input element is a control element having an associated function triggered by the user physically touching a point on the touchscreen where the element is rendered;
one or more image sensors adapted to sense a position or motion vector, relative to said touch activated user input elements, of a user finger, limb or control implement approaching said touch-screen; and
mapping information correlating locations of the rendered touch activated user input elements to functions of applications;
processing circuitry adapted to determine a given rendered touch activated input element the user finger, limb or control implement is approaching, based on the position or motion vector sensed by said one or more image sensors;
a controller adapted to;
(1) cause said graphics processing circuitry to facilitate interaction with said given rendered touch activated input element by altering a size shape or location of said given rendered touch activated input element towards the finger, limb or implement, and (2) modify the mapping information to account for the altered size, shape or location of the given rendered touch activated input element.
2 Assignments
0 Petitions
Accused Products
Abstract
Disclosed are methods, circuits, apparatus and systems for human machine interfacing with a computational platform or any other electronic device, such as a cell-phone, smart-phone, e-book, notebook computer, tablet computer, etc. According to some embodiments, there may be provided an adaptive touch-screen input arrangement, such as a keyboard, keypad or any other touch screen input arrangements including one or more input elements such as rendered or projected keys or buttons which may be projected onto or rendered on a touch screen display. The adaptive touch-screen input arrangement may be adapted to alter the size, shape or location of input elements within proximity of a finger, limb or implement used by a user to touch the screen.
119 Citations
4 Claims
-
1. A user interface apparatus, comprising:
-
a touch-screen input arrangement including a touchscreen and graphics processing circuitry adapted to render touch activated user input elements on said touchscreen, wherein a touch activated user input element is a control element having an associated function triggered by the user physically touching a point on the touchscreen where the element is rendered; one or more image sensors adapted to sense a position or motion vector, relative to said touch activated user input elements, of a user finger, limb or control implement approaching said touch-screen; and mapping information correlating locations of the rendered touch activated user input elements to functions of applications; processing circuitry adapted to determine a given rendered touch activated input element the user finger, limb or control implement is approaching, based on the position or motion vector sensed by said one or more image sensors; a controller adapted to;
(1) cause said graphics processing circuitry to facilitate interaction with said given rendered touch activated input element by altering a size shape or location of said given rendered touch activated input element towards the finger, limb or implement, and (2) modify the mapping information to account for the altered size, shape or location of the given rendered touch activated input element.
-
-
2. An electronic device, comprising:
-
a processor; a battery; a touch-screen input arrangement including a touchscreen and graphics processing circuitry adapted to render touch activated user input elements on said touchscreen, wherein a touch activated user input element is a control element having an associated function triggered by the user physically touching a point on the touchscreen where the element is rendered; a touchless sensor adapted to sense a motion vector , relative to said touch activated user input elements, of a user finger, limb or control implement approaching said touch-screen; and a controller adapted to;
(1) receive from said touchless sensor the motion vector of the user finger, limb or control implement, (2) determine a given rendered touch activated input element the received motion vector is directed towards, and (3) cause said graphics processing circuitry to facilitate interaction with said given rendered touch activated input element by altering a location, size, position or shape of said given rendered touch activated input element;wherein altering the given touch activated input element includes enlarging the given input element. - View Dependent Claims (3)
-
-
4. A method for human-machine interfacing, said method comprising;
-
providing, upon a touchscreen, a graphic user interface including touch activated user input elements, wherein a touch activated user input element is a control element having an associated function triggered by the user physically touching a point on the touchscreen where the element is rendered; determining, by use of one or more image sensors, a motion vector , relative to the touch activated user input elements, of a user finger, limb or control implement approaching the touch-screen input arrangement; determining, based on the determination of a motion vector of a user finger, limb or implement, a given rendered touch activated input element the motion vector is directed towards; facilitate interaction with the given rendered touch activated input element by altering a location, size, position or shape of the given rendered touch activated input element; wherein altering the given touch activated input element includes enlarging the given input element.
-
Specification