Hover angle
First Claim
1. A method, comprising:
- acquiring position data about a portion of an object located at least partially in a three-dimensional hover space produced by a portable apparatus having a hover-sensitive interface,where the position data is acquired solely by a passive capacitive sensing node that detects a capacitance change in the hover space caused by the object;
where the position data describes the position of the object in the three-dimensional hover space, where a first dimension and a second dimension in the hover space define a plane that is parallel to the surface of the interface, where the second dimension is orthogonal to the first dimension, and where a third dimension in the hover space is orthogonal to both the first dimension and the second dimension and perpendicular to the plane;
computing an angle at which the object is pitched with respect to the hover-sensitive interface based, at least in part, on the position data; and
controlling the hover-sensitive interface based, at least in part, on the angle at which the object is pitched with respect to the hover-sensitive interface.
3 Assignments
0 Petitions
Accused Products
Abstract
Example apparatus and methods concern detecting an angle at which an object is interacting with a hover-sensitive input/output interface. An example apparatus may include a proximity detector configured to detect an object in a hover-space associated with the hover-sensitive input/output interface. The proximity detector may provide three dimensional position information for the object (e.g., x,y,z). The angle may be determined from a first (x,y,z) measurement associated with a first portion (e.g., tip) of the object and a second (x,y,z) measurement associated with a second portion (e.g., end) of the object. The position of the object may determine a hover point on the interface while the position and angle may determine an intersection point on the interface. User interface elements or other information displayed on the interface may be manipulated based, at least in part, on the intersection point. Multiple objects interacting at multiple angles may be detected and responded to.
-
Citations
20 Claims
-
1. A method, comprising:
-
acquiring position data about a portion of an object located at least partially in a three-dimensional hover space produced by a portable apparatus having a hover-sensitive interface, where the position data is acquired solely by a passive capacitive sensing node that detects a capacitance change in the hover space caused by the object; where the position data describes the position of the object in the three-dimensional hover space, where a first dimension and a second dimension in the hover space define a plane that is parallel to the surface of the interface, where the second dimension is orthogonal to the first dimension, and where a third dimension in the hover space is orthogonal to both the first dimension and the second dimension and perpendicular to the plane; computing an angle at which the object is pitched with respect to the hover-sensitive interface based, at least in part, on the position data; and controlling the hover-sensitive interface based, at least in part, on the angle at which the object is pitched with respect to the hover-sensitive interface. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A computer-readable storage device storing computer-executable instructions that when executed by a computer cause the computer to perform a method, the method comprising:
-
acquiring position data about a portion of an object located at least partially in a three-dimensional hover space produced by a phone or tablet computer having a hover-sensitive interface, where the position data is acquired by a passive capacitive sensing node that detects a capacitance change in the hover space caused by the object; where the position data describes the position of the object in the three dimensional hover space, where a first dimension and a second dimension in the hover space define a plane that is parallel to the surface of the hover-sensitive interface, where the second dimension is orthogonal to the first dimension, and where a third dimension in the hover space is orthogonal to both the first dimension and the second dimension and perpendicular to the plane; computing an angle at which the object is pitched with respect to the hover-sensitive interface based on the position data, where the angle is computed based on position data associated with two different portions of the object; establishing a hover point for the object based, at least in part, on information in the position data about the location of a portion of the object in the first dimension and the location of the object in the second dimension; establishing an intersection point for the object based, at least in part, on information in the position data about the location of the object in the first dimension, the location of the object in the second dimension, the location of the object in the third dimension, and on the angle; identifying a portion of the hover-sensitive interface that is occluded by the object based, at least in part, on the position data and the angle, and selectively manipulating the portion of the hover-sensitive interface that is occluded by the object; and selectively controlling the appearance of the hover-sensitive interface based, at least in part, on the angle, the hover point, and the intersection point, where controlling the appearance of the hover-sensitive interface comprises re-orienting an item displayed on the hover-sensitive interface, or dynamically reconfiguring a user interface element, where reconfiguring the user interface element includes changing an appearance of the user interface element, changing a position of the user interface element, changing an orientation of the user interface element, changing a size of the user interface element, or simulating a mouse-over event for the user interface element; selectively controlling the operation of the hover-sensitive interface based, at least in part, on the angle, the hover point, and the intersection point, where controlling the operation of the hover-sensitive interface includes enhancing a functionality of a first user interface element located in an area within a threshold distance of the intersection point or diminishing the functionality of a second user interface element located beyond the threshold distance from the intersection point;
orselectively controlling the operation of a user interface element on the hover-sensitive interface based, at least in part, on the angle, the hover point, or the intersection point, where controlling the operation of the user interface element includes controlling a direction of a graphical effect associated with the user interface element, controlling an intensity of a graphical effect associated with the user interface element, or controlling an area impacted by a graphical effect associated with the user interface element.
-
-
16. An apparatus, comprising:
-
a processor; a hover-sensitive input/output interface; a proximity detector configured to detect a portion of an object in a hover-space associated with the hover-sensitive input/output interface and to provide location data concerning the portion of the object, where the proximity detector includes a passive capacitive sensing node that detects a capacitance change in the hover-space; a memory; a set of logics configured to process events associated with the hover-space and the object; and an interface configured to connect the processor, the hover-sensitive input/output interface, the proximity detector, the memory, and the set of logics; the set of logics including; a first logic configured to handle a hover event associated with the object in the hover-space, where handling the hover event includes producing first location data that identifies a first location in the hover space at which a first portion of the object is located and producing second location data that identifies a second location in the hover space at which a second portion of the object is located, where the first location data and second location data are acquired passively by the hover-sensitive input/output interface using the passive capacitive sensing node; and a second logic configured to produce angle information from the first location data and the second location data, where the angle information describes an angle at which the object intersects a normal of the input/output interface. - View Dependent Claims (17, 18, 19, 20)
-
Specification