Grip Detection
First Claim
1. A method, comprising:
- identifying a non-empty set of points where an apparatus is being gripped, the apparatus being a portable device configured with a touch or hover-sensitive display;
determining a grip context based on the set of points, andcontrolling the operation or appearance of the apparatus based, at least in part, on the grip context.
3 Assignments
0 Petitions
Accused Products
Abstract
Example apparatus and methods detect how a portable (e.g., handheld) device (e.g., phone, tablet) is gripped (e.g., held, supported). Detecting the grip may include detecting and characterizing touch points for fingers, thumbs, palms, or surfaces that are involved in supporting and positioning the apparatus. Example apparatus and methods may determine whether and how an apparatus is being held and then may exercise control based on the grip detection. For example, a display on an input/output interface may be reconfigured, physical controls (e.g., push buttons) on the apparatus may be remapped, user interface elements may be repositioned, resized, or repurposed, portions of the input/output interface may be desensitized or hyper-sensitized, virtual controls may be remapped, or other actions may be taken. Touch sensors may detect the pressure with which a smart phone is being gripped and produce control events (e.g., on/off, louder/quieter, brighter/dimmer, press and hold) based on the pressure.
-
Citations
20 Claims
-
1. A method, comprising:
-
identifying a non-empty set of points where an apparatus is being gripped, the apparatus being a portable device configured with a touch or hover-sensitive display; determining a grip context based on the set of points, and controlling the operation or appearance of the apparatus based, at least in part, on the grip context. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. A computer-readable storage medium storing computer-executable instructions that when executed by a computer cause the computer to perform a method, the method comprising:
-
identifying a non-empty set of points where an apparatus is being gripped, the apparatus being a portable device configured with a touch or hover-sensitive display, where the set of points are identified from first information provided by the display or where the set of points are identified from second information provided by a plurality of touch sensors, where the plurality of touch sensors are located on the front, side, or back of the apparatus, and where the touch sensors are not part of the display, where the first information includes a touch location, a touch duration, or a touch pressure, and where the first information identifies a member of the set of points as being associated with a finger, a thumb, a palm, or a surface; determining a grip context based on the set of points, where the grip context identifies whether the apparatus is being gripped in a right hand, in a left hand, by a left hand and a right hand, or by no hands, and where the grip context identifies whether the apparatus is being gripped in a portrait orientation or in a landscape orientation, controlling the operation or appearance of the apparatus based, at least in part, on the grip context, where controlling the operation or appearance of the apparatus includes controlling the operation or appearance of the display based, at least in part, on the set of points and the grip context, where controlling the operation or appearance of the display includes manipulating a position of a user interface element displayed on the display, manipulating a color of the user interface element, manipulating a size of the user interface element, manipulating a shape of the user interface element, manipulating a sensitivity of the user interface element, controlling whether the display presents information in a portrait or landscape orientation, or changing the sensitivity of a portion of the display, where controlling the operation of the apparatus includes controlling the operation of a physical control on the apparatus based, at least in part, on the set of points and the grip context, where the physical control is not part of the display, detecting an action performed on a touch sensitive input region on the apparatus or on the display, where the touch sensitive input region is not part of the display, characterizing the action to produce a characterization data that describes a duration of the action, a location of the action, a pressure of the action, or a direction of the action, and selectively controlling the apparatus based, at least in part, on the action or the characterization data, where selectively controlling the apparatus includes controlling an appearance of the display, controlling an operation of the display, controlling an operation of the touch sensitive input region, controlling an application running on the apparatus, generating a control event for the application, or controlling a component of the apparatus; and detecting a squeeze pressure with which the apparatus is being squeezed based, at least in part, on the touch pressure associated with at least two members of the set of points, and controlling the apparatus based, at least in part, on the squeeze pressure, to; selectively answer a phone call; selectively adjust a volume for the apparatus; selectively adjust a brightness of the display, or selectively control an intensity of an effect in a video game being played on the apparatus.
-
-
15. An apparatus, comprising:
-
a processor; a hover-sensitive input/output interface configured to detect a first point at which the apparatus is being held, a touch interface configured to detect a second point at which the apparatus is being held, the touch interface being configured to detect touches in locations other than the hover-sensitive input/output interface; a memory; a set of logics configured to determine and respond to how the apparatus is being held; and an interface configured to connect the processor, the hover-sensitive input/output interface, the touch interface, the memory, and the set of logics; the set of logics including; a first logic configured to handle a first hold event generated by the hover-sensitive input/output interface; a second logic configured to handle a second hold event generated by the touch interface, and a third logic configured; to determine a hold parameter for the apparatus based, at least in part, on the first point, the first hold event, the second point, or the second hold event, where the hold parameter identifies whether the apparatus is being held in a right hand grip, a left hand grip, a two hands grip, or a no hands grip, and where the hold parameter identifies an edge of the apparatus as the current top edge of the apparatus, and to generate a control event based, at least in part, on the hold parameter, where the control event controls a property of the hover-sensitive input/output interface, a property of the touch interface, or a property of the apparatus. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification