Methods and apparatuses for gesture-based user input detection in a mobile device
First Claim
Patent Images
1. A method comprising, at a mobile device:
- subsequent to initiation of a user perceivable output, determining whether said mobile device is in a gesture command input ready state based, at least in part, on a display portion of said mobile device remaining within a threshold angle of a horizontal viewable position for a threshold period of time and an estimated location of said mobile device indicating that said mobile device is in an environment where movements of said mobile device corresponding to gesture command inputs are likely to be distinguishable from movements of said mobile device that do not correspond to gesture command inputs;
with said mobile device in said gesture command input ready state, determining whether a detected movement of said mobile device represents a gesture command input; and
in response to a determination that said detected movement represents said gesture command input, affecting said user perceivable output.
1 Assignment
0 Petitions
Accused Products
Abstract
Methods and apparatuses are provided that may be implemented in a mobile device to: determine whether the mobile device is in a gesture command input ready state based, at least in part, on a display portion of the mobile device remaining in a horizontal viewable position for a threshold period of time; with the mobile device in a gesture command input ready state, determine whether a detected movement of the mobile device represents a gesture command input; and in response to the determined gesture command input, affect a user perceivable output.
-
Citations
38 Claims
-
1. A method comprising, at a mobile device:
-
subsequent to initiation of a user perceivable output, determining whether said mobile device is in a gesture command input ready state based, at least in part, on a display portion of said mobile device remaining within a threshold angle of a horizontal viewable position for a threshold period of time and an estimated location of said mobile device indicating that said mobile device is in an environment where movements of said mobile device corresponding to gesture command inputs are likely to be distinguishable from movements of said mobile device that do not correspond to gesture command inputs; with said mobile device in said gesture command input ready state, determining whether a detected movement of said mobile device represents a gesture command input; and in response to a determination that said detected movement represents said gesture command input, affecting said user perceivable output. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. An apparatus for use in a mobile device, the apparatus comprising:
-
means for initiating a user perceivable output; means for determining whether said mobile device is in a gesture command input ready state subsequent to initiating said user perceivable output based, at least in part, on a display portion of said mobile device remaining within a threshold angle of a horizontal viewable position for a threshold period of time and an estimated location of said mobile device indicating that said mobile device is in an environment where movements of said mobile device corresponding to gesture command inputs are likely to be distinguishable from movements of said mobile device that do not correspond to gesture command inputs; means for detecting movement of said mobile device; means for determining whether said detected movement of said mobile device represents a gesture command input with said mobile device in said gesture command input ready state; and means for affecting said user perceivable output, in response to said determined gesture command input. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18, 19)
-
-
20. A mobile device comprising:
-
one or more output devices, comprising at least a display device; one or more inertial sensors; and a processing unit configured to, subsequent to initiation of a user perceivable output via at least one of said one or more output devices; determine whether said mobile device is in a gesture command input ready state based, at least in part, on said display device remaining within a threshold angle of a horizontal viewable position for a threshold period of time and an estimated location of said mobile device indicating that said mobile device is in an environment where movements of said mobile device corresponding to gesture command inputs are likely to be distinguishable from movements of said mobile device that do not correspond to gesture command inputs; determine whether a movement of said mobile device represents a gesture command input with said mobile device in said gesture command input ready state, said movement being based, at least in part, on at least one signal associated with at least one of said one or more said inertial sensors; and affect said user perceivable output, in response to said determined gesture command input. - View Dependent Claims (21, 22, 23, 24, 25, 26, 27, 28, 29)
-
-
30. An article comprising:
a non-transitory computer readable medium having stored therein computer executable instructions executable by a processing unit of a mobile device to, subsequent to initiation of a user perceivable output; determine whether said mobile device is in a gesture command input ready state based, at least in part, on a display portion of said mobile device remaining within a threshold angle of a horizontal viewable position for a threshold period of time and an estimated location of said mobile device indicating that said mobile device is in an environment where movements of said mobile device corresponding to gesture command inputs are likely to be distinguishable from movements of said mobile device that do not correspond to gesture command inputs; with said mobile device in said gesture command input ready state, determine whether a detected movement of said mobile device represents a gesture command input; and in response to a determination that said detected movement represents said gesture command input, affect said user perceivable output. - View Dependent Claims (31, 32, 33, 34, 35, 36, 37, 38)
Specification