Motion-based gestures for a computing device
First Claim
Patent Images
1. A computing device, comprising:
- one or more sensors configured to capture sensor data, the sensor data indicating a physical movement of the computing device performed by at least a first hand of a user;
a touch screen display, generating touch location data representing touch locations on the touch screen display by at least a first finger and a second finger of a second hand of the user;
a processor for executing instructions stored in a memory which, when executed by the processor, cause the computing device to;
determine, during a first period, using the touch location data, first touch location data associated with the first finger contacting the touch screen display;
determine, during the first period, using the touch location data, second touch location data associated with the second finger contacting the touch screen display;
determine, during a second period after the first period, using the touch location data, third touch location data associated with the first finger contacting the touch screen display;
determine, during the second period, using the touch location data, fourth touch location data associated with the second finger contacting the touch screen display;
determine, using the sensor data, information about the physical movement of the computing device between the first period and the second period;
determine, based at least on the first touch location data and the third touch location data being different, that a first position of the first finger on the touch screen display during the first period is different than a third position of the first finger on the touch screen display during the second period;
determine, based at least on the second touch location data and the fourth touch location data being different, that a second position of the second finger on the touch screen display during the first period is different than a fourth position of the second finger on the touch screen display during the second period;
determine a first gesture during a third period based at least on the physical movement of the computing device between the first period and the second period, the first position of the first finger on the touch screen display during the first period being different than the third position of the first finger on the touch screen display during the second period, and the second position of the second finger on the touch screen display during the first period being different than the fourth position of the second finger on the touch screen display during the second period, wherein the first gesture corresponds to the second hand remaining steady while the computing device is rotated between the first period and the second period;
select the first gesture;
invoke a first function of the computing device, the first function corresponding to the first gesture, the first function including switching the touch screen display from a landscape display mode to a portrait display mode and then maintaining the touch screen display in the portrait display mode, or switching the touch screen display from a portrait display mode to a landscape display mode and then maintaining the touch screen display in the landscape display mode;
determine a second gesture during a fourth period, the second gesture different than the first gesture; and
invoke a second function of the computing device, the second function corresponding to the second gesture, the second function including maintaining the touch screen display in either the portrait display mode or the landscape display mode while the computing device is rotated.
1 Assignment
0 Petitions
Accused Products
Abstract
Approaches to enable a computing device, such as a phone or tablet computer, to use sensor information obtained from sensors of the computing device to interpret one or more gestures and/or other input provided by the user. In particular, the computing device may use information about the movement of the device gathered from an accelerometer or gyroscope in combination with input data detected by the touch screen or cameras of the device in order to disambiguate between several different types of gestures for the device.
-
Citations
12 Claims
-
1. A computing device, comprising:
-
one or more sensors configured to capture sensor data, the sensor data indicating a physical movement of the computing device performed by at least a first hand of a user; a touch screen display, generating touch location data representing touch locations on the touch screen display by at least a first finger and a second finger of a second hand of the user; a processor for executing instructions stored in a memory which, when executed by the processor, cause the computing device to; determine, during a first period, using the touch location data, first touch location data associated with the first finger contacting the touch screen display; determine, during the first period, using the touch location data, second touch location data associated with the second finger contacting the touch screen display; determine, during a second period after the first period, using the touch location data, third touch location data associated with the first finger contacting the touch screen display; determine, during the second period, using the touch location data, fourth touch location data associated with the second finger contacting the touch screen display; determine, using the sensor data, information about the physical movement of the computing device between the first period and the second period; determine, based at least on the first touch location data and the third touch location data being different, that a first position of the first finger on the touch screen display during the first period is different than a third position of the first finger on the touch screen display during the second period; determine, based at least on the second touch location data and the fourth touch location data being different, that a second position of the second finger on the touch screen display during the first period is different than a fourth position of the second finger on the touch screen display during the second period; determine a first gesture during a third period based at least on the physical movement of the computing device between the first period and the second period, the first position of the first finger on the touch screen display during the first period being different than the third position of the first finger on the touch screen display during the second period, and the second position of the second finger on the touch screen display during the first period being different than the fourth position of the second finger on the touch screen display during the second period, wherein the first gesture corresponds to the second hand remaining steady while the computing device is rotated between the first period and the second period; select the first gesture; invoke a first function of the computing device, the first function corresponding to the first gesture, the first function including switching the touch screen display from a landscape display mode to a portrait display mode and then maintaining the touch screen display in the portrait display mode, or switching the touch screen display from a portrait display mode to a landscape display mode and then maintaining the touch screen display in the landscape display mode; determine a second gesture during a fourth period, the second gesture different than the first gesture; and invoke a second function of the computing device, the second function corresponding to the second gesture, the second function including maintaining the touch screen display in either the portrait display mode or the landscape display mode while the computing device is rotated. - View Dependent Claims (2, 3, 4)
-
-
5. A computer implemented method, comprising:
under the control of a computing device configured with executable instructions, receiving user input data generated by a touch screen, the user input data representing at least; first touch location data of a first finger of a second hand of a user on the touch screen during a first period; second touch location data of a second finger of the second hand on the touch screen during the first period; third touch location data of the first finger of the second hand on the touch screen during a second period, the second period after the first period; and fourth touch location data of the second finger of the second hand on the touch screen during the second period; processing data captured by one or more sensors of the computing device to determine a rotation of the computing device performed by at least a first hand of the user between the first period and the second period; determining, based at least in part on the first touch location data and the third touch location data being different, that a first position of the first finger on the touch screen during the first period is different than a third position of the first finger on the touch screen during the second period; determining, based at least in part on the second touch location data and the fourth touch location data being different, that a second position of the second finger on the touch screen during the first period is different than a fourth position of the second finger on the touch screen during the second period; determining a first gesture during a third period based at least on the rotation of the computing device between the first period and the second period, the first position of the first finger on the touch screen during the first period being different than the third position of the first finger on the touch screen during the second period, and the second position of the second finger on the touch screen during the first period being different than the fourth position of the second finger on the touch screen during the second period, wherein the first gesture corresponds to the second hand remaining steady while the computing device is rotated between the first period and the second period; invoking a first function of the computing device, the first function corresponding to the first gesture, the first function comprising at least switching the touch screen from a portrait display mode to a landscape display mode and then maintaining the touch screen in the landscape display mode; determining a second gesture during a fourth period, the second gesture different than the first gesture; and invoking a second function of the computing device, the second function corresponding to the second gesture, the second function including maintaining the touch screen in either the portrait display mode or the landscape display mode while the computing device is rotated. - View Dependent Claims (6, 7, 8)
-
9. A non-transitory computer readable storage medium storing one or more sequences of instructions that, when executed by one or more processors, cause a computing device to:
-
receive user input data generated by a touch screen, the user input data representing at least; first touch location data of a first finger of a second hand of a user relative to the touch screen during a first period; second touch location data of a second finger of the second hand relative to the touch screen during the first period; third touch location data of the first finger of the second hand relative to the touch screen during a second period; and fourth touch location data of the second finger of the second hand relative to the touch screen during the second period; process data captured by one or more sensors of the computing device to determine a rotation of the computing device performed by at least a first hand of the user between the first period and the second period; determine, based at least in part on the first touch location data and the third touch location data being different, that a first position of the first finger on the touch screen during the first period is different than a third location of the first finger on the touch screen during the second period; determine, based at least in part on the second touch location data and the fourth touch location data being different, that a second position of the second finger on the touch screen during the first period is different than a fourth location of the second finger on the touch screen during the second period; determine a first gesture during a third period based at least on the rotation of the computing device between the first period and the second period, the first position of the first finger on the touch screen during the first period being different than the third location of the first finger on the touch screen during the second period, and the second position of the second finger on the touch screen during the first period being different than the fourth location of the second finger on the touch screen during the second period, wherein the first gesture corresponds to the second hand remaining steady while the computing device is rotated between the first period and the second period; invoke a first function of the computing device, the first function corresponding to the first gesture, the first function comprising at least switching the touch screen from a portrait display mode to a landscape display mode and then maintaining the touch screen in the landscape display mode; determine a second gesture during a fourth period, the second gesture different than the first gesture; and invoke a second function of the computing device, the second function corresponding to the second gesture, the second function including maintaining the touch screen in either the portrait display mode or the landscape display mode while the computing device is rotated. - View Dependent Claims (10, 11, 12)
-
Specification