Motion-based interface control on computing device
First Claim
1. A computer-implemented user interface method, comprising:
- displaying an image on a graphical user interface of a mobile computing device;
receiving a first user input on a touchscreen of the mobile computing device;
sensing motion of the mobile computing device in one or more of a plurality of directions, wherein the sensed motion comprises a sensed tilting of the mobile computing device about an axis of the mobile computing device;
determining whether to interpret the sensed motion as indicating a zooming input based on the first user input, wherein the sensed motion is interpreted as indicating a zooming input when the user is determined to have been contacting the touchscreen while the sensed motion was detected by the mobile computing device;
based on the determination of whether to interpret the sensed motion as indicating a zooming input, correlating the sensed motion with either a zooming in or a zooming out zooming direction;
determining a tilt angle of the mobile computing device;
identifying the tilt angle as being within a particular tilt angle range from among a plurality of tilt angle ranges, wherein each respective tilt angle range from among a plurality of tilt angle ranges, wherein each respective tilt angle range from among the plurality of tilt angle ranges is associated with a distinct zoom level of a plurality of zoom levels;
identifying a particular zoom level of the plurality of zoom levels as being associated with the particular tilt angle range; and
changing a display of the image on the graphical user interface to correspond to the particular zoom level in the correlated zooming direction.
2 Assignments
0 Petitions
Accused Products
Abstract
A computer-implemented user interface method include displaying an image on a graphical user interface of a mobile computing device, receiving a first user input indicating an intent to perform a zooming operation on the graphical user interface, transitioning the mobile computing device into a zooming mode in response to the user input, sensing motion of the mobile computing device in one or more of a plurality of directions, correlating the sensed motion in one or more of a plurality of directions with either a zooming in or a zooming out zooming direction, and changing a zoom level of the display of the image on the graphical user interface to correspond to the correlated zooming direction.
-
Citations
19 Claims
-
1. A computer-implemented user interface method, comprising:
-
displaying an image on a graphical user interface of a mobile computing device; receiving a first user input on a touchscreen of the mobile computing device; sensing motion of the mobile computing device in one or more of a plurality of directions, wherein the sensed motion comprises a sensed tilting of the mobile computing device about an axis of the mobile computing device; determining whether to interpret the sensed motion as indicating a zooming input based on the first user input, wherein the sensed motion is interpreted as indicating a zooming input when the user is determined to have been contacting the touchscreen while the sensed motion was detected by the mobile computing device; based on the determination of whether to interpret the sensed motion as indicating a zooming input, correlating the sensed motion with either a zooming in or a zooming out zooming direction; determining a tilt angle of the mobile computing device; identifying the tilt angle as being within a particular tilt angle range from among a plurality of tilt angle ranges, wherein each respective tilt angle range from among a plurality of tilt angle ranges, wherein each respective tilt angle range from among the plurality of tilt angle ranges is associated with a distinct zoom level of a plurality of zoom levels; identifying a particular zoom level of the plurality of zoom levels as being associated with the particular tilt angle range; and changing a display of the image on the graphical user interface to correspond to the particular zoom level in the correlated zooming direction. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A computer-implemented user interface method, comprising:
-
displaying an image on a graphical user interface of a mobile computing device; receiving a first user input on a touchscreen of the mobile computing device that indicates that a user of the mobile computing device has contacted the touchscreen; sensing motion of the mobile computing device in one or more of a plurality of directions; determining whether to interpret the sensed motion as indicating a zooming input based on the first user input, wherein the sensed motion is interpreted as indicating a zooming input when the user is determined to have been contacting the touchscreen while the sensed motion was detected by the mobile computing device; based on the determination of whether to interpret the sensed motion as indicating a zooming input, correlating the sensed motion in one or more of a plurality of directions with either a zooming in or a zooming out zooming direction; changing a zoom level of the display of the image on the graphical user interface to correspond to the correlated zooming direction; while changing the zoom level of the display, receiving a second user input that indicates an intent to perform a panning operation on the graphical user interface, wherein the second user input comprises a dragging input defined by a user maintaining contact with the touchscreen while moving a point of contact on the touchscreen; correlating motion of the dragging input with one or more panning directions; and panning the content in a direction of the one or more correlated panning directions; wherein the image comprises a list of files, sensed tilting of the device in a direction that is aligned with the list causes the list to scroll, and sensed motion of the device in a direction that is not aligned with the list causes the list to zoom. - View Dependent Claims (11)
-
-
12. A computer-implemented user interface method, comprising:
-
displaying an image on a graphical user interface of a mobile computing device; receiving a first user input on a touchscreen of the mobile computing device that indicates that a user of the mobile computing device has contacted the touchscreen; sensing motion of the mobile computing device in one or more of a plurality of directions; determining whether to interpret the sensed motion as indicating a zooming input based on the first user input, wherein the sensed motion is interpreted as indicating a zooming input when the user is determined to have been contacting the touchscreen while the sensed motion was detected by the mobile computing device; based on the determination of whether to interpret the sensed motion as indicating a zooming input, correlating the sensed motion in one or more of a plurality of directions with either a zooming in or a zooming out zooming direction; changing a zoom level of the display of the image on the graphical user interface to correspond to the correlated zooming direction; while changing the zoom level of the display, receiving a second user input that indicates an intent to perform a panning operation on the graphical user interface, wherein the second user input comprises a dragging input defined by a user maintaining contact with the touchscreen while moving a point of contact on the touchscreen; correlating motion of the dragging input with one or more panning directions; and panning the content in a direction of the one or more correlated panning directions; wherein the sensed motion comprises lateral motion of the mobile computing device and the zooming occurs in a direction that corresponds to a direction of the lateral motion.
-
-
13. An article comprising a non-transitory computer-readable data storage medium storing program code that is operable to cause one or more machines to perform operations, the operations comprising:
-
displaying an image on a graphical user interface of a mobile computing device; receiving a first user input on a touchscreen of the mobile computing device; sensing motion of the mobile computing device in one or more of a plurality of directions, wherein the sensed motion comprises a sensed tilting of the mobile computing device about an axis of the mobile computing device; determining whether to interpret the sensed motion as indicating a zooming input based on the first user input, wherein the sensed motion is interpreted as indicating a zooming input when the user is determined to have been contacting the touchscreen while the sensed motion was detected by the mobile computing device; based on the determination of whether to interpret the sensed motion as indicating a zooming input, correlating the sensed motion with either a zooming in or a zooming out zooming direction; determining a tilt angle of the mobile computing device; identifying the tilt angle as being within a particular tilt angle range from among a plurality of tilt angle ranges, wherein each respective tilt angle range from among the plurality of tilt angle ranges is associated with a distinct zoom level of a plurality of zoom levels; identifying a particular zoom level of the plurality of zoom levels as being associated with the particular tilt angle range; and changing a display of the image on the graphical user interface to correspond to the particular zoom level in the correlated zooming direction. - View Dependent Claims (14, 15, 16)
-
-
17. A mobile computing device, comprising:
-
a touch input manager to receive and interpret user inputs on a touch input device of a computing device, wherein the touch input manager is arranged to receive first user input that indicates that a user of the mobile computing device has contacted a touchscreen of the mobile computing device and a second user input comprising a dragging input defined by the user maintaining contact with the touchscreen while moving a point of contact on the touchscreen; one or more sensors arranged to sense tilting motion of the mobile computing device about an axis of the mobile computing device; one or more computing applications stored on the mobile computing device; and an input method editor programmed to; receive information indicating that tilting motion of the mobile computing device was sensed by the one or more motion sensors; determine whether to interpret the sensed tilting motion as indicating zooming input or other input based on the first user input on the touchscreen of the mobile computing device; determine a tilt angle of the mobile computing device, identify the tilt angle as being within a particular tilt angle range from among a plurality of tilt angle ranges, wherein each respective tilt angle range from among the plurality of tilt angle ranges is associated with a distinct zoom level of a plurality of zoom levels; identify a particular zoom level of the plurality of zoom levels as being associated with the particular tilt angle range; and provide data relating to the received information to a plurality of different applications that include the one or more applications if the sensed tilting motion is interpreted as indicating a zooming input, wherein the sensed tilting motion is interpreted as indicating a zooming input when the user is determined to have been contacting the touchscreen while the sensed tilting motion was detected by the mobile computing device, and wherein the one or more applications are programmed to convert the data from the input method editor into commands for changing a display of an image on the computing device to correspond to the particular zoom level. - View Dependent Claims (18)
-
-
19. A computer-implemented user interface method, comprising:
-
displaying an image on a graphical user interface of a mobile computing device; receiving a first user input on a touchscreen of the mobile computing device that indicates that a user of the mobile computing device has contacted the touchscreen; sensing motion of the mobile computing device in one or more of a plurality of directions; determining whether to interpret the sensed motion as indicating a zooming input based on the first user input, wherein the sensed motion is interpreted as indicating a zooming input when the user is determined to have been contacting the touchscreen while the sensed motion was detected by the mobile computing device; based on the determination of whether to interpret the sensed motion as indicating a zooming input, correlating the sensed motion in one or more of a plurality of directions with either a zooming in or a zooming out zooming direction; changing a zoom level of the display of the image on the graphical user interface to correspond to the correlated zooming direction; while changing the zoom level of the display, receiving a second user input that indicates an intent to perform a panning operation on the graphical user interface, wherein the second user input comprises a dragging input defined by a user maintaining contact with the touchscreen while moving a point of contact on the touchscreen; correlating motion of the dragging input with one or more panning directions; panning the content in a direction of the one or more correlated panning directions; and based on the determination of whether to interpret the sensed motion as indicating a zooming input, displaying a first label positioned near an edge of the graphical user interface and a second label positioned near an opposite edge of the graphical user interface, wherein the first label indicates that motion of the mobile computing device in a direction toward the edge of the graphical user interface will cause a zoom in operation to be performed, and wherein the second label indicates that motion of the mobile computing device in an opposite direction toward the opposite edge of the graphical user interface will cause a zoom out operation to be performed.
-
Specification