User interface for initiating activities in an electronic device
First Claim
1. A method for operating a mobile computing device, the method being performed by one or more processors and comprising:
- detecting a gesture performed by a user, the gesture including at least a motion that (i) is initiated by user contact on a touch-sensitive gesture region of the mobile computing device with an object, and (ii) continues onto a touch-sensitive display screen of the mobile computing device; and
in response to detecting the gesture, presenting a launcher interface on the touch-sensitive display screen, wherein a shape of the launcher interface dynamically changes based, at least in part, on a location of the object while the gesture is performed by the user.
6 Assignments
0 Petitions
Accused Products
Abstract
In one embodiment, a user interface is presented for initiating activities in an electronic device. The user interface includes an element referred to as a “launch wave”, which can be activated at substantially any time, even if the user is engaged with an activity, without requiring the user to first return to a home screen. In various embodiments, the user can activate the launch wave by performing a gesture, or by pressing a physical button, or by tapping at a particular location on a touchscreen, or by activating a keyboard command. In one embodiment, activation of the launch wave and selection of an item from the launch wave can be performed in one continuous operation on a touch-sensitive screen, so as to improve the expediency and convenience of launching applications and other items.
-
Citations
15 Claims
-
1. A method for operating a mobile computing device, the method being performed by one or more processors and comprising:
-
detecting a gesture performed by a user, the gesture including at least a motion that (i) is initiated by user contact on a touch-sensitive gesture region of the mobile computing device with an object, and (ii) continues onto a touch-sensitive display screen of the mobile computing device; and in response to detecting the gesture, presenting a launcher interface on the touch-sensitive display screen, wherein a shape of the launcher interface dynamically changes based, at least in part, on a location of the object while the gesture is performed by the user. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A mobile computing device comprising:
-
a touch-sensitive display screen; a touch-sensitive gesture region; and one or more processors coupled to the touch-sensitive display screen and the touch-sensitive gesture region, the one or more processors configured to; detect a gesture performed by a user, the gesture including at least a motion that (i) is initiated by user contact on the touch-sensitive gesture region with an object, and (ii) continues onto the touch-sensitive display screen; and in response to detecting the gesture, present a launcher interface on the touch-sensitive display screen, wherein a shape of the launcher interface dynamically changes based, at least in part, on a location of the object while the gesture is performed by the user. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. A non-transitory computer-readable medium that stores instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising:
-
detecting a gesture performed by a user, the gesture including at least a motion that (i) is initiated by user contact on a touch-sensitive gesture region of a mobile computing device with an object, and (ii) continues onto a touch-sensitive display screen of the mobile computing device; and in response to detecting the gesture, presenting a launcher interface on the touch-sensitive display screen, wherein a shape of the launcher interface dynamically changes based, at least in part, on a location of the object while the gesture is performed by the user.
-
Specification