User interface indirect interaction
First Claim
1. A method comprising:
- receiving, in a indirect interaction device configured to facilitate interactions with a user interface, a first input data indicative of a first gesture being applied in a first portion of a sensory surface of the indirect interaction device, the sensory surface comprising the first portion and a second portion, the first portion having a first area prior to receiving the first input data and the second portion having a second area prior to receiving the first input data, where the first input data indicates a location on the indirect interaction device at which the first gesture was applied;
responsive to receiving the first input data;
launching, on the display, a menu comprising a plurality of selections; and
expanding the first area to create an expanded first area and reducing the second area to create a reduced second area,where the menu is launched at a location on the display that corresponds to the location on the indirect interaction device at which the first gesture was applied,where the size of the expanded first area varies directly with and depends on the size of the menu, andwhere the size of the reduced second area varies inversely with and depends on the size of the expanded first area; and
wherein the first portion of the sensory surface is mapped to a first launched object and the second portion of the sensory surface is mapped to a second object that is different than the first launched object, and wherein the amount of the sensory surface that is mapped to the launched object is based in part on the type of object that is launched.
2 Assignments
0 Petitions
Accused Products
Abstract
One or more techniques and/or systems are provided for utilizing input data received from an indirect interaction device (e.g., mouse, touchpad, etc.) to launch, engage, and/or close, etc. an object within a user interface. For example, a sensory surface of the indirect interaction device may be divided into two (or more) portions, a first portion utilized to launch, engage, and/or close an object and a second portion utilized to navigate (e.g., a cursor) within the user interface. When an object is launched based upon receipt of a predefined gesture(s), the first portion of the sensory surface may be mapped to the object to provide for interaction with the object via an interaction between a contact (e.g., finger) and the first portion. Also, the surface area of the first portion may be altered (e.g., enlarged) when it is mapped to the object and/or according to operations performed on the object.
-
Citations
14 Claims
-
1. A method comprising:
-
receiving, in a indirect interaction device configured to facilitate interactions with a user interface, a first input data indicative of a first gesture being applied in a first portion of a sensory surface of the indirect interaction device, the sensory surface comprising the first portion and a second portion, the first portion having a first area prior to receiving the first input data and the second portion having a second area prior to receiving the first input data, where the first input data indicates a location on the indirect interaction device at which the first gesture was applied; responsive to receiving the first input data; launching, on the display, a menu comprising a plurality of selections; and expanding the first area to create an expanded first area and reducing the second area to create a reduced second area, where the menu is launched at a location on the display that corresponds to the location on the indirect interaction device at which the first gesture was applied, where the size of the expanded first area varies directly with and depends on the size of the menu, and where the size of the reduced second area varies inversely with and depends on the size of the expanded first area; and wherein the first portion of the sensory surface is mapped to a first launched object and the second portion of the sensory surface is mapped to a second object that is different than the first launched object, and wherein the amount of the sensory surface that is mapped to the launched object is based in part on the type of object that is launched. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. An indirect interaction device comprising:
-
one or more processing units; and a memory that stores instructions that when executed by at least one of the one or more processing units, perform a method comprising; receiving a first input data indicative of a first gesture being applied at a first location in a first portion of a sensory surface of the indirect interaction device, the sensory surface comprising the first portion and a second portion, the first portion having a first area prior to receiving the first input data and the second portion having a second area prior to receiving the first input data; receiving a second input data indicative of a second gesture being applied to the expanded first portion; responsive to receiving the first input data; launching a menu having a plurality of selections on a user interface displayed on a display located on a device other than the indirect interaction device; and expanding the first area to create an expanded first area and reducing the second area to create a reduced second area, and responsive to receiving the second input data; upon determining that the second gesture is the opposite gesture of the first gesture, removing the menu from the display, where the menu is launched at a location on the display that corresponds to the first location, where the size of the expanded first area varies directly with and depends on the size of the menu, where the size of the reduced second area varies inversely with and depends on the size of the expanded first area, where the reduced second area is configured to receive inputs to navigate in an application whose output is displayed on the display, where the expanded first area is partitioned into a plurality of sub-partitions that correspond to the plurality of selections, where members of the plurality of sub-partitions are mapped to the corresponding members of the plurality of selections so that a touch of a member of the plurality of sub-partitions causes selection of the corresponding member of the plurality of sub-partitions, and where the size and location of a member of the plurality of sub-partitions is determined by the size and location of a corresponding member of the plurality of selections; and wherein the first portion of the sensory surface is mapped to a first launched object and the second portion of the sensory surface is mapped to a second object that is different than the first launched object, and wherein the amount of the sensory surface that is mapped to the launched object is based in part on the type of object that is launched.
-
Specification