Force-based interactions with digital agents
First Claim
1. A method performed by a computing device comprising storage hardware, processing hardware, a display, and a touch input device that senses touch inputs and force components thereof, the method performed by the processing hardware executing instructions stored in the storage hardware, the method comprising:
- executing an intelligent personal assistant (IPA) on the computing device, the executing comprising using speech recognition to recognize commands inputted through a microphone of the computing device and invoking respective operations for the recognized commands;
executing an application comprised of a graphical user interface, the executing including displaying the graphical user interface, the graphical user interface comprising graphic objects representing objects stored on the computing device, the application configured to respond to touch inputs that are directed to the graphical user interface;
receiving the touch inputs respectively comprised of locations, the touch inputs associated with the force components, respectively, each force component corresponding to a measure of force with which a corresponding touch input was inputted via the touch input device;
evaluating the force components against a force condition, wherein(i) each time the force condition is determined to be satisfied by an evaluated force component, based on such determination, a corresponding touch input is passed to the IPA which uses the touch input to identify a target object according to a location of a graphic representation of the target object in the graphical user interface and according to the touch input corresponding to the evaluated force component that satisfied the force condition, and(ii) each time the force condition is determined to not be satisfied by an evaluated force component, the touch input is provided to the application based on a location of the touch input and the application responds to the provided touch input.
2 Assignments
0 Petitions
Accused Products
Abstract
Embodiments relate to enabling force-based interactions with an intelligent personal assistant (IPA). A computing device capable of sensing the force exerted to input touch inputs is configured with a pressure-based filter that checks pressures of touch inputs to determine which touch inputs are to be diverted to the IPA or which are to be passed on to underlying user interface that is not related to the IPA. Touch inputs designated for the IPA based on their pressure characteristics can become part of the context of the IPA. Some IPA uses of the touch inputs include selecting graphic objects on the display, resolving exophoric phrases (e.g., “that”, “those”) as referring to such selected graphic objects, displaying transient user interface to provide information about (or actions for) the selected object, incorporating a selected object into the current context of the IPA, etc.
16 Citations
20 Claims
-
1. A method performed by a computing device comprising storage hardware, processing hardware, a display, and a touch input device that senses touch inputs and force components thereof, the method performed by the processing hardware executing instructions stored in the storage hardware, the method comprising:
-
executing an intelligent personal assistant (IPA) on the computing device, the executing comprising using speech recognition to recognize commands inputted through a microphone of the computing device and invoking respective operations for the recognized commands; executing an application comprised of a graphical user interface, the executing including displaying the graphical user interface, the graphical user interface comprising graphic objects representing objects stored on the computing device, the application configured to respond to touch inputs that are directed to the graphical user interface; receiving the touch inputs respectively comprised of locations, the touch inputs associated with the force components, respectively, each force component corresponding to a measure of force with which a corresponding touch input was inputted via the touch input device; evaluating the force components against a force condition, wherein (i) each time the force condition is determined to be satisfied by an evaluated force component, based on such determination, a corresponding touch input is passed to the IPA which uses the touch input to identify a target object according to a location of a graphic representation of the target object in the graphical user interface and according to the touch input corresponding to the evaluated force component that satisfied the force condition, and (ii) each time the force condition is determined to not be satisfied by an evaluated force component, the touch input is provided to the application based on a location of the touch input and the application responds to the provided touch input. - View Dependent Claims (2, 3, 4)
-
-
5. A method performed by a computing device comprising storage hardware, processing hardware, a display, and a touch input device that senses touch inputs and force components thereof, the method performed by the processing hardware executing instructions stored in the storage hardware, the method comprising:
-
executing an intelligent personal assistant (IPA) on the computing device, the executing comprising using speech recognition to recognize commands inputted through a microphone of the computing device and invoking respective operations for the recognized commands; executing an application comprised of a graphical user interface, the executing including displaying the graphical user interface, the graphical user interface comprising graphic objects representing objects stored on the computing device, the application configured to respond to touch inputs that are directed to the graphical user interface; receiving the touch inputs respectively comprised of locations, the touch inputs associated with the force components, respectively, each force component corresponding to a measure of force with which a corresponding touch input was inputted via the touch input device; evaluating the force components against a force condition, wherein (i) each time the force condition is determined to be satisfied by an evaluated force component, based on such determination, a corresponding touch input is passed to the IPA which uses the touch input to identify a target object according to a location of a graphic representation of the target object in the graphical user interface and according to the touch input corresponding to the evaluated force component that satisfied the force condition, and (ii) each time the force condition is determined to not be satisfied by an evaluated force component, the touch input is provided to the application and the application responds to the provided touch input, and wherein a given touch input corresponds to two candidate objects and the method further comprises the IPA and/or the application determining which of the two candidate objects will be the target object.
-
-
6. A method performed by a computing device comprising storage hardware, processing hardware, a display, and a touch input device that senses touch inputs and force components thereof, the method performed by the processing hardware executing instructions stored in the storage hardware, the method comprising:
-
executing an intelligent personal assistant (IPA) on the computing device, the executing comprising using speech recognition to recognize commands inputted through a microphone of the computing device and invoking respective operations for the recognized commands; executing an application comprised of a graphical user interface, the executing including displaying the graphical user interface, the graphical user interface comprising graphic objects representing objects stored on the computing device, the application configured to respond to touch inputs that are directed to the graphical user interface; receiving the touch inputs respectively comprised of locations, the touch inputs associated with the force components, respectively, each force component corresponding to a measure of force with which a corresponding touch input was inputted via the touch input device; evaluating the force components against a force condition, wherein (i) each time the force condition is determined to be satisfied by an evaluated force component, based on such determination, a corresponding touch input is passed to the IPA which uses the touch input to identify a target object according to a location of a graphic representation of the target object in the graphical user interface and according to the touch input corresponding to the evaluated force component that satisfied the force condition, and (ii) each time the force condition is determined to not be satisfied by an evaluated force component, the touch input is provided to the application and the application responds to the provided touch input, and, wherein the force condition is used to differentiate, among the touch inputs, between first touch inputs that are to be used to identify objects by the IPA and second touch inputs that are not to be used by the IPA to identify objects. - View Dependent Claims (7)
-
-
8. A computing device comprising:
-
processing hardware; a microphone; a touch and force sensing display that senses touch inputs and provides respective measures that correspond to forces of the touch inputs; and storage hardware storing instructions configured to cause the processing hardware to perform a process comprising; executing an application comprising an object displayed by the computing device; receiving a voice command inputted through the microphone; receiving a first touch input corresponding to a first force measure; determining whether the force measure satisfies a force condition; executing first logic when the determining determines that the force measure satisfies the force condition, the first logic, when executed, associating the voice command with the object, and executing the voice command based on the association between the voice command and the object; and executing second logic when the determining does not determine that the force measure satisfies the force condition, the second logic, when executed, providing the touch input to the application based on a location of the touch input. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A method performed by a computing device comprising storage hardware, processing hardware, a display, and a touch input device that senses touch inputs and force components, the method performed by the processing hardware executing instructions stored in the storage hardware, the method comprising:
-
executing an intelligent personal assistant (IPA) on the computing device, the executing comprising using speech recognition to recognize commands inputted through a microphone of the computing device; receiving a touch input comprised of a pressure component and a location relative to the display, the pressure component corresponding to a pressure value sensed by the touch input device while sensing input points of the touch input; and determining whether or not to activate the IPA by determining whether the pressure component of the touch input satisfies a pressure condition, wherein the instructions are configured to; respond to a determination that the pressure component of the touch input satisfies the pressure condition by activating the IPA and providing the touch input to the IPA, and respond to a determination that the pressure component of the touch input does not satisfy the pressure condition by, instead of activating the IPA and providing the touch input to the IPA, providing the touch input to an application based on the application being associated with the location of the touch input. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification