Intuitive computing methods and systems
First Claim
1. A processing device including a processor, memory, a touch screen, a location determination module, and at least one image sensor that produces sensor data about an object in an environment of the device, the memory storing instructions configuring the processor to present on the touch screen a user interface, a first area of the user interface presenting discovery information derived from the image sensor data, the discovery information comprising object-identifying information that identifies the object in said environment, and a second, different area of the user interface simultaneously presenting a representation of a geographic area around the device based on information from the location determination module, the second part of the user interface further displaying markings corresponding to objects previously encountered by a user of the device, and including a time-related control by which said markings can be filtered based on times of such previous encounters.
0 Assignments
0 Petitions
Accused Products
Abstract
A smart phone senses audio, imagery, and/or other stimulus from a user'"'"'s environment, and acts autonomously to fulfill inferred or anticipated user desires. In one aspect, the detailed technology concerns phone-based cognition of a scene viewed by the phone'"'"'s camera. The image processing tasks applied to the scene can be selected from among various alternatives by reference to resource costs, resource constraints, other stimulus information (e.g., audio), task substitutability, etc. The phone can apply more or less resources to an image processing task depending on how successfully the task is proceeding, or based on the user'"'"'s apparent interest in the task. In some arrangements, data may be referred to the cloud for analysis, or for gleaning. Cognition, and identification of appropriate device response(s), can be aided by collateral information, such as context. A great number of other features and arrangements are also detailed.
-
Citations
12 Claims
- 1. A processing device including a processor, memory, a touch screen, a location determination module, and at least one image sensor that produces sensor data about an object in an environment of the device, the memory storing instructions configuring the processor to present on the touch screen a user interface, a first area of the user interface presenting discovery information derived from the image sensor data, the discovery information comprising object-identifying information that identifies the object in said environment, and a second, different area of the user interface simultaneously presenting a representation of a geographic area around the device based on information from the location determination module, the second part of the user interface further displaying markings corresponding to objects previously encountered by a user of the device, and including a time-related control by which said markings can be filtered based on times of such previous encounters.
-
7. A non-transitory computer readable storage medium containing software instructions that, when executed by a device processor, cause the device to perform acts including:
-
analyze sensed image data to yield identifying information about an object; obtain information identifying a location; and display two parts of a user interface simultaneously, a first part of the user interface presenting the identifying information about the object, and a second part of the user interface presenting a representation of a geographic area around said location, the second part also including markings corresponding to objects previously encountered by a user, and a time-related control by which said markings can be filtered based on times of such previous encounters. - View Dependent Claims (8, 9)
-
-
10. A method comprising:
-
sensing image data from a user'"'"'s environment; analyzing, using a mobile device, the sensed image data to yield identifying information about an object in the user'"'"'s environment; sensing a location of the user; displaying a user interface, a first part of the user interface presenting the identifying information about the object in the user'"'"'s environment, and a second, different part of the user interface presenting a representation of a geographic area around the sensed location of the user; wherein the method includes displaying said parts of the user interface simultaneously; and wherein the second part of the user interface displays markings corresponding to objects previously encountered by a user, and includes a time-related control by which said markings can be filtered based on times of such previous encounters. - View Dependent Claims (11, 12)
-
Specification