Apparatus and methods for context determination using real time sensor data
First Claim
1. A computerized method for providing a remote control command to a computerized device based on a sequence of digital images, the method comprising:
- determining a discrepancy measure based on a comparison of pixels of a current image of the sequence of digital images to a reference image;
determining a salient feature based on an analysis of the discrepancy measure, the salient feature being associated with a portion of the pixels within the current image; and
based on an existence of a previously established association between an occurrence of a user indication associated with an action by the computerized device and the salient feature, automatically transmitting a command to the computerized device, the command configured to cause the computerized device to execute the action;
wherein;
the salient feature comprises a representation of a user body portion;
the reference image comprises an image acquired prior to the current image without the representation of the user body portion; and
the reference image is based on a low pass filter operation on a plurality of images from the sequence of digital images, where the individual ones of the plurality of images precede the current image.
2 Assignments
0 Petitions
Accused Products
Abstract
Computerized appliances may be operated by users remotely. In one exemplary implementation, a learning controller apparatus may be operated to determine association between a user indication and an action by the appliance. The user indications, e.g., gestures, posture changes, audio signals may trigger an event associated with the controller. The event may be linked to a plurality of instructions configured to communicate a command to the appliance. The learning apparatus may receive sensory input conveying information about robot'"'"'s state and environment (context). The sensory input may be used to determine the user indications. During operation, upon determine the indication using sensory input, the controller may cause execution of the respective instructions in order to trigger action by the appliance. Device animation methodology may enable users to operate computerized appliances using gestures, voice commands, posture changes, and/or other customized control elements.
-
Citations
25 Claims
-
1. A computerized method for providing a remote control command to a computerized device based on a sequence of digital images, the method comprising:
-
determining a discrepancy measure based on a comparison of pixels of a current image of the sequence of digital images to a reference image; determining a salient feature based on an analysis of the discrepancy measure, the salient feature being associated with a portion of the pixels within the current image; and based on an existence of a previously established association between an occurrence of a user indication associated with an action by the computerized device and the salient feature, automatically transmitting a command to the computerized device, the command configured to cause the computerized device to execute the action; wherein; the salient feature comprises a representation of a user body portion; the reference image comprises an image acquired prior to the current image without the representation of the user body portion; and the reference image is based on a low pass filter operation on a plurality of images from the sequence of digital images, where the individual ones of the plurality of images precede the current image. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. A computerized method for providing a remote control command to a computerized device based on a sequence of digital images, the method comprising:
-
determining a discrepancy measure based on a comparison of pixels of a current image of the sequence of digital images to a reference image; determining a salient feature based on an analysis of the discrepancy measure, the salient feature being associated with a portion of the pixels within the current image; based on an existence of a previously established association between an occurrence of a user indication associated with an action by the computerized device and the salient feature, automatically transmitting a command to the computerized device, the command configured to cause the computerized device to execute the action; and loading another set of instructions which were previously configured to cause execution of another task by another computerized device; wherein; the computerized device comprises a household appliance configured to perform a cleaning task of a user premises; a state comprises information related to the user premises; and the loading the another set of instructions is triggered automatically by the computerized device based on a characteristic of the user premises. - View Dependent Claims (15, 16, 17, 18, 19, 20, 21)
-
-
22. A non-transitory computer-readable storage medium having instructions embodied thereon to determine an association between a sensory context and an action indication for an appliance device, the instructions when executed by a processing apparatus cause the processing apparatus to:
-
determine a first sensory context version based on a first sensory modality and a high pass filter operation version using the first sensory modality, the high pass filter operation characterized by a decay time scale; determine a second sensory context version based on a second sensory modality; when the second sensory context version occurs within a first time window from occurrence of the first sensory context version, assign the first sensory context version and the second context version as the sensory context; and associate the sensory context with the action indication based on occurrence of the action indication within a second time window from at least one of the first sensory context version or the second sensory context version; wherein the association between the sensory context and the action indication for the appliance device is configured to enable automatic provision of a command to the appliance device based on an occurrence of the sensory context, the command configured to cause the appliance device to execute the action; and wherein one or more data of the first sensory modality comprises a sequence of images, and the decay time scale is at least five times longer than the second time window. - View Dependent Claims (23)
-
-
24. A non-transitory computer-readable storage medium having instructions embodied thereon to determine an association between a sensory context and an action indication for an appliance device, the instructions when executed by a processing apparatus cause the processing apparatus to:
-
determine a first sensory context version based on a first sensory modality, the first sensory context version comprising a transformation of individual ones of a sequence of input images to transformed images, the individual ones of the transformed images characterized by a first data rate that is at least ten times lower than a second data rate of respective input images; determine a second sensory context version based on a second sensory modality; when the second sensory context version occurs within a first time window from occurrence of the first sensory context version, assign the first sensory context version and the second context version as the sensory context; and associate the sensory context with the action indication based on occurrence of the action indication within a second time window from at least one of the first sensory context version or the second sensory context version; wherein the association between the sensory context and the action indication for the appliance device is configured to enable automatic provision of a command to the appliance device based on an occurrence of the sensory context, the command configured to cause the appliance device to execute the action; wherein one or more data of the first sensory modality comprises a sequence of input images provided by a video camera. - View Dependent Claims (25)
-
Specification