Obtaining information from an environment of a user of a wearable camera system
First Claim
1. A wearable apparatus for storing information related to objects identified in an environment of a user, the wearable apparatus comprising:
- a wearable image sensor configured to capture a plurality of images from the environment of the user while being worn by the user;
an attachment mechanism configured to enable the image sensor to be worn by the user; and
at least one processor programmed to;
receive image data representative of the plurality of images;
analyze, using at least one automated image processing technique, the image data representative of one or more of the plurality of images to detect one or more of a plurality of predetermined triggers;
detect, in the image data representative of at least one of the plurality of images, a trigger indicative of an object leaving a hand of the user;
responsive to detecting the trigger indicative of the object leaving the hand of the user, detect, in the image data representative of at least one of the plurality of images, the object entering a receptacle;
determine, based on the image data representative of at least one of the plurality of images that includes the receptacle, at least a type of the receptacle, the type of receptacle including at least one of a trash receptacle or a recycling receptacle;
detect, in a portion of the image data representative of at least one of the plurality of images that includes the object, one or more characteristics of the object;
determine, based on a comparison of the one or more characteristics in the portion of the image data with one or more images of representative objects stored in a database, at least a category for a type of the object;
determine, based on a comparison of the one or more characteristics in the portion of the image data with one or more images stored in the database and associated with at least one sub-category within the category, an identity of the object;
based on the identity of the object and on the type of the receptacle, generate information related to an action to be taken related to the object; and
provide an instruction to a controlled device associated with the user based on the generated information.
1 Assignment
0 Petitions
Accused Products
Abstract
A wearable apparatus and method are provided for executing actions based on triggers identified in an environment of a user. In one implementation, a wearable apparatus for storing information related to objects identified in an environment of a user is provided. The wearable apparatus includes a wearable image sensor configured to capture a plurality of images from the environment of the user and at least one processing device. The processing device may be programmed to process the plurality of images to detect an object entering a receptacle, process at least one of the plurality of images that includes the object to determine at least a type of the object, and based on the type of the object, generate information related to an action to be taken related to the object.
7 Citations
12 Claims
-
1. A wearable apparatus for storing information related to objects identified in an environment of a user, the wearable apparatus comprising:
-
a wearable image sensor configured to capture a plurality of images from the environment of the user while being worn by the user; an attachment mechanism configured to enable the image sensor to be worn by the user; and at least one processor programmed to; receive image data representative of the plurality of images; analyze, using at least one automated image processing technique, the image data representative of one or more of the plurality of images to detect one or more of a plurality of predetermined triggers; detect, in the image data representative of at least one of the plurality of images, a trigger indicative of an object leaving a hand of the user; responsive to detecting the trigger indicative of the object leaving the hand of the user, detect, in the image data representative of at least one of the plurality of images, the object entering a receptacle; determine, based on the image data representative of at least one of the plurality of images that includes the receptacle, at least a type of the receptacle, the type of receptacle including at least one of a trash receptacle or a recycling receptacle; detect, in a portion of the image data representative of at least one of the plurality of images that includes the object, one or more characteristics of the object; determine, based on a comparison of the one or more characteristics in the portion of the image data with one or more images of representative objects stored in a database, at least a category for a type of the object; determine, based on a comparison of the one or more characteristics in the portion of the image data with one or more images stored in the database and associated with at least one sub-category within the category, an identity of the object; based on the identity of the object and on the type of the receptacle, generate information related to an action to be taken related to the object; and provide an instruction to a controlled device associated with the user based on the generated information. - View Dependent Claims (2, 3, 4)
-
-
5. A method for storing information related to objects identified in an environment of a user of a wearable apparatus, comprising:
-
capturing a plurality of images from the environment of the user by a wearable image sensor while being worn by the user; receiving, via at least one processing device, image data representative of the plurality of images; analyzing, via the at least one processing device using at least one automated image processing technique, the image data representative of one or more of the plurality of images to detect one or more of a plurality of predetermined triggers; detecting, in the image data representative of at least one of the plurality of images, a trigger indicative of an object leaving a hand of the user; responsive to detecting the trigger indicative of the object leaving the hand of the user, detecting, via the at least one processing device, in the image data representative of at least one of the plurality of images, the object entering a receptacle; determining, based on the image data representative of at least one of the plurality of images that includes the receptacle, at least a type of the receptacle, the type of receptacle including at least one of a trash receptacle or a recycling receptacle; detecting, in a portion of the image data representative of at least one of the plurality of images that includes the object, one or more characteristics of the object; determining, via the at least one processing device, at least a category for a type of the object based on a comparison of the one or more characteristics in the portion of the image data with one or more images of representative objects stored in a database; determining, based on a comparison of the one or more characteristics in the portion of the image data with one or more images stored in the database and associated with at least one sub-category within the category, an identity of the object; generating, based on the identity of the object and on the type of the receptacle, information related to an action to be taken related to the object; and providing an instruction to a controlled device associated with the user based on the generated information. - View Dependent Claims (6, 7, 8)
-
-
9. A non-transitory computer readable medium storing instructions executable by at least one processing device, the instructions including instructions for:
-
capturing a plurality of images from the environment of a user by a wearable image sensor while being worn by the user; receiving image data representative of the plurality of images; analyzing, using at least one automated image processing technique, the image data representative of one or more of the plurality of images to detect one or more of a plurality of predetermined triggers; detecting, in the image data representative of at least one of the plurality of images, a trigger indicative of an object leaving a hand of the user; responsive to detecting the trigger indicative of the object leaving the hand of the user, detecting in the image data representative of at least one of the plurality of images, the object entering a receptacle; determining, based on the image data representative of at least one of the plurality of images that includes the receptacle, at least a type of the receptacle, the type of receptacle including at least one of a trash receptacle or a recycling receptacle; detecting, in a portion of the image data representative of at least one of the plurality of images that includes the object, one or more characteristics of the object; determining at least a category fora type of the object based on a comparison of the one or more characteristics in the portion of the image data with one or more images of representative objects stored in a database; determining, based on a comparison of the one or more characteristics in the portion of the image data with one or more images stored in the database and associated with at least one sub-category within the category, an identity of the object; generating, based on the identity of the object and on the type of the receptacle, information related to an action to be taken related to the object; and providing an instruction to a controlled device associated with the user based on the generated information. - View Dependent Claims (10, 11, 12)
-
Specification