Personal audio/visual apparatus providing resource management
First Claim
1. A method for providing resource management using one or more personal audiovisual (A/V) apparatus including a near-eye, augmented reality (AR) display comprising:
- automatically identifying a resource based on image data of the resource captured by at least one capture device of at least one personal A/V apparatus and object reference data;
automatically tracking a three dimensional (3D) space position of the resource in a location identified based on location data detected by the at least one personal A/V apparatus;
automatically determining a property of the resource based on the image data of the resource;
automatically tracking the property of the resource;
automatically causing display of image data related to the resource in the near-eye, augmented reality display based on a notification criteria for the property associated with the resource;
causing the near-eye, AR display to display discrepancy data of item descriptions and pricing information of a purchase list compared to store purchase data; and
performing a task, wherein the task is an eating monitoring application, such that at least one user physical action of at least one body part with respect to the resource is automatically identified within an approximation to a mouth of a wearer of the near-eye, AR display, and the identified resource is determined to be moving to and from the mouth so as to recognize at least a reduction in an amount of the resource.
2 Assignments
0 Petitions
Accused Products
Abstract
Technology is described for resource management based on data including image data of a resource captured by at least one capture device of at least one personal audiovisual (A/V) apparatus including a near-eye, augmented reality (AR) display. A resource is automatically identified from image data captured by at least one capture device of at least one personal A/V apparatus and object reference data. A location in which the resource is situated and a 3D space position or volume of the resource in the location is tracked. A property of the resource is also determined from the image data and tracked. A function of a resource may also be stored for determining whether the resource is usable for a task. Responsive to notification criteria for the resource being satisfied, image data related to the resource is displayed on the near-eye AR display.
-
Citations
20 Claims
-
1. A method for providing resource management using one or more personal audiovisual (A/V) apparatus including a near-eye, augmented reality (AR) display comprising:
-
automatically identifying a resource based on image data of the resource captured by at least one capture device of at least one personal A/V apparatus and object reference data; automatically tracking a three dimensional (3D) space position of the resource in a location identified based on location data detected by the at least one personal A/V apparatus; automatically determining a property of the resource based on the image data of the resource; automatically tracking the property of the resource; automatically causing display of image data related to the resource in the near-eye, augmented reality display based on a notification criteria for the property associated with the resource; causing the near-eye, AR display to display discrepancy data of item descriptions and pricing information of a purchase list compared to store purchase data; and performing a task, wherein the task is an eating monitoring application, such that at least one user physical action of at least one body part with respect to the resource is automatically identified within an approximation to a mouth of a wearer of the near-eye, AR display, and the identified resource is determined to be moving to and from the mouth so as to recognize at least a reduction in an amount of the resource. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A personal audiovisual (A/V) apparatus including a near-eye, augmented reality display for providing resource management comprising:
-
the near-eye augmented reality display having a display field of view and being supported by a near-eye support structure; one or more processors being communicatively coupled to the near-eye, augmented reality display for controlling the display; the one or more processors being communicatively coupled to at least one capture device on the near-eye support structure for receiving image data captured by the at least one capture device of a location in which the support structure is situated; the one or more processors automatically identifying a real object in the image data based on object reference data stored in an accessible memory; the one or more processors identifying the real object as a resource falling within a category of a resource profile based on a type of object determined for the real object and associating a shared resource data record for the resource with the resource profile in data stored in a memory; and the one or more processors identifying and storing in the shared resource data record a three dimensional (3D) space position of the resource in the location based on the image data and a 3D mapping of the location, and automatically tracking a change in the 3D space position of the resource and position of a volume of the resource in the location and updating the shared resource data record to identify the location in which the resource is located and the position of a volume occupied by the resource; the one or more processors identifying a sought after identified resource selected by a user; and the one or more processors being responsive to any object blocking the selected sought after resource in the display field of view, cause an automatic displaying of visual data of an indicator to identify a location of the selected sought after resource being blocked by the object. - View Dependent Claims (11, 12, 13, 14, 15, 16)
-
-
17. One or more processor readable storage devices comprising instructions which cause one or more processors to execute a method for providing resource management using one or more personal audiovisual (A/V) apparatus including a near-eye, augmented reality display, the method comprising:
-
automatically identifying a food resource based on image data of the food resource captured by at least one capture device of at least one personal A/V apparatus and object reference data; automatically tracking a 3D space position of the food resource in a location identified based on location data detected by the at least one personal A/V apparatus; automatically determining a property of the food resource based on the image data of the food resource; automatically identifying a function associated with the food resource based on the object reference data; storing data identifying the location of the food resource, the property of the food resource and the function associated with the food resource in a resource data record associated with the food resource, the location, property and function forming the basis upon which one or more tasks related to the food resource determine actions to perform and task data to output, the actions to perform based on a user of the near-eye augmented reality display identifying a set of food resources; identifying one or more recipes to be made using the identified set of food resources; automatically outputting the one or more recipes to the augmented reality display; determining actions to perform based on the user of the near-eye augmented reality display identifying at least one of the one or more recipes; storing the one or more selected recipes in the resource data record; and associating the resource data record with a resource profile associated with a user of the personal A/V apparatus. - View Dependent Claims (18, 19, 20)
-
Specification