WEARABLE EYEGLASSES FOR PROVIDING SOCIAL AND ENVIRONMENTAL AWARENESS
First Claim
1. Eyeglasses to be worn by a user, comprising:
- a left lens;
a right lens;
an inertial measurement unit (IMU) sensor coupled to the eyeglasses and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the eyeglasses;
a global positioning system (GPS) unit coupled to the eyeglasses and configured to detect location data corresponding to a location of the eyeglasses;
at least one camera positioned on at least one of the left lens or the right lens and coupled to the eyeglasses, the at least one camera configured to detect image data corresponding to a surrounding environment of the eyeglasses;
a memory configured to store object data regarding previously determined objects and previously determined user data associated with the user;
a processor connected to the IMU, the GPS unit and the at least one camera and configured to;
recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data,determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day,determine a destination based on the determined desirable event or action,determine a navigation path for navigating the eyeglasses to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, anddetermine output data based on the determined navigation path; and
a speaker configured to provide audio information to the user based on at least one of the recognized object, determined desirable event or action, or navigation path.
1 Assignment
0 Petitions
Accused Products
Abstract
Eyeglasses include a left lens, a right lens and an IMU sensor and a GPS unit. A camera and a memory are coupled to the eyeglasses. A processor is connected to the IMU, the GPS unit and the at least one camera and is adapted to recognize objects by analyzing image data based on the stored object data and inertial measurement data or location data. The processor is also adapted to determine a desirable event based on the object, previously determined user data, and a time. The processor is also adapted to determine a destination based on the determined desirable event and determine a navigation path for navigating the eyeglasses to the destination based on the determined destination, image data, and inertial measurement data or location data. The processor is also adapted to determine output data based on the determined navigation path. A speaker is also provided.
123 Citations
20 Claims
-
1. Eyeglasses to be worn by a user, comprising:
-
a left lens; a right lens; an inertial measurement unit (IMU) sensor coupled to the eyeglasses and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the eyeglasses; a global positioning system (GPS) unit coupled to the eyeglasses and configured to detect location data corresponding to a location of the eyeglasses; at least one camera positioned on at least one of the left lens or the right lens and coupled to the eyeglasses, the at least one camera configured to detect image data corresponding to a surrounding environment of the eyeglasses; a memory configured to store object data regarding previously determined objects and previously determined user data associated with the user; a processor connected to the IMU, the GPS unit and the at least one camera and configured to; recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data, determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, determine a destination based on the determined desirable event or action, determine a navigation path for navigating the eyeglasses to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, and determine output data based on the determined navigation path; and a speaker configured to provide audio information to the user based on at least one of the recognized object, determined desirable event or action, or navigation path. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method for providing continuous social and environmental awareness by eyeglasses comprising:
-
detecting, via a camera, a GPS unit or an IMU, inertial measurement data corresponding to a positioning, velocity, or acceleration of the eyeglasses, location data corresponding to a location of the eyeglasses or image data corresponding to a surrounding environment of the eyeglasses; storing, in a memory, object data regarding previously determined objects and previously determined user data regarding a user; recognizing, by a processor, an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data; determining, by the processor; a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, a destination based on the determined desirable event or action, a navigation path for navigating the eyeglasses to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, or output data based on the determined navigation path; providing, via a speaker or a vibration unit, audio or haptic information to the user based on at least one of the recognized object, the determined desirable event or action, or the navigation path. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. Eyeglasses to be worn by a user, comprising:
-
a right lens; a left lens; an inertial measurement unit (IMU) sensor coupled to the eyeglasses and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the eyeglasses; a global positioning system (GPS) unit coupled to the eyeglasses and configured to detect location data corresponding to a location of the eyeglasses; at least one camera positioned on at least one of the right lens or the left lens and coupled to the eyeglasses, the at least one camera configured to detect image data corresponding to a surrounding environment of the eyeglasses; a memory configured to store object data regarding previously determined objects and previously determined user data associated with the user; an antenna configured to transmit the image data, the inertial measurement data, the location data and the object data to a remote processor and to receive processed data from the remote processor, the remote processor configured to; recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data, determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, determine a destination based on the determined desirable event or action, determine a navigation path for navigating the eyeglasses to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, and determine output data based on the determined navigation path; and a speaker configured to provide audio information to the user based on at least one of the recognized object, determined desirable event or action, or navigation path. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification