Wearable eyeglasses for providing social and environmental awareness
First Claim
1. A wearable computing device having an eyeglasses form and designed to be worn by a user, comprising:
- a body having a frame, a left lens, and a right lens;
an inertial measurement unit (IMU) attached to the body and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the body;
a global positioning system (GPS) sensor attached to the body and configured to detect global positioning data corresponding to a global position of the body;
at least one camera attached to the body and configured to detect image data corresponding to a surrounding environment of the body and a moving object or person in the surrounding environment;
a memory attached to the body and configured to store object data regarding previously determined objects, previously determined user data associated with the user, and a preferred distance between the body and the moving object or person;
a processor attached to the body and electrically coupled to the IMU, the GPS sensor, the at least one camera, and the memory, and configured to;
determine a current location of the body based on at least one of the inertial measurement data, the global positioning data, or the image data,recognize an object in the surrounding environment by limiting an object identification search based on the current location of the body and by analyzing the image data based on the stored object data and the limited object identification search,determine an event or action to be performed based on the recognized object, the previously determined user data, and a current time or day,determine a destination based on the determined event or action to be performed,determine a navigation path from the current location of the body to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the global positioning data,determine a current distance between the body and the moving object or person,determine that a current speed of the body should increase when the current distance between the body and the moving object or person is greater than the preferred distance, anddetermine that the current speed of the body should decrease when the current distance between the body and the moving object or person is less than the preferred distance; and
a speaker attached to the body, electrically coupled to the processor, and configured to;
provide audio information to the user based on at least one of the recognized object, the determined event or action to be performed, or the navigation path,provide audio information to the user to increase a current walking speed when the processor determines that the current speed of the body should increase, andprovide audio information to the user to decrease the current walking speed when the processor determines that the current speed of the body should decrease.
1 Assignment
0 Petitions
Accused Products
Abstract
Eyeglasses include a left lens, a right lens and an IMU sensor and a GPS unit. A camera and a memory are coupled to the eyeglasses. A processor is connected to the IMU, the GPS unit and the at least one camera and is adapted to recognize objects by analyzing image data based on the stored object data and inertial measurement data or location data. The processor is also adapted to determine a desirable event based on the object, previously determined user data, and a time. The processor is also adapted to determine a destination based on the determined desirable event and determine a navigation path for navigating the eyeglasses to the destination based on the determined destination, image data, and inertial measurement data or location data. The processor is also adapted to determine output data based on the determined navigation path. A speaker is also provided.
415 Citations
20 Claims
-
1. A wearable computing device having an eyeglasses form and designed to be worn by a user, comprising:
-
a body having a frame, a left lens, and a right lens; an inertial measurement unit (IMU) attached to the body and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the body; a global positioning system (GPS) sensor attached to the body and configured to detect global positioning data corresponding to a global position of the body; at least one camera attached to the body and configured to detect image data corresponding to a surrounding environment of the body and a moving object or person in the surrounding environment; a memory attached to the body and configured to store object data regarding previously determined objects, previously determined user data associated with the user, and a preferred distance between the body and the moving object or person; a processor attached to the body and electrically coupled to the IMU, the GPS sensor, the at least one camera, and the memory, and configured to; determine a current location of the body based on at least one of the inertial measurement data, the global positioning data, or the image data, recognize an object in the surrounding environment by limiting an object identification search based on the current location of the body and by analyzing the image data based on the stored object data and the limited object identification search, determine an event or action to be performed based on the recognized object, the previously determined user data, and a current time or day, determine a destination based on the determined event or action to be performed, determine a navigation path from the current location of the body to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the global positioning data, determine a current distance between the body and the moving object or person, determine that a current speed of the body should increase when the current distance between the body and the moving object or person is greater than the preferred distance, and determine that the current speed of the body should decrease when the current distance between the body and the moving object or person is less than the preferred distance; and a speaker attached to the body, electrically coupled to the processor, and configured to; provide audio information to the user based on at least one of the recognized object, the determined event or action to be performed, or the navigation path, provide audio information to the user to increase a current walking speed when the processor determines that the current speed of the body should increase, and provide audio information to the user to decrease the current walking speed when the processor determines that the current speed of the body should decrease. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method for providing continuous social and environmental awareness by a wearable computing device having an eyeglass form comprising:
-
detecting, via an IMU, inertial measurement data corresponding to a positioning, velocity, or acceleration of the wearable computing device; detecting, via a GPS sensor, global positioning data corresponding to a global position of the wearable computing device; detecting, via a camera, image data corresponding to a surrounding environment of the wearable computing device and a moving object or person in the surrounding environment; storing, in a memory, object data corresponding to previously determined objects, previously determined user data regarding a user, and a preferred distance between the wearable computing device and the moving object or person; determining, by a processor, a currently location of the wearable computing device based on at least one of the inertial measurement data, the global positioning data, or the image data; recognizing, by the processor, an object in the surrounding environment by limiting an object identification search based on the current location of the wearable computing device and by analyzing the image data based on the stored object data and the limited object identification search; determining, by the processor; an event or action to be performed based on the recognized object, the previously determined user data, and a current time or day, a destination based on the determined desirable event or action, a navigation path from the current location of the wearable computing device to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the global positioning data, a current distance between the wearable computing device and the moving object or person, that a current speed of the wearable computing device should increase when the current distance between the wearable computing device and the moving object or person is greater than the preferred distance, and that the current speed of the wearable computing device should decrease when the current distance between the wearable computing device and the moving object or person is less than the preferred distance; providing, via a speaker or a vibration unit, audio or haptic information to the user based on at least one of the recognized object, the determined event or action to be performed, or the navigation path; and providing, via the speaker or the vibration unit, additional audio or haptic information to the user to increase a current walking speed when the processor determines that the current speed of the wearable computing device should increase, and to decrease the current walking speed when the processor determines that the current speed of the wearable computing device should decrease. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. A wearable computing device having an eyeglass form to be worn by a user, comprising:
-
a body having a frame, a right lens, and a left lens; an inertial measurement unit (IMU) attached to the body and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the body; a global positioning system (GPS) sensor attached to the body and configured to detect global positioning data corresponding to a global position of the body; at least one camera positioned on at least one of the right lens or the left lens, attached to the body, and configured to detect image data corresponding to a surrounding environment of the body and a moving object or person in the surrounding environment; a memory attached to the body and configured to store object data regarding previously determined objects, previously determined user data associated with the user, and a preferred distance between the body and the moving object or person; an antenna attached to the body and configured to transmit the image data, the inertial measurement data, the global positioning data and the object data to a remote processor and to receive processed data from the remote processor, the remote processor configured to; determine a current location of the body based on at least one of the inertial measurement data, the global positioning data, or the image data; recognize an object in the surrounding environment by limiting an object identification search based on the current location of the body and by analyzing the image data based on the stored object data and the limited object identification search, determine an event or action to be performed based on the recognized object, the previously determined user data, and a current time or day, determine a destination based on the determined desirable event or action, determine a navigation path from the current location of the body to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the global positioning data, determine a current distance between the body and the moving object or person, determine that a current speed of the body should increase when the current distance between the body and the moving object or person is greater than the preferred distance, and determine that the current speed of the body should decrease when the current distance between the body and the moving object or person is less than the preferred distance; and a speaker configured to; provide audio information to the user based on at least one of the recognized object, the determined desirable event or action, or the navigation path, provide audio information to the user to increase a current walking speed when the remote processor determines that the current speed of the body should increase, and provide audio information to the user to decrease the current walking speed when the remote processor determines that the current speed of the body should decrease. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification