WEARABLE EARPIECE FOR PROVIDING SOCIAL AND ENVIRONMENTAL AWARENESS
First Claim
1. An intelligent earpiece to be worn over an ear of a user, comprising:
- an inertial measurement unit (IMU) coupled to the intelligent earpiece and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the intelligent earpiece;
a global positioning system (GPS) unit coupled to the intelligent earpiece and configured to detect location data corresponding to a location of the intelligent earpiece;
at least one camera coupled to the intelligent earpiece configured to detect image data corresponding to a surrounding environment of the intelligent guidance device;
a memory configured to store object data regarding previously determined objects and for storing previously determined user data associated with the user;
a processor connected to the IMU, the GPS unit and the at least one camera and configured to;
recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data,determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day,determine a destination based on the determined desirable event or action,determine a navigation path for navigating the intelligent guidance device to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, anddetermine output data based on the determined navigation path; and
a speaker configured to provide audio information to the user based on at least one of the recognized object, determined desirable event or action, or navigation path.
1 Assignment
0 Petitions
Accused Products
Abstract
An intelligent earpiece to be worn over an ear of a user is described. The earpiece includes a processor connected to the IMU, the GPS unit and the at least one camera. The processor can recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data. The processor can determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day. The processor can determine a destination based on the determined desirable event or action. The processor can determine a navigation path for navigating the intelligent guidance device to the destination based on the determined destination, the image data, the inertial measurement data or the location data. The processor can determine output data based on the determined navigation path.
151 Citations
20 Claims
-
1. An intelligent earpiece to be worn over an ear of a user, comprising:
-
an inertial measurement unit (IMU) coupled to the intelligent earpiece and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the intelligent earpiece; a global positioning system (GPS) unit coupled to the intelligent earpiece and configured to detect location data corresponding to a location of the intelligent earpiece; at least one camera coupled to the intelligent earpiece configured to detect image data corresponding to a surrounding environment of the intelligent guidance device; a memory configured to store object data regarding previously determined objects and for storing previously determined user data associated with the user; a processor connected to the IMU, the GPS unit and the at least one camera and configured to; recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data, determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, determine a destination based on the determined desirable event or action, determine a navigation path for navigating the intelligent guidance device to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, and determine output data based on the determined navigation path; and a speaker configured to provide audio information to the user based on at least one of the recognized object, determined desirable event or action, or navigation path. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method for providing continuous social and environmental awareness by an earpiece comprising:
-
detecting, via an inertial measurement unit (IMU), a global position system unit (GPS) or a camera, inertial measurement data corresponding to a positioning, velocity, or acceleration of the earpiece, location data corresponding to a location of the earpiece or image data corresponding to a surrounding environment of the earpiece; storing, in a memory, object data regarding previously determined objects and previously determined user data regarding the user; recognizing, by a processor, an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data; determining, by the processor; a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, a destination based on the determined desirable event or action, a navigation path for navigating the intelligent guidance device to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, and output data based on the determined navigation path; and providing, via a speaker or a vibration unit, audio or haptic information to the user based on at least one of the recognized object, the determined desirable event or action, or the navigation path. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. An intelligent earpiece to be worn over an ear of a user, comprising:
-
an inertial measurement unit (IMU) coupled to the intelligent earpiece and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the intelligent earpiece; a global positioning system (GPS) unit coupled to the intelligent earpiece and configured to detect location data corresponding to a location of the intelligent earpiece; at least one camera coupled to the intelligent earpiece configured to detect image data corresponding to a surrounding environment of the intelligent guidance device; a memory configured to store object data regarding previously determined objects and for storing previously determined user data associated with the user; an antenna configured to transmit the image data, the inertial measurement data, the location data and the object data to a remote processor and to receive processed data from the remote processor, the remote processor configured to; recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data, determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, determine a destination based on the determined desirable event or action, determine a navigation path for navigating the intelligent guidance device to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, and determine output data based on the determined navigation path; and a speaker configured to provide audio information to the user based on at least one of the recognized object, determined desirable event or action, or navigation path. - View Dependent Claims (16, 17, 18, 19, 20)
-
Specification