SMART NECKLACE WITH STEREO VISION AND ONBOARD PROCESSING
First Claim
1. A wearable neck device for providing environmental awareness to a user, comprising:
- a left portion;
a right portion;
an inertial measurement unit (IMU) sensor coupled to the wearable neck device and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the wearable neck device;
a global positioning system (GPS) unit coupled to the wearable neck device and configured to detect location data corresponding to a location of the wearable neck device;
at least one camera coupled to the wearable neck device, the at least one camera configured to detect image data corresponding to a surrounding environment of the wearable neck device;
a memory configured to store object data regarding previously determined objects and previously determined user data associated with the user;
a processor connected to the IMU, the GPS unit and the at least one camera and configured to;
recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data,determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day,determine a destination based on the determined desirable event or action,determine a navigation path for navigating the wearable neck device to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, anddetermine output data based on the determined navigation path; and
a speaker configured to provide audio information to the user based on at least one of the recognized object, determined desirable event or action, or navigation path.
2 Assignments
0 Petitions
Accused Products
Abstract
A wearable neck device includes an IMU coupled to the wearable neck device and adapted to detect inertial measurement data and a GPS coupled to the device and adapted to detect location data. The wearable neck device further includes a camera adapted to detect image data and a memory adapted to store data. The wearable neck device further includes a processor adapted to recognize an object in the surrounding environment by analyzing the data. The processor can determine a desirable action based on the data and a current time or day. The processor can determine a destination based on the determined desirable action. The processor can determine a navigation path based on the determined destination and the data. The processor is further adapted to determine output based on the navigation path. The wearable neck device further includes a speaker adapted to provide audio information to the user.
103 Citations
20 Claims
-
1. A wearable neck device for providing environmental awareness to a user, comprising:
-
a left portion; a right portion; an inertial measurement unit (IMU) sensor coupled to the wearable neck device and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the wearable neck device; a global positioning system (GPS) unit coupled to the wearable neck device and configured to detect location data corresponding to a location of the wearable neck device; at least one camera coupled to the wearable neck device, the at least one camera configured to detect image data corresponding to a surrounding environment of the wearable neck device; a memory configured to store object data regarding previously determined objects and previously determined user data associated with the user; a processor connected to the IMU, the GPS unit and the at least one camera and configured to; recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data, determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, determine a destination based on the determined desirable event or action, determine a navigation path for navigating the wearable neck device to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, and determine output data based on the determined navigation path; and a speaker configured to provide audio information to the user based on at least one of the recognized object, determined desirable event or action, or navigation path. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A method for providing continuous social and environmental awareness to a user by a wearable neck device comprising:
-
detecting, via a camera, a GPS unit or an IMU, inertial measurement data corresponding to a positioning, velocity, or acceleration of the wearable neck device, location data corresponding to a location of the wearable neck device or image data corresponding to a surrounding environment of the wearable neck device; storing, in a memory, object data regarding previously determined objects and previously determined user data regarding the user; recognizing, by a processor, an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data; determining, by the processor; a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, a destination based on the determined desirable event or action, a navigation path for navigating the wearable neck device to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, or output data based on the determined navigation path; providing, via a speaker or a vibration unit, audio or haptic information to the user based on at least one of the recognized object, the determined desirable event or action, or the navigation path. - View Dependent Claims (14, 15, 16, 17, 18)
-
-
19. A wearable neck device for providing environmental awareness to a user, comprising:
-
an inertial measurement unit (IMU) sensor coupled to the wearable neck device and configured to detect inertial measurement data corresponding to a positioning, velocity, or acceleration of the wearable neck device; a global positioning system (GPS) unit coupled to the wearable neck device and configured to detect location data corresponding to a location of the wearable neck device; a wide angle camera coupled to the wearable neck device; a pair of stereo cameras coupled to the wearable neck device, the pair of stereo cameras configured to detect depth information regarding a surrounding environment, the wide angle camera and the pair of stereo cameras being configured to detect image data corresponding to the surrounding environment of the wearable neck device; a memory configured to store object data regarding previously determined objects and previously determined user data associated with the user; a processor connected to the IMU, the GPS unit, the wide angle camera and the pair of stereo cameras and configured to; recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data, determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, determine a destination based on the determined desirable event or action, determine a navigation path for navigating the wearable neck device to the destination based on the determined destination, the image data, and at least one of the inertial measurement data or the location data, and determine output data based on the determined navigation path; a speaker configured to provide audio information to the user based on at least one of the recognized object, determined desirable event or action, or navigation path; and at least one vibratory motor configured to provide haptic information to the user based on at least one of the recognized object, determined desirable event or action, or navigation path. - View Dependent Claims (20)
-
Specification