Wearable clip for providing social and environmental awareness
First Claim
1. An intelligent clip to be worn by a user, comprising:
- a housing having a front portion, a back portion, a channel positioned on the back portion and configured to receive a connection to a fastening device, a top portion, and a bottom portion; and
one or more components encased within the housing and including;
an inertial measurement unit (IMU) sensor configured to detect inertial measurement data corresponding to a positioning, a speed, a direction of travel or an acceleration of the intelligent clip,at least one camera including a first camera positioned on the front portion, the first camera being configured to detect image data corresponding to a surrounding environment,a memory configured to store object data regarding previously determined objects and previously determined user data associated with the user,a processor connected to the IMU sensor and the at least one camera and configured to;
recognize an object in the surrounding environment based on the detected image data and the stored object data,determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day,determine a destination based on the determined desirable event or action,determine a plurality of navigation paths based on the determined destination,filter the plurality of navigation paths based on the inertial measurement data including the speed and the direction of travel to determine a navigation path of the plurality of navigation paths for the user to travel, anddetermine output data based on the determined navigation path; and
a speaker or a vibration unit configured to provide the output data to the user.
1 Assignment
0 Petitions
Accused Products
Abstract
A clip includes an IMU coupled to the clip and adapted to detect inertial measurement data and a GPS coupled to the device and adapted to detect location data. The clip further includes a camera adapted to detect image data and a memory adapted to store data. The clip further includes a processor adapted to recognize an object in the surrounding environment by analyzing the data. The processor can determine a desirable action based on the data and a current time or day. The processor can determine a destination based on the determined desirable action. The processor can determine a navigation path based on the determined destination and the data. The processor is further adapted to determine output based on the navigation path. The clip further includes a speaker adapted to provide audio information to the user.
-
Citations
20 Claims
-
1. An intelligent clip to be worn by a user, comprising:
-
a housing having a front portion, a back portion, a channel positioned on the back portion and configured to receive a connection to a fastening device, a top portion, and a bottom portion; and one or more components encased within the housing and including; an inertial measurement unit (IMU) sensor configured to detect inertial measurement data corresponding to a positioning, a speed, a direction of travel or an acceleration of the intelligent clip, at least one camera including a first camera positioned on the front portion, the first camera being configured to detect image data corresponding to a surrounding environment, a memory configured to store object data regarding previously determined objects and previously determined user data associated with the user, a processor connected to the IMU sensor and the at least one camera and configured to; recognize an object in the surrounding environment based on the detected image data and the stored object data, determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, determine a destination based on the determined desirable event or action, determine a plurality of navigation paths based on the determined destination, filter the plurality of navigation paths based on the inertial measurement data including the speed and the direction of travel to determine a navigation path of the plurality of navigation paths for the user to travel, and determine output data based on the determined navigation path; and a speaker or a vibration unit configured to provide the output data to the user. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A method for providing continuous social and environmental awareness by an intelligent clip comprising:
-
detecting, via a camera or an inertial measurement unit (IMU) sensor, inertial measurement data corresponding to a positioning, a speed, a direction of travel, or an acceleration of the intelligent clip, or image data corresponding to a surrounding environment; storing, in a memory, object data regarding previously determined objects and previously determined user data regarding a user; recognizing, by a processor, an object in the surrounding environment based on the detected image data, the stored object data and the inertial measurement data including the speed and the direction of travel of the intelligent clip; determining, by the processor, a desirable event or action based on the recognized object, the previously determined user data, and a current time or day; determining, by the processor, a destination based on the determined desirable event or action; determining, by the processor, a plurality of navigation paths based on the determined destination; filtering, by the processor, the plurality of navigation paths based on the inertial measurement data including the speed and the direction of travel to determine a navigation path of the plurality of navigation paths for the user to travel; determining, by the processor, output data based on the determined navigation path; and providing, via a speaker or a vibration unit, the output data to the user. - View Dependent Claims (11, 12, 13, 14, 15)
-
-
16. An intelligent clip to be worn by a user, comprising:
-
a front; a back; a first side; a second side; an inertial measurement unit (IMU) sensor configured to detect inertial measurement data corresponding to a positioning, a speed, a direction of travel or an acceleration of the intelligent clip; a first camera positioned on the front of the intelligent clip and configured to detect a first image data corresponding to a surrounding environment; a second camera positioned on the first side or the second side and configured to detect a second image data corresponding to the surrounding environment; a memory configured to store object data regarding previously determined objects and previously determined user data associated with the user; a processor connected to the IMU sensor, the first camera and the second camera and configured to; recognize an object in the surrounding environment based on the first image data, the second image data, the stored object data and the inertial measurement data including the speed and the direction of travel of the intelligent clip, determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day, determine a destination based on the determined desirable event or action, determine a plurality of navigation paths based on the determined destination, filter the plurality of navigation paths based on the inertial measurement data including the speed and the direction of travel to determine a navigation path of the plurality of navigation paths for the user to travel, and determine output data based on the determined navigation path; and
a speaker or a vibration unit configured to provide the output data to the user. - View Dependent Claims (17, 18, 19, 20)
-
Specification