System and method for contextualized vehicle operation determination
First Claim
1. A method for determining event data comprising:
- sampling a first image stream within a first time window at an interior-facing camera of an onboard vehicle system, wherein the onboard vehicle system is integrated into a mountable unit coupled to a vehicle at a single interior location;
extracting interior activity data, comprising a driver gaze direction, from the first image stream;
sampling a second image stream within a second time window at an exterior-facing camera of the onboard vehicle system, wherein the first and second time window are coextensive;
extracting exterior activity data from the second image stream;
determining an interior event based on the interior activity data, comprising mapping the driver gaze direction relative to a first region of the second image stream based on a relative orientation between the interior-facing camera and the exterior-facing camera of the onboard vehicle system;
determining an exterior event based on the exterior activity data, comprising determining that a distance between the vehicle and an object depicted in the first region of the second image stream has fallen below a threshold distance;
correlating the interior event with the exterior event to generate combined event data, comprising determining that the driver gaze direction overlaps with the first region of the second image stream at a time point within the first time window;
automatically classifying the combined event data to generate an event label; and
automatically labeling the first image stream within the first time window and the second image stream within the second time window with the event label to generate labeled training data;
transmitting the labeled training data to a remote computing system; and
aggregating the labeled training data at the remote computing system with a corpus of labeled training data, wherein the corpus of labeled training data is received from a plurality of onboard vehicle systems operating in a plurality of vehicles.
5 Assignments
0 Petitions
Accused Products
Abstract
A method for determining event data including: sampling a first data stream within a first time window at a first sensor of an onboard vehicle system coupled to a vehicle, extracting interior activity data from the first data stream; determining an interior event based on the interior activity data; sampling a second data stream within a second time window at a second sensor of the onboard vehicle system; extracting exterior activity data from the second image stream; determining an exterior event based on the exterior activity data; correlating the exterior event and the interior event to generate combined event data; automatically classifying the combined event data to generate an event label; and automatically labeling the first time window of the first data stream and the second time window of the second data stream with the combined event label to generate labeled event data.
-
Citations
18 Claims
-
1. A method for determining event data comprising:
-
sampling a first image stream within a first time window at an interior-facing camera of an onboard vehicle system, wherein the onboard vehicle system is integrated into a mountable unit coupled to a vehicle at a single interior location; extracting interior activity data, comprising a driver gaze direction, from the first image stream; sampling a second image stream within a second time window at an exterior-facing camera of the onboard vehicle system, wherein the first and second time window are coextensive; extracting exterior activity data from the second image stream; determining an interior event based on the interior activity data, comprising mapping the driver gaze direction relative to a first region of the second image stream based on a relative orientation between the interior-facing camera and the exterior-facing camera of the onboard vehicle system; determining an exterior event based on the exterior activity data, comprising determining that a distance between the vehicle and an object depicted in the first region of the second image stream has fallen below a threshold distance; correlating the interior event with the exterior event to generate combined event data, comprising determining that the driver gaze direction overlaps with the first region of the second image stream at a time point within the first time window; automatically classifying the combined event data to generate an event label; and automatically labeling the first image stream within the first time window and the second image stream within the second time window with the event label to generate labeled training data; transmitting the labeled training data to a remote computing system; and aggregating the labeled training data at the remote computing system with a corpus of labeled training data, wherein the corpus of labeled training data is received from a plurality of onboard vehicle systems operating in a plurality of vehicles. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A method for determining event data comprising:
-
sampling a first data stream within a first time window at a first sensor of an onboard vehicle system coupled to a vehicle, extracting interior activity data, comprising a driver gaze direction, from the first data stream; sampling a second data stream within a second time window at a second sensor of the onboard vehicle system; extracting exterior activity data from the second image stream; determining an interior event based on the interior activity data, comprising mapping the driver gaze direction relative to a first region of the second image stream based on a relative orientation between the interior-facing camera and the exterior-facing camera of the onboard vehicle system; determining an exterior event based on the exterior activity data, comprising determining that a distance between the vehicle and an object depicted in the first region of the second image stream has fallen below a threshold distance; correlating the exterior event and the interior event to generate combined event data, comprising determining that the driver gaze direction overlaps with the first region of the second image stream at a time point within the first time window; automatically classifying the combined event data to generate an event label; and automatically labeling the first time window of the first data stream and the second ti me window of the second data stream with the combined event label to generate labeled event data; and transmitting the labeled event data to a remote computing system. - View Dependent Claims (9, 10, 11, 12, 13, 14, 15, 16, 17, 18)
-
Specification