Methods for determining manufacturing waste to optimize productivity and devices thereof
First Claim
1. A method for automatically processing and analyzing video data to facilitate more effective and efficient identification of waste subtasks included therein, the method implemented by one or more productivity assessment computing devices and comprising:
- obtaining video data captured by one or more video capture devices, the video data comprising a plurality of frames of a video, and automatically analyzing the video data to identify a plurality of entities present in each of the frames and a type of each of the entities based on a correlation of the video data with a plurality of stored classifiers;
plotting movement of each of the entities across at least a subset of the frames to determine a trajectory of each of the entities based on a stored digital floor plan, wherein the subset of the frames comprises consecutive ones of the frames;
generating entity adjacency data for each of the entities based on a position of the entities in each of the frames and identifying a plurality of interactions of one or more of the entities in each of the frames based on the entity adjacency data;
generating a unique sequence encoding for each of a plurality of subtasks performed by each of the entities, wherein the subtasks are identified based on the interactions and a sequence of at least a subset of the subtasks is associated with at least one task;
classifying at least one of the subset of the subtasks as a waste subtask of the task based on one or more of the interactions corresponding to the one of the subset of the subtasks and the determined trajectory and the type of each of the entities associated with the one or more of the interactions;
correlating the sequence encodings of the one of the subset of subtasks with a number of frames per second of the video data to determine at least one waste duration value for the task; and
outputting an indication of the one of the subset of the subtasks classified as a waste subtask and the waste duration value.
1 Assignment
0 Petitions
Accused Products
Abstract
A method, non-transitory computer readable medium, and productivity assessment computing device that identifies entities present in frames of a video. Entity movement across the frames is plotted to obtain a trajectory of the entities. Interactions of one or more of the entities in each of the frames are identified. A unique sequence encoding is generated for subtasks performed by each of the entities. One of the subtasks is classified as a waste subtask based on one or more of the interactions corresponding to the one of the subtasks and the trajectory and a type of each of the entities associated with the interactions. The sequence encodings of the one of the subtasks are correlated with a number of frames per second of the video to determine waste duration value(s) for a task and the waste duration value(s) are output.
-
Citations
21 Claims
-
1. A method for automatically processing and analyzing video data to facilitate more effective and efficient identification of waste subtasks included therein, the method implemented by one or more productivity assessment computing devices and comprising:
-
obtaining video data captured by one or more video capture devices, the video data comprising a plurality of frames of a video, and automatically analyzing the video data to identify a plurality of entities present in each of the frames and a type of each of the entities based on a correlation of the video data with a plurality of stored classifiers; plotting movement of each of the entities across at least a subset of the frames to determine a trajectory of each of the entities based on a stored digital floor plan, wherein the subset of the frames comprises consecutive ones of the frames; generating entity adjacency data for each of the entities based on a position of the entities in each of the frames and identifying a plurality of interactions of one or more of the entities in each of the frames based on the entity adjacency data; generating a unique sequence encoding for each of a plurality of subtasks performed by each of the entities, wherein the subtasks are identified based on the interactions and a sequence of at least a subset of the subtasks is associated with at least one task; classifying at least one of the subset of the subtasks as a waste subtask of the task based on one or more of the interactions corresponding to the one of the subset of the subtasks and the determined trajectory and the type of each of the entities associated with the one or more of the interactions; correlating the sequence encodings of the one of the subset of subtasks with a number of frames per second of the video data to determine at least one waste duration value for the task; and outputting an indication of the one of the subset of the subtasks classified as a waste subtask and the waste duration value. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A productivity assessment computing device comprising memory comprising programmed instructions stored thereon and at least one processor coupled to the memory, which is configured to be capable of executing the stored programmed instructions to:
-
obtain video data captured by one or more video capture devices, the video data comprising a plurality of frames, and automatically analyze the video data to identify a plurality of entities present in each of the frames and a type of each of the entities based on a correlation of the video data with a plurality of stored classifiers; plot movement of each of the entities across at least a subset of the frames to determine a trajectory of each of the entities based on a stored digital floor plan, wherein the subset of the frames comprises consecutive ones of the frames; generate entity adjacency data for each of the entities based on a position of the entities in each of the frames and identify a plurality of interactions of one or more of the entities in each of the frames based on the entity adjacency data; generate a unique sequence encoding for each of a plurality of subtasks performed by each of the entities, wherein the subtasks are identified based on the interactions and a sequence of at least a subset of the subtasks is associated with at least one task; classify at least one of the subset of the subtasks as a waste subtask of the task based on one or more of the interactions corresponding to the one of the subset of the subtasks and the determined trajectory and the type of each of the entities associated with the one or more of the interactions; correlate the sequence encodings of the one of the subset of subtasks with a number of frames per second of the video data to determine at least one waste duration value for the task; and output an indication of the one of the subset of the subtasks classified as a waste subtask and the waste duration value. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A non-transitory computer readable medium having stored thereon instructions for processing and analyzing video data to facilitate more effective and efficient identification of waste subtasks included therein comprising executable code, which when executed by at least one processor, causes the processor to:
-
obtain video data captured by one or more video capture devices, the video data comprising a plurality of frames, and automatically analyze the video data to identify a plurality of entities present in each of the frames and a type of each of the entities based on a correlation of the video data with a plurality of stored classifiers; plot movement of each of the entities across at least a subset of the frames to determine a trajectory of each of the entities based on a stored digital floor plan, wherein the subset of the frames comprises consecutive ones of the frames; generate entity adjacency data for each of the entities based on a position of the entities in each of the frames and identify a plurality of interactions of one or more of the entities in each of the frames based on the entity adjacency data; generate a unique sequence encoding for each of a plurality of subtasks performed by each of the entities, wherein the subtasks are identified based on the interactions and a sequence of at least a subset of the subtasks is associated with at least one task; classify at least one of the subset of the subtasks as a waste subtask of the task based on one or more of the interactions corresponding to the one of the subset of the subtasks and the determined trajectory and the type of each of the entities associated with the one or more of the interactions; correlate the sequence encodings of the one of the subset of subtasks with a number of frames per second of the video data to determine at least one waste duration value for the task; and output an indication of the one of the subset of the subtasks classified as a waste subtask and the waste duration value. - View Dependent Claims (16, 17, 18, 19, 20, 21)
-
Specification