Sharing preprocessing, computations, and hardware resources between multiple neural networks
First Claim
1. A method for training a Neural-Network (NN), the method comprising:
- receiving a plurality of NN training tasks, each training task comprising (i) a respective preprocessing phase that preprocesses data to be provided as input data to the NN, and (ii) a respective computation phase that trains the NN using the preprocessed data; and
executing the plurality of NN training tasks, including;
identifying a commonality between the input data required by computation phases of two or more of the training tasks; and
in response to identifying the commonality, executing one or more preprocessing phases that produce the input data jointly for the two or more training tasks.
2 Assignments
0 Petitions
Accused Products
Abstract
A method for training a Neural-Network (NN), the method includes receiving a plurality of NN training tasks, each training task including a respective preprocessing phase that preprocesses data to be provided as input data to the NN, and (ii) a respective computation phase that trains the NN using the preprocessed data. The plurality of NN training tasks is executed, including: (a) a commonality is identified between the input data required by computation phases of two or more of the training tasks, and (b) in response to identifying the commonality, one or more preprocessing phases are executed that produce the input data jointly for the two or more training tasks.
-
Citations
23 Claims
-
1. A method for training a Neural-Network (NN), the method comprising:
-
receiving a plurality of NN training tasks, each training task comprising (i) a respective preprocessing phase that preprocesses data to be provided as input data to the NN, and (ii) a respective computation phase that trains the NN using the preprocessed data; and executing the plurality of NN training tasks, including; identifying a commonality between the input data required by computation phases of two or more of the training tasks; and in response to identifying the commonality, executing one or more preprocessing phases that produce the input data jointly for the two or more training tasks. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A system for training a Neural-Network (NN), the system comprising:
-
an interface, configured to receive a plurality of NN training tasks, each training task comprising (i) a respective preprocessing phase that preprocesses data to be provided as input data to the NN, and (ii) a respective computation phase that trains the NN using the preprocessed data; and one or more processors, configured to execute the plurality of NN training tasks, including; identifying a commonality between the input data required by computation phases of two or more of the training tasks; and in response to identifying the commonality, executing one or more preprocessing phases that produce the input data jointly for the two or more training tasks. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19, 20, 21, 22)
-
-
23. A computer software product, the product comprising a tangible non-transitory computer readable medium in which program instructions are stored, which instructions, when read by one or more processors, cause the one or more processors to:
-
receive a plurality of NN training tasks, each training task comprising (i) a respective preprocessing phase that preprocesses data to be provided as input data to the NN, and (ii) a respective computation phase that trains the NN using the preprocessed data; and execute the plurality of NN training tasks, including; identifying a commonality between the input data required by computation phases of two or more of the training tasks; and in response to identifying the commonality, executing one or more preprocessing phases that produce the input data jointly for the two or more training tasks.
-
Specification