Adjusting training set combination based on classification accuracy
First Claim
Patent Images
1. An apparatus, comprising:
- memory to store instructions; and
processing circuitry, coupled with the memory, operable to execute the instructions, that when executed, cause the processing circuitry to;
generate at least one training batch, wherein the at least one training batch includes a plurality of samples associated with one or more classes of a classification model;
train the classification model for a number of iterations using the at least one training batch;
determine an accuracy of each class based on the training; and
adjust a number of the plurality of samples corresponding to the one or more classes in the at least one training batch that meet one or more criteria related to the adjustment based on the determined accuracy of each class.
1 Assignment
0 Petitions
Accused Products
Abstract
Various embodiments are generally directed to techniques of adjusting the combination of the samples in a training batch or training set. Embodiments include techniques to determine an accuracy for each class of a classification model, for example. Based on the determined accuracies, the combination of the samples in the training batch may be adjusted or modified to improve the training of the classification model.
-
Citations
20 Claims
-
1. An apparatus, comprising:
-
memory to store instructions; and processing circuitry, coupled with the memory, operable to execute the instructions, that when executed, cause the processing circuitry to; generate at least one training batch, wherein the at least one training batch includes a plurality of samples associated with one or more classes of a classification model; train the classification model for a number of iterations using the at least one training batch; determine an accuracy of each class based on the training; and adjust a number of the plurality of samples corresponding to the one or more classes in the at least one training batch that meet one or more criteria related to the adjustment based on the determined accuracy of each class. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method comprising:
-
generating, via one or more processors, at least one training batch, wherein the at least one training batch includes a plurality of samples associated with one or more classes of a classification model; training, via the one or more processors, the classification model for a number of iterations using the at least one training batch; determining, via the one or more processors, an accuracy of each class based on the training; and adjusting, via the one or more processors, a number of the plurality of samples corresponding to the one or more classes in the at least one training batch that meet one or more criteria related to the adjusting based on the determined accuracy of each class. - View Dependent Claims (12, 13, 14, 15)
-
-
16. A non-transitory computer-readable storage medium storing computer-readable program code executable by a processor to:
-
generate at least one training batch, wherein the at least one training batch includes a plurality of samples associated with one or more classes of a classification model; train the classification model for a number of iterations using the at least one training batch; determine an accuracy of each class based on the training; and adjust a number of the plurality of samples corresponding to the one or more classes in the at least one training batch that meet one or more criteria related to the adjustment based on the determined accuracy of each class. - View Dependent Claims (17, 18, 19, 20)
-
Specification