Method and System for Evaluation Using Probabilistic Boosting Trees
First Claim
Patent Images
1. A method for training a probabilistic boosting tree, comprising:
- receiving training data at a graphics processing unit (GPU);
dividing the training data into a first dataset and a second dataset using a classifier;
training a first sub-tree and a second sub-tree at the GPU, the first sub-tree using the first dataset and the second sub-tree using the second dataset;
generating a posterior distribution model based on the trained first sub-tree and the trained second sub-tree.
4 Assignments
0 Petitions
Accused Products
Abstract
A method and system for evaluating probabilistic boosting trees is disclosed. In an embodiment, input data is received at a graphics processing unit. A weighted empirical distribution associated with each node of the probabilistic boosting tree is determined using a stack implementation. The weighted empirical distribution associated with each node is added to a total posterior distribution value.
21 Citations
51 Claims
-
1. A method for training a probabilistic boosting tree, comprising:
-
receiving training data at a graphics processing unit (GPU); dividing the training data into a first dataset and a second dataset using a classifier; training a first sub-tree and a second sub-tree at the GPU, the first sub-tree using the first dataset and the second sub-tree using the second dataset; generating a posterior distribution model based on the trained first sub-tree and the trained second sub-tree. - View Dependent Claims (2, 3, 4, 5, 6, 8)
-
-
7. The method of claim 7, wherein the parallel computing architecture is Compute Unified Device Architecture (CUDA).
-
9. A method for determining the posterior distribution of a probabilistic boosting tree, comprising:
-
receiving input data at a graphics processing unit (GPU) determining a weighted empirical distribution associated with each node of the probabilistic boosting tree using a stack implementation; adding the weighted empirical distribution associated with each node to a total posterior distribution value. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16)
-
-
17. A method for evaluating a forest of probabilistic boosting trees, comprising:
-
receiving input data at a graphics processing unit (GPU); evaluating the plurality of probabilistic boosting trees using a stack implementation; generating a combined posterior distribution based on a posterior distribution of each of the plurality of probabilistic boosting trees.
-
-
18. A system for training a probabilistic boosting tree, comprising:
-
means for receiving training data at a graphics processing unit (GPU); means for dividing the training data into a first dataset and a second dataset using a classifier; means for training a first sub-tree and a second sub-tree at the GPU, the first sub-tree using the first dataset and the second sub-tree using the second dataset; means for generating a posterior distribution model based on the trained first sub-tree and the trained second sub-tree. - View Dependent Claims (19, 20, 21, 22, 23, 24, 25)
-
-
26. A system for determining the posterior distribution of a probabilistic boosting tree, comprising:
-
means for receiving input data at a graphics processing unit (GPU) means for determining a weighted empirical distribution associated with each node of the probabilistic boosting tree using a stack implementation; means for adding the weighted empirical distribution associated with each node to a total posterior distribution value. - View Dependent Claims (27, 28, 29, 30, 31, 32, 33)
-
-
34. A system for evaluating a forest of probabilistic boosting trees, comprising:
-
means for receiving input data at a graphics processing unit (GPU); means for evaluating the plurality of probabilistic boosting trees using a stack implementation; means for generating a combined posterior distribution based on a posterior distribution of each of the plurality of probabilistic boosting trees.
-
-
35. A non-transitory computer readable medium encoded with computer executable instructions for training a probabilistic boosting tree, the computer executable instructions defining steps comprising:
-
receiving training data at a graphics processing unit (GPU); dividing the training data into a first dataset and a second dataset using a classifier; training a first sub-tree and a second sub-tree at the GPU, the first sub-tree using the first dataset and the second sub-tree using the second dataset; generating a posterior distribution model based on the trained first sub-tree and the trained second sub-tree. - View Dependent Claims (36, 37, 38, 39, 40, 41, 42)
-
-
43. A non-transitory computer readable medium encoded with computer executable instructions for determining the posterior distribution of a probabilistic boosting tree, the computer executable instructions defining steps comprising:
-
receiving input data at a graphics processing unit (GPU); determining a weighted empirical distribution associated with each node of the probabilistic boosting tree using a stack implementation; adding the weighted empirical distribution associated with each node to a total posterior distribution value. - View Dependent Claims (44, 45, 46, 47, 48, 49, 50)
-
-
51. A non-transitory computer readable medium encoded with computer executable instructions for evaluating a forest of probabilistic boosting trees, the computer executable instructions defining steps comprising:
-
receiving input data at a graphics processing unit (GPU); evaluating the plurality of probabilistic boosting trees using a stack implementation; generating a combined posterior distribution based on a posterior distribution of each of the plurality of probabilistic boosting trees.
-
Specification