Nonparametric control chart for the range
First Claim
1. A computer implemented method for detecting or predicting a deviation in variability of at least one parameter being monitored, comprising:
- measuring the at least one parameter incrementally to create a dataset having a plurality of datapoints, wherein each datapoint represents an individual measured value for the parameter;
rank-ordering the datapoints within the dataset;
selecting at least one subset of the dataset, wherein the at least one subset is a predetermined number of datapoints including a high measured value and a low measured value defining a range of measured parameter values;
calculating all possible numbers of a subsample of the dataset having the range defined by the at least one subset in accordance with the formula;
where h is a highest measured value within the range, g is a lowest measured value within the range, and n is a number of observations within the subset;
repeating the steps of selecting at least one subset and calculating all possible numbers of subsamples of the dataset having the range defined by the at least one subset, until all possible numbers of ranges have been calculated for all possible subsets having the predetermined number of datapoints within the dataset to define a set of ranges;
rank-ordering the defined set of ranges; and
establishing a control limit for the at least one parameter, the control limit being defined by an upper limit and a lower limit wherein the upper and lower limits are a predetermined percentile of the rank-ordered set of ranges.
1 Assignment
0 Petitions
Accused Products
Abstract
A method is provided for detecting or predicting an undesired deviation in variability of at least one parameter being monitored, wherein the variation in the parameter is incrementally recorded. The method comprises establishing the number of subsets of a dataset that have a range of the difference between any two datapoints within the dataset, and computing a control chart for the range based thereon. The method accurately detects changes in variability in real time. The true distribution of the data is reflected, and the desired result is achieved without requiring an inordinate number of computations.
36 Citations
9 Claims
-
1. A computer implemented method for detecting or predicting a deviation in variability of at least one parameter being monitored, comprising:
-
measuring the at least one parameter incrementally to create a dataset having a plurality of datapoints, wherein each datapoint represents an individual measured value for the parameter; rank-ordering the datapoints within the dataset; selecting at least one subset of the dataset, wherein the at least one subset is a predetermined number of datapoints including a high measured value and a low measured value defining a range of measured parameter values; calculating all possible numbers of a subsample of the dataset having the range defined by the at least one subset in accordance with the formula;
where h is a highest measured value within the range, g is a lowest measured value within the range, and n is a number of observations within the subset; repeating the steps of selecting at least one subset and calculating all possible numbers of subsamples of the dataset having the range defined by the at least one subset, until all possible numbers of ranges have been calculated for all possible subsets having the predetermined number of datapoints within the dataset to define a set of ranges; rank-ordering the defined set of ranges; and establishing a control limit for the at least one parameter, the control limit being defined by an upper limit and a lower limit wherein the upper and lower limits are a predetermined percentile of the rank-ordered set of ranges. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
where n is the number of observations within the subset, c is the number of times the lowest measured value appears in the subset, and d is the number of times the highest measured value appears in the subset.
-
Specification