Method and Apparatus for Improving Lag Correction During In Vivo Measurement of Analyte Concentration with Analyte Concentration Variability and Range Data
First Claim
1. A computer-implemented method, comprising:
- receiving a signal representative of sensor data from an analyte sensor related to an analyte level measured over time;
computing rates of change of the sensor data for a time period;
computing a rate distribution of the rates of change;
transforming the rate distribution into a linear arrangement;
determining a best-fit line for the transformed rate distribution;
computing a slope of the best-fit line; and
using the slope of the best-fit line as a representation of a variability of the analyte level to adjust an amount of lag correction applied to the sensor data.
1 Assignment
0 Petitions
Accused Products
Abstract
Methods, devices, and systems are provided for correcting lag in measurements of analyte concentration level in interstitial fluid. The invention includes receiving a signal representative of sensor data from an analyte monitoring system related to an analyte level measured over time, computing rates of change of the sensor data for a time period of the sensor data, computing a rate distribution of the rates of change, transforming the rate distribution into a linear arrangement, determining a best-fit line for the transformed rate distribution, computing a slope of the best-fit line; and using the slope of the best-fit line as a representation of a variability of the analyte level to adjust an amount of lag correction applied to the sensor data. Numerous additional features are disclosed.
-
Citations
33 Claims
-
1. A computer-implemented method, comprising:
-
receiving a signal representative of sensor data from an analyte sensor related to an analyte level measured over time; computing rates of change of the sensor data for a time period; computing a rate distribution of the rates of change; transforming the rate distribution into a linear arrangement; determining a best-fit line for the transformed rate distribution; computing a slope of the best-fit line; and using the slope of the best-fit line as a representation of a variability of the analyte level to adjust an amount of lag correction applied to the sensor data. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A system for determining analyte concentration, the system comprising:
-
a processor; and a memory coupled to the processor, the memory storing processor executable instructions to; receive a signal representative of sensor data from an analyte sensor related to an analyte level measured over time; compute rates of change of the sensor data for a time period; compute a rate distribution of the rates of change; transform the rate distribution into a linear arrangement; determine a best-fit line for the transformed rate distribution; compute a slope of the best-fit line; and use the slope of the best-fit line as a representation of a variability of the analyte level to adjust an amount of lag correction applied to the sensor data. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
-
-
19-27. -27. (canceled)
-
28. A computer-implemented method, comprising:
-
defining a scaling factor for lag correction; collecting a moving window of historical analyte sensor data; defining a probability density function of the historical analyte sensor data within the moving window; determining a normalized analyte variability ratio; storing the normalized analyte variability ratio computed at regular intervals; comparing a latest normalized analyte variability ratio to a predetermined value and a number of prior values; setting a value of the scaling factor based on the probability density function; and computing lag corrected values based on the scaling factor. - View Dependent Claims (30, 31, 32)
-
-
29. (canceled)
-
33-48. -48. (canceled)
Specification