Method and apparatus for improving lag correction during in vivo measurement of analyte concentration with analyte concentration variability and range data
First Claim
1. A computer-implemented method, comprising:
- receiving a signal representative of sensor data from an analyte sensor related to an analyte level measured over time;
computing rates of change of the sensor data for a time period;
computing a rate distribution of the rates of change;
transforming the rate distribution into a linear arrangement;
determining a best-fit line for the transformed rate distribution;
computing a slope of the best-fit line; and
using the slope of the best-fit line as a representation of a variability of the analyte level to adjust an amount of lag correction applied to the sensor data.
1 Assignment
0 Petitions
Accused Products
Abstract
Methods, devices, and systems are provided for correcting lag in measurements of analyte concentration level in interstitial fluid. The invention includes receiving a signal representative of sensor data from an analyte monitoring system related to an analyte level measured over time, computing rates of change of the sensor data for a time period of the sensor data, computing a rate distribution of the rates of change, transforming the rate distribution into a linear arrangement, determining a best-fit line for the transformed rate distribution, computing a slope of the best-fit line; and using the slope of the best-fit line as a representation of a variability of the analyte level to adjust an amount of lag correction applied to the sensor data. Numerous additional features are disclosed.
1101 Citations
18 Claims
-
1. A computer-implemented method, comprising:
-
receiving a signal representative of sensor data from an analyte sensor related to an analyte level measured over time; computing rates of change of the sensor data for a time period; computing a rate distribution of the rates of change; transforming the rate distribution into a linear arrangement; determining a best-fit line for the transformed rate distribution; computing a slope of the best-fit line; and using the slope of the best-fit line as a representation of a variability of the analyte level to adjust an amount of lag correction applied to the sensor data. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A system for determining analyte concentration, the system comprising:
-
a processor; and a memory coupled to the processor, the memory storing processor executable instructions to; receive a signal representative of sensor data from an analyte sensor related to an analyte level measured over time; compute rates of change of the sensor data for a time period; compute a rate distribution of the rates of change; transform the rate distribution into a linear arrangement; determine a best-fit line for the transformed rate distribution; compute a slope of the best-fit line; and use the slope of the best-fit line as a representation of a variability of the analyte level to adjust an amount of lag correction applied to the sensor data. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
-
Specification