Self auto-calibration of analog circuits in a mixed signal integrated circuit device
First Claim
1. An integrated circuit having an auto-calibration circuit to minimize input offset voltage in an operational amplifier, comprising:
- an operational amplifier having differential inputs, an output and a digitally controlled input offset voltage compensation circuit; and
an auto-calibration circuit comprisinga voltage comparator having first and second inputs,a voltage reference coupled to the first input of the voltage comparator,a voltage offset trim digital-to-analog converter (DAC),voltage offset calibration switches,a successive approximation register (SAR) having an input coupled to an output of the voltage comparator and outputs coupled to the voltage offset trim DAC, andcalibration logic;
wherein when an event occurs the calibration logic controls the voltage offset calibration switches to couple the differential inputs of the operational amplifier to the voltage reference and the output of the operational amplifier to the second input of the voltage comparator;
whereby the voltage comparator causes the SAR to change output values to the voltage offset trim DAC so as to minimize an input offset voltage of the operational amplifier during an auto-calibration cycle initiated by the event.
15 Assignments
0 Petitions
Accused Products
Abstract
Auto-calibration of the analog circuits occurs when requested by a user and/or the occurrence of an event(s). The user may invoke an auto-calibration on demand through an auto-calibration (ACAL) input to the mixed-signal integrated circuit. An external voltage calibration (VCAL) input may be used for auto-calibration of the mixed-signal integrated circuit to a user-supplied common-mode voltage reference. Auto-calibration of the mixed-signal integrated circuit may also be initiated upon the occurrence of any one or more of the following events: 1) detection of auto-calibration data corruption, e.g., parity checking of auto-calibration data values digitally stored in the mixed-signal integrated circuit; 2) an internal timer that causes a calibration request after a programmable timeout period, 3) change in the internal integrated circuit die temperature as determined by a temperature sensor, and 4) a change in the power supply and/or internal supply voltage(s).
-
Citations
24 Claims
-
1. An integrated circuit having an auto-calibration circuit to minimize input offset voltage in an operational amplifier, comprising:
-
an operational amplifier having differential inputs, an output and a digitally controlled input offset voltage compensation circuit; and an auto-calibration circuit comprising a voltage comparator having first and second inputs, a voltage reference coupled to the first input of the voltage comparator, a voltage offset trim digital-to-analog converter (DAC), voltage offset calibration switches, a successive approximation register (SAR) having an input coupled to an output of the voltage comparator and outputs coupled to the voltage offset trim DAC, and calibration logic; wherein when an event occurs the calibration logic controls the voltage offset calibration switches to couple the differential inputs of the operational amplifier to the voltage reference and the output of the operational amplifier to the second input of the voltage comparator; whereby the voltage comparator causes the SAR to change output values to the voltage offset trim DAC so as to minimize an input offset voltage of the operational amplifier during an auto-calibration cycle initiated by the event. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A method for minimizing an input offset voltage in an analog input device upon an occurrence of an event, said method comprising the steps of:
-
(a) detecting an occurrence of an event; (b) switching from a normal mode to an auto-calibration mode upon detection of the occurrence of the event an analog input device having an input offset voltage compensation circuit; (c) applying a reference voltage to the analog input device; (d) minimizing an input offset voltage of the analog input device by; (i) measuring an output voltage of the analog input device; and (ii) applying input offset compensation values to the input offset voltage compensation circuit until the output voltage from the analog input device is at a desired value and then storing the input offset compensation value that minimizes the input offset voltage of the analog input device; and (e) switching the analog input device from the calibration mode to the normal mode. - View Dependent Claims (18, 19, 20, 21, 22, 23, 24)
-
Specification