H03M1/1014

TIME INTERLEAVED ANALOG TO DIGITAL CONVERTER AND SIGNAL CONVERSION METHOD
20240429932 · 2024-12-26 ·

A time-interleaved analog-to-digital converter includes a plurality of channel circuitries, an output circuit, and a calibration circuitry. The plurality of channel circuitries are configured to sample an input signal to generate a plurality of first digital codes according to the input signal. The output circuit is configured to output a second digital code according to the plurality of first digital codes. The calibration circuitry is configured to adjust a sampling sequence of the plurality of channel circuitries for the input signal during an initial period, and control the plurality of channel circuitries to sample the input signal in the adjusted sampling sequence during an analog-to-digital conversion period.

Systems and method of compensating for nonlinear capacitance in converters

Described herein are systems and methods related to a converter includes a number of unit cells. The unit cells each include a first transistor and a second transistor. The first transistor is coupled in series with an output of the unit cell, and the second transistor is configured to have a capacitive characteristic that reduces a non-linear capacitive characteristic of the first transistor. The converter can be a voltage or current mode digital to analog converter.

Gain calibration with quantizer offset settings
12191876 · 2025-01-07 · ·

Methods and apparatus for calibrating a gain for a circuit block are disclosed. An example method includes receiving a plurality of quantizer offsets, where the plurality of quantizer offsets represent calibration data for a quantizer configured to quantize an output of the circuit block, determining one or more differences based on one or more first quantizer offsets of the plurality of quantizer offsets and on one or more second quantizer offsets of the plurality of quantizer offsets, and determining an incremental change in a gain associated with the circuit block based on the one or more differences.

Calibration system and method for SAR ADCs

In accordance with an embodiment, a method for operating a successive approximation ADC comprising a first capacitor array includes measuring a first weight of an MSB-a.sup.th bit of the ADC by applying a first reference voltage to first terminals of capacitors of the first capacitor array corresponding to the MSB-a.sup.th bit, applying a second reference voltage to first terminals of capacitors of the first capacitor array corresponding to significant bits lower than the MSB-a.sup.th bit, applying the first reference voltage to first terminals of a first set of capacitors of the first capacitor array corresponding to significant bits higher than the MSB-a.sup.th bit, and applying the second reference voltage to first terminals of a second set of capacitors of the first capacitor array corresponding to the significant bits higher than the MSB-a.sup.th bit; subsequently, a weight of a capacitance of the capacitors corresponding to the MSB-a.sup.th bit is successively approximated.

ANALOG-TO-DIGITAL CONVERTER WITH AN OVER-RANGE STAGE

An analog-to-digital converter (ADC) includes: a time-domain ADC core; and a calibration circuit. The time-domain ADC core includes: a first delay-to-digital stage having a terminal; a second delay-to-digital stage having a terminal; a third delay-to-digital stage having a terminal. The calibration circuitry is coupled to the terminal of the first delay-to-digital stage, the terminal of the second delay-to-digital stage, and the terminal of the third delay-to-digital stage of stages. The calibration circuitry is configured to calibrate the first delay-to-digital stage, the second delay-to-digital stage, and the third delay-to-digital stage based on a zero-crossing calibration and an over-range calibration. The over-range calibration sets a maximum threshold and a minimum threshold for the time-domain ADC relative to a reference voltage.

Digital-to-analog converter glitch reduction techniques

A digital technique to reduce or minimize switching in a DAC by using a partial DAC data ignore switching mode. In the partial DAC data ignore switching mode, a control circuit compares first and second data, such a first and second digital words, and operates corresponding switches only when the first data differ from the second data. The techniques are applicable to many types of DACs, including voltage output DACs, current output DACs, variable resistance DACs, digital rheostats, digital potentiometers, digiPOTs.

Digital-to-analog conversion apparatus and method having signal calibration mechanism

The present invention discloses a digital-to-analog conversion apparatus having signal calibration mechanism. A DAC circuit includes conversion circuits to generate an output analog signal and an echo-canceling analog signal. An echo transmission circuit performs signal processing on an echo path to generate an echo signal. An echo calibration circuit includes odd and even calibration circuits to perform mapping according to offset tables and perform processing according to response coefficients on odd and even input parts of an input digital signal to generate odd and even calibration parts of an echo-canceling calibration signal. A calibration parameter calculation circuit generates offsets according to an error signal between the echo signal and the echo-canceling calibration signal and path information related to the echo calibration circuit. The echo calibration circuit makes the coefficients converge according to the error signal and pseudo noise transmission path information, and updates the offset tables according to the offset.

Digital-to-analog conversion apparatus and method having signal calibration mechanism

The present invention discloses a digital-to-analog conversion apparatus having signal calibration mechanism is provided. A digital-to-analog conversion circuit includes conversion circuits to generate an output analog signal and echo-canceling analog signals. An echo transmission circuit processes an echo-transmitting path to generate an echo signal. An echo calibration circuit generates an output calibration signal and echo-canceling calibration signals according to an input digital circuit through calibration circuits corresponding to the conversion circuits. A calibration parameter calculating circuit generates a plurality of offsets according to an error signal of the echo signal relative to the calibration signals and path information related to the echo calibration circuit. The echo calibration circuit makes response coefficients converge according to the error signal and pseudo-noise transmission path information from the digital-to-analog conversion circuit to the echo transmission circuit, and updates codeword offset table according to the offset.

Voltage regulator with pulse frequency control

The present disclosure describes a system with a first counter circuit, a first converter circuit, a second counter circuit, and a second converter circuit. The first counter circuit is configured to output a first count value based on a comparison between a first reference value and a switched node value of a voltage regulator. The first converter circuit is configured to adjust an activation time of the voltage regulator based on the first count value. The second counter circuit is configured to output a second count value based on a comparison between a second reference value and the switched node value of the voltage regulator. The second converter circuit is configured to adjust an amount of current drawn away from an output of the voltage regulator based on the second count value.

Calibration of ramp digital to analog converter

A source driver including: a current source that provides an approximately constant current; a channel coupled to a source electrode and including a digital to analog converters (DAC), the DAC including: a voltage source that applies an output voltage to the source electrode based on the approximately constant current provided by the current source; and a control unit having circuitry that: inputs a digital value; and terminates, based on the digital value, charging of the voltage source by the approximately constant current; and a calibration unit having circuitry that: generates a comparison between a test voltage applied by the voltage source with a target voltage; and modifies the approximately constant current based on the comparison.