Patent classifications
G01R31/3191
Method for testing a device under test
A method for testing a device under test, the device under test being a measuring instrument to measure a physical parameter of a fluid, includes: performing a plurality of valid test runs, wherein a valid test run includes: exposing the device under test and a reference measuring instrument to the fluid under a set of influences, the set of influences being defined by influence parameters; monitoring the influence parameters; obtaining a reference value for the physical parameter from the reference measuring instrument; and obtaining a test value for the physical parameter from the device under test, wherein a test run is invalidated if influence parameters do not meet specified test requirements for the influence parameters; and then evaluating a plurality of test values originating from the plurality of valid test runs with respect to at least one of accuracy, repeatability and reproducibility.
CIRCUIT MEASURING DEVICE AND METHOD
A circuit measuring device and a method thereof are provided. A voltage source supplies a common voltage such that a calibration current having a preset current value flows from a current-voltage converter to a final test machine. The current-voltage converter converts the calibration current into a calibration voltage. At this time, a voltage sensing component senses a voltage between an input terminal and an output terminal of the current-voltage converter to output sensed calibration data. The current-voltage converter converts a tested current outputted by a tested circuit into a tested voltage. At this time, the voltage sensing component senses the voltage between the input terminal and the output terminal of the current-voltage converter to output actual sensed data. When the final test machine determines that a difference between the sensed calibration data and the actual sensed data is larger than a threshold, the tested circuit is adjusted.
Self-calibrated system on a chip (SoC)
A self-calibrated system on a chip includes a semiconductor substrate, at least one silicon intellectual property (SIP) circuit including dynamic random access memories (DRAMs), a calibration circuit, and a function circuit, a cyclic oscillator, and a control circuit. Each DRAM has a coarsely-tuned capacitance value and a coarsely-tuned resistance value. The calibration circuit has a finely-tuned capacitance value and a finely-tuned resistance value. The cyclic oscillator transmits an oscillating clock signal to the control circuit to choose and provide the coarsely-tuned capacitance value, the coarsely-tuned resistance value, the finely-tuned capacitance value and the finely-tuned resistance value for the function circuit, thereby adjusting a function parameter.
Two-port on-wafer calibration piece circuit model and method for determining parameters
The present application provides two-port on-wafer calibration piece circuit models and a method for determining parameters. The method includes: measuring a single-port on-wafer calibration piece circuit model corresponding to a first frequency band to obtain a first S parameter; calculating, according to the first S parameter, an intrinsic capacitance value of a two-port on-wafer calibration piece circuit model corresponding to the single-port on-wafer calibration piece circuit model; measuring the two-port on-wafer calibration piece circuit model corresponding to the terahertz frequency band to obtain a second S parameter; and calculating a parasitic capacitance value and a parasitic resistance value of the two-port on-wafer calibration piece circuit model according to the second S parameter and the intrinsic capacitance value.
MULTI-CHANNEL TIMING CALIBRATION DEVICE AND METHOD
A multi-channel timing calibration device and a method applicable thereto are provided. The device includes: a plurality of channel inputs, at least one relay switch, at least one comparator, at least one first multiplexer, and a time measurement chip. The at least one comparator is connected to the at least one relay switch, and connected to a reference voltage or a digital analog converter. The at least one first multiplexer has different signals for different channel groups and outputs a signal of a designated channel. The time measurement chip calculates a timing difference of each of the channels of each of the channel inputs as a basis for delay of the timing signals.
DELAY MEASUREMENT SYSTEM AND MEASUREMENT METHOD
A delay measurement system and a measurement method are provided. The delay measurement system includes a delay control device and a comparator. The delay control device is configured to generate a second signal in response to a first signal, wherein a rising edge of the second signal delays a first delay time with respect to a rising edge of the first signal, and the first delay time is controlled in response to an output signal of a comparator. The comparator is configured to compare the first delay time with a second delay time and output the output signal, wherein a rising edge of a third signal delays the second delay time with respect to the rising edge of the first signal, and the third signal is generated by a device under test (DUT) in response to the first signal.
Trimming analog circuits
A system may include a trim circuit configured to provide a trim signal to a circuit under test. The trim circuit may be configured to adjust a trim value of the trim signal based on a selection signal and a value signal. The trim signal may cause a key characteristic of the circuit under test to change based on the adjusted trim value. The system may include a production tester configured to determine whether the key characteristic is within a threshold range. Responsive to the key characteristic being within the threshold range, the production tester may stop performing the trim procedure on the circuit under test. Responsive to the key characteristic not being within the threshold range, the production tester may adjust the value signal based on whether the key characteristic is greater than or less than the threshold range.
Pin driver and test equipment calibration
A force-sense system can provide signals to, or receive signals from, a device under test (DUT) at a first DUT node. The system can include output buffer circuitry configured to provide a DUT signal to the DUT in response to a force control signal at a buffer control node, and controller circuitry configured to provide the force control signal at the buffer control node. The system can include bypass circuitry configured to selectively bypass the controller circuitry and provide an auxiliary control signal at the buffer control node. The auxiliary control signal can be used for system calibration. In an example, an external calibration circuit can provide the auxiliary control signal in response to information received from the DUT.
Systems and methods for calibrating a conducted electrical weapon
Systems and methods for calibrating a conducted electrical weapon (“CEW”) to provide a predetermined amount of current for each pulse of the stimulus signal. Providing the predetermined amount of current, close thereto, increases the effectiveness of the stimulus signal in impeding locomotion of a human or animal target. The calibration process enables a CEW to calibrate the amount of charge in a pulse of the stimulus signal in the environmental conditions where the tester operates and also in the field where the environmental conditions may be different from the environmental conditions during calibration.
Calibrating an interface board
An example test system includes a device interface board (DIB) having one or more signal transmission paths and an interface for connecting to one or more other components of the test system. Test circuitry is configured to inject test signals into the one or more signal transmission paths and to measure transmitted versions of the test signals at the interface to obtain measurement signals. One or more processing devices are configured to generate calibration factors based on differences between the injected test signals and the measurement signals, and to store the calibration factors in computer memory. The calibration factors are for correcting for effects on the test signals of the one or more signal transmission paths.