Contact Us
Browse by Category
Main Indicators, Display Bits and Display Characteristics of Digital Multimeter
The display digits of digital multimeter are usually 31/2 bits to 81/2 bits. There are two principles to determine the display digits of digital instruments: one is that the digits that can display all digits from 0 to 9 are integers; the other is that the number of fractional digits is based on the highest digits in the maximum display value and the number of fractional digits is 2000 when the full range is used. This shows that the instrument has three integers, while the molecule of fractional digits is 1. The denominator is 2, so it is called 31/2 bits. It is pronounced "three and a half", and its highest bit can only display 0 or 1 (0 is not usually displayed). The highest bit of 32/3 digit digital multimeter can only display 0-2 digits, so the maximum display value is (+2999). In the same case, it is 50% higher than the limit of 31/2 bit digital multimeter, especially in the measurement of 380V AC voltage is very valuable.
The universal digital multimeter generally belongs to the hand-held multimeter with 31/2 bits display. The 41/2, 51/2 bits (under 6 bits) digital multimeter can be divided into two types: hand-held and desktop. More than 61/2 bits belong to desktop digital multimeter.
Digital multimeter adopts advanced digital display technology, which shows clearly and intuitively and reads accurately. It not only guarantees the objectivity of reading, but also conforms to people's reading habits, and can shorten the reading or recording time. These advantages are not available in traditional analog (pointer) multimeter.
Accuracy (Accuracy) The accuracy of digital multimeter is the synthesis of systematic error and random error in measurement results. It indicates the consistency between the measured value and the true value, and also reflects the magnitude of the measurement error. Generally speaking, the higher the accuracy, the smaller the measurement error, and vice versa.
The accuracy of digital multimeter is much better than that of analog pointer multimeter. The accuracy of multimeter is a very important index, which reflects the quality and technological ability of multimeter. The multimeter with poor accuracy is difficult to express the true value, and easy to cause errors in measurement.
Resolution (resolution)
The voltage value corresponding to the last word of the digital multimeter in the minimum voltage range is called resolution, which reflects the sensitivity of the instrument. The resolution of digital instruments increases with the increase of display digits. The highest resolution of digital multimeter with different digits is different.
The resolution index of digital multimeter can also be displayed by resolution. Resolution refers to the percentage of the smallest number (excluding zero) and the largest number that the instrument can display.
It should be pointed out that resolution and accuracy belong to two different concepts. The former represents the "sensitivity" of the instrument, that is, the "recognition" ability of the micro-voltage; the latter reflects the "accuracy" of the measurement, that is, the consistency between the measured results and the true value. There is no necessary connection between them, so we can not confuse them, let alone mistake the resolution (or resolution) as similar to accuracy, which depends on the comprehensive error and quantization error of A/D converter and function converter inside the instrument. From the perspective of measurement, resolution is a "virtual" index (independent of measurement error), and accuracy is a "real" index (which determines the magnitude of measurement error). Therefore, it is impossible to increase the number of display bits arbitrarily to improve the resolution of the instrument.