A Multimeter is an instrument used for measuring the electrical potential difference between two points in an electric circuit. Analog voltmeters move a pointer across a scale in proportion to the voltage of the circuit; digital voltmeters give a numerical display of voltage by use of an analog to digital converter.
A multimeter can be a hand-held device useful for basic fault finding and field service work or a bench instrument which can measure to a very high degree of accuracy. They can be used to troubleshoot electrical problems in a wide array of industrial and household devices such as electronic equipment, motor controls, domestic appliances, power supplies, and wiring systems.
Multimeters are available in a wide ranges of features and prices. Cheap multimeters can cost less than US$10, while the top of the line multimeters can cost more than US$5000.
The resolution of a multimeter is often specified in "digits" of resolution. For example, the term 5½ digits refers to the number of digits displayed on the readout of a multimeter.
By convention, a half digit can display either a zero or a one, while a three-quarters digit can display a numeral higher than a one but not nine. Commonly, a three-quarters digit refers to a maximum value of 3 or 5. The fractional digit is always the most significant digit in the displayed value. A 5½ digit multimeter would have five full digits that display values from 0 to 9 and one half digit that could only display 0 or 1. Such a meter could show positive or negative values from 0 to 199,999. A 3¾ digit meter can display a quantity from 0 to 3,999 or 5,999, depending on the manufacturer.
While a digital display can easily be extended in precision, the extra digits are of no value if not accompanied by care in the design and calibration of the analog portions of the multimeter. Meaningful high-resolution measurements require a good understanding of the instrument specifications, good control of the measurement conditions, and traceability of the calibration of the instrument.
Specifying "display counts" is another way to specify the resolution. Display counts give the largest number, or the largest number plus one (so the count number looks nicer) the multimeter' display can show, ignoring a decimal separator. For example, a 5½ digit multimeter can also be specified as a 199999 display count or 200000 display count multimeter. Often the display count is just called the count in multimeter specifications.
Resolution of analog multimeters is limited by the width of the scale pointer, vibration of the pointer, the accuracy of printing of scales, zero calibration, number of ranges, and errors due to non-horizontal use of the mechanical display. Accuracy of readings obtained is also often compromised by miscounting division markings, errors in mental arithmetic, parallax observation errors, and less than perfect eyesight. Mirrored scales and larger meter movements are used to improve resolution; two and a half to three digits equivalent resolution is usual (and is usually sufficiently adequate for the limited precision actually necessary for most measurements).
Resistance measurements, in particular, are of low precision due to the typical resistance measurement circuit which compresses the scale heavily at the higher resistance values. Inexpensive analog meters may have only a single resistance scale, seriously restricting the range of precise measurements. Typically an analog meter will have a panel adjustment to set the zero-ohms calibration of the meter, to compensate for the varying voltage of the meter battery.
Digital multimeters generally take measurements with accuracy superior to their analog counterparts. Standard analog multimeters measure with typically three percent accuracy, though instruments of higher accuracy are made. Standard portable digital multimeters are specified to have an accuracy of typically 0.5% on the DC voltage ranges. Mainstream bench-top multimeters are available with specified accuracy of better than ±0.01%. Laboratory grade instruments can have accuracies of a few parts per million.
Accuracy figures need to be interpreted with care. The accuracy of an analog instrument usually refers to full-scale deflection; a measurement of 10V on the 100V scale of a 3% meter is subject to an error of 3V, 30% of the reading. Digital meters usually specify accuracy as a percentage of reading plus a percentage of full-scale value, sometimes expressed in counts rather than percentage terms.
A multimeter's quoted accuracy is specified as being that of the lower (mV) DC range, and is known as the "basic DC volts accuracy" figure. Higher DC voltage ranges, current, resistance, AC and other ranges will usually have a lower accuracy than the basic DC volts figure. AC measurements only meet specified accuracy within a specified range of frequencies.
Manufacturers can provide calibration services so that new meters may be purchased with a certificate of calibration indicating the meter has been adjusted to standards traceable to, for example, the American National Institute of Standards and Technology, or other national standards laboratory.
Test equipment drifts out of calibration over time, and the specified accuracy cannot be relied upon indefinitely. For more expensive equipment, manufacturers and third parties provide calibration services so that older equipment may be re-calibrated and re-certified. The cost of such services is disproportionate for inexpensive equipment; however extreme accuracy is not required for most routine testing. A multimeter is used for critical measurements and may be part of a metrology program to assure calibration.
- DMM 129A
A DIY Multimeter Kit provides you with everything you need to make your own multimeter capable of measuring voltage (0-30VDC, 0.05 resolution) current (0-500mA, 1mA resolution) and resistance (0-100kΩ). The resistance mode also includes a continuity test: the buzzer will sound when the resistance probes are shorted together, this is one of the more handy tools of any multimeter.