s, an offset of 34 parts per million. A
digital voltmeter (DVM) measures an unknown input voltage by converting the voltage to a digital value and then displays the voltage in numeric form. DVMs are usually designed around a special type of
analog-to-digital converter called an
integrating converter. DVM measurement accuracy is affected by many factors, including temperature, input impedance, and DVM power supply voltage variations. Less expensive DVMs often have input resistance on the order of 10 MΩ. Precision DVMs can have input resistances of 1 GΩ or higher for the lower voltage ranges (e.g. less than 20 V). To ensure that a DVM's accuracy is within the manufacturer's specified tolerances, it must be periodically calibrated against a
voltage standard such as the
Weston cell. The first digital voltmeter was invented and produced by
Andrew Kay of Non-Linear Systems (and later founder of
Kaypro) in 1954. Simple AC voltmeters use a rectifier connected to a DC measurement circuit, which responds to the average value of the waveform. The meter can be calibrated to display the
root mean square value of the waveform, assuming a fixed relation between the average value of the rectified waveform and the RMS value. If the waveform departs significantly from the sinewave assumed in the calibration, the meter will be inaccurate, though for simple wave shapes the reading can be corrected by multiplying by a constant factor. Early "true RMS" circuits used a thermal converter that responded only to the RMS value of the waveform. Modern instruments calculate the RMS value by electronically calculating the square of the input value, taking the average, and then calculating the square root of the value. This allows accurate RMS measurements for a variety of waveforms. ==References==