What is a Digital Voltmeter?

Image Credit: 
Main Image: 
Digital voltmeter.

A Voltmeter is an instrument used for measuring the electrical potential difference between two points in an electric circuit. Analog voltmeter moves a pointer across a scale in proportion to the voltage of the circuit; Digital voltmeter gives a numerical display of voltage by use of an analog to digital converter. An analog-to-digital converter (abbreviated ADC, A/D or A to D) typically, an ADC is an electronic device that converts an input analog voltage or current to a digital number proportional to the magnitude of the voltage or current. Voltmeters are made in a wide range of styles. Instruments permanently mounted in a panel are used to monitor generators or other fixed apparatus.


Portable instruments, usually equipped to also measure current and resistance in the form of a multimeter, are standard test instruments used in electrical and electronics work. General purpose analog voltmeters may have an accuracy of a few percent of full scale, and are used with voltages from a fraction of a volt to several thousand volts. Digital meters can be made with high accuracy, typically better than 1%. Specially calibrated test instruments have higher accuracy, with laboratory instruments capable of measuring to accuracy of a few parts per million.


What are the components of a Digital Voltmeter?

Digital voltmeters (DVMs) are usually designed around a special type of analog-to-digital converter called an integrating converter. Digital voltmeters always have input amplifiers, and, generally have a constant input resistance of 10 mega ohms regardless of set measurement range. Voltmeter accuracy is affected by many factors, including temperature and supply voltage variations. To make sure that a digital voltmeter’s reading is within the manufactures specified tolerances, they should be periodically calibrated against a voltage standard such as Weston cell. Weston cell is a wet-chemical cell that produces a highly stable voltage suitable as a laboratory standard for calibration of voltmeters.


Who invented the Digital voltmeter?

The first digital voltmeter was invented and produced by Andrew Kay of Non-linear Systems (and later founder of Kaypro Corporation, American home/personal computer manufacturer) in the year 1954. Andrew Kay is President and CEO of Kay Computers, a personal computer firm.


How does the Digital voltmeter display the measured voltage?

DVMs display the measured voltage using LCDs (Liquid Crystal Display) or LEDs (Light Emitting Diode) to display the result in a floating point format. DVMs have some characteristics such that they usually have scales that are 0-0.3v, 0-3v, 0-30v, 0-300v, etc. It is not clear why those ranges were chosen but they are commonplace.


How is the circuit designed in the Digital Voltmeter?

The Digital voltmeter is ideal to use for measuring the output voltage of the DC (Direct current) power supply. It includes a 3.5-digit LED display with a negative voltage indicator. It measures DC voltages from 0 to 199.9v with a resolution of 0.1v. The voltmeter is based on single ICL7107 chip and may be fitted on a small 3cm * 7cm printed circuit board. The circuit should be supplied with a 5V voltage supply and consumes only around 25mA (milli ampere).


What is the problem with most commercial DVM modules?

These days we can get the Digital voltmeter modules at almost every local electronic store. They have one big problem that we cannot power the module from the same power that we intend to measure. In other words, the supply voltage for the voltmeter module must be an independent ground free power source. So, we end up building a second transformer into our equipment. The solution to this problem is to take a small microcontroller with build-in ADC (analog-to-digital converter) and add a display.


External References
Related Videos: 
See video