A voltmeter, also known as a voltage meter, is an instrument used for measuring the potential difference, or voltage, between two points in an electrical or electronic circuit. Some voltmeters are intended for use in direct current (DC) circuits; others are designed for alternating current (AC) circuits. Specialized voltmeters can measure radio frequency (RF) voltage.
A basic analog voltmeter consists of a sensitive galvanometer (current meter) in series with a high resistance. The internal resistance of a voltmeter must be high. Otherwise it will draw significant current, and thereby disturb the operation of the circuit under test. The sensitivity of the galvanometer and the value of the series resistance determine the range of voltages that the meter can display.
A digital voltmeter shows voltage directly as numerals. Some of these meters can determine voltage values to several significant figures. Practical laboratory voltmeters have maximum ranges of 1000 to 3000 volts (V). Most commercially manufactured voltmeters have several scales, increasing in powers of 10; for example, 0-1 V, 0-10 V, 0-100 V, and 0-1000 V.
An oscilloscope can be used to measure low voltages; the vertical displacement corresponds to the instantaneous voltage. Oscilloscopes are also excellent for the measurement of peak and peak-to-peak voltages in AC and RF applications. Voltmeters for measuring high potential differences require heavy-duty probes, wiring, and insulators.
In computer practice, standard lab voltmeters are adequate because the voltages encountered are moderate, usually between 1 V and 15 V. Cathode-ray-tube (CRT) monitors operate at several hundred volts. A typical lab voltmeter can indicate these voltages, but CRT units should be serviced only by qualified technicians because the voltages are high enough to be lethal.