What Is An ADC?
ADCs convert an infinitely variable analogue input signal to a finite number of discrete digital output codes. The width of each output code is referred to as the converter’s resolution (number of bits).
The accuracy of an ADC can be improved by a technique called dither. This adds a small amount of random noise to the sampled analog input before conversion.
Analog to Digital Conversion
An analog-to-digital converter, or ADC, converts a continuous, analog input signal into a set of discrete digital values. The ADC’s output, the list of digital numbers, are then processed by a digital system to reconstruct the original analog signal. Because the conversion process involves quantization, it necessarily introduces a small amount of error into the output, which can be minimized by increasing the number of bits (resolution) used to represent the signal.
ADC circuitry samples an analog signal at fixed time intervals, known as the sampling period. It is critical that the sample rate be high enough to capture all frequencies present in the input signal, and no more than twice the highest frequency. Otherwise, any frequency that is sampled more than once will be incorrectly reconstructed as a lower-frequency signal (aliasing).
The ADC’s output, the list containing the digital values, are then sent to the computer for further processing. The digital signals are converted into an analog signal that the computer can read, and the result is a real-world electrical signal that can be manipulated by the software in the computer.
Analog-to-digital converters are used in a wide range of devices, from sound recording and transmission to physiological signals such as the electroencephalogram and the electrocardiogram. They are also often found in electronic sensors and actuators that translate physical quantities into signals that can be controlled by computers.
An ADC can be built on a single chip, but more complex designs typically consist of several stages, each with its own separate circuitry. The first stage typically contains a sample-and-hold (S/H) circuit that is followed by an m-bit flash ADC and a digital low-pass filter. The signal in the S/H circuit is sampled at each time high voltage hot swap controller interval, and the ADC outputs a two’s complement binary code representing the amplitude of the sampled voltage.
Digital to Analog Conversion
DACs take digital inputs and convert them into an analog electrical output. This output drives an audio amplifier that in turn produces sound. Digital circuits, such as microprocessor controlled systems, Arduinos and Raspberry Pi work with discrete digital signals that have only two states, a logic “1” (HIGH) or a logic “0” (LOW). It is therefore necessary to electronically translate these signals into analogue signals so they can be used by sensor devices which are inherently analogue such as switches, relays and encoders.
Essentially an ADC takes a snapshot tantalum capacitor of an analogue signal at a given point in time and creates a digital output code that represents that snapshot. The number of bits in this digital representation of the analogue signal determines its resolution.
For a DAC to function correctly it is critical that the op amp at its output produce an output waveform that is very close to its input signal. This requires that the op amp have very high slew rates and wide bandwidth.
It is also critical for the DAC’s internal architecture to accurately translate each digital input word into its corresponding output voltage. This is often referred to as conversion speed and it sets the maximum data rate that the DAC can support. The accuracy of a DAC may be measured in terms of integral and differential nonlinearity.
Integral nonlinearity details the maximum error in an ideal output for all possible input words and is a measure of the maximum output deviation from one least significant bit (LSB). Differential nonlinearity details the maximum output deviation between two adjacent outputs caused by the same input word and is a measure of the amount of distortion that may be present in the DAC. For both types of error it is important that the DAC’s settling time be much shorter than its conversion speed to avoid errors in the output.
Digital to Digital Conversion
An analog-to-digital converter, or ADC, converts an incoming analog signal into a set of digital representations. It samples the signal at regular intervals, and each sample corresponds to a unique digital value that can be read by a computer. It also produces a reference voltage that it uses to compare its output to the input. ADC circuits are used in all sorts of data acquisition, or DAQ, systems including industrial control and monitoring systems.
An ADC is usually paired with a digital-to-analog converter, or DAC, to turn the digital representation back into an analog electrical signal that can be used by sensor devices. An ADC is most often controlled by a computer, and it takes some tens of microseconds to complete a conversion. The converter generates interrupts to let the program know when it is finished, and the computer can then read the converted data.
Digital-to-analog converters are characterized by their resolution, which is the number of different levels the DAC can produce within its allowed range of input voltages. The resolution determines how much error, or quantization noise, is introduced by the ADC’s sampling process.
ADCs with higher resolution have more bits than those with lower resolution. A higher bit count can provide a more accurate digital representation of the original analog signal, but it is still subject to the same error sources, such as quantization noise and distortion, as any other ADC.
The list of possible analog input signals is long, ranging from temperature changes and other continuous measurements to periodic signals such as audio or video. Some, like temperature or climatic variables, change slowly, while others, such as audio or video, have high frequency content. Some of these signals have a wide dynamic range, while others are limited to a few kilohertz of bandwidth.