Wednesday, March 4, 2009

ADC Definitions and Specifications-1


Introduction
This application note will help users of analog-to-digital converters (ADC) understand common terminologies used in the electronics industry to define ADC operation and performance. There are many terms and parameters used to define the performance of ADC’s. Included in this document are common definitions, numerical specifications, differences, and issues with the definitions. By understanding the terminology used to specify various ADC parameters, a systems designer can better understand how to obtain the greatest overall system performance, based on the various performance features of any given ADC system.

Terms and Definitions
The following terms are used in the electronics industry to define ADC operation.

Measurement Units
There are several terms commonly used to measure ADC performance. Improper or inconsistent use of terms may result in confusion and or misinterpretation of performance. Common measurement units in use in the industry are described here (The following examples assume a 10-bit, 5.12-V ADC with an ideal 2.56-V conversion at $200):

Volts (V) — The error voltage is the difference between the input voltage that converts to a given code and the ideal input voltage for the same code. When the error is measured in volts, it is related to the actual voltages and is not normalized to or dependent on the input range or voltage supply. This measure is useful for fixed error sources such as offset but does not relate well to the observed error.

Least Significant Bits (LSB) — A least significant bit (LSB) is a unit of voltage equal to the smallest resolution of the ADC. This unit of measure approximately relates the error voltage to the observed error in conversion (code error), and is useful for systemic errors such as differential non-linearity. A 2.56-V input on an ADC with ± 3 LSB of error could read between $1FD and $203. This unit is by far the most common terminology and will be the preferred unit used for error representation.

Percent Full-Scale Value (%FSV) — Percent full-scale voltage is a unit of voltage equal to 1/100th of the input range of the ADC. This unit of error clarifies the size of the error relative to the input range, and is useful for trimmable errors such as offset or gain errors. This unit is difficult to accurately translate to observed error.

Counts — A count is a unit of voltage equal to an LSB. It is a terminology unique to specifications of some Freescale ADC’s and may be confusing to customers when doing cross-vendor comparisons.

Bits — A bit is a unit equal to the log (base2) of the error voltage normalized to the resolution of the ADC. An error of N bits corresponds to 2N LSB of error. This measure is easily confused with LSB and is hard to extrapolate between integer values.

Decibels (db) — A decibel is a unit equal to 20 times the log (base10) of the error voltage normalized to the full-scale value (20*log(err_volts/input_range)). A 2.56 input on an ADC with an error of 50 db will convert between $1FD and $203. This figure is often used in the communications field and is infrequently used in control or monitoring applications.

ADC Transfer Curves
The ADC converts an input voltage to a corresponding digital code. The curve describing this behavior is the Actual Transfer Function. The Ideal Transfer Function represents this behavior assuming the ADC is perfectly linear, or that a given change in input voltage will create the same change in conversion code regardless of the input’s initial level. The Adjusted Transfer Function assumes this behavior after the errors at the endpoint are accounted for.


References:
  1. J. Feddeler and Bill Lucas, 8/16 Bit Division Systems Engineering, Austin, Texas, Aplication Note AN2438/D 2/2003, Frescale Semiconductor, Inc, Motorola 2003, www.freescale.com
  2. http://en.wikipedia.org