The traditional method for quantification and assessment of purity of DNA samples is spectrophotometric measurement at 260 and 280nm (1). It is generally accepted that a sample of pure, double-stranded DNA of 50 g/mL will have an absorbance at 260nm of 1.0 and that the ratio of absorbance at 260 and 280nm of this sample will be greater than 1.8 (2). If a sample has an A260 / A280 of less than 1.8, it is usually considered to be contaminated by protein. In such measurements, there is frequently little consideration given to the resolution or bandwidth of the spectrophotometer used for the measurements.
Instrument bandwidth is generally defined as the full width at half height of an absorbance band of a reference material that possesses a natural bandwidth less than or equal to the instrument bandwidth. The ideal reference material would be one with many absorbance bands, each having an infinitely small natural bandwidth. For this reason, the atomic lines of elements such as Hg are used as reference materials for determining instrument bandwidth.
Each sample will have a natural or
Spectral Bandwidth and, thus, the ability of an
instrument to accurately quantify components in a mixture will depend upon several factors, including natural bandwidths of the components and the
instrument bandwidth. A widely accepted practice
is to use a spectrophotometer with an instrument bandwidth of one tenth of the natural bandwidth of
the analyte to be measured, 43 nm in the case of
DNA. Thus, following the accepted practice, a spectrophotometer with an instrument bandwidth of = 4.3nm should be used to measure such samples. In the past few years there has been an increasing trend
within life sciences toward the use of spectrophotometers with instrument bandwidths of 5nm or greater for quantification of DNA and for estim