One of the most commonly used practices to quantitate DNA or RNA is the use of spectrophotometric analysis using a
spectrophotometer. A
spectrophotometer is able to determine the average concentrations of the
nucleic acids
DNA or
RNA present in a mixture, as well as their purity. Spectrophotometric analysis is based on the principles that nucleic acids
absorb ultraviolet light in a specific pattern. In the case of DNA and RNA, a sample is exposed to ultraviolet light at a
wavelength of 260
nanometres (nm) and a photo-detector measures the light that passes through the sample. Some of the ultraviolet light will pass through and some will be absorbed by the DNA / RNA. The more light absorbed by the sample, the higher the nucleic acid concentration in the sample. The resulting effect is that less light will strike the
photodetector and this will produce a higher
optical density (OD) Using the
Beer–Lambert law it is possible to relate the amount of light absorbed to the concentration of the absorbing molecule. At a wavelength of 260 nm, the average
extinction coefficient for double-stranded DNA (dsDNA) is 0.020 (μg/mL)−1 cm−1, for single-stranded DNA (ssDNA) it is 0.027 (μg/mL)−1 cm−1, for single-stranded RNA (ssRNA) it is 0.025 (μg/mL)−1 cm−1 and for short single-stranded oligonucleotides it is dependent on the length and base composition. Thus, an
Absorbance (A) of 1 corresponds to a concentration of 50 μg/mL for double-stranded DNA. This method of calculation is valid for up to an A of at least 2. A more accurate extinction coefficient may be needed for oligonucleotides; these can be predicted using the
nearest-neighbor model. is generated from equation: :Optical density= Log (Intensity of incident light / Intensity of Transmitted light) In practical terms, a sample that contains no DNA or RNA should not absorb any of the ultraviolet light and therefore produce an OD of 0 Optical density= Log (100/100)=0 When using spectrophotometric analysis to determine the concentration of DNA or RNA, the
Beer–Lambert law is used to determine unknown concentrations without the need for standard curves. In essence, the Beer Lambert Law makes it possible to relate the amount of light absorbed to the concentration of the absorbing molecule. The following
absorbance units to nucleic acid concentration conversion factors are used to convert OD to concentration of unknown nucleic acid samples: :A260 dsDNA = 50 μg/mL :A260 ssDNA = 33 μg/mL :A260 ssRNA = 40 μg/mL
Conversion factors When using a 10 mm
path length, simply multiply the OD by the
conversion factor to determine the concentration. Example, a 2.0 OD dsDNA sample corresponds to a sample with a 100 μg/mL concentration. When using a path length that is shorter than 10mm, the resultant OD will be reduced by a factor of 10/path length. Using the example above with a 3 mm path length, the OD for the 100 μg/mL sample would be reduced to 0.6. To normalize the concentration to a 10mm equivalent, the following is done: 0.6 OD X (10/3) * 50 μg/mL=100 μg/mL Most spectrophotometers allow selection of the nucleic acid type and path length such that resultant concentration is normalized to the 10 mm path length which is based on the principles of
Beer's law.
A260 as quantity measurement The "A260 unit" is used as a quantity measure for nucleic acids. One A260 unit is the amount of nucleic acid contained in 1 mL and producing an OD of 1. The same conversion factors apply, and therefore, in such contexts: :1 A260 unit dsDNA = 50 μg :1 A260 unit ssDNA = 33 μg :1 A260 unit ssRNA = 40 μg
Sample purity (260:280 / 260:230 ratios) It is common for nucleic acid samples to be contaminated with other molecules (i.e. proteins, organic compounds, other). The secondary benefit of using spectrophotometric analysis for nucleic acid quantitation is the ability to determine sample purity using the 260 nm:280 nm calculation. The ratio of the absorbance at 260 and 280 nm (A260/280) is used to assess the purity of nucleic acids. For pure DNA, A260/280 is widely considered ~1.8 but has been argued to translate - due to numeric errors in the original Warburg paper - into a mix of 60% protein and 40% DNA. The ratio for pure RNA A260/280 is ~2.0. These ratios are commonly used to assess the amount of protein contamination that is left from the nucleic acid isolation process since proteins absorb at 280 nm. The ratio of
absorbance at 260 nm vs 280 nm is commonly used to assess DNA contamination of
protein solutions, since proteins (in particular, the aromatic amino acids) absorb light at 280 nm. The reverse, however, is not true — it takes a relatively large amount of protein contamination to significantly affect the 260:280 ratio in a nucleic acid solution. • Absorption at 330 nm and higher indicates particulates contaminating the solution, causing scattering of light in the visible range. The value in a pure nucleic acid sample should be zero. • Negative values could result if an incorrect solution was used as blank. Alternatively, these values could arise due to fluorescence of a dye in the solution. == Analysis with fluorescent dye tagging ==