Skip to main content

Currently Skimming:

9 Validation
Pages 135-151

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 135...
... First, technology platforms must be shown to provide consistent, reliable results, which includes assessment of device stability and determination of analytical sensitivity and assay limits of detection, interference, and precision (reproducibility and repeatability)
From page 136...
... For transcriptomic profiles with microarrays, for which we have the most data, there have been many successful applications, often with high rates of validation using an alternative technology such as Northern analysis or quantitative reverse transcriptase polymerase chain reaction (qRT-PCR) ; however, it should be noted that each of these techniques has experimental biases.
From page 137...
... Repeated measurements with the same method but conducted on different days, with different batches of reagents, or with different operators provide a measure of reproducibility. Because the latter scenario best describes the routine application of toxicogenomic technology platforms, reproducibility is the most relevant measure of system performance.
From page 138...
... 138 Applications of Toxicogenomic Technologies FIGURE 9-1 Human plasma proteome. The large range of protein concentrations in the human proteome represents a significant experimental challenge as technologies must be sensitive across nearly 12 orders of magnitude (a 1 trillionfold range)
From page 139...
... This control measures system performance independent of the quality of the RNA sample being analyzed. Objective measures of RNA quality may provide an additional means of assessing the performance and establishing the credibility of a particular microarray assay, as poor quality RNA samples provide unreliable results.
From page 140...
... Adding a control RNA from another species without similarity to the genome of the species under study, and for which there are probes on the microarray, can yield data that can be used to assess the overall quality of the starting RNA, the quality of the hybridization, and the general quality of a particular microarray. Second, as most microarrays contained repeated probe sequences for particular genes, they are quite useful for assessing spatial properties of each microarray and for assessing the overall performance of a single microarray as these repeated probes should give consistent measures of gene expression.
From page 141...
... Data Collection and Normalization Most genomic technology platforms do not perform absolute quantitative measurements. For microarray data collected on an Affymetrix GeneChip, the data from the multiple probe pairs for each gene are combined in various ways to assess an expression level for each gene.
From page 142...
... The proper approach is to conduct full LKOCV in which the sample data are divided into training and test sets before each round of gene selection, algorithm training, and testing. When iterated over multiple rounds, LKOCV can be used to estimate the accuracy of the classification system by simply averaging the complete set of classifiers.
From page 143...
... In a toxicogenomic experiment, in which thousands of genes, proteins, or metabolites are examined in a single assay, biologic validation is important because there is a significant likelihood that some changes in genes, proteins, or metabolites are associated with a particular outcome by chance. For mechanistic studies, biologic validation also requires clearly demonstrating a causative role for any proposed mechanism.
From page 144...
... Toxicogenomic data present unique regulatory validation challenges both because such data have not previously been used in a regulatory setting and because of the rapid pace at which toxicogenomic technologies and data are developing. Therefore, regulatory agencies must balance the need to provide criteria
From page 145...
... and standardization for the submission of toxicogenomic data with the need to avoid prematurely "locking-in" transitory technologies that may soon be replaced with the next generation of products or methods. Regulatory agencies have been criticized for being too conservative in adopting new toxicologic
From page 146...
... EPA and FDA have adopted initial regulatory guidances that seek to encourage toxicogenomic data2 submissions (see Table 9-1 and Chapter 11)
From page 147...
... Validation 147 TABLE 9-1 Worldwide Regulatory Policies and Guidelines Related to Toxicogenomics and Pharmacogenomics Region/Document Type Document Issue Date United States: Food and http://www.fda.gov Drug Administration Guidance Multiplex Tests for Heritable DNA April 21, 2003 Markers, Mutations and Expression Patterns; Draft Guidance for Industry and FDA Reviewers Guidance Guidance for Industry: Pharmacogenomic March 2005 Data Submissions Guidance Guidance for Industry and FDA Staff: March 10, 2005 Class II Special Controls Guidance Document: Drug Metabolizing Enzyme Genotyping System Concept paper Drug-Diagnostic Co-Development April 2005 Concept Paper (Preliminary Draft) Guidance Guidance for Industry and FDA Staff: August 25, 2005 Class II Special Controls Guidance Document: RNA Preanalytical Systems Environmental http://www.epa.gov Protection Agency Guidance Interim Genomics Policy June 2002 White paper Potential Implications of Genomics for December 2004 Regulatory and Risk Assessment Applications at EPA Europe: European Agency http://www.emea.eu.int for the Evaluation of Medicinal Products Position paper CPMP Position Paper on Terminology in November 21, 2002 Pharmacogenetics (EMEA/CPMP/ 3070/01)
From page 148...
... EPA issued an Interim Policy on Genomics in 2002 to allow consideration of genomic data in regulatory decision making but stated that these data alone would be "insufficient as a basis for decisions" (EPA 2002, p.
From page 149...
... . The ICCVAM criteria are useful guides for regulatory validation of toxicogenomic data and methods, but toxicogenomic technologies will require unique and more flexible approaches to validation given their rapid pace of change and other unique characteristics (Corvi et al.
From page 150...
... Consequently, regulatory agencies such as the EPA and the FDA should move forward expeditiously in continuing to develop and expand their validation criteria to encourage submission and use of toxicogenomic data in regulatory contexts. In summary, the following are needed to move forward in validation: • Objective standards for assessing quality and implementing quality control measures for the various toxicogenomic technologies; • Guidelines for extending technologies from the laboratory to broader applications, including guidance for implementing related but more easily deployable technologies such as qRT-PCR and ELISAs; • A clear and unified approach to regulatory validation of "-omic" technologies that aligns the potentially diverse standards being developed by various federal agencies, including the EPA, FDA, NIEHS, and OSHA, as well as at
From page 151...
... Whereas the use of toxicogenomic data will be facilitated by harmonization of data and method validation criteria among U.S. regulatory agencies and at the international level, harmonization should be a long-term goal and should not prevent individual agencies from developing their own validation procedures and criteria in the shorter term.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.