Skip to main content

Currently Skimming:

3 Experimental Design and Data Analysis
Pages 45-58

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 45...
... The discussion focuses on transcriptome profiling using DNA microarrays. However, the approaches and issues discussed here apply to various toxicogenomic technologies and their applications.
From page 46...
... As the cost of using microarrays and other toxicogenomic technologies has declined, experiments have begun to include sampling protocols that provide better estimates of biologic and systematic variation within the data. Still, high costs remain an obstacle to large, population-based studies.
From page 47...
... The use of matched controls and randomization can minimize potential sources of systematic bias and improve the quality of inferences drawn from toxicogenomic datasets. A related question in designing toxicogenomic experiments is whether samples should be pooled to improve population sampling without increasing the number of assays (Dobbin and Simon 2005; Jolly et al.
From page 48...
... 2001 Analysis of variance Long et al. 2001 Class Prediction k-nearest neighbors Theilhaber et al.
From page 49...
... . Evaluation of the gene expression data may indicate the nephrotoxic compounds can be grouped based on the cell type affected, the mechanism responsible for renal failure, or other common factors.
From page 50...
... . Evaluation of the gene expression data may indicate the nephrotoxic compounds can be grouped based on the cell type affected, the mechanism responsible for renal failure, or other common factors.
From page 51...
... modified to correct for overestimates arising from small values in the denominator, along with permutation testing to estimate the false discovery rate in any selected significant gene set. Other methods attempt to correct for multiple testing, such as the well-known Bonferroni correction, but these methods assume independence between the measurements, a constraint that is violated in gene analysis as many genes and gene products operate together in pathways and networks and so are co-regulated.
From page 52...
... and through a careful comparison of the expression profiles finds genes whose patterns of expression can be used to distinguish the various phenotypic groups under analysis. Class prediction approaches then attempt to use sets of informative genes (generally selected using statistical approaches in class comparison)
From page 53...
... There is no universally accepted way to connect the expression of genes, proteins, or metabolites to functionally relevant pathways leading to particular phenotypic end points, so a good deal of user interaction and creativity is currently required. New approaches to predict networks of interacting genes based on gene expression profiles use several modeling techniques, including boolean networks (Akutsu et al.
From page 54...
... . The utility of gene expression-based biomarkers was clearly illustrated by van Leeuwen and colleagues' 1986 identification of putative transcriptional biomarkers for early effects of smoking using peripheral blood cell profiling (van Leeuwen et al.
From page 55...
... Many of the most promising applications involve using gene, protein, or metabolic expression profiles as diagnostic or prognostic indicators and refer to them as biomarkers. However, use of this term has been rather imprecise, in part because the term has developed a rather broad range of interpretations and associations with detection of a range of measurable end points.
From page 56...
... would allow many of the unanswered questions about the applicability of genomic technologies to toxicology to be addressed. In fact, a more extensive analysis would allow scientists to more fully address questions about reproducibility, reliability, generalizability, population effects, and potential experimental biases that might exist and that would drive the development of standards and new analytical methods.
From page 57...
... (2001) Phase 1 Preclinical exploratory Promising directions identified Phase 2 Clinical assay and Clinical assay detects established disease validation Phase 3 Retrospective Biomarker detects disease before it becomes longitudinal clinical and a "screen-positive" rule is defined Phase 4 Prospective screening Extent and characteristics of disease detected by the test and the false referral rate are identified Phase 5 Cancer control Impact of screening on reducing the burden of disease on the population is quantified investigate associations among various elements.
From page 58...
... 58 Applications of Toxicogenomic Technologies open to the research community. Specific tools that are needed include the following: a.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.