The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.
From page 27... ...
The mathematical sciences have consistently been making major advances in both fundamental theory and highimpact applications, and recent d ecades have seen tremendous innovation and productivity. The discipline is displaying great unity and coherence as more and more bridges are built between subfields of research.

From page 28... ...
The topics are, in order: • The Topology of ThreeDimensional Spaces • Uncertainty Quantification • The Mathematical Sciences and Social Networks • The ProteinFolding Problem and Computational Biology • The Fundamental Lemma • Primes in Arithmetic Progression • Hierarchical Modeling • Algorithms and Complexity • Inverse Problems: Visibility and Invisibility • The Interplay of Geometry and Theoretical Physics • New Frontiers in Statistical Inference • Economics and Business: Mechanism Design • Mathematical Sciences and Medicine • Compressed Sensing THE TOPOLOGY OF THREEDIMENSIONAL SPACES The modest title of this section hides a tremendous accomplishment. The notion of space is central to the mathematical sciences, to the physical sciences, and to engineering.

From page 29... ...
consist ing of "nodes" joined by "edges," turn out to be a fundamental tool used by social scientists to understand social networks. A striking feature of mathematical structures is their hierarchical nature  it is possible to use existing mathematical structures as a foundation on which to build new mathematical structures.

From page 30... ...
These different classes of random variables can be put together into structures known as probabilistic models, which are an incredibly flexible class of mathematical structures used to understand phenomena as diverse as what goes on inside a cell, financial markets, or the physics of superconductors. Mathematical structures provide a unifying thread weaving through and uniting the mathematical sciences.

From page 31... ...
For the next 100 years this problem and its generalizations led to enormous theoretical advances in the understanding of threedimensional spaces and of higherdimensional spaces, but Poincaré's original problem remained unsolved. The problem proved so difficult, and the work it stimulated so important, that in 2000 the Clay Mathematics Institute listed it as one of its seven Millennium Prize problems in mathematics, problems felt to be among the hardest and most important in theoretical mathematics.

From page 32... ...
This mathematical breakthrough is fairly recent and it is too early to accurately assess its full impact inside both mathematics and the physical sciences and engineering. Nevertheless, we can make some educated guesses.

From page 33... ...
The two societies have also started a joint journal, Uncertainty Quantification. THE MATHEMATICAL SCIENCES AND SOCIAL NETWORKS The emergence of online social networks is changing behavior in many contexts, allowing decentralized interaction among larger groups and fewer geographic constraints.

From page 34... ...
However, since the rise of the Internet and social networks, the underlying assumptions in the analysis of networks have changed dramatically. The abundance of such social network data, and the increasing complexity of social networks, is changing the face of research on social networks.

From page 35... ...
Some d subproblems include how a native structure results from the interatomic forces of the sequence of amino acids and how a protein can fold so fast.4 Although the proteinfolding conjecture has been shown to be incorrect for a certain class of proteins  for example, sometimes enzymes called "chaperones" are needed to assist in the folding of a protein  scientists have observed that more than 70 percent of the proteins in nature still fold spontaneously, each into its unique threedimensional shape. In 2005, the proteinfolding problem was listed by Science magazine as one of the 125 grand unsolved scientific challenges.

From page 36... ...
The overall approach is still not as precise as would be desired. Given the availability of a large number of solved protein structures, there is still room for applying certain statistical learning strategies to combine information from empirical data and physics principles to further improve the energy function.

From page 37... ...
In a similar way, statistical approaches, applied in an evolutionary context, may augment our current arsenal of experimental and theoretical methods for understanding protein folding and predicting protein structure, function, and mechanisms. THE FUNDAMENTAL LEMMA The fundamental lemma is a seemingly obscure combinatorial identity introduced by Robert Langlands in 1979, as a component of what is now

From page 38... ...
6 David Nadler, 2012, The geometric nature of the fundamental lemma. Bulletin of the American Mathematical Society 49(1)

From page 39... ...
Why have we included this example to illustrate the vitality of the mathematical sciences? Because a key feature of the advance is the forging of a surprising link between prime numbers and two apparently unrelated fields of mathematics, harmonic analysis and ergodic theory.

From page 40... ...
, and the true batting averages are also modeled as arising from a "population distribution" (the unknown distribution of true batting averages) ; it is this secondlevel modeling that leads to the name "hierarchical modeling." There are various possible ways to analyze the resulting model, but all can lead to interesting and surprising conclusions, such as a possible "crossover effect," wherein a batter with a higher current average but fewer games played can be predicted to have less skill than a batter with a lower current average but more games played (because the random component of this latter batter's current average would be smaller)

From page 41... ...
For example, one can assume a latent hidden Markov structure for a protein sequence and use observations from multiple species or multiple analogous copies from one species to find common conserved parts of the protein, which often correspond to func tionally critical regions and can be informative for drug design. A similar structure can also be designed for control regions (called p romoters)

From page 42... ...
ALGORITHMS AND COMPLEXITY Underlying much of engineering are algorithms that solve problems, often problems with deep and interesting mathematical structure. In recent years there have been significant improvements in our ability to solve such problems efficiently and to understand the limits of what is solvable.

From page 43... ...
The algorithms used for optimization all are based on deep mathematical ideas, although their efficiency has often been established only experimentally. Recent work initiated by Spielman and Teng on smoothed analysis8 gives a new framework for understanding the efficiency of these methods, one that estimates performance probabilistically rather than focusing on the rarely seen worstcase outcomes.

From page 44... ...
Coding has also had an important impact on many seemingly unrelated fields, such as understanding the limits of what is efficiently computable. Coding can be seen as the key tool in the development of a useful new type of proof system called probabilistically checkable proofs.

From page 45... ...
Research in this area draws on a diverse array of mathematics, including complex analysis, differential geometry, harmonic analysis, integral geometry, numerical analysis, optimization, partial differential equations, and probability, and it builds strong linkages between applications and deep areas of mathematics. An archetypal example of an inverse boundary problem for an elliptic equation is the bynowclassical Calderón problem, also called electrical impedance tomography (EIT)

From page 46... ...
These mathematical objects had been introduced by Hilbert in the 1880s for purely mathematical reasons having nothing to do with quantum m echanics, which had not even been conceived of then. It is interesting to note, though, that Hilbert called the decomposition of his operators the "spectral decomposition" because it reminded him of the spectrum of various atoms, something that was mysterious at the time and finally explained by quantum mechanics.

From page 47... ...
Here again the physicists were struggling to develop a mathematical framework to handle the physical concepts they were developing, when in fact the mathematical framework, which in mathematics is known as connections on principal bundles and curvature, had already been introduced for mathematical reasons. Much of the recent history of quantum field theory has turned this model of interaction on its head.

From page 48... ...
One recent development, the gauge gravity duality, or AdS/CFT, has connected general relativity with quantum field theories, the theories we use for particle physics.14 The gravity theory lives in hyperbolic space. Thus, many developments in hyperbolic geometry, and black holes, could be used to describe certain strongly interacting systems of particles.

From page 49... ...
The deep underlying mathematical structures are only starting to be understood. Integrability in socalled (1 + 1)

From page 50... ...
This era of scientific mass production calls for novel developments in statistical inference, and it has inspired a tremendous burst in statistical methodology. More importantly, the data flood completely transforms the set of questions that needs to be answered, and the field of statistics has, accordingly, changed profoundly in the last 15 years.

From page 51... ...
In the example above, it is known that only a small number of genes can potentially be associated with a trait. In medical imaging, the image we wish to form typically has a concise description in a carefully chosen representation.

From page 52... ...
To give one idea of a topic that is likely to preoccupy statisticians in the next decade, one could mention the problem of providing correct inference after selection. Conventional statistical inference requires that a

From page 53... ...
Standard statistical tests and confidence intervals applied to the selected parameters are in general completely erroneous. Statistical methodology providing correct inference after massive data snooping is urgently needed.

From page 54... ...
patient images. This is achieved by solving the socalled deformable image registration problem, a problem that comes up over and over in medical imaging.

From page 55... ...
Compressed sensing was motivated by a great question in MRI, a medical imaging technique used in radiology to visualize detailed internal structures. MRI is a wonderful tool with several advantages over other medical imaging techniques such as CT or Xrays.

From page 56... ...
This new method produces sharp images from brief scans. The potential for this method is such that both General Electric and Phillips Corporation have medical imaging products in the pipeline that will incorporate compressed sensing.

From page 57... ...
Not only is compressed sensing one of the most applicable theories coming out of the mathematical sciences in the last decade, but it is also very sophisticated mathematically. Compressed sensing uses techniques of probability theory, combinatorics, geometry, harmonic analysis, and optimization to shed new light on fundamental questions in approximation theory: How many measurements are needed to recover an object of interest?

Key Terms
This material may be derived from roughly machineread images, and so is provided only to facilitate research.
More
information on Chapter Skim is available.