National Academies Press: OpenBook

Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium (2006)

Chapter:Population Dynamics of Human Language: A Complex System

« Previous: The Promise of Synthetic Biology
Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×

Population Dynamics of Human Language: A Complex System

NATALIA L. KOMAROVA

University of California, Irvine


In the course of natural history, evolution has made several great innovative “inventions,” such as nucleic acids, proteins, cells, chromosomes, multi-cellular organisms, and the nervous system. The last “invention,” which truly revolutionized the very rules of evolution, is language. It gives humans an unprecedented possibility of transmitting information from generation to generation, not by the “traditional” means of a genetic code, but by talking. This new mode of cross-generational information transfer has given rise to so-called “cultural evolution.” Through cultural evolution, language is responsible for a big part of our being “human.” Language, and cultural evolution, are also shaping history and changing the rules of biology. Without exaggeration, language is one of the most fascinating traits of Homo sapiens.

The study of language and grammar dates back to classical India and Greece. In the eighteenth century, the discovery of the Indo-European language family led to the surprising realization that very different languages may be related to each other; this was the beginning of historical linguistics. Formal language theory, which emerged only in the 20th century (Chomsky, 1956, 1957; Harrison, 1978), is an attempt to describe the rules a speaker uses to generate linguistic forms (descriptive adequacy) and to explain how language competence emerges in the human brain (explanatory adequacy). Language theory has been supported by advances in the mathematical and computational analysis of language acquisition, a field that became known as learning theory. Currently, efforts are being focused on bringing linguistic inquiry into contact with various disciplines of

Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×

biology, including neurobiology (Deacon, 1997; Vargha-Khadem et al., 1998), animal behavior (Dunbar, 1996; Fitch, 2000; Hauser, 1996), evolution (Aitchinson, 1996; Batali, 1994; Bickerton, 1990; Hawkins and Gell-Mann, 1992; Hurford et al., 1998; Jackendoff, 1999; Knight et al., 2000; Lieberman, 1984, 1991; Maynard Smith and Szathmary, 1995; Pinker and Bloom, 1990) and genetics (Gopnik and Crago, 1991; Lai et al., 2001). The goal of these interdisciplinary studies is to approach language as a product of evolution and as the extended phenotype of a species of primates.

In the past decade there has been an explosion of interest in computational aspects of the evolution of language (Cangelosi and Parisi, 2001; Christiansen and Kirby, 2003). A great many efforts, across a wide range of disciplines, are now focused on answering such questions as why language is the way it is and how it got that way. Various approaches to these questions have been suggested, including viewing language as a complex adaptive system (Steels, 2000). Levin (2002) identified the following defining properties of a general complex adaptive system:

  1. They consist of a number of different components.

  2. The components interact with each other with some degree of localization.

  3. An autonomous process uses the outcomes of these interactions to select a subset of components for replication and/or enhancement.

Property 3, a signature of “biology” in a complex adaptive system, includes replication (which implies a degree of variability) and Darwinian selection. A mathematical problem posed by such systems is to find the outcome (or, more generally, describe the dynamics) of a competition in which the set of players changes, depending on the current state of affairs. New players come in, and their “strategies” (or properties) are drawn from a huge set of possibilities.

The main idea in this approach to language evolution can be stated as follows. There is a population of individuals (neural networks [Oliphant, 1999; Smith, 2002], agents [Steels, 2001; Steels and Kaplan, 1998], organisms in a foraging environment [Cangelosi, 2001]) who communicate with each other. Each individual is characterized by parameters that define its phenotype. These usually include the individual’s ability to speak, vocabulary, ability to learn, and other characteristics important for communication (and sometimes other features like the life span or onset of the reproductive age [Hurford and Kirby, 1998]).

The results of communication between individuals are assessed in some way. Rounds of communication are followed by rounds of “update,” which may mimic biological reproduction (an individual is replaced by its offspring), or learning (the individual’s vocabulary or grammatical rules are changed/updated). Various numerical techniques are used to model the dynamics of reproducing and learning individuals, such as genetic algorithms. The initial condition usually assumes no common communication system in the population. After a num-

Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×

ber of rounds of update/replication, the state of communication ability is evaluated again.

Two questions are often addressed. First, under what circumstances does a common communication system arise in a population of interacting individuals? And second, what are the conditions under which such a communication system can be maintained?

WHAT IS UNIVERSAL GRAMMAR AND WHY DO WE NEED IT?

Learning is inductive inference. The learner is presented with data and must infer the rules that generate these data. The difference between “learning” and “memorization” is the ability to generalize beyond one’s own experience to novel circumstances. In the context of language, the child learner will generalize to novel sentences never heard before. Any child can produce and understand sentences that are not part of his/her previous linguistic experience.

Children develop grammatical competence spontaneously without formal training. All they need is interaction with people and exposure to normal language use. In other words, a child hears grammatical sentences and then constructs an internal representation of the rules that generate grammatical sentences. Chomsky pointed out that the evidence available to the child does not uniquely determine the underlying grammatical rules (Chomsky, 1965, 1972), a phenomenon called the “poverty of stimulus” (Wexler and Culicover, 1980). The “paradox of language acquisition” is that children nevertheless reliably achieve correct grammatical competence (Jackendoff, 1997, 2001). How is this possible?

The proposed solution of the paradox is that children learn correct grammar by choosing from a restricted set of candidate grammars. The structure of this restricted set is “universal grammar.” Mathematical learning theory proves the “necessity” of a universal grammar. Discovering properties of universal grammar and particular human learning algorithms requires the empirical study of neurobiological and cognitive functions of the human brain involved in language acquisition. Some aspects of universal grammar, however, might be revealed by studying common features of existing human languages. This has been a major goal of linguistic research during the last several decades.

In our modeling approach, we use the concept and some properties of universal grammar to formulate the mathematical theory of language evolution. We assume that universal grammar has a rule system that generates a set (or a search space) of grammars, {G1, G2,…, Gn}. These grammars can be constructed by the language learner as potential candidates for the grammar that needs to be learned; the learner cannot end up with a grammar that is not part of this search space. In this sense, universal grammar contains the possibility of learning all human languages (and many more). Figure 1 illustrates this process of language

Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×

FIGURE 1 Universal grammar specifies the search space of candidate grammars and the learning procedure for evaluating input sentences. The basic idea is that the child has an innate expectation of grammar (for example, a finite number of candidate grammars) and then chooses a particular candidate grammar that is compatible with the input. Source: Komarova and Nowak, 2001c. Reprinted with permission.

acquisition. The learner has a mechanism to evaluate input sentences and to choose one of the candidate grammars in his search space.

A MATHEMATICAL FORMULATION OF LANGUAGE EVOLUTION

Our approach differs from many others in that we use mathematical, analytical tools to address questions of language origins and evolution (Komarova and Nowak, 2001a,b, 2003; Komarova et al., 2001; Nowak and Komarova, 2001; Nowak et al., 2001, 2002). We assume that each individual has universal grammar, which allows him/her to learn any language in a (finite but large) set, {G1, …, Gn}.

In classical learning theory, which usually focuses on an isolated teacher-learner pair, there is a collection of concepts (grammars), G1, …, Gn, and words (or sample sentences, for learning a grammar) that refer to these concepts, sometimes ambiguously. The teacher generates a stream of words, referring to, say, concept G2. This is not known to the student, but he must learn by guessing some concept, Gi, and checking for consistency with the teacher’s input. A typical

Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×

question of interest is how quickly a given method converges with the truth. Stated in the terminology of learning languages, the question becomes how many samples, Nδ, a given learning algorithm typically requires to learn the correct language with probability 1–δ. Questions of this type for specific learning mechanisms are interesting mathematical problems (e.g., the treatment of the so-called memoryless learner in Komarova and Rivin, 2003).

Next, imagine a population of learners, all equipped with a given learning algorithm. The question now becomes how many samples, Nδ, individual learners in a population need for the fraction 1–δ of the population to converge to a common language. The answer will, of course, depend on the specifics of the population dynamics.

Borrowing from population biology, we can define the fitness of speakers of different grammars. We denote by sij the probability that a speaker who uses grammar Gi formulates a sentence that is compatible with grammar Gj. The matrix {sij} describes the pair-wise similarity among the n grammars, 0 ≤ sij ≤ 1. We assume there is a reward for mutual understanding. The payoff for an individual using Gi communicating with an individual using Gj is given by aij = (1/2) (sij + sij). This is the average probability that Gi generates a sentence that is parsed by Gj and vice versa. We denote by xi the frequency of individuals who use grammar Gi; the vector x ={x1, . . ., xn} is defined on the simplex,

The average payoff of each of these individuals is given by , where ={aij} is a symmetric matrix. Payoff translates into fitness—individuals with a higher payoff produce more offspring. Note that the fitness of individuals strongly depends on the current composition of the population. Such is the nature of communication.

Another biological concept based on the theory of Darwinian evolution is variability. The “mutation rates” are defined as follows: denote by Qij the probability that a child learning from a parent with grammar Gi will end up speaking grammar Gj. = {Qij} is a stochastic matrix (its rows add up to one). Interestingly, the findings related to individual teacher-learner pairs can be incorporated in a natural way into the matrix, .

The last component of the model is the update rule for the evolutionary dynamics. The simplest rule is a deterministic equation, where each variable has a meaning of its ensemble average and the noise is neglected. This can be written by analogy with the well-known quasispecies equation (Eigen and Schuster, 1979), except it has a higher degree of nonlinearity (a consequence of the population-dependent fitness). We have,

Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×

Here = (x, f) is the average fitness, or grammatical coherence, of the population, that is, the probability that a sentence said by one person is understood by another person. This equation describes a mutation-selection process in a population of individuals of n types.

COHERENCE THRESHOLD IN POPULATION LEARNING

Numerical simulations (Komarova et al., 2001; Nowak et al., 2001) and analytical estimates (Komarova, 2004) of equation (1) show the following trend. If the matrix, , is close to identity, there are many coexisting localized steady-state solutions (corresponding to stable fixed points). For each such solution, the majority of the population speaks one of the languages, and the grammatical coherence, , takes values close to 1. As deviates far from the identity matrix (which means that there is a lot of “noise” in the system, that is, mistakes or learning are very likely), then this localization is lost and grammatical coherence becomes low.

A particular, highly symmetrical case of this system has been analyzed by Komarova et al. (2001) and Mitchener (2003). It was found that the low-coherence delocalized solution undergoes a transcritical bifurcation for the value ΔQ= where is the identity matrix defined by the entries of the matrix . A very interesting fact is that the threshold value of ΔQ does not depend on the dimension of the system, n.

A natural question is then to describe the phenomenon of the loss/gain of coherence for general matrices and . For instance, we can assume that the entries of the matrix are taken from a distribution, and the matrix is a function of . Our results (Komarova, 2004) suggest that the threshold value of ΔQ typically tends to a constant as n → ∞, where n is the size of the system. This finding can be called a universality property of universal grammars. Thus, for a (reasonable) class of learning algorithms (matrix ) and for any size of universal grammar, n, there is a finite coherence threshold in the system defined by the similarity of the grammars (matrix ).

What is the significance of the coherence threshold for our understanding of language evolution? Our analyses can help us obtain possible bounds on complexity of universal grammar that are compatible with Darwinian evolution. Indeed, if the space of all possible grammars is too large, learning would take too long (humans have a limited time for learning before they become adults). At this point, linguistics meets evolutionary biology—there is a selection pressure to make universal grammar smaller and easier to learn. However, a larger pool of grammars also has some advantages, such as increased flexibility and more likely innovation.

Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×

DISCUSSION

There are two common misconceptions about the evolution of language. The first one represents the human capacity for language as an indivisible unit and implies that its gradual evolution is impossible because no component of this unit would have any function in the absence of the other components. For example, syntax could not have evolved without phonology or semantics and vice versa. The other misconception is that evolution of language started from scratch some 5 million years ago, when humans and chimps diverged. There are virtually no data to support this theory.

Both of these views are fundamentally flawed. First, all complex biological systems consist of specific components, although it is often hard to imagine the usefulness of individual components in the absence of other components. The usual task of evolutionary biology is to understand how complex systems can arise from simpler ones gradually by mutation and natural selection. In this sense, human language is no different from other complex traits.

Second, it is clear that the human language faculty did not evolve de novo in the last few million years, but was a continuation of a process that had evolved in other animals over a much longer time. Many animal species have sophisticated cognitive abilities in terms of understanding the world and interacting with one another. Furthermore, it is a well known “trick” of evolution to use existing structures for new, sometimes surprising purposes. Monkeys, for example, appear to have brain areas similar to our language centers, but they use them to control facial muscles and analyze auditory input. It may have been an easy evolutionary task to reconnect these centers for human language. Hence, the human language instinct is most likely not the result of a sudden moment of inspiration of evolution’s blind watchmaker, but rather the consequence of several hundred million years of “experimenting” with animal cognition.

The goal of this paper is to show how methods of formal language theory, learning theory, and evolutionary biology can be combined to improve our understanding of the origins and properties of human language. We have formulated a mathematical theory for the population dynamics of grammar acquisition. The key result here is a “coherence threshold” that relates the maximum complexity of the search space to the amount of linguistic input available to the learner and the performance of the learning procedure. The coherence threshold represents an evolutionary stability condition for the language acquisition device. Only a universal grammar that operates above the coherence threshold can induce and maintain coherent communication in a population.

ACKNOWLEDGMENTS

Support from the Alfred P. Sloan foundation is gratefully acknowledged.

Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×

REFERENCES

Aitchinson, J. 1996. The Seeds of Speech. New York: Cambridge University Press.


Batali, J. 1994. Innate Biases and Critical Periods. Pp. 160–171 in Artificial Life IV, edited by R. Brooks and P. Maes. Cambridge, Mass.: MIT Press.

Bickerton, D. 1990. Language and Species. Chicago, Ill.: University of Chicago Press.


Cangelosi, A. 2001. Evolution of communication and language using signals, symbols and words. IEEE Transactions on Evolutionary Computation 5(2): 93–101.

Cangelosi, A., and D. Parisi, eds. 2001. Simulating the Evolution of Language. New York: Springer Verlag.

Chomsky, N.A. 1956. Three models for the description of language. IRE Transactions in Information Theory 2(3): 113–124.

Chomsky, N.A. 1957. Syntactic Structures. New York: Mouton.

Chomsky, N.A. 1965. Aspects of the Theory of Syntax. Cambridge, Mass.: MIT Press.

Chomsky, N.A. 1972. Language and Mind. New York: Harcourt Brace Jovanovich.

Christiansen, M.H., and S. Kirby, eds. 2003. Language Evolution: The States of the Art. New York: Oxford University Press.


Deacon, T. 1997. The Symbolic Species. London, U.K.: Penguin Books.

Dunbar, R. 1996. Grooming, Gossip, and the Evolution of Language. Cambridge, U.K.: Cambridge University Press.


Eigen, M., and P. Schuster. 1979. The Hypercycle: A Principle of Natural Self-Organisation. Berlin: Springer Verlag.


Fitch, W.T. 2000. The evolution of speech: a comparative review. Trends in Cognitive Science 4(7): 258–267.


Gopnik, M., and M. Crago. 1991. Familial aggregation of a developmental language disorder. Cognition 39(1): 1–50.


Harrison, M.A. 1978. Introduction to Formal Language Theory. Reading, Mass.: Addison-Wesley.

Hauser, M.D. 1996. The Evolution of Communication. Cambridge, Mass.: Harvard University Press.

Hawkins, J.A., and M. Gell-Mann. 1992. The Evolution of Human Languages. Reading, Mass.: Addison-Wesley.

Hurford, J.R., and S. Kirby. 1998. Co-evolution of Language-Size and the Critical Period. Pp. 39–63 in New Perspectives on the Critical Period Hypothesis and Second Language Acquisition, edited by D. Birdsong. Mahwah, N.J.: Lawrence Erlbaum Associates.

Hurford, J.R., M. Studdert-Kennedy, and C. Knight, eds. 1998. Approaches to the Evolution of Language. Cambridge, U.K.: Cambridge University Press.


Jackendoff, R. 1997. The Architecture of the Language Faculty. Cambridge, Mass.: MIT Press.

Jackendoff, R. 1999. Parallel constraint-based generative theories of language. Trends in Cognitive Science 3(10): 393–400.

Jackendoff, R. 2001. Foundations of Language. New York: Oxford University Press.


Knight, C., M. Studdert-Kennedy, and J. Hurford. 2000. The Evolutionary Emergence of Language: Social Function and the Origins of Linguistic Form. New York: Cambridge University Press.

Komarova, N.L. 2004. Replicator-mutator equation, universality property and population dynamics of learning. Journal of Theoretical Biology 230(2): 227–239.

Komarova, N.L., and M.A. Nowak. 2001a. Natural selection of the critical period for language acquisition. Proceedings: Biological Sciences 268(1472): 1189–1196.

Komarova, N.L., and M.A. Nowak. 2001b. The evolutionary dynamics of the lexical matrix. Bulletin of Mathematical Biology 63(3): 451–484.

Komarova, N.L. and M.A. Nowak. 2001c. Population Dynamics of Grammar Acquisition. Pp. 149–164 in Simulating the Evolution of Language, edited by A. Cangelosi and D. Parisi. London: Springer Verlag.

Komarova, N.L., and I. Rivin. 2003. Harmonic mean, random polynomials and stochastic matrices. Advances in Applied Mathematics 31(2): 501–526.

Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×

Komarova, N.L., P. Niyogi, and M.A. Nowak. 2001. The evolutionary dynamics of grammar acquisition. Journal of Theoretical Biology 209(1): 43–59.


Lai, C.S., S.E. Fisher, J.A. Hurst, F. Vargha-Khadem, and A.P. Monaco. 2001. A forkhead-domain gene is mutated in a severe speech and language disorder. Nature 413(6855): 519–523.

Levin, S. 2002. Complex adaptive systems: exploring the known, the unknown and the unknowable. Bulletin of the American Mathematical Society 40(1): 3–19.

Lieberman, P. 1984. The Biology and Evolution of Language. Cambridge, Mass.: Harvard University Press.

Lieberman, P. 1991. On the Evolutionary Biology of Speech and Syntax. Pp. 409–429 in Language Origin: A Multidisciplinary Approach, edited by B. Bichakjian, A. Nocentini, and B. Chiareli. Dordrecht, The Netherlands: Kluwer.


Maynard Smith, J., and E. Szathmary. 1995. The Major Transitions in Evolution. New York: Oxford University Press.

Mitchener, W.G. 2003. Bifurcation analysis of the fully symmetric language dynamical equation. Journal of Mathematical Biology 46(3): 265–285.


Nowak, M.A., and N.L. Komarova. 2001. Towards an evolutionary theory of language. Trends in Cognitive Sciences 5(7): 288–295.

Nowak, M.A., N.L. Komarova, and P. Niyogi. 2001. Evolution of universal grammar. Science 291(5501): 114–118.

Nowak, M.A., N.L. Komarova, and P. Niyogi. 2002. Computational and evolutionary aspects of language. Nature 417(6889): 611–617.


Oliphant, M. 1999. The learning barrier: moving from innate to learned systems of communication. Adaptive Behavior 7(3/4): 371–384.


Pinker, S., and A. Bloom. 1990. Natural language and natural selection. Behavioral and Brain Sciences 13(4): 707–784.


Smith, K. 2002. The cultural evolution of communication in a population of neural networks. Connection Science 14(1): 65–84.

Steels, L. 2000. Language as a Complex Adaptive System. Pp. 17–26 in Parallel Problem Solving from Nature. PPSN-VI, edited by M. Schoenauer, K. Deb, G. Rudolph, X. Yao, E. Lutton, J.J. Merelo, and H.-P. Schwefel. Lecture Notes in Computer Science 2000. New York: Springer Verlag.

Steels, L. 2001. Language Games for Autonomous Robots. Pp. 16–22 in IEEE Intelligent Systems, Vol. 16, edited by N. Shadbolt. New York: IEEE Press.

Steels, L., and F. Kaplan. 1998. Spontaneous Lexicon Change. Pp. 1243–1249 in Proceedings of COLING-ACL. Montreal: Association for Computational Linguistics.


Vargha-Khadem, F., K.E. Watkins, C.J. Price, J. Ashburner, K.J. Alcock, A. Connelly, R.S.J. Frack-owiak, K.J. Friston, M.E. Pembrey, M. Mishkin, D.G. Gadian, and R.E. Passingham. 1998. Neural basis of an inherited speech and language disorder. Proceedings of the National Academy of Sciences 95(21): 12695–12700.


Wexler, K., and P. Culicover. 1980. Formal Principles of Language Acquisition. Cambridge, Mass.: MIT Press.

Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×

This page intentionally left blank.

Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×
Page89
Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×
Page90
Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×
Page91
Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×
Page92
Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×
Page93
Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×
Page94
Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×
Page95
Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×
Page96
Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×
Page97
Suggested Citation:"Population Dynamics of Human Language: A Complex System." National Academy of Engineering. 2006. Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11577.
×
Page98
Next: Agent-Based Modeling as a Decision-Making Tool »
Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2005 Symposium Get This Book
×
Buy Paperback | $50.00 Buy Ebook | $40.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This volume includes 16 papers from the National Academy of Engineering's 2005 U.S. Frontiers of Engineering (USFOE) Symposium held in September 2005. USFOE meetings bring together 100 outstanding engineers (ages 30 to 45) to exchange information about leading-edge technologies in a range of engineering fields. The 2005 symposium covered four topic areas: ID and verification technologies, engineering for developing communities, engineering complex systems, and energy resources for the future. A paper by dinner speaker Dr. Shirley Ann Jackson, president of Rensselaer Polytechnic Institute, is also included. The papers describe leading-edge research on face and human activity recognition, challenges in implementing appropriate technology projects in developing countries, complex networks, engineering bacteria for drug production, organic-based solar cells, and current status and future challenges in fuel cells, among other topics. Appendixes include information about contributors, the symposium program, and a list of meeting participants. This is the eleventh volume in the USFOE series.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!