National Academies Press: OpenBook

Behavioral Economics: Policy Impact and Future Directions (2023)

Chapter: 13 Implementing Behavioral Economics Approaches

« Previous: 12 Conducting and Disseminating Behavioral Economics Research
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

13

Implementing Behavioral Economics Approaches

Despite strong evidence from across domains and contexts that strategies based on behavioral economics can contribute significantly to important policy objectives, there is also evidence of how challenging it is to apply this academic evidence beyond the scale and setting of the research studies (e.g., Haines & Donald, 1998; Bogenschneider & Corbett, 2010; Kajermo et al., 2010).1 These challenges are in no way unique to the application of ideas from behavioral economics to policy—generalizable insights that work in the real world are elusive in nearly all social science fields (National Research Council, 2012).

A study of the adoption of the results of randomized controlled trials in behavioral economics research, funded by the U.S. Agency for International Development, estimated that less than one-third of the ideas tested were implemented at scales that could yield significant policy results (Kremer, Rao, & Schilbach, 2019). Others have estimated that the so-called voltage drop—the decline in effect sizes that can be expected when an evidence-based program is implemented at a broad scale—may range between 50 and 90 percent (List, 2022). It is reasonable to expect that a significant percentage of ideas tested will not bear fruit, and it is likely that the percentage will vary across domains and types of research. Nevertheless, trying to understand why promising ideas sometimes do not succeed when broadly implemented is clearly important.

___________________

1 This section draws on the paper by Elizabeth Linos commissioned for this study, available at https://nap.nationalacademies.org/resource/26874/NASEM_Commissioned_Report_Linos.pdf

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

This chapter provides an overview of the challenges that confront policy makers and practitioners when they attempt to use behavioral economics strategies to address policy issues, and ways of addressing those challenges. The process of translating research findings to effective, broad-scale, real-world applications is complex and, ideally, involves an interactive feedback loop that links theory, experimentation, design, evaluation, and implementation. This chapter examines the circumstances that affect whether research intended for policy applications succeeds, focusing on the goals of researchers and policy makers, administrative issues, the accessibility of research, and the challenges of translating the relevant findings into a new context and putting them to work at the necessary scale.

CONNECTING RESEARCH AND PRACTICE

Much of what is known about behavioral economics has been discovered by academic researchers using randomized controlled trials (e.g., Tversky & Kahneman, 1985). In this work, researchers often control the setting, which guarantees that the experimental conditions necessary to arrive at causal estimates are met. Such studies often involve the researchers’ own students and are carried out in research labs: that is, they often rely on highly structured study conditions and narrow populations.2 However, the testing of behavioral economics ideas in the public sector is also essential, despite the sometimes competing objectives of researchers and policy makers.

Researchers and Policy Makers: Different Goals

Researchers are primarily interested in testing hypotheses and expanding understanding of scientific ideas that are generalizable; thus, they design and test interventions under carefully controlled conditions. Policy makers and others in the public sphere are equally interested in evidence about interventions, but they are also focused primarily on finding solutions for social problems and achieving their policy objectives. Those agencies and actors value evidence of effectiveness, but their primary goal is not necessarily to generate knowledge for its own sake or to be concerned about findings that do not contribute to their policy objectives. They are concerned about the populations they serve, their political preferences, and other factors in

___________________

2 This was the case for the seminal work of Tversky & Kahneman (1985) on human heuristics and cognitive biases, as well as the seminal work of Madrian & Shea (2001) on the importance of default opt-out retirement policies in a large U.S. corporation where the researchers were given extensive control of the experimental setting, and the Volpp et al. (2009) study of a smoking cessation program at General Electric.

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

their immediate contexts. Outside the public sphere, behavioral economics research is also conducted by private, for-profit entities, who use behavioral economics ideas for managing personnel policies at their firms, as well as for marketing and other purposes. They generate findings in response to the questions that arise in their businesses and are not necessarily focused on sharing those results or actively engaging with the research community.

Behavioral Principles in Policy Practice

Research in the settings of public-sector agencies has focused attention on the choices that the public administrator makes, in addition to the decisions made by the people whom the intervention is designed to influence (e.g., Thaler & Sunstein, 2009; Sanders, Snijders, & Hallsworth, 2018). This work has pointed to notable differences in how behavioral economics principles work in the administration of public policy—as opposed to the context of individual decision making.

One key idea from this work is that the cost-benefit model of traditional economics is inadequate to explain the choices that people make when interacting with government administrators and agencies (Moynihan, Herd, & Harvey, 2015; Herd & Moynihan, 2019). People face administrative burden (see Chapter 3) when they participate in a program: that is, the combination of requirements, procedures, and policies developed by policy makers and administrators to regulate the program. But those regulations may reduce the likelihood that people will actually use services and programs. Although this issue is acknowledged by traditional economists, behavioral economics research shows that people’s responses to the presence or removal of administrative burdens significantly exceed what would be predicted by traditional economics.

Navigating complicated paperwork and government websites, as well as challenges in communicating with government agencies, are examples of administrative burden. Individuals experience learning costs when they interact with government agencies and have to become familiar with the benefits and entry requirements for different social programs. Individuals may also experience psychological costs, such as stigma associated with program participation or dissonance between their self-image and their stereotype of who needs assistance. Program recipients also may face compliance costs, the burdens of maintaining their eligibility to receive benefits, such as recertification and follow-up interviews.

It has been suggested that public administrators may use these administrative burdens to adjust the flow of cases and the disbursement of expensive benefits and that the origins of the burdens in the policy and the political processes deserve further study (Herd & Moynihan, 2019). The example of the Free Application for Federal Student Aid (FAFSA; see Chapter

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

9) illustrates the importance of this issue: simplifying FAFSA forms is not the most efficacious way to connect families to financial aid, and certain groups may be disproportionately affected by the status quo and would be better able to participate if the system was redesigned.

Another claim is that the perceived manipulation that occurs with choice architecture in the public sector can lead to reactance (De Jonge, Zeelenberg, & Verlegh, 2018), that people will respond negatively when they perceive that others, particularly government agencies, are attempting to limit their freedom of choice (Brehm, 1966). Reactance often causes people to make choices that are not consistent with their preferences but allow them to feel in control of the situation (Jachimowicz et al., 2017). The result is people making suboptimal choices, choices they would not have made in the absence of the intervention (Jung & Mellers, 2016).

An example from the Netherlands illustrates reactance (Krijnen, Tannenbaum, & Fox, 2017). In 2016, the Netherlands passed a law that presumed a person’s consent for organ donation rather than using opt-in consent. The goal of the legislation was to increase organ donation; it was based on overwhelming evidence showing that opt-out defaults increased participation. In this case, however, the policy choice created reactance, and the number of people opting out rose dramatically, including among those who had previously elected to donate. While the number opting out eventually declined and the policy ultimately increased organ donation, the negative reactance had long-term effects. Thus, it has been argued that, to avoid reactance, nudges should preserve freedom of choice and provide alternatives (Sunstein, 2017).

The conclusion from this and similar work is that the conditions that are possible in designing research to be carried out with a public-sector partner in the real world are actually quite different from the ideal research design (Fels, 2022). In public administration settings, the public partner frequently controls access to the data; the research environment; and, ultimately, the authority to permit the research. This necessary control leads to conflicting goals for the intervention. Academic researchers seek generalizable knowledge, and therefore they place the highest priority on a strong causal research design. In contrast, a public administrator’s primary goal is to understand the success of their particular intervention in a specific context (Sanders, Snijders, & Hallsworth, 2018). Following a study design closely is not generally a top priority for a public administrator, who may be uncomfortable with the need to withhold an intervention from people they perceive to be in need, even if it is done through randomized selection. Additionally, they may be comfortable with less firmly supported claims as a basis for action than researchers look for. Moreover, many public administrators have decided that the intervention being studied is effective before

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

agreeing to use it, so some of the study requirements may conflict with their public service motivation (Glennerster & Takavarasha, 2013).

Another way in which research in a public-sector context may differ from academic studies has to do with novelty. Findings about applications that produce novel results, such as the importance of social norms regarding utility use, appeal to academic audiences: they are easier to publish and gain attention from academic peers. This phenomenon is illustrated in a study of the social norming principle, which also illustrates the differing motivations of researchers and administrators. The study showed that people who received notices suggesting they used more energy than their neighbors reduced their own consumption by two percent (Allcott, 2011). The public administrator was eager to apply this finding on a wide scale. However, the researchers suspected that the novelty of this intervention, which attracted the public’s attention and thus had salient motivation, was driving the results. If that were the case, repeated exposure would likely reduce the efficacy of the treatment over time and ultimately mean that it was less valuable for the administrator’s goals than it had seemed (Sanders, Snijders, & Hallsworth, 2018).

ACCESSIBILITY OF RESEARCH

To benefit from evidence, policy makers and others in the public sector need to know it exists, understand what it means, and value its potential to help with the problem they want to solve (Linos, 2022).3 Few policy makers have ample time to follow academic research in depth, and most want to be made aware of new research that is relevant to their work as efficiently as possible.4 Several factors influence the ease with which policy makers can identify and gain access to research findings that are reliable and relevant to their work.

Persistent problems with publication bias (see Chapter 12) can distort impressions of the state of a field even for trained experts. If the evidence most likely to be shared widely reports surprising success stories (that may or may not ultimately be replicated), a policy maker’s ability to understand what evidence to implement is severely limited. Policy makers vary in the training and experience they have had in understanding and applying academic research and in the value and importance they believe it has for their own policy work. Moreover, the same behavioral biases that may affect how any person interprets new information (e.g., present bias, limited

___________________

3 This and the following section draw in part on ideas in Linos (2022), commissioned for the project.

4 For a detailed discussion of this problem, see National Academies of Sciences, Engineering, and Medicine (2022).

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

attention; see Chapter 3) also apply to policy makers and practitioners (e.g., Moynihan & Lavertu, 2012; Bellé, Cantarelli, & Belardinelli, 2017; Battaglio et al., 2019).

Intermediary institutions, such as think tanks and clearinghouses, explicitly attempt to bridge the gaps between academic researchers and policy makers. By translating research findings into everyday language, such institutions support policy makers in understanding what has been learned in several ways, particularly by weighing more rigorous studies more heavily and by synthesizing results; in these ways, they make it easier to digest the main findings in large bodies of work. For example, the What Works Centre, funded by the government of the United Kingdom, is designed to help public-sector organizations “create, share, and use high-quality evidence in decision-making” (What Works Network, 2013, Introduction). Another example from the United Kingdom is the independent nonprofit Education Endowment Foundation (EEF), which not only funds education research but also collects and disseminates policy-relevant evidence related to the relationship between family income and educational achievement in ways that are easy to digest.5 The EEF categorizes different types of educational interventions on the basis of their cost, the likely effect size, and how rigorous the evidence base is.

With respect to publication bias, improving research transparency would be a substantial benefit (see Chapter 12). The work of intermediary organizations and efforts to improve research transparency are valuable, but research is needed to improve understanding of the challenges policy makers and practitioners face in learning about and understanding new evidence. For example, it would be helpful to know who in a government agency needs to know and understand the evidence for it to be adopted. Some theories of change emphasize the role of political leaders as central to evidence adoption (e.g., Damanpour & Schneider, 2009), but others focus more on individual knowledge brokers in more mid-level roles (e.g., Ward, House, & Hamer, 2009; Smits et al., 2018; Meza et al., 2021). Others argue that innovation depends on distributed innovators across teams and more comprehensive interventions across multiple policy positions (e.g., Grol & Grimshaw, 2003; Meijer, 2014). Research on how training, peer networks, and communities of practice help to spread knowledge would be valuable, as would research on the effects of the materials and dissemination efforts of intermediate organizations on policy makers.

___________________

5 See https://educationendowmentfoundation.org.uk/

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

TRANSLATING EVIDENCE FOR USE

Once a policy maker identifies an evidence-based intervention that targets a specified policy objective, the next challenge is to translate the intervention for a particular context: there will never be an “off-the-shelf” intervention that has been tested that can be applied in a new context without modification. Researchers themselves face challenges in replicating studies in different environments, and the challenges for policy makers are greater. Policy makers need to identify the components of an intervention that are essential to its effectiveness, understanding the mechanism through which it brings about change and how that mechanism will function in the policy context in which it is to be used. Ideally, policy makers would rely on evidence produced in a wide set of contexts to evaluate the relevance of research findings, including settings that are not primarily WEIRD (western, educated, industrialized, rich, and democratic) when evaluating the relevance of research findings, but this is not always possible.

Because the needed work requires skills, authority, and resources, there has been a growing recognition of the need for full-time employees who focus on data, research, and evaluation in government agencies. Increasingly, agencies have created evidence teams and fellowship programs to help develop the infrastructure needed to translate research into a policy context. Local governments may supplement their resources with hired experts. At the federal level, the Office of Management and Budget has focused on ensuring that all agencies have the capacity to use evidence in decision making.

A separate challenge is adopting an approach at scale. As noted above, effect sizes are generally lower when interventions are taken to scale; reduction in effects is likely partly a product of site selection bias. That is, the sites selected to first test a new behavioral intervention may be correlated with the likelihood of impact (Allcott, 2015). Sites for study are likely selected precisely because the population may be especially amenable to the approach, and study samples may not be representative of the population at large.

There is no substitute for testing new behavioral insights that are candidates for translation from research to practice. The Office of Evaluation Sciences (OES), a part of the federal General Services Administration, supports and conducts randomized controlled trials of research-based ideas that are brought to the public sector. It is worth noting that, as providers of technical assistance to federal agencies, the OES staff focus on making progress with policy objectives rather than on seeking opportunities to test theories or behavioral insights. When designing experiments, OES draws on both the academic literature and previous experiments by the government. By testing similar behavioral concepts in many settings, OES is not

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

only creating empirical evidence of which behavioral insights translate but also actively measuring effect sizes at scale, providing estimates of average effects that internalize any voltage drop.6

The OES example is very useful, but more work on the details of translating intervention mechanisms to new contexts will be valuable. Replicating findings in more diverse contexts is a logical goal, but other approaches could include involving representatives of the population the intervention is intended to help in designing adaptations, as well as the frontline workers who will deliver it.

The OES example is an illustration of a successful unit at the federal level in the United States. Many other successful examples, such as the Behavioral Insights Team (a nudge unit) in the United Kingdom, are also at the central government level. Creating design, evaluation, and implementation units at the state and local levels is much more challenging because public agencies at those levels have many fewer resources and expert personnel. Some cities have successfully established such units, generally in the form of nudge units. An example is The Lab @ DC, operated by the office of the mayor of the District of Columbia, but city-level units are not common.7 State and local governments are more likely to need the help of external organizations, either intermediary institutions like those discussed above or consulting organizations that can replace the need for in-house government staff.

ADOPTING INTERVENTIONS AT A BROAD SCALE

Even when a solid body of evidence exists and needed changes for the context have been carefully addressed, there is no guarantee that an intervention will perform as expected when implemented on a broad scale, whether in the public or private sector (Athey & Luca, 2019; DellaVigna, Kim, & Linos, 2022; List, 2022). This is true for virtually all interventions, not just behavioral ones. One study of 73 randomized controlled trials conducted in U.S. cities showed that only one-third of the tested behavioral treatments were ultimately adopted (DellaVigna, Kim, & Linos, 2022).

Researchers have investigated why some organizations are better able to implement interventions based on solid evidence. They suggest that larger entities that have greater resources and larger staffs are in a better position to set up routines and practices for transferring knowledge and for learning as an organization—and are therefore better equipped to act effectively on new evidence (e.g., Besley & Persson, 2009; Moynihan & Landuyt, 2009; Argote & Miron-Spektor, 2011; Bekkers, Tummers, & de Vries, 2015). Two

___________________

6 Recently, OES has also committed to publishing all preanalysis plans (Linos, 2022).

7 See https://thelabprojects.dc.gov

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

other factors are important in policy settings. A change in political leadership may create obstacles to knowledge transfer and may also introduce political calculations about efforts associated with a previous administration. Such political transitions also often entail employee turnover, particularly of career civil servants who may have played a critical role in innovation. The review of 73 randomized controlled trials noted above also suggests that incremental improvements to existing infrastructure are much more likely to be adopted than completely new programs (DellaVigna, Kim, & Linos, 2022).

CONCLUSION AND RECOMMENDATIONS

What would make it easier for policy makers and practitioners to implement the evidence-based approaches they know about and value that are readily applicable to their contexts? We see two avenues for strengthening this capacity: (1) increased attention to collaboration among those trained in behavioral economics and those trained in implementation science or public management, and (2) improved training in behavioral economics to help prepare policy makers and staff to collaborate in translating research ideas for real-world policy development and design. A new subfield of public administration scholarship dedicated to using evidence from behavioral economics has also started to gain steam, but in the committee’s view, behavioral economics or another field that directly addresses behavior should be a core element of the curriculum for students preparing for careers in public administration (e.g., Grimmelikhuijsen et al., 2017).

Conclusion 13-1: Collaboration among researchers and policy makers is invaluable both for the continued development of knowledge about the application of behavioral economics to policy and for the development of effective policies. The development of strong intermediary institutions that can function to bring the two groups together and to assist in the translation between different languages could contribute to such collaborations.

Recommendation 13-1: Government units should consider adopting the example of the Office of Evaluation Sciences, in the General Services Administration, to support and fund in-house capabilities for integrating behavioral specialists into policy development, such as through institutional structures that facilitate learning and collaboration among policy makers and researchers in the design, implementation, and evaluation of behavioral economics–based policies in all relevant domains. The use of temporary research appointments and consulting organizations

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

could bring expertise and assistance to state and local government entities that cannot afford permanent in-house staff.

Recommendation 13-2: University leaders should ensure that training in the principles of behavioral economics and critical thinking about their translation and application to policy making is a core component of training for students pursuing degrees in public administration.

REFERENCES

Allcott, H. (2011). Social norms and energy conservation. Journal of Public Economics, 95(9–10), 1082–1095. https://doi.org/10.1016/j.jpubeco.2011.03.003

———. (2015). Site selection bias in program evaluation. The Quarterly Journal of Economics, 130(3), 1117–1165. https://doi.org/10.1093/qje/qjv015

Argote, L., & Miron-Spektor, E. (2011). Organizational learning: From experience to knowledge. Organization Science, 22(5), 1123–1137. https://doi.org/10.1287/orsc.1100.0621

Athey, S., & Luca, M. (2019). Economists (and economics) in tech companies. Journal of Economic Perspectives, 33(1), 209–230. https://doi.org/10.1257/jep.33.1.209

Battaglio Jr., R. P., Belardinelli, P., Bellé, N., & Cantarelli, P. (2019). Behavioral public administration ad fontes: A synthesis of research on bounded rationality, cognitive biases, and nudging in public organizations. Public Administration Review, 79(3), 304–320. https://doi.org/10.1111/puar.12994

Bekkers, V. J. J. M., Tummers, L., & de Vries, H. (2015). Innovation in the public sector: A systematic review and future research agenda. Public Administration, 94(1), 146–166. https://doi.org/10.1111/padm.12209

Bellé, N., Cantarelli, P., & Belardinelli, P. (2017). Cognitive biases in performance appraisal: Experimental evidence on anchoring and halo effects with public sector managers and employees. Review of Public Personnel Administration, 37(3), 275–294. https://doi.org/10.1177/0734371X17704891

Besley, T., & Persson, T. (2009). The origins of state capacity: Property rights, taxation, and politics. American Economic Review, 99(4), 1218–1244. https://doi.org/10.1257/aer.99.4.1218

Bogenschneider, K., & Corbett, T. I. (2010). Evidence-based policymaking: Insights from policy-minded researchers and research-minded policymakers. Routledge.

Brehm, J. W. (1966). A theory of psychological reactance. Academic Press.

Damanpour, F., & Schneider, M. (2009). Characteristics of innovation and innovation adoption in public organizations: Assessing the role of managers. Journal of Public Administration Research and Theory, 19(3), 495–522. https://doi.org/10.1093/jopart/mun021

De Jonge, P., Zeelenberg, M., & Verlegh, P. W. (2018). Putting the public back in behavioral public policy. Behavioural Public Policy, 2(2), 218–226. https://doi.org/10.1017/bpp.2018.23

DellaVigna, S., Kim, W., & Linos, E. (2022). Bottlenecks for evidence adoption. NBER Working Paper 30144. National Bureau of Economic Research. https://doi.org/10.3386/w30144

Fels, K. M. (2022). Who nudges whom? Expert opinions on behavioural field experiments with public partners. Behavioural Public Policy, 1–37. https://doi.org/10.1017/bpp.2022.14

Glennerster, R., & Takavarasha, K. (2013). Running randomized evaluations: A practical guide. Princeton University Press. https://doi.org/10.1515/9781400848447

Grimmelikhuijsen, S., Jilke, S., Olsen, A. L., & Tummers, L. (2017). Behavioral public administration: Combining insights from public administration and psychology. Public Administration Review, 77(1), 45–56. https://doi.org/10.1111/puar.12609

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

Grol, R., & Grimshaw, J. (2003). From best evidence to best practice: Effective implementation of change in patients’ care. The Lancet, 362(9391), 1225–1230. https://doi.org/10.1016/S0140-6736(03)14546-1

Haines, A., & Donald, A. (1998). Making better use of research findings. British Medical Journal, 317(7150), 72–75. https://doi.org/10.1136/bmj.317.7150.72

Herd, P., & Moynihan, D. P. (2019). Administrative burden: Policymaking by other means. Russell Sage Foundation.

Jachimowicz, J. M., Chafik, S., Munrat, S., Prabhu, J. C., & Weber, E. U. (2017). Community trust reduces myopic decisions of low-income individuals. Proceedings of the National Academy of Sciences, 114(21), 5401–5406. https://doi.org/10.1073/pnas.1617395114

Jung, J. Y., & Mellers, B. A. (2016). American attitudes toward nudges. Judgment & Decision Making, 11(1). https://journal.sjdm.org/15/15824a/jdm15824a.pdf

Kajermo, K. N., Boström, A. M., Thompson, D. S., Hutchinson, A. M., Estabrooks, C. A., & Wallin, L. (2010). The BARRIERS scale—The barriers to research utilization scale: A systematic review. Implementation Science, 5(1), 1–22.

Kremer, M., Rao, G., & Schilbach, F. (2019). Behavioral development economics. Handbook of behavioral economics: Applications and foundations 1, 2, 345–458. North-Holland. https://doi.org/10.1016/bs.hesbe.2018.12.002

Krijnen, J. M., Tannenbaum, D., & Fox, C. R. (2017). Choice architecture 2.0: Behavioral policy as an implicit social interaction. Behavioral Science & Policy, 3(2), i–18. https://doi.org/10.1353/bsp.2017.0010

Linos, E. (2022). Translating behavioral economics evidence into policy and practice. Commissioned paper prepared for the Committee on Future Directions for Applying Behavioral Economics to Policy, National Academies of Sciences, Engineering, and Medicine. https://nap.nationalacademies.org/resource/26874/NASEM_Commissioned_Report_Linos.pdf

List, J. A. (2022). The voltage effect: How to make good ideas great and great ideas scale. Currency.

Madrian, B. C., & Shea, D. F. (2001). The power of suggestion: Inertia in 401(k) participation and savings behavior. The Quarterly Journal of Economics, 116(4), 1149–1187. https://doi.org/10.1162/003355301753265543

Meijer, A. J. (2014). From hero-innovators to distributed heroism: An in-depth analysis of the role of individuals in public sector innovation. Public Management Review, 16(2), 199–216. https://doi.org/10.1080/14719037.2013.806575

Meza, R. D., Triplett, N. S., Woodard, G. S., Martin, P., Khairuzzaman, A. N., Jamora, G., & Dorsey, S. (2021). The relationship between first-level leadership and inner-context and implementation outcomes in behavioral health: A scoping review. Implementation Science, 16(1), 69. https://doi.org/10.1186/s13012-021-01104-4

Moynihan, D. P., & Landuyt, N. (2009). How do public organizations learn? Bridging cultural and structural perspectives. Public Administration Review, 69(6), 1097–1105. https://doi.org/10.1111/j.1540-6210.2009.02067.x

Moynihan, D. P., & Lavertu, S. (2012). Does involvement in performance management routines encourage performance information use? Evaluating GPRA and PART. Public Administration Review, 72(4), 592–602. https://doi.org/10.1111/j.1540-6210.2011.02539.x

Moynihan, D., Herd, P., & Harvey, H. (2015). Administrative burden: Learning, psychological, and compliance costs in citizen-state interactions. Journal of Public Administration Research and Theory, 25(1), 43–69. https://doi.org/10.1093/jopart/muu009

National Academies of Sciences, Engineering, and Medicine. (2022). Ontologies in the behavioral sciences: Accelerating research and the spread of knowledge. The National Academies Press. https://nap.nationalacademies.org/login.php?record_id=26464

National Research Council. (2012). Using science as evidence in public policy. The National Academies Press. https://nap.nationalacademies.org/login.php?record_id=13460

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×

Sanders, M., Snijders, V., & Hallsworth, M. (2018). Behavioural science and policy: Where are we now and where are we going? Behavioural Public Policy, 2(2), 144–167. https://doi.org/10.1017/bpp.2018.17

Smits, P., Denis, J.-L., Préval, J., Lindquist, E., & Aguirre, M. (2018). Getting evidence to travel inside public systems: What organisational brokering capacities exist for evidence-based policy? Health Research Policy and Systems, 16(1), 122. https://doi.org/10.1186/s12961-018-0393-y

Sunstein, C. R. (2017). Human agency and behavioral economics: Nudging fast and slow. Springer.

Thaler, R. H., & Sunstein, C. R. (2009). Nudge: Improving decisions about health, wealth, and happiness. Penguin Books.

Tversky, A., & Kahneman, D. (1985). The framing of decisions and the psychology of choice. Behavioral decision making, 25–41. Springer. https://doi.org/10.1007/978-1-4613-2391-4_2

Volpp, K. G., Troxel, A. B., Pauly, M. V., Glick, H. A., Puig, A., Asch, D. A., Galvin, R., Zhu, J., Wan, F., DeGuzman, J., & Corbett, E. (2009). A randomized, controlled trial of financial incentives for smoking cessation. New England Journal of Medicine, 360, 699–709. https://doi.org/10.1056/NEJMsa0806819

Ward, V., House, A., & Hamer, S. (2009). Knowledge brokering: The missing link in the evidence to action chain? Evidence & Policy, 5(3), 267–279. https://doi.org/10.1332/174426409X463811

What Works Network. (2013). What Works Network. Last updated January 17, 2023. https://www.gov.uk/guidance/what-works-network

Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 201
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 202
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 203
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 204
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 205
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 206
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 207
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 208
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 209
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 210
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 211
Suggested Citation:"13 Implementing Behavioral Economics Approaches." National Academies of Sciences, Engineering, and Medicine. 2023. Behavioral Economics: Policy Impact and Future Directions. Washington, DC: The National Academies Press. doi: 10.17226/26874.
×
Page 212
Next: 14 Advancing the Field of Behavioral Economics »
Behavioral Economics: Policy Impact and Future Directions Get This Book
×
 Behavioral Economics: Policy Impact and Future Directions
Buy Paperback | $25.00 Buy Ebook | $20.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Behavioral economics - a field based in collaborations among economists and psychologists - focuses on integrating a nuanced understanding of behavior into models of decision-making. Since the mid-20th century, this growing field has produced research in numerous domains and has influenced policymaking, research, and marketing. However, little has been done to assess these contributions and review evidence of their use in the policy arena.

Behavioral Economics: Policy Impact and Future Directions examines the evidence for behavioral economics and its application in six public policy domains: health, retirement benefits, climate change, social safety net benefits, climate change, education, and criminal justice. The report concludes that the principles of behavioral economics are indispensable for the design of policy and recommends integrating behavioral specialists into policy development within government units. In addition, the report calls for strengthening research methodology and identifies research priorities for building on the accomplishments of the field to date.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!