National Academies Press: OpenBook

Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community (2016)

Chapter: 5 Social Science and Behavioral Economics of Privacy - Panel Summary

« Previous: 4 Privacy Implications of Emerging Technologies Part II - Panel Summary
Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×

5

Social Science and Behavioral Economics of Privacy—Panel Summary

REMARKS FROM PANELISTS

Frederick Chang, director of the Darwin Deason Institute for Cyber Security, the Bobby B. Lyle Endowed Centennial Distinguished Chair in Cyber Security, and professor in the Department of Computer Science and Engineering in the Lyle School of Engineering at Southern Methodist University, introduced the next panel on the topics of attitudes, preferences, and behaviors as they relate to privacy. He noted that the panel would touch on attitudinal surveys, privacy behaviors, how much people pay for privacy, and the societal impacts of privacy, informed by the panelists’ backgrounds in economics, behavioral economics, computer science, psychology, media and communications, law, and information systems. Chang, as moderator, introduced the following panelists and gave each of them 5 minutes for opening comments:

  • Idris Adjerid, assistant professor of management, University Notre Dame;
  • Jessica Staddon, associate professor of computer science, North Carolina State University;
  • Joseph Turow, Robert Lewis Shayon Professor of Communication at the Annenberg School for Communication, University of Pennsylvania; and,
  • Katherine Strandburg, Alfred B. Engelberg Professor of Law, New York University.

Idris Adjerid discussed his research on the economics of privacy with a focus on behavioral economics. Through his work, he has found that privacy decision making is particularly susceptible to deviations from rational choice. This runs counter to the common assumption that people have consistent privacy preferences, make rational decisions, and act in their own best interest. He noted that there are many examples that illustrate such irrational privacy behavior.

He highlighted the specific example of the control paradox, the phenomenon where the illusion of control can be comforting. Experimental work has shown that giving users more controls puts them at ease, and makes them more willing to disclose information—whether or not the controls actually enhance benefits or reduce risk.1 This work suggests that systematic changes can be induced in people’s behavior by manipulating subtle or even insubstantial factors—counter to rational models of behavior.

One study demonstrating this effect provided a set of substantive privacy controls under different names to participants. Participants presented with the options labeled “privacy settings” were 56 percent more likely to actually use protective options compared to a group given the same options labeled “survey settings.” A similar experiment suggested that perceived changes in the risk of data disclosure can have a more profound effect on behavior than the objective differences in risk.

Adjerid noted the large amount of evidence suggesting that people are bad at making privacy decisions, but also that this is not always the case. For example, a recent study out of MIT by Catherine Tucker and Alex

__________________

1 L. Brandimarte, A. Acquisti, and G. Loewenstein, 2013, Misplaced confidences: Privacy and the control paradox, Social Psychological and Personality Science 4(3):340-347.

Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×

Marthews2 found a measureable change in people’s search behavior after the recent disclosures about intelligence collection practices. In particular, people were less likely to search certain terms on Google that previously had been identified as personally sensitive, suggesting a deliberate attempt to mask search interests.

Adjerid has examined how behavioral economics translates to consumers’ decisions about themselves, with an eye to limitations and rationality. Another important topic—on which little research has been conducted—is how individuals make decisions about other people’s privacy, which was touched on in the previous panel’s discussion of tagging images on Facebook. It is unclear whether or not individuals will be good at managing other people’s privacy and which factors affect that ability. He suggested that this question could be relevant to how individuals in the IC approach data decisions involving sensitive data about other people, and how appropriate mindsets might be systematically encouraged.

Jessica Staddon described her background in industry research, most recently at Google. She noted that she was about to begin an academic position at North Carolina State University and was not representing Google at the workshop. She focused her remarks on three main points: (1) the value of transparency, (2) privacy measurement, and (3) industry’s role in the privacy ecosystem.

Like many previous panelists, Staddon emphasized the value of transparency, using insights pulled from her time in industry. She described Google Dashboard, which enables a user to see his or her own search history as tracked by Google. In the face of concerns that the information revealed through such tools might cause users to recoil and use Google less (or even close their accounts), she and her colleagues searched for any evidence that such transparency tools would have a negative effect. They found none. To the contrary, researchers found many positive associations between the use of these tools and feelings of trust and control, based on direct user feedback. She noted that, in general, data owners should be thinking about how to return some value or utility to the users from which the data were derived, and that transparency is one way to provide such utility.

Staddon then discussed challenges surrounding privacy measurement. She noted that many in the privacy research community have been pushing for the same kinds of standards that are found in the hard sciences—more of a culture of repeatability, with more consistency of measurement techniques and more objective measurements. Staddon has found that many research findings are inconsistent—for example, she knows of some very solid papers that find no difference in privacy concerns between genders, and some very solid papers that do. There is also a wide variation in reported rates of concern around certain privacy topics.

She noted that we have a poor knowledge of historical trends. Many questions will be hard to answer without some consensus and some means for documenting privacy incidents. For example:

  • Are privacy incidents becoming more common?
  • What is the most common cause of such incidents?
  • How do these trends vary by geography, or by demographics?

She suggested one source for such data could be a crowd-sourced but moderated (along the lines of Wikipedia) privacy data repository where people could report and document events.

Finally, Staddon expressed concern about the privacy ecosystem—especially the role of industry, which is a huge player. She noted that industry holds a huge amount of data about privacy preferences and behaviors and suggested that it needs to be a part of the conversation, and also needs to work on innovations for privacy. She has heard anecdotes suggesting that many companies are becoming increasingly risk averse when it comes to research and development that could lead to privacy innovations, likely due to regulatory barriers to conducting such research and the potential that findings about the privacy implications of their own practices could result in more scrutiny from oversight bodies.

__________________

2 A. Marthews and C. Tucker, 2014, “Government Surveillance and Internet Search Behavior,” March 23, available at SSRN 2412564, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2412564.

Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×

Joseph Turow described the two major thrusts of his research. He has long studied how companies deal with digital relationships and issues of surveillance, particularly in the marketing and retailing sectors. Since 1999, he has also worked on approximately ten national surveys conducted through major research companies. He went on to summarize some of his basic findings.

First, people generally know that their online activities are being tracked—this is apparent from a range of surveys, as well as from anecdotal reports.

Second, there is nonetheless a huge ignorance about the details of what goes on in the digital arena. For example, people think the government is protecting them more than it actually does. This has been observed around phenomena such as price discrimination. At least two surveys have suggested that people think it is illegal for companies to charge different people different prices for the same goods. Turow suggested that such ignorance is not because people lack intelligence but because they are busy and simply do not have the time to examine and interpret the complicated information that exists about such protections.

Third, there is evidence that people philosophically do not like trade-offs. In one of his recent surveys3

  • 91 percent of respondents disagreed with the statement that receiving a discount is a fair exchange for companies collecting their data without their knowledge;
  • 71 percent disagreed with the statement that it is fair for a store to monitor their online activity while shopping there, in exchange for free access to the store’s Wi-Fi.
  • 51 percent disagreed with the statement that it is okay for a store to use the information it has about them to create a picture of them that would help to provide them with better service.

Nonetheless, many firms say that people accept trade-offs.

In their report on this survey,4 Turow and his coauthors argue that it is not the case that people consent to giving away their personal information in exchange for some benefit; on the contrary, the evidence suggests that Americans are simply resigned to having their data taken.

He noted some specific findings from the study:

  • 84 percent of respondents agreed that “I want to have control over what marketers know about me.”
  • 65 percent of respondents agreed that “I have come to accept that I have little control about what marketers know about me.” — 58 percent agreed with both of these statements.
  • 72 percent of respondents disagreed with the statement that “What companies know about me online cannot hurt me.” — 41 percent of respondents fell into all three of these categories.

Turow said these findings imply that something serious is going on. He suggested that it’s not just people who are suspicious, but companies as well. He recalled a conversation with a contact from a major retailer who was concerned that large Internet companies could share information about customer activities with the retailer’s competitors.

He concluded by noting that these findings and anecdotes raise many interesting questions about harm and public perceptions. He suggested that these findings could relate directly to how people view the IC.

Katherine Strandburg focused her remarks around two themes: (1) intrinsic failures of any online privacy market and (2) the social value of privacy. She began by arguing that any idea of an online privacy market fails because it fails to accurately reflect consumer preferences. In general, markets enhance social welfare because transactions reflect consumer preferences—people are willing to pay what the product is

__________________

3 J. Turow, M. Hennessy, and N. Draper, 2015, “The Tradeoff Fallacy: How Marketers Are Misrepresenting American Consumers and Opening Them Up to Exploitation,” Annenberg School for Communication, University of Pennsylvania, June, https://www.asc.upenn.edu/sites/default/files/TradeoffFallacy_1.pdf.

4 Ibid.

Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×

worth—and this is a good thing. However, in the case of privacy, consumers do not know what price they’re paying when their personal data is being collected.

She reiterated Turow’s point that the fact that consumers acquiesce to the collection of their personal data does not accurately signal their preferences. She said that there are many reasons consumers do not understand the price of giving up their own data, including (1) information asymmetry and (2) behavioral economics factors. She emphasized that the most fundamental problem with the idea of price in this context is that the incremental cost to a consumer of giving up data depends on what data about them is already out there and what data about them will be disclosed in the future by anyone. She said that the price or cost of giving up any single piece of data in any single transaction is thus effectively unknowable, so the idea of a privacy price is meaningful only in aggregate. The same is true when considering the value to an individual consumer of a given privacy-protecting alternative—the consumer does not know whether the information could be obtained anyway through some other means.

Strandburg introduced the possibility, in line with the results of Turow’s study, that consumers treat online activity as an all-or-nothing proposition: They choose to be online and risk compromising their privacy, or else they choose not to be online and lose the value of the Internet. She suggested that this could explain the so-called paradox that people expose their data online even though they say they care about privacy. This would mean that we are in a bad equilibrium where it is hard for an individual to switch out of a market with a host of useful services that are accessible only by giving up one’s data. Such a change would require collective action; it does no good to make a partial switch to a privacy protective service. Given the barriers to change, data collection persists even when consumers would prefer an alternative system.

Strandburg then pointed out that privacy is not only an individual value, and that online privacy markets, consent-based systems, or even democratic voting systems do not necessarily account for the social value of privacy. She went on to explain that privacy may have positive externalities, such as the social benefits of individuals exploring non-majoritarian views. Another example is the network benefits of emerging technologies such as social media that are not taken into account by individuals; surveillance can have chilling and conforming effects on people and their decision-making that would inhibit these network benefits. She also pointed out that technology can change the locus of social life, and the implications of a particular surveillance can change significantly as technology changes.

Strandburg also noted that short-term benefits may be overestimated in comparison to long-term social costs, and that surveillance impacts might be undercounted because they are often concentrated in underrepresented or economically disadvantaged groups. She suggested that we need more empirical studies of the social costs and benefits of privacy, rather than just the costs and benefits for individuals.

PANEL DISCUSSION

Chang then led the panel in discussion, which centered around several topics.

Recent Examples of Tipping Points

Turow suggested that companies that have experienced data breaches, such as Target, may have lost some business in the short term, and incurred the cost of credit protection services for those who were affected, but it is not clear that they will suffer long-term consequences. He said he used to think that people would rise up and reject online tracking for marketing purposes, but the changes are incremental, and people may be so wedded to the status quo that it could take a huge “disaster” for real changes to occur. He suggested that we have not really seen such a tipping point yet in the marketing sphere.

Turow went on to agree with Strandburg’s remarks that the real question is societal. He proposed that we should be thinking about the broader trajectory of society. He questioned what would happen in the absence of visible tipping points. Chang and Turow suggested that it is not so much about tipping points as it is about the changing sense of what is normal, and how that impacts society.

Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×

Adjerid agreed that there is probably not a single identifiable event that marks a tipping point for privacy concerns at the societal level. He suggested instead that there may be a series of mini-events that, depending upon how they are presented, could result in individuals experiencing tipping points. He provided an example of a recent study in which privacy-relevant information was presented in a salient way to mobile users. In one instance, being told that a particular app leaked location information thousands of times in a single day was a significant point for individual users. He suggested that, rather than experiencing some exogenous societal shock, individual members of society experience their own personal tipping points. However, over time a critical mass may change their perspective, resulting in a societal push to respond to a certain practice.

Staddon added that the flood of re-identification of purportedly anonymous databases (e.g. of publicly released AOL search logs and Netflix customer rental data), has changed attitudes—but so far seems to have produced resignation rather than calls to action. Turow suggested that the Ashley Madison breach could be a tipping point for a subset of the population.

Strandburg suggested that the Snowden disclosures might have been a tipping point with respect to the IC—though it may be too soon to say—akin to the revelations that led to the Church committee investigation and subsequent reforms. Invoking her background in physics, Strandburg suggested that we are arguably in a metastable situation, from which it can be difficult to return to an optimal situation, even through tipping points.

Anticipating Future Tipping Points

Turow recommended caution when thinking about how attitudes and behaviors will change across time, in particular while considering generational differences. Many have suggested that millennials think differently about privacy than members of older generations, but some of Turow’s research shows that the thinking is not as different as one might imagine. He also suggested that an individual’s attitudes might change over time.

Adjerid voiced concern about the risks of relegating privacy protection to a notice-and-consent paradigm (where users are provided a sometimes overwhelmingly complex privacy policy to which they must consent in order to use a service), as it puts the burden on users while avoiding some of the hard questions around what uses are appropriate or inappropriate. He also noted that the IC does not have this luxury. He suggested it will not be easy to tell how quickly we might approach a tipping point around this model.

Strandburg suggested that we might start seeing tipping points as awareness and concern about equality issues increase. People may feel less resigned or have different expectations around equality. Turow agreed, and noted that he is finding more and more concern about equality in the retail space. But he also noted that Americans have come to accept many hierarchies and inconveniences, such as airline boarding protocols and luggage restrictions, which he referred to as the “gerbilization” of life. He pointed out that, in general, no one stands up and complains, but suggested that inequality in the application of certain inconveniences could lead to a tipping point.

Turow also expressed concern that some parties might attempt to make people believe they can have control over their information without actually providing significant controls, a practice he referred to as “pseudo-transparency.” For example, an app may ask for permission to use your location data without telling you that it plans to sell the data to other companies. Publicity about pseudo-transparency could make people very angry.

Improving Trust

Adjerid suggested that concerns about data misuse—or perceived misuse—could be assuaged by demonstrating how individuals within an organization can make the right decisions in a context where there is not a clear right and wrong. He then discussed some research on how individuals make decisions about other people’s privacy describing some early but interesting findings related to the roles of reciprocity and social

Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×

norms. He has found that most individuals say they are respectful and cognizant of others’ privacy, but fewer say that other people are respectful and cognizant of their privacy. This could be due to a disconnect between the ways we perceive how considerate we are versus how considerate other people are. Another possible explanation is that any visible bad actor skews an individual’s perceptions of others.

Adjerid is also examining whether “niceness” is a predictor of cognizance of another’s privacy, by evaluating results in terms of where respondents fall on the psychopathy spectrum. His early results suggest that there is no significant difference; psychopathy did not have a moderating effect. Most people, including “nice” people, are more willing to disclose sensitive information about other people than about themselves, which suggests that the role of individual judgment in grey-areas is not straightforward.

Adjerid suggested that similar arguments can be made about individual decision-making around norms in an institutional setting. Individuals may impose—even subconsciously—their own perceptions about the proper trade-offs between security and privacy for the people whose data they are working with. This could potentially happen in the IC, and does happen in private organizations.

Staddon added that she sees similarities between large Internet companies, such as Google, and the IC. She recalled comments from previous panels that it is difficult for an outsider to really know what the IC does. She suggested that simply being more open might help—for example, about the utility being returned to the public, the types of things that are being studied, or even the fact that thought is going into these decisions. This would come with risks. People might try to poke holes in the chosen practices, or ask questions that the IC cannot answer—but more openness could still have significant benefits.

Strandburg suggested that trust could be improved by being more transparent about the efficacy of the IC’s privacy protection practices. She also noted that her interactions, though somewhat limited, with people in the IC suggest to her that there is a very high degree of professionalism in the community. Instilling certain ethical attitudes as part of the professional identity of a community can be important, in particular for the IC.

Turow added that he thinks the public wants to see evidence that their information is being handled with respect. He suggested that, before the recent disclosures about intelligence practices, Americans gave more credit to the government than to marketers when it came to privacy but that this attitude has probably changed. He also noted that there is no simple recipe for building trust. It will be an incremental process, during which the community must demonstrate a genuine respect for the larger society. It will take a lot of time.

Alexander Joel, civil liberties protection officer, ODNI, responded to some of the panel’s remarks. He noted that the ideas for building trust were consistent with the IC’s new efforts. In particular, the IC has principles of intelligence transparency that they are working hard to implement. He also recognized the need to be more transparent about the utility of the IC, because people’s attitudes about a particular service or trust relationship depend in part on the value that it provides them. It is much easier to see such value with online services (for example, a Web service remembering a customer’s preferences) than with the IC.

He also reiterated the idea of professionalism and ethics, and noted the published principles of professional ethics for the intelligence community,5 including sections on mission, truth, lawfulness, integrity, stewardship, excellence, and diversity. The section on lawfulness reads as follows:

We support and defend the Constitution, and comply with the laws of the United States, ensuring that we carry out our mission in a manner that respects privacy, civil liberties, and human rights obligations.

Joel pointed out that the research on how people feel about their own privacy vs. that of other people, addressed by Adjerid, has obvious relevance to the community. He noted that, without a personal stake in the matter, data managers can be very rules- and compliance-focused.

__________________

5 Office of the Director of National Intelligence, “Principles of Professional Ethics for the Intelligence Community,” http://www.dni.gov/index.php/intelligence-community/principles-of-professional-ethics, accessed September 8, 2015.

Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×

OPEN DISCUSSION

All participants were then invited by Chang to ask questions and discuss the points and themes presented by the panel. Chang began by asking if there is a privacy equivalent of critical security controls in the IC. Several participants pointed out that the IC has both security controls and recently published privacy controls: The privacy overlay can be found as Attachment 6 in Appendix F of Committee on National Security Systems Instructions (CNSSI) 1253.6 The privacy controls aim to operationalize the Fair Information Practice Principles (FIPPs).

Public Perceptions and Trust

A participant noted that many in the public do not necessarily appreciate the distinction between different parts of the government, and thus do not believe that oversight of one government group by another is truly independent. The participant also pointed out that some information, including that which reflects the highest impacts of intelligence activities, is and must stay secret. The participant asked whether this undermines the IC’s ability to build trust.

Adjerid suggested that demonstration of value and generation of trust are two distinct concepts. An example of demonstrating value would be telling the American public that the activities undertaken will benefit them. He suggested that a more direct strategy for building trust, one that does not necessarily require the release of secrets, is to carefully articulate what is being done to protect data.

Another participant noted that being transparent does not necessarily mean revealing everything. Nonetheless, it is not clear what kind of transparency is feasible that would make the public feel comfortable. By analogy, consider that most people do not really understand the details of how a car works, and do not need to. In order to develop trust, it may be important for the citizens to believe that the IC’s goals and interests are in line with their own.

Participants discussed public perceptions of and concerns about the government’s ability to access private-sector data. One participant suggested that the firewall between the data held by commercial entities and that held by the government has disintegrated and wondered what effect this had on public trust. Turow noted that his surveys had not considered this topic, but said that, in addition to the government having legal mechanisms for directly accessing company data, there is potential concern about the ability of the government to legally purchase information from an independent data broker that they would not otherwise have been authorized to access.

Another participant suggested that the challenge might actually be a lack of transparency about the meaning and interpretation of the rules and laws that govern this access. The participant noted that, while some details must remain secret, the ambiguity of oversight and governance of activities funded by taxpayers is a huge problem. One example raised was the fact that many terms, such as “targeted,” used in laws such as FISA have no statutory definition. The participant found it unclear how explaining such terms could legitimately compromise the IC’s mission.

Strandburg suggested that one way to understand the trust dynamic is to question whether citizens would approve of secret practices if they knew about them. One might, for instance, think about this by considering whether such practices would cause embarrassment if published in the New York Times. She also suggested that disclosing general ideas about what kind of data are being collected and how they are being used would help demonstrate efficacy and enhance transparency, and that there are probably things that could be disclosed without negatively impacting the efficacy or usefulness of tools.

Joel responded to several of these points to provide some perspective from the IC. First, he acknowledged the general surprise at the disclosures of how laws were being interpreted and applied, noting that the IC aims

__________________

6 Committee on National Security Systems, “Security Categorization and Control Selection for National Security Systems,” CNSSI No. 1253, release date March 27, 2014, https://www.cnss.gov/CNSS/issuances/Instructions.cfm.

Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×

to move forward with more transparency. He also noted that the internal process for accessing privately held data was not seen as easy within the IC, but was an elaborate process involving the Foreign Intelligence Surveillance Act (FISA) court and congressional oversight, as well as many people overseeing the process within the IC. He pointed out that there were also substantial restrictions on how any data obtained could be queried, used, and shared. Recent statistics have shown a very narrow set of uses, though an admittedly enormous collection of data. The IC supported the USA Freedom Act, which has put in place a new model with statutory transparency requirements that the community is working to determine how to implement.

A participant suggested that the question of whether secrecy undermines trust could be a good research question. Another participant noted the difference between an action being legally permitted and being acceptable to the public, and asked whether the fact that something is perceived to have value (either to the nation, to an organization, or to the public) makes it more acceptable, even if it is outside the normal boundaries of acceptability. Turow said that, in the marketing context, the public has not seen value as an acceptable justification for unwanted data practices. He had seen some evidence in the past that people would be more forgiving if the value accrued to the nation rather than to individual marketers, but was unsure of how people feel about that now. He suggested that understanding the difference between attitudes toward the government and attitudes toward corporations with respect to value and fairness are quite important, and noted that the nation might have a greater capacity for disappointing the public than a corporation might.

Adjerid described some of his work on organizations. Early findings from qualitative discussions and interviews suggest dissonance between the perspectives of higher-level and lower-level employees: Higher-level employees tended to offer the “party line” that the organization does not engage in data practices that are potentially invasive or discriminatory. Lower-level employees, such as those actually doing the data analysis, tend to find creative ways of collecting and using information that might push boundaries, but could be potentially lucrative to the organization.

Equality, Discrimination, and Consumer Profiling

A participant observed that people with different backgrounds can have very different perspectives on data collection. Strandburg reiterated that underrepresented groups can be affected disproportionately, and suggested that the government has a responsibility to those groups in a way that a company does not. She also highlighted the idea that lower-income people could be the canary in the coal mine when it comes to privacy; what gets done to those who are not very powerful could eventually become normal, and happen to us all. Turow noted that very little research has been done on the privacy preferences of certain demographic groups, such as lower-income populations and minority groups, and that this is very important to understand.

The group then discussed issues related to consumer pricing strategies. For example, loyalty programs often offer lower prices to members; this generally reflects the fact that companies gain value from tracking their members’ practices. It is not clear to consumers how much companies benefit from data collection, even if the consumers see the discount. Strandburg reiterated that costs and benefits are somewhat unknowable, as they depend upon what is already known about a consumer, which will greatly influence the value of newly collected data. It was suggested that perceived injury on the part of a customer as a result of data collection is accounted for in the formula for the pricing differential.

Turow noted that loyalty programs have been increasingly used as a lure to get consumers to share data; such programs can be used in many ways. He described the current debate within the retail community about more elaborate price tailoring practices, where prices vary from individual to individual based on the profiles the company has built about each, a strategy enabled by the Internet and mobile shopping. He provided several examples to illustrate the depth of complexity we are entering into around individual profiling. At least one retailer actually gives higher prices to more loyal customers, because it is less concerned about losing their business. Some brick-and-mortar retailers have started using electronic ink, enabling them to change the price by time of day, or even tailor prices to individual shoppers. He suggested that current practices are teaching people that giving up their data is just a part of 21st century life. In 20 years, people

Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×

may think that it has always been this way—indeed, no one today remembers what retailing was like in 1895. He again suggested a need to think about the consequences of these shifts for the larger society.

The group then discussed the role of experience in shaping user attitudes, motives, and incentives. According to Turow’s data, fewer than 5 percent of Americans report having been directly harmed as a result of their data having been used. Turow pointed out that we are still at the beginning of the digital age. He recalled testifying in a 1985 hearing on program-length commercials, where companies were creating shows aimed at marketing toys to children, and suggesting that action would need to be taken at that time, or else people would take such practices for granted. He pointed out that today, there is a whole channel run by Hasbro.

Turow suggested that data collectors, especially the IC, are stewards of much of American society. He posed a question: “How do we want our grandchildren to think about the way the world is?” He suggested that we are really only at the beginning—that the pace of data collection and use is accelerating, and things could change rapidly.

Privacy Research

The group also discussed challenges to conducting innovative privacy research.

Staddon pointed out that the heavy public scrutiny of Internet companies creates a disincentive for companies to carry out research or make data available to researchers. In particular, if results can be interpreted as showing that a company’s priorities are at odds with consumer privacy, things will become more difficult for the company. Another participant pointed out that this is a big loss, because Internet companies often have access to massive data sets that could help advance our understanding, whereas academics tend to work with small data sets.

Another participant pointed to difficulties around the study of behavior, and noted that it is easy, even for social scientists, to come to the wrong conclusions when interpreting observed behavior. For example, a lack of action on the part of an individual could indicate resignation rather than acceptance. Similarly, survey results can be off if questions are not asked in the correct way. Finally, people do not necessarily behave logically, which might reflect self-contradiction or some other factor such as resignation. The participant asked whether we are getting better at using this type of research to understand what people really care about.

Staddon noted that there are many survey practices that can provide confidence in results, such as phrasing survey questions in ways that will help to avoid bias. She noted that the community as a whole is doing more work on behavioral data. She referred to a recent study7, 8 that examined what information users were willing to share with an anonymizing filter and without one; this approach provides a better understanding of the context in which users are comfortable disclosing information. Nonetheless, we still do not know how to identify privacy-concerned users simply based upon their online behavior.

A participant asked if we really know whether individuals are being honest in their responses to surveys about privacy preferences. Turow described some of the steps his team took to address this issue, including hiring experienced public polling firms to conduct the surveys, and asking some of the same questions longitudinally. He noted that responses have been consistent and stable over time. He noted that other interpretations of such consistency are possible—including the possibility that individuals provide the answers they think the questioners are hoping to hear—but he finds this unlikely. He has compared past

__________________

7 S.T. Peddinti et al. 2015, Understanding sensitivity by analyzing anonymity [guest editor’s introduction], 36th Symposium on Security and Privacy 13(2):14-21.

8 S.T. Peddinti, A. Korolova, E. Bursztein, and G. Sampemane, 2014, Cloak and swagger: Understanding data sensitivity through the lens of user anonymity, Proceedings of the IEEE Symposium on Security and Privacy, pp. 493-508, doi:10.1109/SP.2014.38.

Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×

results to related questions in other surveys, including some conducted by Pew,9 and to anecdotes. For example, the findings about resignation are consistent with what is often heard anecdotally.

Another participant pointed out different approaches that can be illustrated using the example of attitudes and behavior around nutrition. Consider an individual who is aware that he has bad eating habits. One could ask about the individual’s habits, what the individual would like his or her habits to be, or whether he or she finds another person’s habits admirable. In general, people have principles, but do not always follow them due to factors such as weakness or time constraints. Learning what people think is the right thing could be useful, though these ideals could be distinct from what they personally want or enjoy.

Adjerid agreed that there are ways to experimentally tease out such details. One strategy is to ask participants to anticipate how they would behave in a hypothetical scenario, and then actually implement the scenario. He has found that people do not act as they had predicted they would. Research suggests that individuals overestimate their ability to behave rationally in the future while underestimating their susceptibility to influence by non-rational factors. However, he suggested that the mechanism behind such dissonance has not yet been adequately elucidated.

Adjerid identified the tendency of individuals who care about privacy to act against these interests, which he termed the “privacy paradox.” He also suggested that consumers seem to have privacy fatigue—a huge breach of a company’s data may not affect its sales or stock value. He wondered whether results from social science research, which might provide evidence of this fatigue, could paradoxically lead organizations to avoid taking action to enhance privacy. On the other hand, he also asked whether inaction on the part of companies could lead to a tipping point such that people stop disclosing information to retail, Internet, or telecommunications companies, which would presumably also impact the IC’s ability to do its job.

He recalled the contextual nature of privacy, and suggested that it could be helpful to learn more about the contextual nature of individual rationality in making privacy decisions. This could serve as a basis for tailoring policy around privacy in ways that could align individuals’ behavior with their best interests, and for providing more predictability, and thus make it easier for organizations to plan policies and react to concerns.

__________________

9 Note a recent Pew study addressing Americans’ attitudes about privacy in the context of government: see Mary Madden and Lee Rainie, 2015, “Americans’ Attitudes About Privacy, Security and Surveillance,” May 20, http://www.pewinternet.org/2015/05/20/americans-attitudes-about-privacy-security-and-surveillance/.

Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×
Page 21
Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×
Page 22
Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×
Page 23
Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×
Page 24
Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×
Page 25
Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×
Page 26
Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×
Page 27
Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×
Page 28
Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×
Page 29
Suggested Citation:"5 Social Science and Behavioral Economics of Privacy - Panel Summary." National Academies of Sciences, Engineering, and Medicine. 2016. Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community. Washington, DC: The National Academies Press. doi: 10.17226/21879.
×
Page 30
Next: 6 Best Practices and Ethical Approaches for Data Collection and Use - Panel Summary »
Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community Get This Book
×
 Privacy Research and Best Practices: Summary of a Workshop for the Intelligence Community
Buy Paperback | $40.00 Buy Ebook | $31.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Recent disclosures about the bulk collection of domestic phone call records and other signals intelligence programs have stimulated widespread debate about the implications of such practices for the civil liberties and privacy of Americans. In the wake of these disclosures, many have identified a need for the intelligence community to engage more deeply with outside privacy experts and stakeholders.

At the request of the Office of the Director of National Intelligence, the National Academies of Sciences, Engineering, and Medicine convened a workshop to address the privacy implications of emerging technologies, public and individual preferences and attitudes toward privacy, and ethical approaches to data collection and use. This report summarizes discussions between experts from academia and the private sector and from the intelligence community on private sector best practices and privacy research results.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!