National Academies Press: OpenBook

Social Media and Adolescent Health (2024)

Chapter: 5 Design Features

« Previous: 4 The Relation between Social Media and Health
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

5

Design Features

As Chapter 2 explained, social media algorithms influence a user’s experience of social media in ways that are both complicated and highly variable. Young people’s experiences with social media can also be influenced by exposure to violent or otherwise toxic content, by harassment, or through introduction to bad actors, such as adults interested in grooming to sexual predation or incitement to political radicalization. The committee recognizes that perfect controls over what users see is not a realistic or necessarily desirable expectation for social media companies. But there are provisions that can be incorporated into the design of apps, games, and websites that limit the personal information companies collect, the types of content available, and the prompts to extend time on a platform.

There will always be a place for a knowledgeable consumer to make informed decisions about risks faced online. In the same way health literacy can allow patients to have more knowledgeable, better prepared conversations with their health providers, media literacy, a topic discussed in detail in the next chapter, can allow for a more informed understanding about decisions made online. At the same time, the complexity and pace of the online environment far exceed what adolescents—or any layperson—could be reasonably expected to understand. This chapter recommends steps at the level of platform design that would help tip the balance of transparency to the users who support the platforms and the government agencies that monitor the fairness of their operations.

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

AGE-APPROPRIATE DESIGN CODE

The central mission of age-appropriate design is to make technology safer for young people through “a set of processes for digital services when end users are children [that] aids in the tailoring of the services that are provided” (IEEE, 2021, p.11). Age-appropriate design can extend to the way vendors collect and use information about minors and how schools promote educational technology to students. It includes enhanced privacy protections, whether in products specifically designed for minors (e.g., children’s programming on YouTube) or products they are likely to access (e.g., search engines). The rights and developmental needs of children are central to the determination of the age appropriateness of a product or platform.

Creating or modifying a product to meet child-friendly design standards starts with a full review of how the products’ features may influence children and a plan for how to mitigate those risks and test new changes. The next steps, undertaken simultaneously, involve auditing the features, identifying risks, and mitigating them. Steps to minimize targeted advertising to children, for example, begin with an overview of corporate policies on data privacy and the shareholders’ views on the matter (IEEE, 2021).

A growing interest in designing digital technology for children led the Institute for Electrical and Electronics Engineers (IEEE), an international association for electronic and electrical engineering, and the 5Rights Foundation to release a standard for age-appropriate design in 2021 (Shahriari and Shahriari, 2017). This design puts the burden of establishing users’ age on the producer of the technology. Knowledge of users’ age in turn allows companies to present terms of use that reflect adolescents’ progressively growing capacity for understanding and independent decision making (IEEE, 2021).

Age-appropriate design emphasizes the protection of young people’s online privacy. The code requires platforms to collect only the necessary information and use it in the way that the child (or, in some cases, the responsible adult) had agreed to and not for commerce. It also discourages the persuasive design features (i.e., features intended to extend the time spent on a platform such as push notifications and tones when new content is posted) that extend use, especially at night, and promotes a high standard for content moderation. The standard also stipulates that it is the technology developer’s responsibility to reduce the automated recommendation of violent, offensive, or harmful content and misinformation (IEEE, 2021).

For product developers and vendors, age-appropriate design may seem to impose burdensome restrictions, especially if they work in dif-

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

ferent jurisdictions with varying levels of relevant legal or regulatory controls. In such a situation, the IEEE guidance encourages full review of all applicable laws and regulations and, when in doubt as to the standard required, to proceed with the service that more conservatively reflects the best interests of the child (IEEE, 2021). For example, if a user declines to enter a birthdate or if age cannot be verified, the technology developer should not offer nudges to stay longer on the platform or push notifications at night.

The California Age-Appropriate Design Code Act,1 like similar legislation in the UK, is primarily concerned with young people’s data privacy (Information Comissioner’s Office, 2023). The IEEE standard goes slightly further, including guidance on limiting features that encourage extended use of platforms. But even the IEEE standard offers fewer specifics on content moderation. Its guidance encourages companies to make investments in moderation proportionate to risk, that terms of content moderation should be clear, and that parents and children should have a means for redress (IEEE, 2021).

Age-appropriate design guidance can be technically vague and hard to enforce, and assessment of compliance has been described as subjective (Farthing et al., 2021; Franqueria et al., 2022; Goldman, 2022). Whether other states follow the age-appropriate design code lead set by California remains to be seen, and even if they do, critics of age-appropriate design have seen it as infantilizing of children, especially older adolescents, as the emphasis on acting in the best interest of the child presupposes that the child is incapable of discerning what their best interests may be (Collinson and Persson, 2022). What is more, some of the most serious risks to the mental and physical health of young people come from overuse and algorithms that present unhealthy content. These are not necessarily problems that age-appropriate design code aims to solve.

The age-appropriate design movement put concrete parameters on what had been an abstract discussion about children’s privacy. Its emphasis on both the inputs to and outputs of a functional privacy system gives researchers and companies a guideline against which to measure the data collection risks that children encounter online. Yet threats to the mental and physical health of young people are often traced to failures of content moderation, algorithms that promote toxic content, and overuse. Social media platforms would benefit from a similar standard to guide assessment of how their algorithms influence well-being.

___________________

1 California Civil Code §§ 1798.99.28–1798.99.40.

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

GREATER TRANSPARENCY AND ACCOUNTABILITY

Social media is an important source of entertainment and connection for many people, especially adolescents. Given the importance these platforms have in people’s lives there is a growing momentum for more openness and oversight of their operation. Much of the public outrage elicited by Frances Haugen’s revelations stemmed from the perception of secrecy, the idea that harms known to executives inside the company were kept from the public (Allyn, 2021). Allowing researchers and civil society watchdogs access to social media data and review of their algorithms would allow for a better understanding of how social media platforms influence young people for better or worse.

It is difficult to determine what effect social media has on well-being or the extent to which companies are doing due diligence to protect young people from the more habit-forming affordances of their platforms, as companies retain extremely tight control on their data and algorithms (Krass, 2022). Publicly available data can support some research. The University of Michigan’s Iffy Quotient, for example, aims to monitor the extent to which Facebook and Twitter amplify disinformation (Center for Social Media Responsibility, 2022). But even this is vulnerable. In 2021 Facebook sued researchers attempting to study political advertising—using publicly available information—because the data scraping tools they used violated the platforms’ terms of service, a topic discussed more in Chapter 8 (Knight First Amendment Institute, 2021; Panditharatne, 2022). The tools Facebook authorizes for researchers, including a searchable advertisement library called Ad Library API and a network data analysis tool called CrowdTangle, provide “tightly circumscribed and spotty information” (Panditharatne, 2022). Civil society groups requesting access to social media data have reported an arbitrary lottery-like process, highly dependent on personal relationships and subject to favoritism (Bradshaw and Barrett, 2022).

In the same way, there can be a seemingly arbitrary approach to the enforcement of content moderation guidelines. Participation in social media has become an important part of modern life. When a platform’s decisions seem unfair, aggrieved users may take the position that they are a victim of corporate overreach, being denied access to a public venue, in a manner similar to being disallowed entry to a movie theater or store (MacCarthy, 2021). While such concerns are reasonable, there is also deep public ambivalence regarding outside interference, especially from the government, in determining which public statements should be amplified and which ones should be silenced. In the balancing of trade-offs, a system for content moderation has emerged that relies on oversight boards,

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

groups of experts that are neither fully independent of the platform nor fully open about their process (Douek, 2019).

A general lack of transparency regarding social media operations has bred public distrust of the platforms and the companies that run them. Figure 5-1 shows results of a 2021 survey conducted by researchers at George Mason University and The Washington Post indicating widespread mistrust of the platforms. Transparency is a remedy for distrust, as it provides some assurance that the platform is conforming to public values and expectations (MacCarthy, 2021).

Some of the companies’ reluctance to share information is well founded. Platform algorithms are proprietary, which can make obliging companies to share seem unfair and uncompetitive. Social media platforms also hold a great deal of information about ordinary people that could, in the wrong hands, be used for surveillance or blackmail (Bradshaw and Barrett, 2022). Therefore, questions of data access and sharing can be especially fraught. Not all researchers can trust that their work will be free from government interference, nor can civil society organizations always assume that their independence will be respected.

The need for more accountability and openness in social media has attracted the attention of Congress. Dozens of pieces of legislation in the

Image
FIGURE 5-1 Response to the question, “How much do you trust each of the following companies or services to responsibly handle your personal information and data on your internet activity?”
SOURCE: Kelly and Guskin, 2021.
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

last two sessions alone have taken aim at advertising transparency, data privacy, protections for minors, oversight of mobile apps, targeted marketing, and other aspects of platform operations.2 There is clearly momentum and political will for a system for better oversight. In response to this momentum, the social media companies, including Meta and Alphabet, have come to accept data privacy legislation (Kang, 2018). A prompt coordinated effort to develop technical standards for platform operations, transparency, and data use would be a meaningful step toward a better, global system for platform accountability.

Recommendation 5-1: The International Organization for Standardization should convene an ongoing technical working group including industry representatives, civil society, and academic stakeholders to develop standards for social media platform design, transparency, and data use.

Social media is a widely diverse set of tools, used differently, by different people. The extent to which particular platforms are committed to maximizing the beneficial uses and curtailing the harmful ones is not clear to anyone. The development of uniform standards is an essential precursor to any transparent reporting, benchmarking of progress, or regulation. Without such standards outside auditors cannot judge the effectiveness of content moderation or the role of a platform’s advertising or recommendation algorithms in promoting harmful content. Harmonized standards are also the basis of comprehensible public disclosures, such as those governing terms and conditions of using an online service, or measures taken to counter harassment or hate speech. A standard format for data at the application program interface also greatly eases the study of confidential algorithms (MacCarthy, 2022).

The International Organization for Standardization (known as ISO) is an international, nongovernmental organization with a long history in successfully setting and maintaining international standards. Its members include the national standard-setting bodies of 168 countries and use a process described as “voluntary, consensus-based and market relevant” (ISO, n.d.). Given the worldwide reach of social media platforms and the companies’ need to operate across borders and cultures, such international support and buy-in are crucial. ISO also has long experience and well-defined processes for updating standards to ensure their continued

___________________

2 A Congress.gov search for legislation in the 117th and 118th Congresses for legislation about online advertising, social networks, social media, online privacy, or online data indicated more than 40 pieces of immediately relevant legislation, with many hundreds more tangentially relevant.

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

relevance, something that will be necessary given the pace of change in this field (ISO, 2021). ISO has considerable experience in similarly thorny and technical topics. The ISO/IEC 27000 family of standards, for example, provides a model for information security management and data protection (ISO, 2022). The committee envisions a similarly inclusive process guiding the development of platform standards for social media.

The recommended standard setting process would be iterative and dynamic, given the rapid pace of change in social media technology and in society’s perception of threats. The ISO process is also designed to include the full range of stakeholders needed to comment on the management of technical processes, including “people such as manufacturers, sellers, buyers, customers, trade associations, users or regulators” (ISO, 2023).

The Types of Standards Needed

The standards for social media operations and platform design would, like the IEEE standard for age-appropriate design, articulate both inputs and outputs of a functional system. Inputs refer to actions taken by the platform, while outputs are partially driven by the platform but are also shaped by the behaviors of users. Inputs can include processes for content moderation or data use, content of privacy agreements, and mandatory disclosures to users, all reflective of decisions largely within the platform’s control. Outputs could include platform health measures, such as the amount of toxicity on a platform, for example. A platform’s content moderation and take-down policies will influence measures of toxicity, but the platform cannot fully control something driven by the behavior of its users. The distinction is key, as adherence to input standards requires little if any margin for reaction time on the part of the platform.

At the platform level, measures should move beyond simple aggregates and provide informative percentile summaries. The reported percentiles would aim to capture the harm experienced by those most vulnerable, such as the amount of cyberbullying experienced by the most bullied decile of adolescents. Since the association between social media use and health outcomes varies across groups, standards should allow quantification at the group or community level. Finally, aligned with algorithmic transparency standards, on request, platforms should provide summaries at the user level.

To better illustrate how this recommendation would work, Table 5-1 gives examples of key measures for which the ISO working group would develop standards together with examples of input and output measures that could be tracked. As discussed in Chapter 2, platform algorithms cover ranking, ad-targeting, and content moderation. As such, Table 5-1

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

gives examples of standards for each algorithmic type. As noted earlier, standards should be measured comprehensively, so Table 5-1 gives examples of concrete platform-, community-, and user-level outcomes. When applicable, past work with significant overlap is also presented, although the many empty cells in this column illustrate the importance of building these standards and encouraging their adoption.

One important measure that companies should report is their efforts to remediate youth mental health problems. This information, like certain audit and systemic risk reports, should be available on request to the Federal Trade Commission (FTC). Better clarity on and tracking of the standardized indicators would eventually allow for comparisons across platforms and over time, giving both the public and the FTC better clarity on the risks these platforms pose.

It is important to note that the examples provided in Table 5-1 are proposed for illustrative purposes, not as a definitive list of needs. Part of the value of a reoccurring convening via ISO is that the standards could develop in line with a growing body of scientific consensus on the ways social media influences adolescents. Consider for example, the influence of image sharing platforms on body image discussed in Chapter 4. Given the strength of this association, the recommended standards may do well to include measures related to the amount of such content seen on a given platform.

ADOPTING THE STANDARDS

Critics of the previous recommendation may maintain that such steps are not necessary as the social media industry already has relevant rules in place. Self-regulation has long been relied on in the media: television, movies, videogames, and music all make use of industry standards for content rating (Napoli, 2018). Recent years have seen greater effort at industry self- and third-party regulation of social media, exemplified by Facebook’s Oversight Board (Klonick, 2020). This oversight board could help protect users from unfair treatment (Maroni, 2019). At the same time, there will always be a suspicion that the real goal of such a board, or any effort at self-regulation, is to bolster the platform’s market position or authority (Maroni, 2019). Social media platforms’ success depends on engaging as many users as possible, something controversy and emotion can do (Brown, 2021). Asking companies to moderate the more sensational voices on their platform could be asking them to act against their business interests (Brown, 2021).

Skepticism of self-regulation aside, enacting a regulatory framework across jurisdictions on global companies is not always a legally or logistically viable option (Henderson et al., 2016). An acknowledgement of the

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

TABLE 5-1 Operationalizing Standards for Social Media Operations, Transparency, and Data Use

Aim Input Examples Output Examples Transferable Work
Content-based health measures To analyze the nature of the content with implications for users’ health and well-being The amount and type of resources dedicated to ensuring harmful content is identified and demoted Reports on the amount of cyberbullying experienced by the 25%, 10%, 5%, and 1% most bullied users tracked over time (platform-level measure)
Reports on the amount of cyberbullying found in a specific subcommunity, e.g., Facebook group (community-level measure)
Reports on the amount of cyberbullying attacks experienced by a user, reported to that user on request (user-level measure)
Reports on material taken down, proportion of moderation decisions appealed
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Aim Input Examples Output Examples Transferable Work
Reports on the amount of cyberbullying attacks experienced by a user, reported to that user on request (user-level measure)
The public content moderation policy, the number of content moderators Reports on material taken down, proportion of moderation decisions appealed
The amount of resources dedicated to ensuring advertising algorithms do not expose adolescents to harmful content Reports on the amount of harmful content served to adolescents through ads
Network-based health measures To track the extent to which social connection on the platform is positive and the extent to which it is negative The amount and type of resources dedicated to discerning network quality Reports on the fraction of user connections that promote social connectedness
Privacy and security To better align privacy and security settings with user preferences Privacy setting portability To allow users to state privacy preferences once and deploy them across apps and platforms Reports on privacy and security that measure how users’ understanding of the privacy and security policy evolve over time The UK open banking initiative, wherein nine major banks developed an industry standard for customers to transfer their financial dataa
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Privacy policies written in standard machine-readable format that can be read automatically by a web browser reducing burden on user Reports on the number of users that benefit from the use of machine-readable privacy policies (users that port their privacy settings to use elsewhere) The Platform for Privacy Preferences (P3P) tool, intended to enable users to limit their exposure to websites with privacy policies that do not match their preferencesb
Data use To clarify what types of data algorithms can use Predictive models to identify young people in mental health crisis Proxy indicators such as proportion of young people in suspected mental health crisis seeing ads about support services
A public database of advertising targeting criteria Advertising algorithms audited and audit reports shared with FTC
Operational transparency To improve understanding of how the platforms work Reports on the actions taken to make the platform’s operations more transparent Recommendation algorithms audited and audit reports shared with FTC

NOTE: a Brown, 2022; b Cranor, 2003.

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

fact that industry stakeholders are often in the best position to set out operational policies underlies the prior recommendation’s specification that industry should be part of the ISO technical work group. There is also reason to believe that companies will have an interest in monitoring one another against the standards the ISO group develops. For this reason, the social media companies should formally adopt these standards and reference them in their public documents.

The companies would do well to adopt such standards to forestall more sweeping regulatory action (Cusumano et al., 2021). The UK’s proposed Online Safety Bill, for example, put significant demands on platforms, even specifying the type of content moderation technology they must use (Church and Pehlivan, 2023). Such restrictions can be impractical and detract from the time and resources platforms can designate for product improvement or even to developing better tools for content moderation.

Recommendation 5-2: Social media providers should adopt the standards referenced in the previous recommendation as a matter of policy and as specific provisions in their terms of service.

A public statement that platforms will comply with all the measures included in the standard and a commitment to the standard in its terms of service would be a meaningful step toward an enforceable legal structure on social media. Section 5 of the Federal Trade Commission Act gives the FTC the authority to penalize firms that engage in unfair or deceptive business practices, although this provision includes an exception enacted in 1980 prohibiting the FTC from using its unfairness authority to promulgate rules governing children’s advertising.3 Using this authority, the agency has brought enforcement actions against companies that have failed to honor commitments made in their privacy policies and other similar documents (FTC, 2023).

Failure to honor basic cybersecurity standards may also represent an unfair business practice (FTC, 2021). Unlike deception, which is judged against a firm’s affirmative statement, unfairness can be seen as a more general failure to meet society’s expectations, including standards of industry practice (Pertschuk et al., 1980). Though applied more sparingly, unfairness can be the basis for enforcement actions even against egregious conduct by companies that have not actively incorporated those standards into their terms of use (FTC, 2003).

The FTC’s ability to characterize business practices as unfair depends on the agency giving firms sufficient notice of what is necessary to meet

___________________

3 15 U.S. Code § 57a.

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

their legal obligations.4 The agency’s proposed new rule on commercial surveillance and data security has identified the “extent [to which] commercial surveillance practices or lax data security measures harm children, including teenagers” as an area of particular concern.5 An industry standard on data security and advertising could facilitate the agency’s oversight of these practices.

The creation of a standard would also support the FTC’s use of consent decrees as a regulatory tool. The agency will negotiate consent decrees with companies that fail to meet expected standards, as it has done for data protection (Daily Business Review, 2015). Once a company agrees to a consent decree, the terms of the decree determine its obligations to remediate, regardless of whether or not those terms are strictly within the FTC’s authority (Rosch, 2011).

The creation of industry standards for social media would inform the FTC’s governance by consent decree, even for social media providers that do not explicitly adopt the standard into its terms of service. Nevertheless, it is the committee’s hope that the standards development process described in Recommendation 5-1 would trigger a virtuous cycle of compliance. International standards can be a marker of good business practice and even a badge of pride, a dynamic that would be analogous to companies seeking green building certification in the absence of any legal obligation to do so. The normative pressure of industry standards could serve as a signal to the public of a company’s sincere and meaningful steps to mitigate the harms associated with its product.

USING THE STANDARDS

A similar process is underway in artificial intelligence (AI) and machine learning. Ethical AI tool kits are designed to enable more open communication among technology developers, researchers, policy makers, and civil society (Wong et al., 2023). Tools such as Model Cards, which provide short explanations of how and against which machine learning tools are benchmarked, are a step toward transparency in AI (Mitchell et al., 2019). Similarly, public documentation on the provenance of the datasets used to calibrate machine learning models is gaining traction as a way to mitigate the harms a biased model can cause (Gebru et al., 2021).

As part of the ethical AI movement, IEEE has set out standards and guidelines to ensure that the AI systems prioritize human well-being in design (Shahriari and Shahriari, 2017). The standards developed from the

___________________

4 Federal Trade Commission vs. Wyndham Worldwide Corp., 799 F.3d 236 (3d Cir. 2015).

5 Federal Trade Commission, “16 CFR Chapter I: Trade Regulation Rule on Commercial Surveillance and Data Security,” Federal Register 87, No. 202 (October 20, 2022) 63738.

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

implementation of Recommendation 5-1 could draw on these principles, evaluating the platform’s transparency about its policies and practices and its accountability for data breaches or violations of user privacy. The standards could evaluate whether the platform has age-verification processes, data encryption, and robust privacy policies in place, along with efforts to educate parents and other stakeholders on cyberbullying and reporting and blocking mechanisms. The standards could shine a light on the extent to which platforms are performing due diligence to enforce their age minimums. In 2021, Common Sense Media found that 38 percent of children between ages 8 and 12 have used social media, for example (Rideout et al., 2021). Standards could also clarify whether a social media platform’s content is suitable for children and teens based on age-appropriate criteria and whether the design of the platforms’ features and affordances for young people are developmentally informed or evidence based.

Practically speaking, such standards could form the basis of a rating system or a checklist assessment of items that enumerate responsible design. Such a checklist could be used to create a library of ranked social media platforms or apps wherein included apps have some level of endorsement for children or teens. The library could even provide clear language information to parents and guardians about the specific purposes and affordances of each app—something particularly valuable given the dynamic and changing landscape of new social media platforms.

The committee recognizes that greater transparency and accountability in the design of social media do not necessarily prevent young people from accessing inappropriate content or taking risks online. Many young people are tech-savvy and can find ways to bypass age restrictions or privacy settings. Nevertheless, an objective quality benchmark could be invaluable to parents trying to determine which platforms could provide the most positive experience for their children. Complying with certain standards would be an important indicator to anyone in a position to authorize a platform or app for personal or institutional use, as in a school system. Some form of benchmarking apps could help school districts better interpret the market for educational technology.

Social media operations are remarkably poorly understood, especially for products so influential and widely used. Accessible and comparable standards would be an aid to consumers who want a valid indicator of various platforms’ commitment to data privacy, content moderation, and other important aspects of the user experience. This important first step toward product benchmarking could introduce greater transparency and ultimately more fair competition into an opaque market.

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

REFERENCES

Allyn, B. 2021. Here are 4 key points from the Facebook whistleblower’s testimony on Capitol Hill. https://www.npr.org/2021/10/05/1043377310/facebook-whistleblower-frances-haugen-congress (accessed September 18, 2023).

Bradshaw, S., and B. Barrett. 2022. Civil society organizations’ data, access, and tooling needs for social media research. https://informationenvironment.org/wp-content/uploads/2022/09/RP5-Civil-Society-Organizations-Data-Access-and-Tooling-Needs-for-Social-Media-Research.pdf (accessed September 18, 2023).

Brown, I. 2022. The UK’s Midata and Open Banking Programmes: A case study in data portability and interoperability requirements. Technology and Regulation 2022:113-123.

Brown, N. 2021. Regulatory Goldilocks: Finding the just and right fit for content moderation on social platforms moderation on social platforms. Texas A&M Law Review 8(3). https://scholarship.law.tamu.edu/cgi/viewcontent.cgi?article=1219&context=lawreview (accessed September 18, 2023).

Center for Social Media Responsibility. 2022. Iffy Quotient. University of Michigan, School of Information. https://csmr.umich.edu/projects/iffy-quotient (accessed February 23, 2024).

Church, P., and C. N. Pehlivan. 2023. The Digital Services Act (DSA): A new era for online harms and intermediary liability. Global Privacy Law Review 4(1):53-59.

Collinson, J., and J. Persson. 2022. What does the ‘best interests of the child’ mean for protecting children’s digital rights? A narrative literature review in the context of the ICO’s age appropriate design code. Communications Law 27(3):132-148.

Cranor, L. F. 2003. P3P: Making privacy policies more useful. IEE Security & Privacy (Nov/Dec):50-55. https://users.ece.cmu.edu/~adrian/630-f05/readings/cranor-p2p.pdf (accessed September 18, 2023).

Cusumano, M. A., A. Gawer, and D. B. Yoffie. 2021. Social media companies should self-regulate. Now. Harvard Business Review. https://hbr.org/2021/01/social-media-companies-should-self-regulate-now (accessed December 31, 2023).

Daily Business Review. 2015. FTC consent decrees are best guide to cybersecurity policies. https://www.bsfllp.com/news-events/ftc-consent-decrees-are-best-guide-to-cybersecurity-policies.html (accessed September 18, 2023).

Douek, E. 2019. Verified accountability: Self-regulation of content moderation as an answer to the special problems of speech regulation. Hoover Working Group on National Security, Technology, and Law. https://www.hoover.org/sites/default/files/research/docs/douek_verified_accountability_aegisnstl1903_webreadypdf.pdf (accessed September 18, 2023).

Farthing, R., R. Abbas, K. Michael, and G. Smith-Nunes. 2021. Age appropriate digital services for young people: Major reforms. IEEE Consumer Electronics Magazine (July/August):40-48.

Franqueria, V., J. Annor, and O. Kafali. 2022. Age appropriate design: Assessment of TikTok, Twitch, and YouTube Kids. https://doi.org/10.48550/arXiv.2208.0263.

FTC (Federal Trade Comission). 2003. The FTC’s use of unfairness authority: Its rise, fall, and resurrection. https://www.ftc.gov/news-events/news/speeches/ftcs-use-unfairness-authority-its-rise-fall-resurrection (accessed September 18, 2023).

FTC. 2021. Federal Trade Commission 2020 privacy and data security update. https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-2020-privacy-data-security-update/20210524_privacy_and_data_security_annual_update.pdf (accessed July 12, 2023).

FTC. 2023. Privacy and security enforcement. https://www.ftc.gov/news-events/topics/protecting-consumer-privacy-security/privacy-security-enforcement (accessed July 12, 2023).

Gebru, T., J. Morgenstern, B. Vecchione, J. W. Vaughan, H. Wallach, H. Daumé III, and K. Crawford. 2021. Datasheets for datasets. Communications of the ACM 64(12):86–92.

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Goldman, E. 2022. California legislators seek to burn down the internet—for the children. https://www.techdirt.com/2022/06/29/california-legislators-seek-to-burn-down-the-internet-for-the-children (accessed September 18, 2023).

Henderson, R., A. Migdal, and T. He. 2016. Note: Industry self-regulation sustaining the commons in the 21st century? Harvard Business School Background Note 315-074, March 2015. (Revised March 2016.)

IEEE (Institute of Electrical and Electronics Engineers). 2021. IEEE standard for an age appropriate digital services framework based on the 5Rights principles for children. https://doi.org/10.1109/IEEESTD.2021.9627644.

Information Comissioner’s Office. 2023. Introduction to the children’s code. https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources (accessed September 18, 2023).

ISO (International Organisation for Standardization). 2021. ISO strategy 2030. Geneva: International Organisation for Standardization.

ISO. 2022. ISO/IEC 27000 family: Information security management. https://www.iso.org/standard/iso-iec-27000-family (accessed June 1, 2023).

ISO. 2023. Standards. https://www.iso.org/standards.html (accessed October 17, 2023).

ISO. n.d. About us. https://www.iso.org/about-us.html (accessed July 12, 2023).

Kang, C. 2018. Tech industry pursues a federal privacy law, on its own terms. The New York Times. https://www.nytimes.com/2018/08/26/technology/tech-industry-federal-privacy-law.html (accessed September 18, 2023).

Kelly, H., and E. Guskin. 2021. Americans widely distrust Facebook, TikTok and Instagram with their data, poll finds. The Washington Post. https://www.washingtonpost.com/technology/2021/12/22/tech-trust-survey (accessed September 18, 2023).

Klonick, K. 2020. The Facebook Oversight Board: Creating an independent institution to adjudicate online free expression. Yale Law Journal 129:2418-2499.

Knight First Amendment Institute. 2021. Researchers, NYU, Knight Institute condemn Facebook’s effort to squelch independent research about misinformation. https://knightcolumbia.org/content/researchers-nyu-knight-institute-condemn-facebooks-effort-to-squelch-independent-research-about-misinformation (accessed September 18, 2023).

Krass, P. 2022. Transparency: The first step to fixing social media. https://ide.mit.edu/insights/transparency-the-first-step-to-fixing-social-media (accessed May 31, 2023).

MacCarthy, M. 2021. How online platform transparency can improve content moderation and algorithmic performance. The Brookings Institution. https://www.brookings.edu/blog/techtank/2021/02/17/how-online-platform-transparency-can-improve-content-moderation-and-algorithmic-performance (accessed September 18, 2023).

MacCarthy, M. 2022. Transparency recommendations for regulatory regimes of digital platforms. Centre for International Governance Innovation. https://www.cigionline.org/publications/transparency-recommendations-for-regulatory-regimes-of-digital-platforms (accessed September 18, 2023).

Maroni, M. 2019. Some reflections on the announced Facebook Oversight Board. https://cmpf.eui.eu/some-reflections-on-the-announced-facebook-oversight-board (accessed September 18, 2023).

Mitchell, M., S. Wu, A. Zaldivar, P. Barnes, L. Vasserman, B. Hutchinson, E. Spitzer, I. D. Raji, and T. Gebru. 2019. Model Cards for model reporting. Paper presented at the Conference on Fairness, Accountability, and Transparency, Atlanta, GA.

Napoli, P. M. 2018. What social media platforms can learn from audience measurement: Lessons in the self-regulation of “black boxes.” Paper presented at 2018 Telecommunications Policy Research Conference, Washington, DC.

Panditharatne, M. 2022. Law requiring social media transparency would break new ground. https://www.brennancenter.org/our-work/research-reports/law-requiring-social-media-transparency-would-break-new-ground (accessed September 18, 2023).

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Pertschuk, M., P. R. Dixon, D. A. Clanton, R. Pitofsky, and P. P. Bailey. 1980. FTC policy statement on unfairness. December 17. https://www.ftc.gov/legal-library/browse/ftc-policy-statement-unfairness (accessed July 12, 2023).

Rideout, V., A. Peebles, S. Mann, and M. B. Robb. 2021. The common sense census: Media use by tweens and teens. Common Sense Media. https://www.commonsensemedia.org/sites/default/files/research/report/8-18-census-integrated-report-final-web_0.pdf (accessed September 18, 2023).

Rosch, J. T. 2011. Consent decrees: Is the public getting its money’s worth? https://www.ftc.gov/sites/default/files/documents/public_statements/consent-decrees-public-getting-its-moneys-worth/110407roschconsentdecrees.pdf (accessed September 18, 2023).

Shahriari, K., and M. Shahriari. 2017. IEEE standard review - ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems. 2017 IEEE Canada International Humanitarian Technology Conference (IHTC):197–201.

Wong, R. Y., M. A. Madaio, and N. Merrill. 2023. Seeing like a toolkit: How toolkits envision the work of AI ethics. Proceedings of the ACM on Human–Computer Interaction 7(CSCW1):Article 145.

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

This page intentionally left blank.

Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 137
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 138
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 139
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 140
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 141
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 142
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 143
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 144
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 145
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 146
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 147
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 148
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 149
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 150
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 151
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 152
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 153
Suggested Citation:"5 Design Features." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 154
Next: 6 Training and Education »
Social Media and Adolescent Health Get This Book
×
 Social Media and Adolescent Health
Buy Paperback | $40.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Social media has been fully integrated into the lives of most adolescents in the U.S., raising concerns among parents, physicians, public health officials, and others about its effect on mental and physical health. Over the past year, an ad hoc committee of the National Academies of Sciences, Engineering, and Medicine examined the research and produced this detailed report exploring that effect and laying out recommendations for policymakers, regulators, industry, and others in an effort to maximize the good and minimize the bad. Focus areas include platform design, transparency and accountability, digital media literacy among young people and adults, online harassment, and supporting researchers.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!