National Academies Press: OpenBook

Social Media and Adolescent Health (2024)

Chapter: 2 How Social Media Work

« Previous: 1 Introduction
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

2

How Social Media Work

As the previous chapter discussed, social media can include a range of activities and features that facilitate social interaction online. As such, understanding the potential effect of social media on health must move beyond a reductive understanding of social media as a unitary exposure and recognize that there is a broad range of interactions with social media that may result in different consequences for different people at different developmental stages. The creation and consumption of information, including the creation of a personal profile and the ability to comment and react to others’ comments and to share and associate online, are important features of social media and ones that can be meaningful to adolescents and differently meaningful at different developmental stages. Social media platforms vary widely in their target audience, design, and purposes, making the broad discussion of social media complicated. For this reason, an understanding of platform affordances and how they interact with different developmental ages and capacities is central to this discussion.

In the interest of narrowing how we think of the range of social media functions to better understand the relation between social media and health, affordances refer broadly to the possibilities of action arising from the relation between a person’s goals and a technology’s features; in fact, affordances can enable or constrain certain behaviors or actions (Evans et al., 2017). Through affordances, technology influences but does not determine the possible actions available to a user. Online communication technologies have a range of affordances that shape how people

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

interact and how they construct their identities and relationships (Treem and Leonardi, 2012). Persistence, for example, is an affordance that refers to the durability of online content. Some social media platforms allow users to create and share content that can be stored and accessed at any time, meaning that information shared on social media can have lasting repercussions.

An affordance approach also recognizes that while the goals and motivations of adolescents may not change over the years, the features of social media technology do, as do the economic and regulatory environments in which social media companies operate. For example, sensitivity to peers’ opinions, self-expression, and identity development are well-established needs during adolescence that are unlikely to change 10 or 20 years from now (Steinberg and Morris, 2001). The features of social media technology that adolescents may use for self-expression and identity development are, in contrast, likely to be very different from current iterations in 10 or 20 years. In the same way, the algorithms that operate social media technology can be expected to be in almost constant flux, as will the economic and legal conditions within which social media platforms operate.

This chapter discusses how social media function with particular attention to affordances, aiming to lay the ground for an understanding of how these affordances interact with development to influence adolescents. This chapter discusses the features, services, and functions that a person encounters in the technology’s interface (e.g., a like button, a sharing function, a messaging service), the ways the technologies operate and the algorithms that drive them; the way these operations interact with adolescents’ developmental stages and needs, and the business models and legal constraints that influence how social media platforms operate. These four aspects of social media functioning are tightly entwined, but it is useful to consider them separately in describing the overall operation.

SOCIAL MEDIA AFFORDANCES

The discussion of social media affordances sometimes highlights persistence, referring to the automatic archive of online statements; replicability or the ability of information to be copied and shared; scalability, meaning the potential for wide visibility of information in online public platforms, and searchability meaning the ability of such information to be searched for and located (boyd, 2010). When considering how adolescents use social media, however, there are other relevant features to consider.

Social media platforms have features that influence how users communicate, share information, and interact online as well as the information they see. Most adolescents who use social make use of affordances

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

for creation and consumption of content. Many platforms enable youth to create and share text, images, videos, or other creations such as games or code. Common ways to share content include posting to a profile on Instagram or Facebook, broadcasting to a group on Twitter, and uploading a fanfiction story to a website, just to cite a few examples. Some spaces allow users to view media without the need to add content; in others, such as BeReal, creating content is mandatory. Some models such as YouTube and TikTok do not even require users to have an account on the platform. The individual or collaborative nature of the content created is a related affordance. Some platforms are designed for one user to create content and share it, while other platforms, including the games Minecraft and Roblox, for example, are geared to collaborative creation. Still other platforms have aspects of both individual and collective production.

Platforms enable messaging in varying time frames, described as synchronous or asynchronous communication. Synchronous communication refers to real-time back-and-forth interaction, as in chat rooms or massively multiplayer online role playing games. Asynchronous communication, a feature of the vast majority of platforms, enables users to post and respond to posts at any time, either shortly after the initial message or later as in message threads or comments. Some platforms (e.g., Discord) offer both synchronous and asynchronous communication. Communication can also have an aspect of anonymity. On platforms such as Reddit, users need not display their names or any identifying information. On other platforms (e.g., Facebook), messages are linked to accounts with the user’s account name and photo displayed. Still other platforms (e.g., Instagram or World of Warcraft) do not necessarily display a name but are easily identifiable through searches or platform recordkeeping. Platforms can also have some identity-displaying requirements but allow the use of supplemental apps that are completely anonymous. The gaming, blockchain, and financial technology platform YOLO is an example of the latter hybrid-anonymous communication.

On a related note, public or private communication is an important feature of social media. Some platforms are designed for private messages, either one-on-one or in groups (e.g., WhatsApp or Marco Polo). Other platforms may have the option to send direct messages or support private messages through a supplemental app as with Facebook messenger. Public communication, on the other hand, can involve discussion forums, as on Quora or Reddit, public feedback threads as on the fanfiction app Wattpad, and public comments.

The ability to display connections to other users or causes is a feature of many types of social media described as affiliation. Affiliation is commonly indicated through tags added to content (e.g., @, #). Some platforms, such as Twitter, also allow the forwarding or reposting of other

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

users’ content on one’s own account. Through affiliation and reposting users can indicate their like or dislike of other users’ content, something that is also made visible through like tallies and comments.

The extent to which these comment tallies or other records of the social media experience remain accessible, known as recordability of content, is variable. Some user-created content disappears after being viewed, as on Snapchat, or after a short time, as in an Instagram story. Some platforms allow users to modify or delete posted content. Other platforms do not allow any editing of posted content. Nevertheless, because of screen capture, sharing, and reposting, most content is at least theoretically recordable.

All social media platforms enable some way to tailor a profile to a user’s personal requirements. Individualization can include aspects of the users’ display such as the design of the appearance of the profile (e.g., Instagram) or play space (e.g., Minecraft) or types of curated content (e.g., Pinterest). Some platforms have questions that create the profile structure (e.g., Tinder or Yubo). On some apps, the user personalizes a username and a description of self, or avatar, sometimes including the appearance of the avatar, as on the videogame League of Legends.

Recommendations are social media affordances that can connect users with interesting content or with people who may share similar interests. Recommendations can be presented in a way to encourage extending the time spent on the platform. For example, most platforms have a default user setting to enable notifications, meaning users are notified every time they have a new message or new content in their feeds. Such settings encourage the user to return to platforms they were not otherwise using and to open messages, review who has engaged with their content, and spend more time on the platform. Some platforms embed recommendations in user feeds. Others have recommended content play automatically. Recommended content can appear endlessly. Some platforms present new material using an infinite scroll. Social media companies can also recommend advertisements or add-on purchases in a banner on the top of the screen or embedded in feeds; they may enable automatically played videos. All these features (recommending content, autoplay, infinite scroll, banners, and push notifications) are part of persuasive design, tools to capture users’ attention and time to the financial benefit of the companies.

PLATFORM OPERATIONS

Many of the affordances described in the previous section are powered by computational algorithms (others, such as the ability to leave comments on a video or chat with a fellow gamer are the result of platform design decisions). An algorithm is a set of instructions or a series of

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

steps that a computer or program follows in order to solve a problem or perform a specific task. On social media platforms, algorithms are evaluated based on their efficiency, accuracy, and complexity. The efficiency of an algorithm refers to how quickly it can perform the desired task considering the resources used. Accuracy, in contrast, refers to how well an algorithm can achieve the desired outcome. Both accuracy and efficiency are evaluated relative to a particular platform. An algorithm’s complexity, referring to how difficult the algorithm is to understand and implement, is another important feature. Most algorithms employed by today’s platforms are incredibly complex, with a simple video recommendation depending a million lines of code (Computerphile, 2014; Murthy, 2021).

Algorithms rely on information platforms to collect information about their users’ actions and behaviors. Algorithms can reveal patterns in the copious amounts of data that internet use generates, even the passively collected “data exhaust” of no obvious value to the platform (George et al., 2014). Information about when and how people search the internet, what they shop for, and who they interact with can be valuable to platforms when taken on aggregate and combined with other information (George et al., 2014).

Given concern about potential negative consequences, researchers have started to evaluate algorithms’ bias and transparency. Definitions of these concepts evolve over time and vary across disciplines (Jacobs and Wallach, 2021; Mulligan et al., 2019). Bias can refer to many different concepts. In this report, the term is primarily focused on algorithmic fairness or lack thereof. Even with that narrowed scope, there are various interpretations that consider the effects of algorithmic decisions on disparities of outcomes across groups (e.g., similar exposure to health-promoting content across races) or limit the use of group characteristics in decision making (e.g., platform decision to serve a piece of content being independent from race). Recent scholarship also highlights the importance of accounting for justice (Arneson, 2018; Jacobs and Wallach, 2021). Accountability becomes relevant when someone harmed by algorithmic decision making seeks redress. Finally, transparency, while still a contested concept, generally refers to the factors underlying algorithmic decisions being open to the relevant stakeholders (Mulligan et al., 2019).

The algorithms’ technical sophistication points to the explicit purposes that the creators of social media tools have in mind; this purpose varies across platforms. Today, algorithms are being used to manage, curate, and moderate content that people see online. This section describes how these algorithms work and power the affordances and features described in the previous sections.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Platform Algorithms

Social media platforms use a variety of algorithms to manage content that users see. These algorithms are often proprietary and use a variety of machine learning and artificial intelligence (AI) techniques to determine which content to show to users and to develop a platform’s unique niche. Their goal is to maximize engagement and, for many platforms, keep users on them for as long as possible. Platforms’ efforts to curate the content users see are advanced by four main types of tools: algorithms for making recommendations, algorithms for ranking, algorithms for targeting advertising, and algorithms that moderate what content a user sees in their feed.

Recommendation Algorithms

Social media platforms collect a vast amount of data on their users, including their browsing history, search queries, interests, and social connections. They use these data to create user profiles and to make personalized recommendations. Companies can also infer characteristics of a user based on information about that person’s social connections or people to whom they are otherwise similar (Davison et al., 2010; Mislove et al., 2010). A collaborative filtering algorithm looks at the user’s behavior, such as which posts were liked or shared, and then recommends similar content.

Broadly speaking, recommendation algorithms are geared to help users discover new content on and off the platform including people to follow, things to buy, etc. Platforms also harness the information in a user’s images, posts, and videos with algorithms analyzing the material to determine its meaning and then make recommendations based on the user’s interests and preferences. On top of that, many platforms provide summaries of what is currently being discussed, which serve as measures of the pulse of the crowd. This material is provided by algorithms that detect trending topics. These algorithms look at spikes in particular emergent topics present in users’ posts and recommend content related to those topics. Most recommendation algorithms also account for engagement with a post, as indicated by its number of likes, shares, and comments. Posts with high engagement are more likely to be recommended. Finally, these algorithms also consider timeliness, such as how recent the post was and how long it has been since the user last logged in. Recent posts and posts from accounts with which the user engages frequently are more likely to be recommended to others.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Ranking Algorithms

Content ranking and sorting algorithms are used to determine the order in which content is displayed to users. The idea is to provide users with a personalized and engaging experience by showing them the most relevant and high-quality content first. These algorithms aim to balance the interests of the user with the interests of the platform while also promoting engagement and user satisfaction. A common strategy considers measures of how relevant the content is to the user’s interests and behavior. For example, if a user frequently engages with posts about cats—or, more relevant to this discussion, about eating disorders—the algorithm may prioritize posts about those topics in their feed. Measures of engagement can be subtle; even the amount of time a user lingers over content can influence ranking algorithms.

Ranking algorithms may prioritize content based on relevance; relevance can in turn be inferred from user activities. Social connections influence estimates of relevance. For example, algorithms generally give more weight to content from friends and family members. Measures of engagement with a post (e.g., the number of likes and shares) can also influence its rank with posts with high engagement being more likely to be ranked higher and displayed to more users. Ranking algorithms also consider the recency of the items to be sorted, giving more weight to newer posts, as users are more likely to engage with fresh content. Quality of the content is also considered in deriving ranking positions, with the posts’ originality, credibility, and tone playing a role. Some platforms (e.g., Facebook) have also experimented with showing users a diverse range of content, by providing them an option for an all-encompassing reverse chronological ordering view of content (called feeds on Facebook). The idea behind such affordances has been to support opportunities to diversify from topics that align with and are reinforced by user preferences, avoiding overreliance on a single topic or source (Mahapatra, 2020; TikTok, 2021).

Recommendation and ranking algorithms may work hand in hand to determine how content is displayed in a user’s feed. Platforms may first identify what types or pieces of content are the most suitable for a particular user’s feed at a point in time, and then use some objective measures of relevance or engagement to rank them. At times, ranked content could be interwoven with advertised content if the advertisements are consistent with the user’s interests and past activities. The latter is again triggered by a set of different algorithms called ad-targeting algorithms.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Ad-Targeting Algorithms

In addition to targeting particular content, social media algorithms also use information about users to target advertising. Most social media platforms are free to use, although some, such as Roblox, LinkedIn, Hinge, and Twitter, offer premium versions for a subscription fee. In general, social media companies rely on advertising revenue and therefore have highly sophisticated ad-targeting algorithms. These algorithms can use contextual or behavioral cues to create targeted, personalized advertising campaigns, determining which users should see which ads. The goal of these algorithms is to show ads to users who are most likely to be interested in them to increase the effectiveness of their advertisers’ investment.

Social media platforms have rich information about their users: browsing history, search queries, interests, and social connections as well as basic demographic information such as age, gender, and occupation. Ad targeting algorithms use all this information to create personalized profiles behind the scenes, often opaque to the end user. Contextual information on where the ad is displayed is also considered to ensure the relevance of posted advertisements.

Ad-targeting can also incorporate an element of collaborative filtering. Some algorithms target users who are demographically or behaviorally similar to existing customers or followers (an approach known as lookalike targeting). Some platforms also allow advertisers to upload their customer lists or create custom audiences based on specific criteria, such as people who have visited their website or interacted with their social media pages.

Content Moderation Algorithms

Most social media platforms have content moderation policies that aim to protect their users from harmful or offensive content. Determining what constitutes harmful content depends on the company’s policies, outlined in its terms of service. Typical examples include hate speech, nudity, violence, and spam. While content moderation is a feature of all media, and has always been, content moderation for social media is substantially more complex given the rapid and high volume of information available to social media users. As such, approaches to moderation have grown and evolved rapidly with the companies that implement them. Today’s approaches may bear little resemblance even to previous policies of the same company. Online bulletin boards, for example, were once carefully managed by dedicated administrators who were an integral part of the community, a model still central to platforms such as Reddit. Nevertheless, the volume of content moderation required on large platforms relies

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

increasingly on contract workers or firms to remove disturbing and illegal content from the site (Roberts, 2016).

Discerning what content is offensive or vulgar is necessarily a judgment call. Impressions of a vulgar level of nudity, for example, are highly dependent on local cultural norms, posing challenges for multinational companies. This dilemma led TikTok, for example, to issue separate content moderation guidance for more and less socially conservative countries, although journalistic research has found that the platform’s decisions often far overreach accepted norms, particularly in the censoring of depictions of homosexuality (Hern, 2019). The same researchers found the platform to be strangely permissive in other decisions, taking “the unusual approach of erring on the side of risk when it came to sexualized content featuring children” (Hern, 2019).

TikTok’s inconsistent approach to content moderation is one of many incidents that have drawn public attention to content moderation. The 2016 U.S. presidential election, in which social media played an influential part, was another (Caplan et al., 2018; Edelman, 2020; McSherry, 2020). The importance of effective moderation policies is now a prominent part of companies internal and external policy debates. Platforms may selectively reduce the prominence of content from users who, while not explicitly violating the platforms’ terms, are sufficiently offensive or misleading as to border on a violation (Gillespie, 2022).

Academic researchers are increasingly aware of the shortcomings that exist in the way platforms are managed (Gorwa, 2019). Platforms generally use a combination of artificial intelligence and human staff to enforce content moderation policies. Many platforms encourage users to report offensive or harmful content in an effort to improve the efficiency of their moderation efforts. Some violations can be detected with automated methods; for example, the presence of certain keywords for example can indicate likely hate speech, and automated tools can analyze the content of images, videos, or text. The analysis of metadata, information such as where, how, and by whom a post was created, can also inform an algorithm’s assessment of whether content violates a platform’s terms of service. Moderation algorithms often use a variety of techniques in combination, including machine learning, natural language processing, and computer vision, but specific moderation strategies vary. Box 2-1 lists examples of how some well-known social media platforms moderate content.

While each platform has its own proprietary and often unique moderation strategy, there have been some collective efforts to standardize the process. The Global Internet Forum to Counter Terrorism (GIFCT) is one prominent example of such standardization. The forum started as a coalition of technology companies working together to deprive terrorists of the

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

amplifying value of social media. Industry commitment to collaboration was at the root of its involvement in the GIFCT (Huszti-Orban, 2018).

GIFCT working groups have explored strategies for optimal content moderation, advocating the use of artificial intelligence and algorithms that use machine learning to identify and remove terrorist content before it is widely viewed (GIFCT, 2021b). Sharing of perceptual hashes, “representation[s] of original content that cannot be reverse-engineered to recreate the content,” is one strategy GIFCT promotes (GIFCT, 2023). Through hash sharing, member companies can quickly inform colleagues at other companies in the forum of potential terrorist content without sharing user data. GIFCT also gives some attention to support for smaller companies for whom the cost of compliance with content moderation, especially policies that rely on human moderators, might be prohibitive (GIFCT, 2021a).

Response to GIFCT’s work, from member companies and press, has been generally positive (Criddle, 2023; Hadavas, 2020; Microsoft, 2017). After public concern that Facebook had been used to incite violence against the Rohingya minority population in Myanmar, the company improved its detection algorithm for hate speech in the local language (Stecklow, 2018; Stevenson, 2018). Unsurprisingly, training the algorithm to understand objectionable terms has led to a dramatic increase in the removal of terrorist content on major platforms (Gorwa et al., 2020). Facebook reports that 99 percent of content from ISIS and Al Qaeda is detected before anyone flags it; YouTube reports removing 98 percent of violent extremist content (GIFCT, 2019). Yet meaningful interpretation of such claims is difficult. The percentage of content found to be in violation of policy and removed can only be a subset of all the violating content that exists. Regardless of how close to perfect a moderation system is, its success depends largely on the accurate identification of the most harmful content, with room for error only in the margins. Such qualitative evaluations of not just the content removed but its relative likelihood to cause harm depends on both objective measurement of harm and more complete data on content moderation and takedowns.

Distortions Associated with Algorithms

At the same time, no amount of computational sophistication can produce algorithms that cater perfectly to end users’ needs, demands, and goals. As a 2022 White House briefing noted:

Although tech platforms can help keep us connected, create a vibrant marketplace of ideas, and open up new opportunities for bringing products and services to market, they can also divide us and wreak serious real-world harms. (White House, 2022)

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Of particular concern to this report are the potential harms to young people. This section discusses some of the challenges that have emerged from the persistent and increasing reliance of social media platforms on algorithmic computation.

The Black Box Nature of Platform Algorithms

Part of opening a social media account is an invitation to authorize the platform to track data. For this, users have to accept these terms to make an account. But terms of service are widely, if not universally, ignored, prompting a recent article to describe the idea that users’ endorse them as “the biggest lie on the internet” (Obar and Oeldorf-Hirsch, 2020).

Research indicates that while 79 percent of adults express concern about how companies use their personal data, only about 60 percent of adults report reading privacy policies even some of the time (Auxier et al., 2019). Endorsing terms of service, even if only officially, can absolve companies of responsibility for misuse of personal data. Terms of service sometimes stipulate account security measures, and users who do not take required measures may be more vulnerable to hacking or data breaches. What is more, platforms can terminate accounts over violations of terms of service (Instagram, 2023c; Meta, 2022; TikTok, 2023b). Given the complexity of most terms of service and the real challenges of information overload, there is concern that users today simply cannot make meaningful use of notice-and-consent terms, but it is difficult to say what a better system would look like (Turow et al., 2023).

Despite being central to the end user experience, the nature of social media platform algorithms makes them opaque to users. The information used in algorithmic input, how such information is processed, and how the user’s experience is generated are rarely clear. There is also a problem of the inherent complexity of the process, a complexity that drives much uncertainty.

The use of black box algorithms by social media platforms to define and influence the user experience has been criticized for its biases and lack of transparency (Waddell, 2021). Technology companies, for their part, have been notoriously reluctant to share what they consider trade secret information with the public (Foss-Solbrekk, 2021; Waddell, 2021). Content moderation decisions, already subject to somewhat subjective interpretation of rules, are even less open when made by an algorithm (Gorwa et al., 2020). When certain viewpoints appear to be penalized or given special treatment, backlash will follow, as the process that underlies these important social and political decisions is somewhat opaque.

In her public testimony, Facebook whistleblower Frances Haugen described the company’s interest in growth at the expense of safety argu-

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

ing that the company made subjective, secretive decisions about what content its users should see because its harmful algorithms were in fact profitable (Haugen, 2021). Facebook denied this allegation, citing a $13 billion investment and 40,000 safety staff (Ryan and White, 2021). While neither Haugen’s allegations nor the usefulness of the company’s investment in safety can be objectively verified, the industry’s reticence to authorize independent analysis of their algorithms contributes to a perception of malfeasance (Darcy, 2021; Mostert and Urbelis, 2021; Thune, 2021).

The proprietary and black box nature of platform algorithms can cause subtle harms to users. Even highly educated users are not fully aware of how social media algorithms work and how they affect their social connections. For example, if a user’s close friend or family member’s content is not showing up in their feed, they may assume that the person is intentionally ignoring them, when, in reality, the social media algorithm is responsible for the lack of visibility. Research indicates that many users do not realize how social media platforms are designed to show them content that is most likely to engage them and keep them on the platform, rather than providing a comprehensive view of the content their friends and family members are posting (Eslami et al., 2015). Feeds also tend to mix network content with advertisements, further confusing people’s understanding of why something is in their feed.

Algorithmic audits by external parties have been suggested as a way to measure the extent of these harms, but such audits are difficult to implement partly because platform algorithms are constantly changing in ways that affect the curation and distribution of content. Some of these changes are merely annoying, as when a post disappears from a user’s feed. Others are discriminatory: Facebook has allowed its advertisers to exclude whole racial, ethnic, and age groups from seeing ads (Angwin and Parris, 2016; Angwin et al., 2017). In some cases, the algorithmic influence on how ads are shown to users can conflict with the advertisers’ goals (Ali et al., 2021; Sapiezynski, 2023). There may be bias in how ads are pushed to users even when the advertiser explicitly requested certain audiences (e.g., disproportionate views by male users when advertisers wanted an equal split of male and female viewers) (Ali et al., 2019).

It may not be possible or advisable to fully reverse engineer algorithmic systems powering today’s social media sites, but there is reason to believe that some of an algorithm’s harmful consequences could be identified through the use of so-called glass box analyses that identify the data and features driving the users’ experience (Dobbrick et al., 2022). Algorithmic audits have turned out to be helpful in recent years to identify gender and racial biases in online systems and to understand misinformation online. Glass-box approaches, wherein platform designers create fea-

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

tures and scaffolds for third-party researchers and practitioners to study the intricacies of the underlying platform algorithms without having to know all possible proprietary details, can supplement these efforts.

While some scholars have advocated for glass-box analysis of social media platforms’ algorithms, others have recognized that this would have several implications. First, a public-interest infrastructure to facilitate scrutiny of platforms and offer appropriate guidelines on how to redress problems would be needed. Such infrastructure would likely need to involve academia, government agencies, civil society groups, and user collectives, and be capable of conducting thorough examinations and addressing problems using means such as competitive pressures, the media, or legal action.

The Effect of Algorithmic Practices

Platform algorithms arguably play an important role in regulating what users see, seeking to keep them safe online. Algorithms do well at identifying certain types of clear and egregious violations. For example, child sexual abuse material, a universally reviled and illegal category of content, is relatively straightforward for companies to articulate and for automated moderation to detect (U.S. Congress, 2019). Other objectionable content, however, has more ambiguous boundaries, and algorithms have difficulty making precise determinations in the face of ambiguity (Duarte et al., 2017; Fishman, 2023). Self-harm content and hate speech, for instance, can be identified only through contextual interpretation. Developing tools that can accurately detect and remove this content is a significant challenge (Neumann, 2013).

Further, content moderation practices themselves have been met with skepticism among some scholars and internet activists. Chandrasekharan and colleagues found that Reddit’s ban of certain racist and demeaning threads caused the people spreading them to either quit the platform or move on to other topics, vastly reducing their influence (Chandrasekharan et al., 2017). Similarly, investigations found that misinformation on election fraud and hate speech fell sharply after the January 2021 deplatforming of former president Donald Trump (Dwoskin and Timberg, 2021). But it is not always clear if such drops are the result of the most radical users simply taking their business to more obscure and poorly moderated forums. While such migration makes the work of content moderators on mainstream platforms easier, it poses a separate problem for society in that it removes the moderating influence of mainstream discourse from the lives of the people already on the margins, thereby contributing to their increased radicalization (Lanteri, 2022). The same pattern has

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

been documented among users leaving Twitter for explicitly right-wing-friendly platforms such as Gab and Parler; Telegram serves a similar role for Islamic State militants (Ribeiro et al., 2021; Warrick, 2016). Playing out over many online platforms, such migration has led to the public questioning whether commitment to removing offensive content would “come at the expense of [creating] a more toxic and radical community” (Ribeiro et al., 2021).

There is ample reason to suspect that harmful information may be more likely to flourish on online platforms with fewer moderation measures in place. Obscure platforms can foster echo chambers as they attract users who share similar perspectives. Users who seek out these platforms through recommendations may be more receptive to the messages circulated on them. Research on online groups devoted to eating disorders indicates that such users will adapt their behavior to allow their objectionable content to escape detection, using strategies such as intentional misspelling and coding their language to avoid notice (Chancellor et al., 2017). In such circles, the content moderation algorithm can inadvertently perpetuate disordered norms and beliefs about diet. Ultimately, many of the key challenges to protecting young people’s well-being online come back to trade-offs in dealing with subtle and complex content on topics such as self-harm.

The limited efficacy of platform algorithms and their potential to create distortions can give rise to recursive feedback loops for users. Although the algorithms’ goal may be relatively innocuous, the manner in which the content is presented can be a source of harm (Kirdemir et al., 2021; Lee et al., 2022; Logrieco et al., 2021). An emphasis on maximizing user engagement, discussed later in this chapter, may be at the root of the problem, as algorithms sort content based on users’ history, favoring the material to which users have responded in the past. The most sensational and provocative posts are often given the highest priority for this reason, exposing users to a narrow range of content that reinforces their existing beliefs and interests, encouraging recursive feedback loops.

Recursive feedback can, in turn, exacerbate problems with harmful content and misinformation. If a user shows interest in conspiracy theories, for example, then the algorithm may recommend more of the same content, creating the impression that such theories are more prominent than they actually are. This misperception lends a veneer of credibility to misinformation. Vaccine hesitancy is a prime example of the fallout of recursive feedback, but many quack health treatments have been promoted through the same path (Brugnoli et al., 2019; Dow et al., 2021; Robins-Early, 2022; Swire-Thompson and Lazer, 2020).

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

The Implementation of Persuasive Algorithmic Design

As noted above, social media platforms are often designed to keep users engaged and coming back for more. While often beneficial, some of the design elements can shape users’ perceptions of the world around them by limiting their exposure to diverse perspectives.

One of the key ways that social media platforms use design to influence user behavior is through notifications. Notifications are designed to grab attention and encourage users to take specific actions, such as opening an app or liking or commenting on a post. However, notifications can affect user behavior by prioritizing certain types of content or interactions over others (Altuwairiqi et al., 2019). For example, social media platforms may prioritize notifications for posts with high engagement, such as likes or comments, which can direct users’ attention toward those posts and away from others that are equally or more important. At the same time, inadequate notification features may create apprehension or anxiety over missing an opportunity (i.e., fear of missing out) (Alutaybi et al., 2019).

Another design element that can shape behavior is infinite scrolling (Cara, 2019). Infinite scrolling allows users to move through their feeds without having to click a “next page” button, a feature that encourages users to spend more time on the platform. Infinite scroll also tends to prioritize the most popular content rather than providing a comprehensive view of network posts. Infinite scroll has the potential to keep people on a platform; it can also have the effect of overwhelming people who may miss out on important content that is buried deeper in their feed (Bhargava, 2023; Karagoel and Nathan-Roberts, 2021).

The algorithms that tailor the content users see in their feeds to their interests and preferences are also an obvious source of distortions. While personalization can create a sense of relevance and engagement, it can also create filter bubbles, thereby reinforcing echo chambers. Filter bubbles occur when users only see content that aligns with their preexisting beliefs and preferences, limiting their exposure to diverse perspectives and leading to a distorted sense of reality (Arguedas et al., 2022; Rhodes, 2022). Additionally, personalization can lead to privacy concerns, as social media platforms may collect and use personal data to create targeted content (Eg et al., 2023).

Gamification is another persuasive design element used in social media platforms. The ability to win badges or points or to be on leader-boards encourages more use of the platform (Bitrián et al., 2021). It can also encourage users to value certain types of interactions or behaviors over others. For example, social media platforms may reward users for engaging with popular or controversial content even if it is misleading or harmful (Alizadeh et al., 2023).

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Persuasive design can also manipulate social influence (Stibe and Oinas-Kukkonen, 2014; Wiafe et al., 2020). Social proof, for example, refers to the influence of others on a person’s attitudes or behaviors (Das et al., 2014). Design features that display the number of likes, shares, or followers that a user has can create a sense of pressure or conformity around certain types of content or interactions (Sanak-Kosmowska, 2021). A high count of likes or re-posts can create a social pressure to continue the re-sharing of content (Hartmann et al., 2021; Richler, 2023).

Most people’s lack awareness of the extent to which their personal data drive operational algorithms may be the result of a long-held emphasis in computing on seamless design, that is, a system where the operations of a technology are invisible to the user (Inman and Ribes, 2019). An emerging emphasis on the opposite, platform design that makes the complexity and ambiguity of the technology clearer, is gaining traction as a way to make users aware of their influence on a platform’s operations (Inman and Ribes, 2019). This so-called “seamful” design aims to convey the logic influencing the mix of human and algorithmic involvement in the content shown to users.

ADOLESCENTS AND SOCIAL MEDIA

Teenagers are often early adapters to social media platforms; spikes and drops in platform popularity among young people are crucially important to social media companies (Frenkel et al., 2021). They are also active users of the technology: A recent nationally representative survey found that 95 percent of teenagers have a smart phone, a 22 percentage point increase over the last 8 years (Vogels et al., 2022). The use of smartphones tends to start early in adolescence, over 40 percent of children aged 8 to 12 have a smartphone; 18 percent of children in this age group report using social media every day (Rideout et al., 2021).

The relative popularity of different social media platforms among teenagers can shift rapidly. The most popular platform today, the video sharing cite YouTube, is used by 95 percent of teens every day (Vogels et al., 2022). More than a third of teenagers say they use a social media platform “almost constantly” with YouTube again being most often named (see Figure 2-1) (Vogels et al., 2022).

Pew researchers found Black and Hispanic teens may use online media more than their White peers; more than half of Black and Hispanic teens and 37 percent of White teens report being online “almost constantly” (Vogels et al., 2022). In the same way, girls appear to use modestly more online media than boys, and teens over age 15 more often engaged in constant use than younger teens (see Figure 2-2). On the whole, 55 per-

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Image
FIGURE 2-1 Percentage of U.S. teens who say they “ever use this app or site” or “almost constantly use this app or site.”
SOURCE: Vogels et al., 2022.
NOTE: Teens refer to those ages 13 to 17. Those who did not give an answer or gave other responses are not shown.
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Image
FIGURE 2-2 Percentage of U.S. teens who say they use the internet almost constantly either on a computer or a cellphone.
SOURCE: Vogels et al., 2022.
NOTE: Teens refer to those ages 13 to 17. White and Black teens include those who report being only one race and are not Hispanic. Hispanic teens are of any race. Those who did not give an answer or gave other responses are not shown.

cent of teens said the time they spend on social media is about right, only 8 percent thought they were on social media too little; the rest thought their time spent was too much (Vogels et al., 2022).

The Adolescent Experience Interacts with Platform Affordances

Adolescence is a time of tremendous cognitive, social, emotional and physical change. These changes involve both opportunity for maturation and vulnerability to environmental stressors (NASEM, 2019). There is a mismatch in the timing of maturation in different brain systems and the ability of these systems to communicate (Dahl, 2004; Giedd et al., 2015). Early in adolescence, the start of puberty initiates dramatic changes in the brain’s limbic system that regulates emotions, moods, feelings of reward, and social needs. In contrast, the brain’s prefrontal cortex, which is critical for carrying out functions such as decision making and regulating emo-

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

tions, takes longer to develop and fully mature. This developmental mismatch can leave adolescents with less mature controls for making good judgments and regulating emotions and impulses, especially when emotions are high, and peers are around. Thus, the underlying maturational processes that operate during this stage of development are the very ones that can render adolescents more vulnerable, relative to younger children or adults, to social media affordances.

The brain undergoes profound development from adolescence into early adulthood (Guyer et al., 2023). Three key features of adolescent brain development are especially relevant to a discussion of social media affordances. One is a heightened sensitivity to rewards as well as dynamic changes in the function of the dopaminergic system (Silverman et al., 2015; Spear, 2011; Wahlstrom et al., 2010). The second is protracted maturation of brain networks that support cognitive control (Giedd et al., 2015). The third is neural sensitivity to specific types of social information (Nelson et al., 2016).

Beyond physical changes, in the second decade of life, youth refine their identity, assert their autonomy, build intimate relationships with peers and romantic partners, and begin to take on adult roles and responsibilities. These maturational processes specific to adolescence can also interact with social media in ways that are both similar to, and different from, younger children and adults.

Platforms are designed to keep users engaged, a feature sometimes described as stickiness. Stickiness is typically realized through the persuasive design features described earlier in this chapter. For adolescents, ongoing brain development and a heightened sensitivity to rewards coupled with still developing cognitive control make it difficult to disengage despite their best intentions and even knowledge about the consequences. When faced with content that is increasingly interesting and emotionally exciting, as pushed content may be, the task of getting offline is more difficult; the adolescent brain is particularly susceptible to highly emotional or arousing contexts (Guyer et al., 2016). A strong desire for social connectedness may lead adolescents to have relaxed privacy settings or a willingness to connect with strangers and a strong need to check accounts for feedback from peers, such as likes and comments, given the reinforcing potential for a social reward that engages dopamine-producing brain regions (Nelson et al., 2016; Sherman et al., 2018). For the same reason, notifications about new messages or comments on platforms are challenging for youth to ignore when doing homework or trying to sleep (Kidron et al., 2018).

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Neurobiological Maturation

As children approach adolescence, their brains reach adult size, but they continue neurobiological maturation during the teen years into the mid to late twenties. Much of this maturation comes from the refinement and coordination of the brain’s prefrontal cortex that leads to improved capacity for logic, planning, memory, and abstract thinking (van der Molen, 1994). In tandem, executive function—the higher order skills and cognitive processes that help youth regulate their time, attention, emotions, and impulses—continue to develop in adolescence but do not fully mature until the mid-twenties (Friedman and Miyake, 2017; Zelazo and Carlson, 2012). In general, cognitive skills and impulse control improve tremendously during this second decade, but the process is uneven and largely shaped by both pubertal changes and personal experience. For this reason, the problem-solving skills of a prepubescent 13-year-old is likely to be different than a 13-year-old who started puberty at age 9. Both age and timing of developmental milestones, such as puberty, can influence the experience of social media.

Changes in the prefrontal cortex during adolescence into young adulthood involve reduced gray matter volume caused by the pruning of unused connections. Pruning improves the transmission of neural signals, and myelination increases white matter volume to improve transmission of neural signals and strengthen connections between brain regions (Giedd et al., 2015; Mills et al., 2016). Constant engagement in social media in early adolescence may alter neural sensitivity to rewards and punishments (Maza et al., 2023).

Reflecting a developmental mismatch between drives and controls, brain systems for processing socioemotional incentives and exerting cognitive control both grow rapidly in adolescence but at different rates. Sensitivity to rewards increases from childhood through adolescence, peaking in the late teens and then declining (Braams et al., 2015; Silverman et al., 2015). Regulation of impulsive behavior, a function of cognitive control, develops more slowly. For this reason, adolescence is characterized by both increased enjoyment of risk-taking and a disinclination to limit it, making teens more likely to take greater risks, online and offline, than adults or children (Albert et al., 2013; Shulman et al., 2016; Steinberg, 2010).

Studies using functional magnetic resonance imaging (fMRI) have linked the use of social networking sites to brain adaptations in samples of adolescents and young adults (Wadsley and Ihssen, 2023). The brain’s so-called reward circuit is involved in social media use, and especially in excessive or problematic use, as it is in other behavioral addictions, though it is hard to say if these changes are a cause or an effect of social

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

media use (Wadsley and Ihssen, 2023). Recent fMRI studies have linked the use of social media in early adolescence with changes in the neural underpinnings for sensitivity to social feedback. Similar research suggests that multiple neural networks can be reconfigured in response to personalized video feeds in young adults (Su et al., 2021). While none of these studies can establish that social media use causes changes in the brain, the emerging literature suggests a potential interaction of social media stimuli and neurological development.

Social and Emotional Development

Increased neural maturity supports young people’s ability to think not only about their mental state but that of others as well (Hall et al., 2021). An increased ability to consider other perspectives drives empathetic and prosocial behaviors as well as increased social comparison (Hollarek and Lee, 2022). Adolescents may give undue weight to other people’s opinions, real or imagined, delivered explicitly or implicitly (Elkind and Bowen, 1979; Guyer et al., 2014; Pfeifer et al., 2009; Venticinque et al., 2021). This internalized, relational thinking can interact with social media affordances such as affiliation, content creation, and recordability, and might increase teenagers’ vulnerability to violate platforms’ terms of service or circumvent moderation policies (e.g., to maintain a clandestine identity).

Along with a better understanding of one’s own and others’ mental states, the developing adolescent is increasingly better at recognizing their own emotions and regulating emotional intensity (Silvers, 2022). While stabilizing with age, emotional intensity is influenced by hormonal changes and puberty (Guyer et al., 2016). The considerable variability in hormonal cycling can influence young people’s moods and emotions, including their emotional response to stimuli such as social media (Buchanan et al., 1992). There are also important cultural differences in how emotions are expressed and the range of emotion considered acceptable to display (Kiel and Kalomiris, 2015). Emotional regulation plays a role in the expression of emotion online and the propensity to have negative emotional reactions; it is also central to basic coping and self-regulation (Blumberg et al., 2016; Gioia et al., 2021; Giordano et al., 2023; Greenwood and Long, 2009; Marino et al., 2020).

Developmental Needs of Adolescence

Milestones of adolescence, such as learning to drive, getting a job, helping to mind younger children, and having romantic relationships

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

are markers of increasing freedom and responsibility. A desire for independence is intertwined with physical, mental, and sexual maturation of teens. It also makes digital spaces especially appealing, as young people can interact with others without the same parental oversight as their in-person interactions might draw. On social media, adolescents can select content, connections, and activities allowing platforms’ curation algorithms to further tailor such content for them.

Development of identity is an important part of adolescence. Teens spend time exploring their thoughts, beliefs, and feelings, and thinking about how they signal these beliefs to others (Meeus et al., 1999). Social media features provide a myriad of ways to display oneself publicly and receive feedback on those displays, including a variety of configurations of social connections. Even using the same platform, youth might maintain multiple accounts with different connections and different presentations of self. Social media can also offer a window on different identities, potentially valuable for young people who do not have in-person role models of identity, such as members of sexual and gender minorities.

The importance of peers for teens also relates to their identity development. Young people choose friends with shared interests, experiences, or traits and may become more like their peers over time (Brechwald and Prinstein, 2011). Teens typically spend more waking hours with their peers, both online and offline, than with their parents and often give more attention to markers of their peer affiliations (e.g., clothing style, walking close together). The selection and socialization aspects of peer relationships can be seen in both face-to-face and digital interactions (Brechwald and Prinstein, 2011; McPherson et al., 2001).

Sexual maturation is a significant milestone of adolescence, and this can involve opportunities for emotional and physical intimacy with others (Suleiman et al., 2017). Many aspects of romantic relations are similar to friendship, with feelings of closeness, support, and biochemical rewards (Furman and Shomaker, 2008; Yau and Reich, 2018). Adolescents seek information about sex and sexuality from various sources, including media (DeLamater and Friedrich, 2002). With the advent of social media, youth can now find sexual content, information about sexuality, and romantic and sexual partners online.

THE SOCIAL MEDIA BUSINESS MODEL

The affordances that various platforms offer influence users’ experiences with social media. Choices as to how to deploy different affordances are influenced in turn by the platforms’ business models, which in the case of social media are typically classified as organic, earned, or paid.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Organic media, sometimes described as owned media, emphasize the importance of customers engaging with the brand. Users do not pay for organic media; the company bears the cost of the staff and salaries needed to support them. Similarly, earned media refers to the free publicity a brand or company may receive from uncompensated customers who are motivated to create word-of-mouth promotion or buzz, sometimes captured and amplified on social media. Paid media, by far the most prevalent form, refers to traditional advertising where businesses pay the platform to position advertisements (Lovett and Staelin, 2016). Meta for example, the parent company of Facebook, Instagram, and WhatsApp, makes an estimated 98 percent of its revenue from advertising (Allyn, 2022). The value of advertising is determined by traffic to the website and users’ responsiveness to the ads, typically quantified by measures of engagement (e.g., number of clicks, time spent on the site).

Sometimes paid advertising is obvious, as when a video streaming service shows commercials. Other examples are less clear. It is not always easy to distinguish paid content from that which users’ produce. Sponsored content refers to material developed by social media users or influencers under contract with a company, such as unboxing videos or product testimonials, for example (Radesky et al., 2020). Sponsored content can be difficult to identify. While children older than age seven gradually develop an understanding of the persuasive intent of advertising, including commercials, sponsored content and influencer marketing (e.g., toy reviews, toy play videos) are much harder for them to resist (Radesky et al., 2020). What is more, social media influencers are often entertaining, blurring the line between commercial and noncommercial content (Balaban et al., 2022). Among adolescents, understanding the persuasive intent of sponsored content does not necessarily trigger skepticism, especially among teens younger than 14 (van Reijmersdal and van Dam, 2020). Even after age 16, when intellectual sophistication regarding advertising is similar to adult levels, adolescents are still highly susceptible to influencers’ sponsored content (Balaban et al., 2022).

Although paid advertising drives much of the business of social media, there are other revenue sources. Some platforms have subscription or a hybrid-subscription models, wherein the basic service is provided for no cost but additional features can be accessed for a fee (Kumar, 2014). Some apps and platforms also earn income though user purchases, such as in-game purchases in video games. Especially when games are free to download, game features and updates that encourage players to buy content can represent important parts of a company’s business model (Hamari et al., 2017).

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Data Monetization

Underlying most advertising models online is the use of information about consumers (e.g., their preferences, behavior, and characteristics) for commercial purposes. Some companies buy and sell data about their customers, but the more common practice is to use the information generated by internet traffic in product development and management (Ofulue and Benyoucef, 2022). An iterative process of collecting information about customers, sometimes called business intelligence, is not a new practice, but the extent to which the internet facilitates it raises ethical questions about the balance of power between buyer and seller.

If social media users were obliged to pay directly to use platforms, the correlation between the firm’s revenue and consumer’s valuation of the product would be stronger than when revenue depends on advertising. There is evidence of a gulf between social media users’ willingness to pay and a counter indicator of valuation called willingness to accept (referring to willingness to accept payment to stop using the platform) (Sunstein, 2020). A nationally representative survey found willingness to pay for Instagram to be a median of $5 a month, but the same participants would require, at the median, $100 a month to stop using the platform (Sunstein, 2020). The same pattern was observed across social media platforms (Facebook, Snapchat, Reddit, Twitter, WhatsApp, YouTube). Such gaps reflect that people value social media but are unwilling to transfer cash for it. Similar studies have shown that only 5 percent of app users make any in-app purchases, partly because users consider the apps to be properly free of charge and that in app sales are a form of unfair monetization (Salehudin and Alpert, 2022).

Data monetization allows consumers to pay for the apps and platforms they enjoy without transferring cash. The advertising revenue social media companies earn tends to track the time users’ spend on their platform (Grover, 2022). Some research indicates that this business model would be difficult to change: even adult consumers have expressed reluctance to pay for social media (Holt and Morgeson, 2023). For adolescents, who could be expected to be more sensitive to fees, it is safe to assume willingness to pay would be even less.

Data monetization drives targeted advertising, a practice that can be both efficient and intrusive. Better information about customers allows advertisers to spend more efficiently, companies to develop better products, and consumers to be exposed to more relevant messages. Television advertising, for example, targets commercials based on relatively crude assumptions about the overlap between market and audience demographics. The promise of social media advertising is that it obviates this imprecision and better delivers to advertisers their desired customer base.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

The compromise for the users of social media platforms is that companies collect information in ways that remain opaque to most people, partly because the terms of service described earlier in this chapter are universally ignored or not understood. When third parties benefit from information about people’s behavior and preferences—information collected and sold in circuitous or highly technical ways—there can be concerns about the ethics of the monetization practice (Fred, 2017).

Legal Constraints on Data and Privacy

Protection of users’ privacy online is the subject of growing public interest (Klosowski, n.d.; Rahnama and Pentland, 2022; World Bank, 2019). Social media companies in particular have access to wellsprings of personal data about their users, information that they could use to discriminate against customers or manipulate them in sophisticated ways (Acemoglu, 2021). The use of data to target advertising or to sell to data brokers raises questions as to how much information about one’s health, personal relationships, or work people would want to share and to what ends, to say nothing of related concerns with how safe such data may be from hackers. Because companies that amass these data do so to their financial advantage there are concerns that the largest companies have an unfair advantage over their smaller competitors based only on their access to data (Berner, 2021; Bose, 2019).

The Federal Trade Commission and COPPA

Authority over unfair and deceptive practices in commerce are the purview of the Federal Trade Commission (FTC, 2021). The agency has a heightened consumer protection responsibility to children, codified by the Children’s Online Privacy Protection Act (COPPA), which seriously limits the amount of information online service providers can collect about children younger than 13 years old.1

A concern with the erosion of consumer privacy and companies’ obligations to be open about their data collection practices motivated the FTC to provide a series of guidance documents on internet advertising and marketing (FTC, 2000, 2008, 2013). The agency has also issued an advance notice of proposed rulemaking on commercial surveillance and data security (FTC, 2022b).

A more recent process has turned attention to stealth marketing to children, misleading advertising through influencers, and microtargeting of consumers (FTC, 2022a). The agency has recently taken advantage of its

___________________

1 Children’s Online Privacy Protection Act, 15 U.S.C. § 6501-6505 (2021): 2182-2183.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

authority to require companies to provide written response to questions about their practices and management, with inquiries on deceptive marketing and data collection on social media (FTC, 2020, 2021, 2023). When the agency determines that companies have acted unlawfully, it can issue fines, as it did in 2019 when it required Google and its subsidiary YouTube to pay $170 million for data collection in violation of COPPA (FTC, 2019).

COPPA recognizes that young children lack the capacity to consent to terms of service that allow for online data collection. This problem led to statutory provisions that prohibit the tracking and collection of personal information from children younger than 13 years without a parent’s explicit consent; under COPPA, parents can review and prevent further use of information that may already have been collected.2 The act specifically disallows the enticing of personal disclosures for prizes or as part of a game and sharply restricts advertising directed to children.

But COPPA’s legal protections do not extend to minors aged 13 and older. Proposed legislation aims to change this by extending COPPA protections through age 16.3 Other legislative proposals aim to extend protections to all minors and would require platforms to report on “foreseeable risks of harm” their products pose.4 State legislatures have also shown an increased interest in this topic. As Figure 2-3 shows, as of mid-2023 state legislatures across the country have passed comprehensive privacy laws, several of these included specific privacy protections for children (Blackwell, 2023).

Navigating Disparate Legal Provisions

The myriad of pending state and federal privacy laws share a common concern with giving people more control over the ways companies use their data (GDPR, 2023; Rahnama and Pentland, 2022). Questions about privacy can quickly become intertwined with larger concerns about social media affordances that encourage children and adolescents to spend more time on the platform—to companies’ financial benefit and the possible detriment of the young person (Yu et al., 2018).

An interest in age-appropriate design, a topic discussed more in Chapter 5, has developed in response to the growing apprehension that companies’ operations harm children and adolescents. For this reason, the California Age-Appropriate Design Code Act requires companies that provide online services for, or advertisements targeted to, anyone under

___________________

2 Children’s Online Privacy Protection Rule, 15 U.S.C. § 6502(b)(1)(B); 16 C.F.R § 312.6(a) (2013).

3 Children and Teens’ Online Privacy Protection Act, S. 1628, 117th Cong., 2nd sess. (2022).

4 Kid’s Online Safety Act, S. 3663, 117th Cong., 2nd sess. (2022).

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Image
FIGURE 2-3 States privacy legislation as of mid-2023.
SOURCE: IAPP, 2023. This “US State Privacy Legislation Tracker,” produced by the International Association of Privacy Professionals (IAPP) originally appeared in the IAPP Resource Center. It is reprinted with permission. It is updated regularly and this version is from May 12, 2023. Any modifications made for accessibility are not those of the IAPP.

age 18 to a more conservative standard of operations.5 Age-appropriate design, discussed more in Chapter 5, generally shifts the onus of managing personal data from the young user to the tech company (Yu et al., 2018).

But for social media companies, which operate in different states and countries, complying with disparate laws presents logistical challenges. What is more, differences in regulations among states or countries are

___________________

5 California Age-Appropriate Design Code Act, California Civil Code §§1798.99.28–1798.99.40.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

a reflection of widely varying social norms (Pfefferkorn, 2023). A recent Utah law that limits minors from using social media between 10:30 PM and 6:30 AM is seen by some as suitably protective of young people’s physical and mental health and by others as an intrusive step to isolate them (The Associated Press, 2023; Wen, 2023). Content moderation decisions regarding what is indecent, incendiary, or uncivil, aside from being famously difficult to discern, are also necessarily value judgments that will strike some people as overly restrictive, biased, or unfair (Fagan, 2020). Recent court decisions have recognized constitutional limits on the government’s ability to interfere with social media companies’ editorial choices.6

The unique power of technology companies in general, and social media companies in particular, to influence people’s experience of news, entertainment, and their social relationships is at the heart of some questioning the influence of the companies’ products on their users, especially young users. Social media affordances and the way the platforms operate can have a unique influence on adolescents, an influence described in more detail in the next chapter.

REFERENCES

Acemoglu, D. 2021. Harms of AI. Department of Economics, Massachusetts Institute of Technology. https://economics.mit.edu/sites/default/files/publications/Harms%20of%20AI.pdf (accessed September 21, 2023).

Albert, D., J. Chein, and L. Steinberg. 2013. The teenage brain: Peer influences on adolescent decision making. Current Directions in Psychological Science 22:114-120.

Ali, M., P. Sapiezynski, M. Bogen, A. Korolova, A. Mislove, and A. Rieke. 2019. Discrimination through optimization: How Facebook’s ad delivery can lead to biased outcomes. Proceedings of the ACM Human–Computer Interaction. 3(CSCW):Article 199.

Ali, M., P. Sapiezynski, A. Korolova, A. Mislove, and A. Rieke. 2021. Ad delivery algorithms: The hidden arbiters of political messaging. Paper presented at Proceedings of the 14th ACM International Conference on Web Search and Data Mining, Virtual Event, Israel.

Alizadeh, M., E. Hoes, and F. Gilardi. 2023. Tokenization of social media engagements increases the sharing of false (and other) news but penalization moderates it. Scientific Reports 13(1):13703.

Allyn, B. 2022. Meta announces another drop in revenue. https://www.npr.org/2022/10/27/1132042031/meta-announces-another-drop-in-revenue (accessed September 21, 2023).

Altuwairiqi, M., E. Arden-Close, E. Bolat, L. Renshaw-Vuillier, and R. Ali. 2019. When people are problematically attached to social media: How would the design matter? Paper read at IEEE International Conference on Systems, Man and Cybernetics, Bari, Italy..

___________________

6 Murthy v. Missouri, No. 23-411 (U.S. Oct. 20, 2023) (mem), Moody v. NetChoice, LLC, No. 22-277 (U.S. Sept. 29, 2023) (mem.), NetChoice, LLC v. Paxton, No. 22-555 (U.S Sept. 29, 2023) (mem.)

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Alutaybi, A., E. Arden-Close, J. McAlaney, A. Stefanidis, K. Phalp, and R. Ali. 2019. How can social networks design trigger fear of missing out? Paper presented at IEEE International Conference on Systems, Man and Cybernetics, Bari, Italy.

Angwin, J., and T. Parris. 2016. Facebook lets advertisers exclude users by race. Propublica. https://www.propublica.org/article/facebook-lets-advertisers-exclude-users-by-race (accessed September 21, 2023).

Angwin, J., A. Tobin, and M. Varner. 2017. Facebook (still) letting housing advertisers exclude users by race. Propublica. https://www.propublica.org/article/facebook-advertising-discrimination-housing-race-sex-national-origin (accessed September 21, 2023).

Arguedas, A., C. T. Robertson, R. Fletcher, and R. K. Neilsen. 2022. Echo chambers, filter bubbles, and polarisation: A literature review. Reuters Institute for the Study of Journalism. Reuters Institute. https://reutersinstitute.politics.ox.ac.uk/echo-chambers-filter-bubbles-and-polarisation-literature-review (accessed September 21, 2023).

Arneson, R. 2018. Four conceptions of equal opportunity. The Economic Journal 128(612): F152-F173.

Auxier, B., L. Rainie, M. Anderson, A. Perrin, M. Kumar, and E. Turner. 2019. Americans and privacy: Concerned, confused and feeling lack of control over their personal information. https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information (accessed September 21, 2023).

Balaban, D. C., M. Mucundorfeanu, and L. I. Mureșan. 2022. Adolescents’ understanding of the model of sponsored content of social media influencer Instagram stories. 2022 10(1):12.

Berner, M. 2021. Research backed consumer services could solve our two biggest problems with big tech. Forbes. https://www.forbes.com/sites/forbestechcouncil/2021/09/13/research-backed-consumer-services-could-solve-our-two-biggest-problems-with-big-tech/?sh=6c99e499323e (accessed December 26, 2023).

Bhargava, H. 2023. Infinite scroll: Addiction by design in information platforms. Paper read at Theory in Economics of Information Systems, March 31-April 2, San Francisco.

Bitrián, P., I. Buil, and S. Catalán. 2021. Enhancing user engagement: The role of gamification in mobile apps. Journal of Business Research 132:170-185.

Blumberg, F. C., J. L. Rice, and A. Dickmeis. 2016. Social media as a venue for emotion regulation among adolescents. In Emotions, technology, and social media, edited by S. Y. Tettegah. San Diego, CA: Elsevier Academic Press. Pp. 105-116.

Bose, N. 2019. House antitrust probe report likely by ‘first part’ of 2020. Reuters. https://www.reuters.com/article/us-tech-antitrust-idUSKBN1WX1WA (accessed December 26, 2023).

boyd, d. 2010. Social network sites as networked publics: Affordances, dynamics, and implications. In A networked self: Identity, community, and culture on social network sites. Z. Papacharissi (ed.). New York: Routledge. Pp. 39-58

Braams, B. R., A. C. K. van Duijvenoorde, J. S. Peper, and E. A. Crone. 2015. Longitudinal changes in adolescent risk-taking: A comprehensive study of neural responses to rewards, pubertal development, and risk-taking behavior. Journal of Neuroscience 35(18):7226-7238.

Brechwald, W. A., and M. J. Prinstein. 2011. Beyond homophily: A decade of advances in understanding peer influence processes. Journal of Adolescent Research 21(1):166-179.

Brugnoli, E., M. Cinelli, W. Quattrociocchi, and A. Scala. 2019. Recursive patterns in online echochambers. Scientific Reports 9(1):20118.

Buchanan, C. M., J. S. Eccles, and J. B. Becker. 1992. Are adolescents the victims of raging hormones: Evidence for activational effects of hormones on moods and behavior at adolescence. Psychology Bulletin 111(1):62-107.

Caplan, R., L. Hanson, and J. Donovan. 2018. Dead reckoning: Navigating content moderation after “fake news”. Data & Society Research Institute.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Cara, C. 2019. Dark patterns in the media: A systematic review. Network Intelligence Studies 7(14).

Chancellor, S., Y. Kalantidis, J. A. Pater, M. D. Choudhury, and D. A. Shamma. 2017. Multimodal classification of moderated online pro-eating disorder content. Paper presented at Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, Colorado, USA.

Chandrasekharan, E., U. Pavalanathan, A. Srinivasan, A. Glynn, J. Eisenstein, and E. Gilbert. 2017. You can’t stay here: The efficacy of Reddit’s 2015 ban examined through hate speech. Proceedings of the ACM on Human-Computer Interaction 1(CSCW):1-22.

Computerphile. 2014. YouTube’s secret algorithm. https://www.youtube.com/watch?v=BsCeNCVb-d8 (accessed September 15, 2023).

Criddle, C. 2023. Google develops free terrorism moderation tool for smaller websites. Financial Times. https://www.ft.com/content/c2da6eb1-ba81-40c5-a411-dfc94ea280db (accessed January 12, 2023).

Dahl, R. E. 2004. Adolescent brain development: A period of vulnerabilities and opportunities. Keynote address. Annals of the New York Academy of Sciences. 1021:1-22.

Darcy, O. 2021. Social media algorithms to face scrutiny as lawmakers look to curb misinformation. CNN Business. https://www.cnn.com/2021/04/27/media/social-media-algorithms-reliable-sources/index.html (accessed September 21, 2023).

Das, S., A. D. I. Kramer, L. A. Dabbish, and J. I. Hong. 2014. Increasing security sensitivity with social proof: A large-scale experimental confirmation. Paper presented at Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security, Scottsdale, Arizona, USA.

Davison, B., T. Suel, N. Craswell, and B. Liu. 2010. Proceedings of the Third International Conference of Web search and Web Data Mining, WSDM 2010. New York, NY.

DeLamater, J., and W. N. Friedrich. 2002. Human sexual development. Journal of Sex Research 39:10-14.

Dobbrick, T., J. Jakob, C.-H. Chan, and H. Wessler. 2022. Enhancing theory-informed dictionary approaches with “glass-box” machine learning: The case of integrative complexity in social media comments. Communication Methods and Measures 16(4):303-320.

Dow, B. J., A. L. Johnson, C. S. Wang, J. Whitson, and T. Menon. 2021. The COVID-19 pandemic and the search for structure: Social media and conspiracy theories. Social and Personality Psychology Compass 15(9):e12636.

Duarte, N., E. Llanso, and A. Loup. 2017. Mixed messages?: The limits of automated social media content analysis. Center for Democracy & Technology. https://perma.cc/NC9B-HYKX (accessed September 21, 2023).

Dwoskin, E., and C. Timberg. 2021. Misinformation dropped dramatically the week after Twitter banned Trump and some allies. The Washington Post. https://www.washingtonpost.com/technology/2021/01/16/misinformation-trump-twitter (accessed September 21, 2023).

Edelman, G. 2020. Better than nothing: A look at content moderation in 2020. https://www.wired.com/story/content-moderation-2020-better-than-nothing (accessed September 21, 2023).

Eg, R., Ö. Demirkol Tønnesen, and M. K. Tennfjord. 2023. A scoping review of personalized user experiences on social media: The interplay between algorithms and human factors. Computers in Human Behavior Reports 9:100253.

Elkind, D., and R. Bowen. 1979. Imaginary audience behavior in children and adolescents. Developmental Psychology 15:38-44.

Eslami, M., A. Rickman, K. Vaccaro, A. Aleyasen, A. Vuong, K. Karahalios, K. Hamilton, and C. Sandvig. 2015. “I always assumed that I wasn’t really that close to [her]”: Reasoning about invisible algorithms in news feeds. Paper presented at Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Espinoza, J., C. Criddle, and H. Murphy. 2023. EU tells Elon Musk to hire more staff to moderate Twitter. Ars Technica. https://arstechnica.com/tech-policy/2023/03/eu-tells-elon-musk-to-hire-more-staff-to-moderate-twitter/#:~:text=Twitter%20currently%20uses%20a%20mix,which%20owns%20Facebook%20and%20Instagram (accessed September 21, 2023).

Evans, S. K., K. E. Pearce, J. Vitak, and J. W. Treem. 2017. Explicating affordances: A conceptual framework for understanding affordances in communication research. Journal of Computer-Mediated Communication 22(1):35-52.

Fagan, F. 2020. Optimal social media content moderation and platform immunities. European Journal of Law and Economics 50(3):437-449.

Fishman, B. 2023. Dual-use regulation: Managing hate and terrorism online before and after Section 230 reform. https://www.brookings.edu/research/dual-use-regulation-managing-hate-and-terrorism-online-before-and-after-section-230-reform (accessed September 21, 2023).

Foss-Solbrekk, K. 2021. Three routes to protecting AI systems and their algorithms under IP law: The good, the bad and the ugly. Journal of Intellectual Property Law & Practice 16(3):247-258.

Fred, J. 2017. Data monetization – how an organization can generate revenue with data? Tempere University of Technology. https://trepo.tuni.fi/bitstream/handle/123456789/24694/fred.pdf?%20sequence=4&isAllowed=y (accessed September 21, 2023).

Frenkel, S., R. Mac, and M. Isaac. 2021. Instagram struggles with fears of losing its ‘pipeline’: Young users. The New York Times. https://www.nytimes.com/2021/10/16/technology/instagram-teens.html (accessed December 27, 2023).

Friedman, N., and A. Miyake. 2017. Unity and diversity of executive functions: Individual differences as a window on cognitive structure. Cortex 86:186-204..

FTC (Federal Trade Commission). 2000. Advertising and marketing on the internet: Rules of the road. Bureau of Consumer Protection. https://www.ftc.gov/system/files/ftc_gov/pdf/bus28-rulesroad-2023_508.pdf (accessed December 27, 2023).

FTC. 2008. FTC approved new rule provision under the CAN-SPAM Act. https://www.ftc.gov/system/files/ftc_gov/pdf/bus28-rulesroad-2023_508.pdf (accessed December 27, 2023).

FTC. 2013. .com disclosures: How to make effective disclosures in digital advertising. https://www.ftc.gov/system/files/documents/plain-language/bus41-dot-com-disclosures-information-about-online-advertising.pdf (accessed December 27, 2023).

FTC. 2019. Google and YouTube will pay record $170 million for alleged violations of children’s privacy law. https://www.ftc.gov/news-events/news/press-releases/2019/09/google-youtube-will-pay-record-170-million-alleged-violations-childrens-privacy-law (accessed September 21, 2023).

FTC. 2020. FTC issues orders to nine social media and video streaming services seeking data about how they collect, use, and present information. https://www.ftc.gov/news-events/news/press-releases/2020/12/ftc-issues-orders-nine-social-media-video-streaming-services-seeking-data-about-how-they-collect-use (accessed September 21, 2023).

FTC. 2021. A brief overview of the Federal Trade Commission’s investigative, law enforcement, and rulemaking authority. https://www.ftc.gov/about-ftc/mission/enforcement-authority#top (accessed September 21, 2023).

FTC. 2022a. FTC proposes to strengthen advertising guidelines against fake and manipulated reviews. https://www.ftc.gov/news-events/news/press-releases/2023/06/federal-trade-commission-announces-proposed-rule-banning-fake-reviews-testimonials#:~:text=The%20Federal%20Trade%20Commission%20proposed%20a%20new%20rule,a%20product%20or%20service%20and%20undercut%20honest%20businesses (accessed December 27, 2023).

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

FTC. 2022b. Trade regulation rule on commercial surveillance and data security. Federal Register 87(August 22):51273-51299.

FTC. 2023. FTC issues orders to social media and video streaming platforms regarding efforts to address surge in advertising for fraudulent products and scams. https://www.ftc.gov/news-events/news/press-releases/2023/03/ftc-issues-orders-social-media-video-streaming-platforms-regarding-efforts-address-surge-advertising (accessed September 21, 2023).

Furman, W., and L. B. Shomaker. 2008. Patterns of interaction in adolescent romantic relationships: Distinct features and links to other close relationships. Journal of Adolescence 31(6):771-788.

GDPR (General Data Protection Regulation). 2023. What is GDPR, the EU’s new data protection law? https://gdpr.eu/what-is-gdpr (accessed September 21, 2023).

George, G., M. Haas, and A. Pentland. 2014. Big data and management. Academy of Management Journal 57:321-326.

Giedd, J. N., A. Raznahan, A. Alexander-Bloch, E. Schmitt, N. Gogtay, and J. L. Rapoport. 2015. Child psychiatry branch of the National Institute of Mental Health longitudinal structural magnetic resonance imaging study of human brain development. Neuropsy-chopharmacology 40(1):43-49.

GIFCT (Global Internet Forum to Counter Terrorism). 2019. About our mission. https://perma.cc/44V5-554U (accessed September 21, 2023).

GIFCT. 2021a. GIFCT Technical Approaches Working Group: Executive summary. https://gifct.org/wp-content/uploads/2021/07/GIFCT-TAWG21-ExecSummary.pdf (accessed December 26, 2023).

GIFCT. 2021b. Technical approaches working group: Gap analysis and recommendations for deploying technical solutions to tackle the terrorist use of the internet. https://gifct.org/wp-content/uploads/2021/07/GIFCT-TAWG-2021.pdf (accessed December 26, 2023).

GIFCT. 2023. GIFCT’s hash-sharing database. https://gifct.org/hsdb (accessed September 21, 2023).

Gillespie, T. 2022. Do not recommend? Reduction as a form of content moderation. Social Media + Society 8(3):20563051221117552.

Gioia, F., V. Rega, and V. Boursier. 2021. Problematic internet use and emotional dysregula-tion among young people: A literature review. Clinical Neuropsychiatry: Journal of Treatment Evaluation 18(1):41-54.

Giordano, A. L., M. K. Schmit, and J. McCall. 2023. Exploring adolescent social media and internet gaming addiction: The role of emotion regulation. Journal of Addictions & Offender Counseling 44(1):69-80.

Gorwa, R. 2019. The platform governance triangle: Conceptualising the informal regulation of online content. Internet Policy Review 8(2).

Gorwa, R., R. Binns, and C. Katzenbach. 2020. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data & Society 7(1):2053951719897945.

Greenwood, D., and C. Long. 2009. Mood specific media use and emotion regulation: Patterns and individual differences. Personality and Individual Differences 46:616-621.

Grover, S. 2022. Which ad revenue metrics should publishers track? September 2. https://www.adpushup.com/blog/which-ad-revenue-metrics-should-publishers-track/#:~:text=One%20of%20the%20supreme%20ad,which%20content%20to%20improve%20on;%20 https://doi.org/10.2202/1446-9022.1154 (accessed October 10, 2023).

Guardian Staff and Agencies. 2022. Elon Musk declares Twitter ‘moderation council’—as some push the platform’s limits. The Guardian. https://www.theguardian.com/technology/2022/oct/28/elon-musk-twitter-moderation-council-free-speech (accessed December 24, 2023).

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Guyer, A. E., J. D. Caouette, C. C. Lee, and S. K. Ruiz. 2014. Will they like me? Adolescents’ emotional responses to peer evaluation. International Journal of Behavioral Development 38(2):155-163.

Guyer, A. E., J. S. Silk, and E. E. Nelson. 2016. The neurobiology of the emotional adolescent: From the inside out. Neuroscience & Biobehavioral Reviews 70:74-85.

Guyer, A. E., S. J. Beard, and J. S. Venticinque. 2023. Brain development during adolescence and early adulthood. In APA handbook of adolescent and young adult development, APA Handbooks in Psychology®. Washington, DC, US: American Psychological Association. Pp. 21-37.

Hadavas, C. 2020. The future of free speech online may depend on this database. Slate. https://slate.com/technology/2020/08/gifct-content-moderation-free-speech-online.html (accessed December 24, 2023).

Hall, H. K., P. M. R. Millear, M. J. Summers, and B. Isbel. 2021. Longitudinal research on perspective taking in adolescence: A systematic review. Adolescent Research Review 6(2):125-150.

Hamari, J., K. Alha, S. Järvelä, J. M. Kivikangas, J. Koivisto, and J. Paavilainen. 2017. Why do players buy in-game content? An empirical study on concrete purchase motivations. Computers in Human Behavior 68:538-546.

Hartmann, P., P. Fernández, V. Apaolaza, M. Eisend, and C. D’Souza. 2021. Explaining viral CSR message propagation in social media: The role of normative influences. Journal of Business Ethics 173(2):365-385.

Haugen, F. 2021. Written testimony of Frances Haugen before the United States House of Representatives Committee on Energy and Commerce Subcommittee on Communications and Technology. https://docs.house.gov/meetings/IF/IF16/20211201/114268/HHRG-117-IF16-Wstate-HaugenF-20211201-U1.pdf (accessed September 21, 2023).

Hern, A. 2019. TikTok’s local moderation guidelines ban pro-LGBT content. https://www.theguardian.com/technology/2019/sep/26/tiktoks-local-moderation-guidelines-ban-pro-lgbt-content (accessed September 21, 2023)

Hollarek, M., and N. Lee. 2022. Current understanding of developmental changes in adolescent perspective taking. Current Opinion in Psychology 45:101308.

Holt, G. T. M., and V. Morgeson. 2023. Research: How people feel about paying for social media. Harvard Business Review. https://hbr.org/2023/04/research-how-people-feel-about-paying-for-social-media (accessed October 3, 2023).

Husch Blackwell. 2023. A comprehensive resource for tracking U.S. state children’s data privacy legislation. https://www.huschblackwell.com/2023-state-childrens-privacy-law-tracker (accessed September 21, 2023).

Huszti-Orban, K. 2018. Internet intermediaries and counter-terrorism: Between self-regulation and outsourcing law enforcement. NATO CCD COE Publications. https://ccdcoe.org/uploads/2018/10/Art-12-Internet-Intermediaries-and-Counter-Terrorism.-Between-Self-Regulation-and-Outsourcing-Law-Enforcement.pdf (accessed September 21, 2023).

IAPP (International Association of Privacy Professionals). 2023. Statute/bill in legislative process. Portsmouth, NH: IAPP.

Inman, S., and D. Ribes. 2019. “Beautiful seams”: Strategic revelations and concealments. Paper presented at the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland, UK.

Instagram. 2023a. Community guidelines. https://help.instagram.com/477434105621119?helpref=faq_content (accessed September 21, 2023).

Instagram. 2023b. How instagram uses artificial intelligence to moderate content. https://help.instagram.com/423837189385631 (accessed September 21, 2023).

Instagram. 2023c. Terms of use. https://help.instagram.com/581066165581870/?helpref=uf_share (accessed September 21, 2023).

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Jacobs, A. Z., and H. Wallach. 2021. Measurement and fairness. Paper presented at Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, Virtual Event, Canada.

Karagoel, I., and D. Nathan-Roberts. 2021. Dark patterns: Social media, gaming, and e-commerce. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 65:752-756.

Kidron, B., A. Evans, J. Afia, J. Adler, H. Bowden-Jones, L. Hackett, A. Juj, A. Przybylski, R. Angharad, and Y. Scot. 2018. Disrupted childhood: The cost of persuasive design. https://5rightsfoundation.com/static/5Rights-Disrupted-Childhood.pdf (accessed September 15, 2023).

Kiel, E., and A. Kalomiris. 2015. Current themes in understanding children’s emotion regulation as developing from within the parent-child relationship. Current Opinion in Psychology 3:11-16.

Kirdemir, B., J. Kready, E. Mead, M. N. Hussain, and N. Agarwal. 2021. Examining video recommendation bias on YouTube. https://dl.acm.org/doi/abs/10.1007/978-3-030-80387-2_7 (accessed September 15, 2023).

Klosowski, T. n.d. How to protect your digital privacy. The Privacy Project. The New York Times. https://www.nytimes.com/guides/privacy-project/how-to-protect-your-digital-privacy (accessed September 21, 2023).

Kumar, V. 2014. Making “freemium” work. Harvard Business Review (May). https://hbr.org/2014/05/making-freemium-work (accessed September 21, 2023).

Lanteri, C. 2022. Improving social media content moderation: An examination of the flow of online conspiracy theories across varying platform models and policies. University of North Carolina. https://cdr.lib.unc.edu/concern/honors_theses/6h4413239 (accessed September 21, 2023).

Lee, A. Y., H. Mieczkowski, N. B. Ellison, and J. T. Hancock. 2022. The algorithmic crystal: Conceptualizing the self through algorithmic personalization on TikTok. Proceedings of the ACM Human–Computer Interaction. 6(CSCW2):Article 543.

Logrieco, G., M. R. Marchili, M. Roversi, and A. Villani. 2021. The paradox of TikTok anti-pro-anorexia videos: How social media can promote non-suicidal self-injury and anorexia. International Journal of Environmental Research and Public Health 18(3):1041.

Lovett, M. J., and R. Staelin. 2016. The role of paid, earned, and owned media in building entertainment brands: Reminding, informing, and enhancing enjoyment. Marketing Science 35(1):142-157.

Mahapatra, A. 2020. On the value of diversified recommendations. Meta. https://about.instagram.com/blog/engineering/on-the-value-of-diversified-recommendations (accessed September 21, 2023).

Marino, C., G. Gini, F. Angelini, A. Vieno, and M. M. Spada. 2020. Social norms and e-motions in problematic social media use among adolescents. Addictive Behaviors Reports 11.

Maza, M. T., K. A. Fox, S. J. Kwon, J. E. Flannery, K. A. Lindquist, M. J. Prinstein, and E. H. Telzer. 2023. Association of habitual checking behaviors on social media with longitudinal functional brain development. JAMA Pediatrics 177(2):160-167.

McPherson, M., L. Smith-Lovin, and J. M. Cook. 2001. Birds of a feather: Homophily in social networks. Annual Review of Sociology 27:415-444.

McSherry, C. 2020. Content moderation and the U.S. election: What to ask, what to demand. Electronic Frontier Foundation. https://www.eff.org/deeplinks/2020/10/content-moderation-and-us-election-what-ask-what-demand (accessed September 21, 2023).

Meeus, W., J. Iedema, M. Helsen, and W. Vollebergh. 1999. Patterns of adolescent identity development: Review of literature and longitudinal analysis. Developmental Review 19:419-461.

Meta. 2022. Terms of service. https://m.facebook.com/legal/terms (accessed September 21, 2023).

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Meta. 2023a. Detecting violations. https://transparency.fb.com/enforcement/detecting-violations (accessed September 29, 2023).

Meta. 2023b. How meta enforces its policies. Meta. https://transparency.fb.com/enforcement (accessed September 21, 2023).

Microsoft. 2017. Facebook, Microsoft, Twitter and YouTube provide update on Global Internet Forum to Counter Terrorism. https://blogs.microsoft.com/on-the-issues/2017/12/04/facebook-microsoft-twitter-and-youtube-provide-update-on-global-internet-forum-to-counter-terrorism (accessed September 21, 2023).

Mills, K. L., A.-L. Goddings, M. M. Herting, R. Meuwese, S.-J. Blakemore, E. A. Crone, R. E. Dahl, B. Güroğlu, A. Raznahan, E. R. Sowell, and C. K. Tamnes. 2016. Structural brain development between childhood and adulthood: Convergence across four longitudinal samples. NeuroImage 141:273-281.

Mislove, A., B. Viswanath, K. P. Gummadi, and P. Druschel. 2010. You are who you know: Inferring user profiles in online social networks. Paper presented at the Third International Conference on Web Search and Data Mining. WSDM, February 4-6.

Mostert, F., and A. Urbelis. 2021. Social media platforms must abandon algorithmic secrecy. Financial Times. https://www.ft.com/content/39d69f80-5266-4e22-965f-efbc19d2e776 (accessed September 21, 2023).

Mulligan, D. K., J. A. Kroll, N. Kohli, and R. Y. Wong. 2019. This thing called fairness: Disciplinary confusion realizing a value in technology. Proceedings of the ACM Human-Computing Interaction 3(CSCW):Article 119.

Murthy, D. 2021. Evaluating platform accountability: Terrorist content on YouTube. American Behavioral Scientist 65(6):800-824.

NASEM (National Academies of Sciences, Engineering, and Medicine). 2019. The promise of adolescence: Realizing opportunity for all youth. Washington, DC: The National Academies Press. https://doi.org/10.17226/25388.

Nelson, E. E., J. M. Jarcho, and A. E. Guyer. 2016. Social re-orientation and brain development: An expanded and updated view. Developmental Cognitive Neuroscience 17:118-127.

Neumann, P. R. 2013. Options and strategies for countering online radicalization in the United States. Studies in Conflict & Terrorism 36(6):431-459.

Obar, J. A., and A. Oeldorf-Hirsch. 2020. The biggest lie on the internet: Ignoring the privacy policies and terms of service policies of social networking services. Information, Communication & Society 23(1):128-147.

Ofulue, J., and M. Benyoucef. 2022. Data monetization: Insights from a technology-enabled literature review and research agenda. Management Review Quarterly. https://doi.org/10.1007/s11301-022-00309-1.

Pfefferkorn, R. 2023. Prepared remarks on U.S. legal considerations for children’s online safety policy. https://cyberlaw.stanford.edu/blog/2023/03/prepared-remarks-us-legal-considerations-childrens-online-safety-policy (accessed September 21, 2023).

Pfeifer, J. H., C. L. Masten, L. A. Borofsky, M. Dapretto, A. J. Fuligni, and M. D. Lieberman. 2009. Neural correlates of direct and reflected self-appraisals in adolescents and adults: When social perspective-taking informs self-perception. Child Development 80(4):1016-1038.

Radesky, J., Y. R. Chassiakos, N. Ameenuddin, D. Navsaria, Council on Communication and Media. 2020. Digital advertising to children. Pediatrics 146(1).

Rahnama, H., and A. Pentland. 2022. The new rules of data privacy. Harvard Business Review, https://hbr.org/2022/02/the-new-rules-of-data-privacy (accessed September 21, 2023).

Rhodes, S. C. 2022. Filter bubbles, echo chambers, and fake news: How social media conditions individuals to be less critical of political misinformation. Political Communication 39(1):1-22.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Ribeiro, M. H., S. Jhaver, S. Zannettou, J. Blackburn, G. Stringhini, E. D. Cristofaro, and R. West. 2021. Do platform migrations compromise content moderation? Evidence from r/The_Donald and r/Incels. Proceedings of the ACM Human–Computer Interaction. 5(CSCW2):Article 316.

Richler, J. 2023. Social pressure to share fake news. Nature Reviews Psychology 2(5):265.

Rideout, V., A. Peebles, S. Mann, and M. B. Robb. 2021. The common sense census: Media use by tweens and teens. San Francisco, CA: Common Sense.

Roberts, S. T. 2016. Commercial content moderation: Digital laborers’ dirty work. In The Intersectional Internet: Race, Sex, Class, and Culture Online, edited by S. U. Noble and B. Tynes: Peter Lang Publishing.

Robins-Early, N. 2022. There’s a reason thousands of people take quack cures for Covid. The Guardian. https://www.theguardian.com/commentisfree/2022/feb/17/quacks-cashed-in-world-quick-fix-covid-ivermectin-social-media-conspiracy-theories (accessed December 24, 2023).

Ryan, J., and A. White. 2021. Whistle-blower decries Facebook’s ‘free pass’ for bad behavior. Bloomberg. https://www.bloomberg.com/news/articles/2021-10-05/facebook-insider-shows-morally-bankrupt-company-senators-say#xj4y7vzkg (accessed September 21, 2023).

Salehudin, I., and F. Alpert. 2022. To pay or not to pay: Understanding mobile game app users’ unwillingness to pay for in-app purchases. Journal of Research in Interactive Marketing 16:633-647.

Sanak-Kosmowska, K. 2021. Evaluating social media marketing: Social proof and online buyer behaviour. London: Routledge. https://doi.org/10.4324/9781003128052.

Sapiezynski, P. 2023. Individual and societal effects of ad delivery algorithms. Paper presented at Assessment of the Impact of Social Media on the Health and Wellbeing of Children and Adolescents Meeting 2, Washington DC. February 6. https://www.nationalacademies.org/event/02-06-2023/assessment-of-the-impact-of-social-media-on-the-health-and-wellbeing-of-adolescents-and-children-meeting-2 (accessed January 12, 2024).

Sherman, L. E., L. M. Hernandez, P. M. Greenfield, and M. Dapretto. 2018. What the brain ‘likes’: Neural correlates of providing feedback on social media. Social Cognitive and Affective Neuroscience: 13(7):699-707.

Shulman, E. P., A. R. Smith, K. Silva, G. Icenogle, N. Duell, J. Chein, and L. Steinberg. 2016. The dual systems model: Review, reappraisal, and reaffirmation. Developmental Cognitive Neuroscience 17:103-117.

Silverman, M. H., K. Jedd, and M. Luciana. 2015. Neural networks involved in adolescent reward processing: An activation likelihood estimation meta-analysis of functional neuroimaging studies. NeuroImage 122:427-439.

Silvers, J. A. 2022. Adolescence as a pivotal period for emotion regulation development. Current Opinion in Psychology 44:258-263.

Spear, L. P. 2011. Rewards, aversions and affect in adolescence: Emerging convergences across laboratory animal and human data. Developmental Cognitive Neuroscience 1(4):390-403.

Stecklow, S. 2018. Why Facebook is losing the war on hate speech in Myanmar. In Inside Facebook’s Myanmar Operation: Hatebook, edited by P. Hirschberg. Reuters. https://www.reuters.com/investigates/special-report/myanmar-facebook-hate (accessed September 21, 2023).

Steinberg, L. 2010. Biosocial theories of crime: A social neuroscience perspective on adolescent risk-taking, edited by K. M. Beaver and A. Walsh. London: Routledge.

Steinberg, L., and A. S. Morris. 2001. Adolescent development. Annual Review of Psychology 52:82-110.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Stevenson, A. 2018. Facebook admits it was used to incite violence in Myanmar. The New York Times. https://www.nytimes.com/2018/11/06/technology/myanmar-facebook.html (accessed September 21, 2023).

Stibe, A., and H. Oinas-Kukkonen. 2014. Using social influence for motivating customers to generate and share feedback. Lecture Notes in Computer Science 8462:224-235. https://doi.org/10.1007/978-3-319-07127-5_19.

Su, C., H. Zhou, C. Wang, F. Geng, and Y. Hu. 2021. Individualized video recommendation modulates functional connectivity between large scale networks. Human Brain Mapping 42(16):5288-5299.

Suleiman, A. B., A. Galván, K. P. Harden, and R. E. Dahl. 2017. Becoming a sexual being: The ‘elephant in the room’ of adolescent brain development. Developmental Cognitive Neuroscience 25:209-220.

Sunstein, C. R. 2020. Valuing Facebook. Behavioural Public Policy 4(3):370-381.

Swire-Thompson, B., and D. Lazer. 2020. Public health and online misinformation: Challenges and recommendations. Annual Review of Public Health 41(1):433-451.

The Associated Press. 2023. Utah’s new social media law means children will need approval from parents. https://www.npr.org/2023/03/24/1165764450/utahs-new-social-media-law-means-children-will-need-approval-from-parents (accessed September 21, 2023).

Thune, J. 2021. It’s time Congress pulled back the curtain on social media algorithms. CNN. https://www.cnn.com/2021/11/11/opinions/congress-algorithms-filter-bubble-transparency-act-thune/index.html (accessed September 21, 2023).

TikTok. 2021. An update on our work to safeguard and diversify recommendations. https://newsroom.tiktok.com/en-us/an-update-on-our-work-to-safeguard-and-diversify-recommendations (accessed September 21, 2023).

TikTok. 2023a. Our approach to content moderation. https://www.tiktok.com/transparency/en-us/content-moderation/ (accessed September 21, 2023).

TikTok. 2023b. Terms of service. https://www.tiktok.com/legal/page/us/terms-of-service/en (accessed December 27, 2023).

Treem, J. W., and P. M. Leonardi. 2012. Social media use in organizations: Exploring the affordances of visibility, editability, persistence, and association. Annals of the International Communication Association 36(1):143-189.

Turow, J., Y. Lelkes, N. A. Draper, and A. E. Waldman. 2023. Americans can’t consent to companies’ use of their data: They admit they don’t understand it, say they’re helpless to control it, and believe they’re harmed when firms use their data—making what companies do illegitimate. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4391134 (accessed September 21, 2023).

Twitter. 2023. About offensive content. https://help.twitter.com/en/safety-and-security/offensive-tweets-and-content (accessed September 21, 2023).

U.S. Congress, House of Representatives, and Committee on Homeland Security. 2019. Examining social media companies’ efforts to counter online terror content and misinformation. 116th Cong., 1st Sess. June 26.

van der Molen, P. C. M. 1994. Cognitive psychophysiology: A window to cognitive development and brain maturation. In Human Behavior and the Developing Brain. New York, NY, US: The Guilford Press. Pp. 456-490.

van Reijmersdal, E. A., and S. van Dam. 2020. How age and disclosures of sponsored influencer videos affect adolescents’ knowledge of persuasion and persuasion. Journal of Youth and Adolescence 49(7):1531-1544.

Venticinque, J. S., R. Chahal, S. J. Beard, R. A. Schriber, P. D. Hastings, and A. E. Guyer. 2021. Neural responses to implicit forms of peer influence in young adults. Social Neuroscience 16(3):327-340.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Vogels, E., R. Gelles-Watnick, and N. Massarat. 2022. Teens, social media and technology 2022. Pew Research Center. https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022 (accessed December 26, 2023).

Waddell, K. 2021. Tech companies too secretive about algorithms that curate feeds, study says. Consumer Reports. https://www.consumerreports.org/consumer-protection/tech-companies-too-secretive-about-algorithms-that-curate-feeds-a8134259964 (accessed September 21, 2023).

Wadsley, M., and N. Ihssen. 2023. A systematic review of structural and functional MRI studies investigating social networking site use. Brain Sciences 13(5):787.

Wahlstrom, D., P. Collins, T. White, and M. Luciana. 2010. Developmental changes in dopamine neurotransmission in adolescence: Behavioral implications and issues in assessment. Brain and Cognition 72(1):146-159.

Warrick, J. 2016. The ‘app of choice’ for Jihadists ISIS seizes on internet tool to promote terror. The Washington Post. https://www.washingtonpost.com/world/national-security/theapp-of-choice-for-jihadists-isis-seizes-on-internet-tool-to-promote-terror/2016/12/23/a8c348c0-c861-11e6-85b5-76616a33048d_story.html (accessed September 21, 2023).

Wen, L. S. 2023. Utah is limiting kids’ social media access. Other states should follow. The Washington Post, April 11. https://www.washingtonpost.com/opinions/2023/04/11/utah-social-media-law-other-states (accessed February 24, 2024).

White House. 2022. Readout of White House listening session on tech platform accountability. https://www.whitehouse.gov/briefing-room/statements-releases/2022/09/08/readout-of-white-house-listening-session-on-tech-platform-accountability (accessed December 24, 2023).

Wiafe, I., F. N. Koranteng, E. Owusu, A. O. Ekpezu, and S. A. Gyamfi. 2020. Persuasive social features that promote knowledge sharing among tertiary students on social networking sites: An empirical study. Journal of Computer Assisted Learning 36(5):636-645.

World Bank. 2019. Practitioner’s guide: Data protection and privacy laws. https://id4d.world-bank.org/guide/data-protection-and-privacy-laws (accessed October 3, 2023).

Yau, J. C., and S. M. Reich. 2018. Are the qualities of adolescents’ offline friendships present in digital interactions? Adolescent Research Review 3(3):339-355.

YouTube. 2023a. How content ID works. https://support.google.com/youtube/answer/2797370?hl=en (accessed September 21, 2023).

YouTube. 2023b. How does YouTube manage harmful content? https://www.youtube.com/howyoutubeworks/our-commitments/managing-harmful-content (accessed September 21, 2023).

Yu, J., M. Stoilova, and S. Livingstone. 2018. Regulating children’s data and privacy online: The implications of the evidence for age-appropriate design. https://blogs.lse.ac.uk/medialse/2018/11/01/regulating-childrens-data-and-privacy-online-the-implications-of-the-evidence-for-age-appropriate-design (accessed September 21, 2023).

Zelazo, P. D., and S. M. Carlson. 2012. Hot and cool executive function in childhood and adolescence: Development and plasticity. Child Development Perspectives 6(4):354-360.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

This page intentionally left blank.

Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 31
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 32
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 33
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 34
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 35
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 36
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 37
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 38
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 39
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 40
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 41
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 42
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 43
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 44
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 45
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 46
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 47
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 48
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 49
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 50
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 51
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 52
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 53
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 54
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 55
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 56
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 57
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 58
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 59
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 60
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 61
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 62
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 63
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 64
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 65
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 66
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 67
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 68
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 69
Suggested Citation:"2 How Social Media Work." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 70
Next: 3 Potential Benefits of Social Media »
Social Media and Adolescent Health Get This Book
×
 Social Media and Adolescent Health
Buy Paperback | $40.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Social media has been fully integrated into the lives of most adolescents in the U.S., raising concerns among parents, physicians, public health officials, and others about its effect on mental and physical health. Over the past year, an ad hoc committee of the National Academies of Sciences, Engineering, and Medicine examined the research and produced this detailed report exploring that effect and laying out recommendations for policymakers, regulators, industry, and others in an effort to maximize the good and minimize the bad. Focus areas include platform design, transparency and accountability, digital media literacy among young people and adults, online harassment, and supporting researchers.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!