National Academies Press: OpenBook

Social Media and Adolescent Health (2024)

Chapter: 7 Online Harassment

« Previous: 6 Training and Education
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

7

Online Harassment

Previous chapters in this report have described how digital media and their affordances can influence adolescents’ response to stressors and how the affordances themselves can become stressors. The use of digital technology can also facilitate a novel type of problem, often by providing an anonymity that emboldens perpetrators. Behavior that would be unacceptable in person can flourish in digital worlds.

Part of the harm associated with social media lies in the harassment some young people experience. These forms of harassment run from the relatively common, as in cyberbullying, to the rare but serious, as in child sexual exploitation. This chapter sets out some steps that social media companies and the federal government could take against digital harassment, thereby reducing the harms associated with exposure to social media.

CYBERBULLYING

The public and connected affordances of social media, much like a school yard or school bus, enable bullying to occur before an audience of bystanders or even encouragers, though the number of bystanders online has almost no limit. The size of the social network and the anonymity of the participants can both influence the likelihood of people intervening to stop the harassment, though not in ways that are easy to predict (Machackova, 2020; You and Lee, 2019). The significance of peer feedback and social norms for adolescent development, along with a tendency

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

during this life stage to conceive of oneself as the center of others’ attention, sometimes called an imaginary audience (Elkind and Bowen, 1979), makes adolescents particularly vulnerable to cyberbullying.

Monitoring young people’s safety online is, at its heart, a matter of trade-offs between protection and autonomy. For example, the efforts of teachers and the school system to monitor online harassment among students has been met with criticism for being an invasion of privacy and more often used for disciplining students than for helping them (Laird et al., 2022). Furthermore, bullying, whether online or in person, is fundamentally a matter of group dynamics (DeSmet et al., 2018). There is good evidence that digital learning tools can be harnessed to alter these dynamics, changing behavior of bystanders and even improving coping skills, though such interventions are more effective for face-to-face interactions than for cyberbullying (Chen et al., 2023). Such tools may nevertheless have promise for reducing the overall burden of victimization. Research from England found in-person bullying to be vastly more common than cyberbullying (Przybylski and Bowes, 2017).

There is also some suggestion that social media companies can take policy measures to limit bullying, although it is not always clear if or how the companies could do this in a way that does not amount to an overly simplified blaming strategy (Milosevic, 2017). It is also not clear what types of interventions from the companies might best advance the end goal of helping young people build resilience or learn to navigate social processes. Social media platforms may have an incentive to overcorrect in the interest of protecting their reputation (Milosevic, 2017). Additionally, while teens tend to view their parents’ efforts to control online harassment favorably, they have a far less positive impression of the efforts of their teachers, law enforcement, social media companies, or elected officials (see Figure 7-1) (Vogels, 2022).

Digital harassment may be difficult for schools or social media companies to police alone given larger cultural forces at play. A recent survey of American adults found over 40 percent have experienced offensive name calling, stalking, physical threats and other forms of harassment online, with a majority saying they find such harassment to be a major problem (Vogels, 2021). A similar survey among teens found that almost half have experienced some form of cyberbullying (see Figure 7-2) (Vogels, 2022). A recent systematic review of the experience of Black Americans on social media found some suggestion that Black teens experience considerably more bullying on social media, though other studies in the review suggested the problem was worse among White teens. There was some concern that social media can impede Black people’s sense of well-being through exposure to disturbing images of racial discrimination (Park et al., 2023).

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Image
FIGURE 7-1 Percentage of respondents ages 13 to 17 who have favorable or unfavorable impressions of the efforts various authority figures are taking to stop online harassment and bullying.
SOURCE: Vogels, 2022.
NOTE: Teens are those aged 13 to 17. Excellent or good job or only fair to poor job responses are combined. Those who did not give an answer are not shown.
Image
FIGURE 7-2 Percentage of respondents ages 13 to 17 who say they have ever experienced cyberbullying when online or on their cell phone.
SOURCE: Vogels, 2022.
NOTE: Teens are those aged 13 to 17. Those who did not give an answer are not shown.
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Among adults, political name calling on social media is cited as a starting point for digital harassment (Vogels, 2021). Research has raised the concern that the normalization of a “mob vigilante” mindset may be a more widespread phenomenon of which teen bullying is just one piece (Milosevic, 2017).

Among gamers, reports of harassment are even more common (see Figure 7-3). This is not surprising; the experience and the enjoyment of playing video games involves, by definition, winners and losers and some potential for arguments and hurt feelings. These platforms can also harbor serious forms of toxic content; a recent survey found 10 percent of teenage gamers have encountered white supremacist ideology while gaming and 7 percent of adults on the platforms have been exposed to Holocaust denial while playing (ADL, 2021).

Qualitative research among female gamers suggests that sexual harassment is an unfortunately common feature of their experience, in part because of the continued stigma against women participating in a stereotypically male activity (Kuss et al., 2022; McLean and Griffiths, 2019). Such harassment is cause for particular concern given gaming’s

Image
FIGURE 7-3 Percentage of respondents ages 13 to 17 who have experienced the listed form of disruptive behavior in online multiplayer games in the last 6 months.
SOURCE: ADL, 2021. Reprinted with permission from “Hate is no game: Harassment and positive social experiences in online games 2021” in Social Media and Adolescent Health (New York: Anti-Defamation League, 2021), www.adl.org. All rights reserved.
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

association with hypermasculine aggression (Lorenz and Browning, 2020; Mortensen, 2016). Games may portray female characters with highly sexualized avatars, a feature that has been shown to influence adolescents to accept sexual harassment and stereotyped, hostile narratives about sexual assault, and this perpetuates the exclusion of girl and women participants (Driesmans et al., 2014; Lynch et al., 2016). Although the number of female primary characters in games has increased in recent decades, female characters are still more secondary and more sexualized than primary characters (Lynch et al., 2016).

As with harassment on social networking sites, it is hard to say how gaming platforms should handle this problem. Industry research indicates that only 1 percent of users are consistently toxic and that these players account for 5 percent of harassment incidents (Pappas, 2023; Robinson, 2018). While no one would object to banning these players, doing so could not be expected to dramatically improve a game’s overall environment. Context and situation are important determinants of trolling behavior online, which makes banning players a flawed strategy. As a 2017 study concluded, “Not only are some banned users likely to be ordinary users just having a bad day, but [banning] does little to curb…situational trolling” (Cheng et al., 2017).

Asking players to flag and report offensive content is one strategy to combat harassment, but industry estimates suggest that only between 10 and 15 percent of these reports are legitimately harassment (Pappas, 2023). What is more, the most insidious forms of harassment are the sort that would not be noticed or reported: Young gamers who are being groomed for sexual predation or incited to radicalization do not, almost by definition, realize that there is something wrong with the behavior to which they are being exposed (Pappas, 2023).

This puts the companies in a difficult position. If neither deplatforming users nor relying on firsthand reports is effective, their best strategies to forestall harassment involve some invasion of privacy, mostly likely through using machine learning to identify suspect interactions and some combination of human and automated tools to follow up. But collecting information about children potentially subjects companies to liability under the Children’s Online Privacy Protection Act (COPPA). Monitoring children’s messaging or accounts, even if only rarely and with the best intentions, could be even riskier and something that most companies might be disinclined to do.

Any remedies a company takes to surface information about bullying, harassment, or incitement to radicalization could expose them to privacy liability. But sometimes the severity of the risk to society, especially a risk to children, warrants an invasion of privacy. Mandatory reporting laws, whether requiring disclosure to public health or law enforcement authorities, recognize that service of the public good sometimes over-

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

rides important concerns such as patient confidentiality (Geiderman and Marco, 2020). The delicate balancing of individual risks against societal benefits is not something that any company should have to manage for itself and is a topic revisited in later in this chapter.

SEXUAL OFFENSES

Social media can be a means to perpetrate sexual offenses against minors (Bergman, 2023). The people who commit these crimes benefit from the anonymity and easy access to potential victims the platforms provide (Bleakley et al., 2023). These crimes cover a range of severity and are occurring at a pace that has so far exceeded society’s effort to protect children and support victims and law enforcement’s capacity to respond (Bursztein et al., 2019).

Cyberflashing

Cyberflashing, the electronic transmission of sexually explicit photos without the recipient’s request, is one troubling manifestation of online harassment (Miller, 2021). The practice can occur through direct messaging features of apps or on social media. Bypassing social media altogether, short-range wireless transmission such as AirDrop can be used to send images anonymously in crowded places (Freeman, 2020). The anonymity of such transmission complicates the government’s ability to take action against it. In Maryland, for example, flashing is punishable by up to 3 years in prison but only if it happens in person (Gaskill, 2023). In California, victims can sue for civil damages,1 while Texas passed a law criminalizing cyberflashing.2 These policies are indicative of wide support for combating the problem in Congress and in state legislatures, but prosecution is challenging partly because the offense is rarely reported (Lima and Schaffer, 2022; Miller, 2021; Wang, 2023).

Estimates of the frequency of cyberflashing, even among adults, are relatively scarce (Freeman, 2020; Miller, 2021; Salerno-Ferraro et al., 2022). A survey in the UK found nearly half of women aged 18 to 24 years have received unsolicited lewd photographs (Smith, 2018). A recent, smaller study of Canadian college women, most of them 18 or 19 years old, found that almost a quarter had received unsolicited nude pictures online or on their phones; more than half had experienced some form of online sexual harassment. Among those who had experienced sexual harassment,

___________________

1 California Civil Code § 1708.88.

2 Texas Penal Code Annotated § 21.19.

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

propositioning, or flashing, more than 6 percent reported the first incident happening between the ages of 12 and 14 (Salerno-Ferraro et al., 2022).

Less has been published about the online sexual harassment of minors. In 2020, a British survey of adolescents aged 12 to 18 years found that 37 percent of girls and 20 percent of boys had received sexual photos or videos online, often from adult strangers (Ringrose et al., 2021a). Focus group participants reported feelings of shame and disgust elicited by such harassment but also reported doing nothing in response, either from embarrassment, not knowing what action to take, or worrying that reporting might aggravate the perpetrator (Ringrose et al., 2021a). Focus group data from adolescents aged 12 to 19 suggests that young people are disinclined to tell their parents about their experiences out of concern that it would be upsetting to the parents or that they would be advised to quit the platforms the images were sent through (Mishna et al., 2023).

Reporting unwanted images in also logistically complicated. The sender’s identity is not often obvious in these images (Ringrose et al., 2021b). On platforms such as Snapchat, saving the image to report it to a trusted adult triggers a screen save notification to the sender, an action that could be interpreted as encouragement or make it obvious, in the case of report, who the aggrieved reporter was (Ringrose et al., 2021b).

The receipt of unsolicited sexual images or videos is distressing in itself; it is also associated with sexual coercion. The receipt of unwanted sexual images is often accompanied by requests for intimate photos in return (Mishna et al., 2023). A survey of cisgender teenage girls who had sexted in the previous year found that over 70 percent had sent a sexual image in response to their partners’ coercion (Bragard and Fisher, 2022).

Qualitative research indicates that blocking users can be met with manipulative or victim-blaming responses, especially if the request comes from an offline acquaintance or friend (Ringrose et al., 2021b).

Cyberflashing is rarely untraceable. Regardless of how long the content is visible to the victim or the username from which it was sent, the platform would typically have a record of the shared illicit content (Meta, 2023; TikTok, 2023). Given the infrequency with which this crime is reported, however, perpetrators may continue to have online access to youth without consequences.

Grooming and the Sexual Extortion of Minors

The capacity of social media to connect minors with strangers and in spaces designed for adults introduces vulnerability for other forms of sexual abuse or unwanted contact. A nationally representative 2018 survey of young adults aged 18 to 28 found that 22.5 percent (standard error 1.2 percent) had experienced some form of online sexual solicitation in child-

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

hood, 10.3 percent (standard error 0.8 percent) had been threatened or coerced for sexual images, and 3.1 percent (standard error 0.5 percent) had experience revenge pornography, the nonconsensual sharing of images designed to intentionally hurt the target (Finkelhor et al., 2022). The same data indicate that adolescents of high school age make up the majority of targets, with female and transgender young people being more vulnerable than males (Finkelhor et al., 2022). Perpetrators were often known to their victims offline and, though the age of these offenders is often unknown, many were themselves minors (Finkelhor et al., 2022).

Puberty and sexual maturation are central experiences of adolescence, which carry with them increasing interest in sex. Coupled with immature cognitive control processes (e.g., heightened pleasure from risk taking and low ability to think through consequences and delay gratification) and heightened identity experimentation, adolescence is an especially vulnerable time for interactions to people who appear to be possible romantic partners.

Presenting as a potential friend or romantic partner can be a strategy used in grooming, “a process to gain, persuade, and engage a child in sexual activity where the internet is used as a medium for access” (Borj et al., 2023). Grooming often involves deception, as in the deceptive portrayal of age or mutual friends. It can also involve the adult providing emotional support, sympathy, and even gifts (Calvete et al., 2022). Such behaviors prey on teens’ natural curiosity about sex as well as their impulsivity and need for acceptance (Whittle et al., 2013).

It is difficult to comment on the frequency of grooming for obvious reasons; grooming is, by design, stealth. A recent survey of undergraduates found that over 20 percent recalled interactions as minors that met criteria for online grooming; 38 percent of these young people (about 8 percent of the total sample) eventually met the adult in person, and a sizable majority of those who made in-person contact (68 percent or roughly 5 percent of the total sample) eventually had sex with the adult (Greene-Colozzi et al., 2020). This study is consistent with other research that indicates most youth are resistant to grooming and those that initially engage find ways to separate (Whittle et al., 2013). But for those who do not, often socially isolated young people with low self-esteem, the consequences can be devastating (Whittle et al., 2013).

One consequence is sextortion, a form of blackmail hinging the threat of exposing intimate images against a demand for money, more images, or sex (National Center for Missing & Exploited Children, 2023). A nationally representative survey of middle and high school students in the United States estimated that 5 percent had been victims and 3 percent perpetrators of sextortion (Patchin and Hinduja, 2020). Since then, the National Center for Missing & Exploited Children has reported an alarming spike

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

in reports of sextortion, which more than doubled between 2019 and 2021 (National Center for Missing & Exploited Children, 2023). The Federal Bureau of Investigation’s most recent records contain almost 15,000 reports of sextortion (FBI, 2021). Perpetrators of sextortion usually have multiple victims; the Department of Justice estimates that there are more victims per offender in sextortion cases than in any other type of child sexual exploitation (Jurecic et al., 2016; Wittes et al., 2016). Some research from Europe indicates that teenagers who are transgender and gender nonconforming may be at increased risk for being pressured into sending sexual images (Van Ouytsel et al., 2020).

The Sexual Abuse of Minors

Social media can also facilitate egregious forms of sexual exploitation (O’Brien and Li, 2020). Research among teenagers who were sexually abused found that almost 6 percent met their abuser online; this digital component was associated with recurrent, violent abuse by multiple assailants (Say et al., 2015). While there is little hard evidence on the prevalence of child sex trafficking dependent on the internet, there is reason for concern (Gezinski and Gonzalez-Pons, 2022). The National Center for Missing & Exploited Children has received an exponentially growing number reports of child sexual abuse online since it started collecting them, many from internet service providers (Bursztein et al., 2019) (see Figure 7-4). An exponential increase in report volume has led to 9.6 million reports in the last year alone. Whether this increase is the result of a true change in the perpetration of the crime or improved reporting is impossible to say, however.

Recent public attention to this problem has highlighted the role social media companies can play in child sexual abuse. In June 2023, a Wall Street Journal investigation found that Instagram’s recommendation algorithms promoted child sexual abuse material and was being used to connect pedophiles (Horowitz and Blunt, 2023). The investigation made clear that the social media platforms were aware of the child sexual abuse material they were hosting. A series of unambiguously graphic hashtags were attached to warnings, “these results may contain images of child sexual abuse” material that causes “extreme harm to children,” but users were then given the option to “see results anyway” (Horowitz and Blunt, 2023).

Social media companies are aware of the harms their platforms can facilitate. Meta, the parent company of Facebook, Instagram, and WhatsApp, accounts for 85 percent of child pornography reports filed with the National Center for Missing & Exploited Children (Horowitz and Blunt, 2023). The end-to-end encryption that the companies value as a means to assure privacy, can also give cover to people involved with the worst forms of exploitation online (Salter and Hanson, 2021).

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Image
FIGURE 7-4 Monthly volume of child sexual abuse images, log scale, received by the National Center for Missing & Exploited Children since the creation of its cyber tipline in 1998.
SOURCE: Bursztein et al., 2019.

The committee recognizes that social media platforms do not cause harassment or sexual offenses against children, nor are they to blame for the existence of egregious forms of human behavior. At the same time, any company that makes a product central to unconscionable crimes has a role to play in stopping them. In the same way that hotel and airline companies have made prevention of human trafficking an industry-wide corporate social responsibility, so should the technology industry take steps to ensure their users can easily report online abuse and to follow up on those reports (Mohn, 2012; Winters, 2017). Social media companies’ responsibility would extend to sharing information with law enforcement and working with them to build evidence against criminals.

Recommendation 7-1: Social media companies should develop systems for reporting, follow-up, and adjudication for cases of online harassment and abuse. These systems should be easy to use, universal, accountable, and transparent.

It is not clear how best to respond to the proliferation of child sexual abuse material online, nor is it clear how to manage more commonplace forms of harassment. There is likely a role for legislative and systemwide

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

change (Bleakley et al., 2023). Social media sites have also considered responses to online harassment that emphasize support for targets and practices such as mediation that may be restorative and help mend the conflict (Schoenebeck et al., 2021). This committee was not, however, charged nor suitably constituted to recommend changes to the legal framework governing the abuse of online privacy protections. At the same time, the abuses described in this chapter represent an important harm of social media to children’s and adolescents’ well-being and one that social media companies have a responsibility to mitigate.

The committee recognizes that social media platforms already rely on flagging and reporting systems to handle online harassment and abuse. No system is perfect, however. Flagging systems, for example, offer users a way for users to communicate problems to a platform but in a relatively broad-brush way that has been described as “remarkable more for what [it] cannot express than what [it] can” (Crawford and Gillespie, 2016). Some platforms allow users to simply flag content, while others may accompany the action with an optional multiple choice list of reasons for flagging (Crawford and Gillespie, 2016). What is more, flagging can often be deployed in an insincere way: posts can be flagged jokingly among friends or in retaliatory way; some flags are deployed to express disapproval for a person or their ideas or as an act of bullying (Allen et al., 2022; Crawford and Gillespie, 2016). A tendency to misuse flagging systems, deliberately or by accident, makes them not representative of the harm experienced on a platform (Coscia and Rossi, 2020). What is more, users flagging generates a massive number of actions requiring platforms evaluation, resulting in a significant burden for the companies. Comparative analysis of the reporting systems across 20 different social networking, messaging, and virtual reality platforms shows wide variability in reporting procedure and follow-up3 (Leavitt and Lo, 2023). Only 30 percent of platforms notify reporters that their report was received (Leavitt and Lo, 2023). The internal processes the platforms use to act on these reports are largely unknown to outside researchers (Henry and Witt, 2021).

In cases of severe online abuse, mainstream social media companies face reputational risk for failure to act. The traditional authorities in the offline world, law enforcement and prosecutors, have reduced ability online, partly from a lack of awareness of what crimes are happening (Bleakley et al., 2023). By making the reporting of crimes easier and guaranteeing prompt follow-up, the companies could take a meaningful step to reduce the harms associated with their product. By facilitating

___________________

3 Including the design and language used to solicit reports, the categories and details provided to guide the reporter, the information about the platform’s adjudication policy available to the reporter, and the information provided after submission of a report.

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

easy reporting, they would also be in a better position to work with law enforcement in assembling evidence toward a prosecution.

The committee recognizes that the intervention suggested in Recommendation 7-1 rests in uneasy tension with companies’ obligations under COPPA. While major social media platforms often require users to be at least 13, survey research suggests that almost one-fifth of children as young as 8 to 12 years use social media every day (Rideout et al., 2021). This discrepancy in enforcement makes it difficult to design child protective features into platforms, but attention from the Federal Trade Commission (FTC) could help remove the incentive companies have had to ignore the presence of children under 13 on their platforms.

In 2013, the FTC amended its regulations implementing COPPA to expand the definition of personal information to include persistent identifiers “that can be used to recognize a user over time and across different Web sites or online services.”4 Persistent identifiers can include information held in cookies, IP addresses,5 or any unique device identifier (Miller, 2023). Collecting information about which young people are experiencing harassment or exploitation would almost always involve the collection of prohibited personal identifiers.

What is more, pending legislation in Congress, such as the Kids Online Safety Act, emphasizes platforms’ “duty of care” to mitigate mental health problems, addictive behaviors, online bullying and harassment, and exploitation.6 The same interest is reflected in Chapter 5 of this report, which encourages companies to report on their efforts to remediate young people’s mental health problems and report these measures to the FTC. The committee shares the admirable goal reflected in this legislation of protecting young people from sexual exploitation and abuse online but acknowledges that these societal protections may come at the cost of ceding individual privacy protections, including the privacy protections guaranteed to minors under COPPA.

Social media companies must strike a delicate balance between their obligation to protect children’s privacy and their responsibility to protect minors from online harassment and abuse. Lack of clarity regarding how to satisfy these competing duties simultaneously may have the unintended effect of encouraging social media companies and gaming studios to turn a blind eye to patterns of interaction online that suggest abuse. Both the companies and society would thus benefit from clear guidance regarding how to manage the trade-offs between child protection and data privacy.

___________________

4 16 C.F.R. § 312.2.

5 Officially, Internet Protocol addresses

6 Kids Online Safety Act, S. 3663, 117th Congress, 2d Session (2022).

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Recommendation 7-2: The Federal Trade Commission should revise its regulations to clarify how to make systems for reporting cases of online harassment and abuse comply with the Children’s Online Privacy Protection Act.

COPPA and its implementing regulations permit the collection of a child’s personal information under certain circumstances. For example, operators may collect “the name of the child and online contact information (to the extent reasonably necessary to protect the safety of a child participant on the site” so long as the purpose is to protect the safety of a child, the information is not used or disclosed for any purposes unrelated to the child’s safety and is not disclosed on the site, and the operators make reasonable efforts to notify the child’s parent as specified by the regulation.7 The rule implementing this provision also authorizes the collection of a parent or guardian’s name and contact information, suggesting some flexibility to expand this exception beyond the strict letter of the law.8 Whether this latitude is broad enough to include data about incidents of harassment and abuse remains uncertain.

COPPA also permits industry groups to create self-regulatory “safe harbor programs” so long as they provide substantially the same or greater protections as the requirements of COPPA.9 None of the safe harbors that the FTC has approved to date authorizes the collection of information beyond the child’s and parent’s name and online contact information in order to protect the child’s safety.

A subsequent provision authorizing disclosure of information to law enforcement agencies applies only “[t]o the extent permitted under other provisions of law.”10 This latter phrase suggests that this provision permits the disclosure of lawfully gathered information to law enforcement rather than serving as an independent basis for collecting data.

These ambiguities suggest that both the social media companies and society would benefit from greater clarity of the circumstances under which COPPA permits the collection of information necessary to address the problem of online harassment and exploitation. Any such guidance should reflect the fact that the appropriateness of actions that override legal protections on privacy necessarily depend on the severity of the problem those actions aim to solve. The Health Insurance Portability and Accountability Act of 1996 (HIPAA), for example, sets an extremely high bar as to how health providers, insurers, and other covered entities

___________________

7 U.S. Code 15 (1998), § 6502(b)(2)(D).

8 U.S. Code 16 CFR § 312.5(c)(5).

9 U.S. Code 15 § 6503.

10 U.S. Code 16 C.F.R. § 312.5(c)(6)(iv).

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

disclose personal health information, including how they share information with their “business associates” (HHS, 2003). During the COVID19 pandemic, however, the Department of Health and Human Services announced that it would exercise “enforcement discretion” and would not penalize the disclosure of protected information for public health oversight (HHS, 2020). Given that the pandemic created an unprecedented health emergency, the scope of the associated privacy protection was proportionately adjusted. In the same way, the privacy protections on social media may need to take into account the magnitude of problems faced even under less extraordinary conditions. The FTC may want to consider if similar discretion is necessary for COPPA violations in light of growing concerns about the mental health and potential exploitation of young people online.

In setting out its guidance, the FTC should be mindful of the compliance burden that their rules can place on small and medium-sized firms. The European Union’s General Data Protection Regulation (GDPR), for example, requires data collection, storage, and processing to a standard described as “the toughest…in the world” (GDPR, 2023). This standard poses particularly onerous burdens for smaller firms (Klinger et al., n.d.; Official Journal of the European Union, 2016). As the FTC confronts the trade-offs between individual privacy and social welfare, the agency should also take compliance costs into account and recognize that privacy’s importance as a value does not necessarily entail the adoption and enforcement of every conceivable privacy protection.

SUPPORT FOR VICTIMS AND MOMENTUM FOR PROSECUTION

Some professions, by virtue of their privileged glimpses into people’s personal lives, have a legal obligation to report the abuse of children to the authorities; in some states this obligation extends to all adults (NAMR, 2023). What is more, most people would want to support young victims and bring the perpetrators to justice. The best pathway to do this is not always clear.

The crime of cyberflashing, for example, is so little discussed that few adults would be able to advise on exactly what should be reported or to whom. Police would not necessarily know how to collect evidence on the crime. When the victims of cyberflashing are adolescents and children, society’s first concern must be protecting them from the fear, shame, and intimidation such images elicit. Support for these young people, while not primarily intended to curb the abuse, could have a positive downstream effect of bringing attention to the problem and consequences of online

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

sexual harassment, serving an end goal of greater accountability and prosecution of the perpetrators.

Similar challenges persist in supporting victims of sextortion and online grooming. Children may be reluctant to identify inappropriate actions of an adult online for many reasons, the most obvious being that they do not realize there is something wrong with the interaction until it is too late. For this reason, the educational programs described in the previous chapter should include information on how to identify and report inappropriate sexual advances made online. For children under 13, who are likely using social media in violation of the platform’s terms, there could be an added element of fear of being found out and kicked off the platform.

When young people are bullied, harassed, or preyed on by sexual predators, and when that abuse is inextricably tied to the reach and anonymity of the internet, society has an obligation to help them. The U.S. Substance Abuse and Mental Health Services Administration (SAMHSA) has the mandate “to lead public health and service delivery efforts that promote mental health … and provide treatments and supports to foster recovery” (SAMHSA, 2023c). The agency also has the expertise to provide support and intervention services for children and adolescents who are harmed by their experiences on social media.

Recommendation 7-3: The U.S. Substance Abuse and Mental Health Services Administration should develop support programs for children and adolescents who experience digital abuse and evaluate the effectiveness of such programs.

The goal of this recommendation is to support young people who have been harassed online. Providing such support is consistent with SAMHSA’s mission and consistent with its programming in school and campus health, including the StopBullying.gov program (SAMHSA, 2023b; Vecchio, 2018). In identifying the specific services that victims of online abuse might need the committee would encourage SAMHSA to consider transferable lessons from its experience with both bullying prevention and crisis helplines.

The 988 Suicide and Crisis Lifeline has operated in the United States since July of 2022, building on the National Suicide Prevention Lifeline and network of crisis centers established in 2004, providing confidential support to people in mental health crisis (Ackerman and Horowitz, 2022; Canady, 2022b). The FTC has been involved with the program since its start, smoothing the technical challenges for phone and text service providers and routing calls to the correct regional centers (FCC, 2022). SAMHSA has also developed promotional materials and videos explain-

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

ing how the 988 line works, what happens during a call or text, and awareness-generating videos and advertisements (SAMHSA, 2023a). The service has received extensive, generally positive press coverage, despite concerns about the stability of its funding (Bauman, 2023; Blum, 2022; Budds, 2022; Chatterjee, 2023).

During the national rollout of the 988 line, SAMHSA’s assistant secretary for mental health and substance abuse described the program as an entry point on a continuum of crisis services, calling attention to a wealth of supportive materials for mental health on the SAMHSA website (Canady, 2022a). A strategy that builds on these resources may be the most efficient use of an existing community network for mental health support. The 988 program recognizes the importance of partnership with a network of crisis centers and local and state health authorities around the country (SAMHSA, 2022). These organizations form a network of partners to help promote public service announcements and social marketing materials about the service. The same channels may be useful starting points to share information about digital abuse and make young people aware of resources they can turn to for support.

Providing support for young people who are victims of digital abuse is the main goal of this recommendation. It is also possible that by providing support, experts at SAMHSA and partner organizations will come to have a clearer picture of who is perpetrating some of the more egregious forms of harassment or abuse. Such information would be useful to law enforcement. For this reason, it would be helpful to include police and district attorneys in the development of the recommended interventions.

Including law enforcement in this discussion of protecting children is important given concerns that laws have not kept pace with online harassment and sexual crimes and with the ever-changing social media platforms. States are increasingly moving to punish cyberflashing; one example is by increasing the pressure on Congress for heightened federal action (Freeman, 2020; Lima and Schaffer, 2022). Greater attention to supporting victims of digital harassment would be a valuable complement to the growing momentum for more clear and consistent punishment for its perpetrators, including perpetrators of crimes against adults.

REFERENCES

Ackerman, J., and L. Horowitz. 2022. Youth suicide prevention and intervention: Best practices and policy implications. Spring Briefs in Psychology: Advances in Child and Family Policy and Practice. https://library.oapen.org/bitstream/handle/20.500.12657/58358/978-3-031-06127-1.pdf (accessed September 19, 2023).

ADL (Anti-Defamation League). 2021. Hate is no game: Harassment and positive social experiences in online games 2021. https://www.adl.org/resources/report/hate-no-game-harassment-and-positive-social-experiences-online-games-2021 (accessed September 19, 2023).

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Allen, J., C. Martel, and D. G. Rand. 2022. Birds of a feather don’t fact-check each other: Partisanship and the evaluation of news in Twitter’s Birdwatch crowdsourced fact-checking program. Paper presented at the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA.

Bauman, A. 2023. Houston suicide call center working to meet surge in calls to new 988 hotline. February 3. https://www.houstonchronicle.com/news/houston-texas/health/article/988-suicide-and-crisis-lifeline-houston-centers-17760082.php (accessed September 19, 2023).

Bergman, M. 2023. Social media and sexual violence. Social Media Victims Law Center. https://socialmediavictims.org/sexual-violence (accessed September 19, 2023).

Bleakley, P., E. Martellozzo, R. Spence, and J. DeMarco. 2023. Moderating online child sexual abuse material (CSAM): Does self-regulation work, or is greater state regulation needed? European Journal of Criminology. https://doi.org/10.1177/14773708231181361.

Blum, D. 2022. What to know about 988: The new mental health crisis hotline. New York Times. https://www.nytimes.com/2022/07/12/well/988-suicide-prevention-hotline.html (accessed September 19, 2023).

Borj, P. R., K. Raja, and P. Bours. 2023. Online grooming detection: A comprehensive survey of child exploitation in chat logs. Knowledge-Based Systems 259:110039.

Bragard, E., and C. B. Fisher. 2022. Associations between sexting motivations and consequences among adolescent girls. Journal of Adolescence 94(1):5-18.

Budds, B. 2022. There’s nothing weak about seeking help: SC to launch new 988 mental health hotline. News19 WLTX. https://www.wltx.com/article/news/local/sc-implementing-988-mental-health-hotline/101-dee6ff90-2d7d-4913-9bfd-2d23659ccdd0 (accessed September 19, 2023).

Bursztein, E., E. Clarke, M. DeLaune, D. M. Elifff, N. Hsu, L. Olson, J. Shehan, M. Thakur, K. Thomas, and T. Bright. 2019. Rethinking the detection of child sexual abuse imagery on the internet. Paper presented at The World Wide Web Conference, San Francisco, CA, USA.

Calvete, E., I. Orue, and M. Gamez-Guadi. 2022. A preventive intervention to reduce risk of online grooming among adolescents. Psychosocial Intervention 31(3):177-184.

Canady, V. A. 2022a. Field elated over upcoming 988 MH crisis system, just days away from launch. Mental Health Weekly 32(28):1-3.

Canady, V. A. 2022b. Tennessee stakeholders tout success of 988 crisis line following live launch. Mental Health Weekly 32(29):1-7.

Chatterjee, R. 2023. 988 lifeline sees boost in use and funding in first months. NPR. https://www.npr.org/sections/health-shots/2023/01/16/1149202586/988-lifeline-sees-boost-in-use-and-funding-in-first-months (accessed September 19, 2023).

Chen, Q., K. L. Chan, S. Guo, M. Chen, C. K.-M. Lo, and P. Ip. 2023. Effectiveness of digital health interventions in reducing bullying and cyberbullying: A meta-analysis. Trauma, Violence, & Abuse 24(3):1986-2002.

Cheng, J., M. Bernstein, C. Danescu-Niculescu-Mizil, and J. Leskovec. 2017. Anyone can become a troll: Causes of trolling behavior in online discussions. Paper presented at the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, Portland, Oregon, USA.

Coscia, M., and L. Rossi. 2020. Distortions of political bias in crowdsourced misinformation flagging. Journal of The Royal Society Interface 17(167):20200020.

Crawford, K., and T. Gillespie. 2016. What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media & Society 18(3):410-428.

DeSmet, A., S. Bastiaensens, K. Van Cleemput, K. Poels, H. Vandebosch, G. Deboutte, L. Herrewijn, S. Malliet, S. Pabian, F. Van Broeckhoven, O. De Troyer, G. Deglorie, S. Van Hoecke, K. Samyn, and I. De Bourdeaudhuij. 2018. The efficacy of the Friendly Attac serious digital game to promote prosocial bystander behavior in cyberbullying among young adolescents: A cluster-randomized controlled trial. Computers in Human Behavior 78:336-347.

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Driesmans, K., L. Vandenbosch, and S. Eggermont. 2014. Playing a videogame with a sexualized female character increases adolescents’ rape myth acceptance and tolerance toward sexual harassment. Games for Health Journal 4(2):91-94.

Elkind, D., and R. Bowen. 1979. Imaginary audience behavior in children and adolescents. Developmental Psychology 15:38-44.

FBI (Federal Bureau of Investigation). 2021. Internet crime report 2021. https://www.ic3.gov/Media/PDF/AnnualReport/2021_IC3Report.pdf (accessed September 19, 2023).

FCC (Federal Communications Commission). 2022. 988 Suicide and Crisis Lifeline. https://www.fcc.gov/988-suicide-and-crisis-lifeline (accessed September 19, 2023).

Finkelhor, D., H. Turner, and D. Colburn. 2022. Prevalence of online sexual offenses against children in the US. JAMA Network Open 5(10):e2234471–e2234471.

Freeman, V. 2020. Cyber flashing: Unwanted and non-consensual lewd photographs as technology enhanced sexual harassment. Seton Hall Law. https://scholarship.shu.edu/cgi/viewcontent.cgi?article=2105&context=student_scholarship (accessed December 31, 2023).

Gaskill, H. 2023. Maryland legislation seeks to address penalties for ‘cyber flashing.’ Baltimore Sun. https://www.baltimoresun.com/politics/bs-md-pol-cyber-flashing-taskforce-20230221-chvn2pczs5gd7jkq3wkr36spye-story.html (accessed September 19, 2023).

GDPR (General Data Protection Regulation). 2023. What is GDPR, the EU’s new data protection law? https://gdpr.eu/what-is-gdpr (accessed September 19, 2023).

Geiderman, J. M., and C. A. Marco. 2020. Mandatory and permissive reporting laws: Obligations, challenges, moral dilemmas, and opportunities. Journal of the American College of Emergency Physicians Open 1(1):38-45.

Gezinski, L. B., and K. M. Gonzalez-Pons. 2022. Sex trafficking and technology: A systematic review of recruitment and exploitation. Journal of Human Trafficking 8:1-15.

Greene-Colozzi, E. A., G. M. Winters, B. Blasko, and E. L. Jeglic. 2020. Experiences and perceptions of online sexual solicitation and grooming of minors: A retrospective report. Journal of Child Sexual Abuse 29(7):836-854.

Henry, N., and A. Witt. 2021. Governing image-based sexual abuse: Digital platform policies, tools, and practices. In The Emerald International Handbook of Technology-Facilitated Violence and Abuse, edited by J. Bailey, A. Flynn, and N. Henry. Emerald Publishing Limited. Pp. 749-768.

HHS (U.S. Department of Health and Human Services). 2003. Summary of the HIPAA Privacy Rule. https://www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html (accessed September 19, 2023).

HHS. 2020. Enforcement discretion under HIPAA to allow uses and disclosures of protected health information by business associates for public health and health oversight activities in response to COVID-19. https://www.federalregister.gov/documents/2020/04/07/2020-07268/enforcement-discretion-under-hipaa-to-allow-uses-and-disclosures-of-protected-health-information-by (accessed September 19, 2023).

Horowitz, J., and K. Blunt. 2023. Instagram connects vast pedophile network. https://www.wsj.com/articles/instagram-vast-pedophile-network-4ab7189 (accessed September 19, 2023).

Jurecic, Q., C. Spera, B. Wittes, and C. Poplin. 2016. Sextortion: The problem and solutions. https://www.brookings.edu/blog/techtank/2016/05/11/sextortion-the-problem-and-solutions (accessed September 19, 2023).

Klinger, E., A. Wiesmaier, and A. Heinemann. n.d. A review of existing GDPR solutions for citizens and SMEs. Darmstadt, Germany: Hochschule Darmstadt University of Applied Sciences.

Kuss, D. J., A. M. Kristensen, A. J. Williams, and O. Lopez-Fernandez. 2022. To be or not to be a female gamer: A qualitative exploration of female gamer identity. International Journal of Environmental Research and Public Health 19(3):1169.

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Laird, E., H. Grant-Chapman, C. Venzke, and H. Quay-de la Vallee. 2022. Hidden harms: The misleading promise of monitoring students online. https://cdt.org/insights/report-hidden-harms-the-misleading-promise-of-monitoring-students-online (accessed September 19, 2023).

Leavitt, A., and K. Lo. 2023. A comparative analysis of platform reporting flows. Paper read at Trust & Safety Research Conference 2023, September 28, 29, Palo Alto, CA. https://tinyurl.com/tsrc23reporting (accessed November 17, 2023).

Lima, C., and A. Schaffer. 2022. States are moving to penalize ‘cyber-flashing.’ The Washington Post. https://www.washingtonpost.com/politics/2022/09/27/states-are-moving-penalize-cyber-flashing (accessed September 19, 2023).

Lorenz, T., and K. Browning. 2020. Dozens of women in gaming speak out about sexism and harassment. https://openlab.citytech.cuny.edu/emergingmedia/files/2020/09/Dozens-of-Women-in-Gaming-Speak-Out-About-Sexism-and-Harassment-The-New-York-Times.pdf (accessed September 19, 2023).

Lynch, T., J. E. Tompkins, I. I. van Driel, and N. Fritz. 2016. Sexy, strong, and secondary: A content analysis of female characters in video games across 31 years. Journal of Communication 66(4):564-584.

Machackova, H. 2020. Bystander reactions to cyberbullying and cyberaggression: Individual, contextual, and social factors. Current Opinion in Psychology 36:130-134.

McLean, L., and M. D. Griffiths. 2019. Female gamers’ experience of online harassment and social support in online gaming: A qualitative study. International Journal of Mental Health and Addiction 17(4):970-994.

Meta. 2023. Privacy policy: What is the privacy policy and what does it cover? https://www.facebook.com/privacy/policy (accessed September 19, 2023).

Miller, B. 2021. Fact or phallus? Considering the constitutionality of Texas’s cyber-flashing law under the true threat doctrine. Texas A&M Law Review 8(2):424-449.

Miller, T. C. 2023. Protecting children online: Evaluating possible reforms in the law and the application of COPPA. https://www.mercatus.org/media/161586/download?attachment (accessed November 12, 2023).

Milosevic, T. 2017. Protecting children online?: Cyberbullying policies of social media companies. The MIT Press. https://library.oapen.org/viewer/web/viewer.html?file=/bitstream/handle/20.500.12657/30535/645372.pdf (accessed September 19, 2023).

Mishna, F., E. Milne, C. Cook, A. Slane, and J. Ringrose. 2023. Unsolicited sexts and unwanted requests for sexts: Reflecting on the online sexual harassment of youth. Youth & Society 55(4):630-651.

Mohn, T. 2012. The travel industry takes on human trafficking. The New York Times. https://www.nytimes.com/2012/11/09/giving/the-travel-industry-takes-on-human-trafficking.html (accessed September 19, 2023).

Mortensen, T. E. 2016. Anger, fear, and games: The long event of #gamergate. Games and Culture 13(8):787-806.

NAMR (National Association of Mandated Reporters). 2023. About NAMR. https://namr.org/about (accessed September 19, 2023).

National Center for Missing & Exploited Children. 2023. Sextortion. https://www.missingkids.org/theissues/sextortion (accessed September 19, 2023).

O’Brien, J. E., and W. Li. 2020. The role of the internet in the grooming, exploitation, and exit of United States domestic minor sex trafficking victims. Journal of Children and Media 14(2):187-203.

Official Journal of the European Union. 2016. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27, April 2016, on the protection of natural persons with regard to the processing of personal data on the free movement of such data, and repealing directive 95/46/EC (General Data Protection Regulation). https://eur-lex.europa.eu/eli/reg/2016/679/oj (accessed September 19, 2023).

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

Pappas, M. 2023. Challenges inhibiting platforms from effective moderation. Paper presented at Committee on Social Media Impacts on the Health and Wellbeing of Children and Adolescents: Meeting 4, Washington, DC.

Park, J., J. Hallman, X. S. Liu, and J. Hancock. 2023. Black representation in social media well-being research: A scoping review of social media experience and psychological well-being among Black users in the United States. New Media & Society. https://doi.org/10.1177/14614448231191542.

Patchin, J. W., and S. Hinduja. 2020. Sextortion among adolescents: Results from a national survey of U.S. Youth. Sexual Abuse 32(1):30-54.

Przybylski, A. K., and L. Bowes. 2017. Cyberbullying and adolescent well-being in England: A population-based cross-sectional study. The Lancet Child & Adolescent Health 1(1):19-26.

Rideout, V., A. Peebles, S. Mann, and M. B. Robb. 2021. The common sense census: Media use by tweens and teens. San Francisco, CA: Common Sense.

Ringrose, J., K. Regehr, and B. Mikne. 2021a. Understanding and combatting youth experiences of image-based sexual harassment and abuse. ASCL. https://www.ascl.org.uk/ASCL/media/ASCL/Our%20view/Campaigns/Understanding-and-combatting-youth-experiences-of-image-based-sexual-harassment-and-abuse-full-report.pdf (accessed September 19, 2023).

Ringrose, J., K. Regehr, and S. Whitehead. 2021b. Teen girls’ experiences negotiating the ubiquitous dick pic: Sexual double standards and the normalization of image based sexual harassment. Sex Roles 85(9):558-576.

Robinson, M. 2018. Toxic players and the power of positive engagement. https://www.gamesindustry.biz/toxic-players-and-the-power-of-positive-engagement (accessed September 19, 2023).

Salerno-Ferraro, A. C., C. Erentzen, and R. A. Schuller. 2022. Young women’s experiences with technology-facilitated sexual violence from male strangers. Journal of Interpersonal Violence 37(19-20):NP17860-NP17885.

Salter, M., and E. Hanson. 2021. “I need you all to understand how pervasive this issue is:” User efforts to regulate child sexual offending on social media. In The Emerald International Handbook of Technology-Facilitated Violence and Abuse, edited by J. Bailey, A. Flynn, and N. Henry. Emerald Publishing Limited. Pp. 729-748.

SAMHSA (Substance Abuse and Mental Health Services Agency). 2022. 988 partner community. https://www.samhsa.gov/find-help/988/partners (accessed September 19, 2023).

SAMHSA. 2023a. 988 social media shareables. https://www.samhsa.gov/find-help/988/partner-toolkit/social-media-shareables (accessed September 19, 2023).

SAMHSA. 2023b. School and campus health. https://www.samhsa.gov/school-campus-health (accessed October 19, 2023).

SAMHSA. 2023c. Who we are. https://www.samhsa.gov/about-us/who-we-are#:~:text=SAMHSA’s%20mission%20is%20to%20lead,equitable%20access%20and%20better%20outcomes (accessed October 19, 2023).

Say, G. N., Z. Babadagi, K. Karabekiroglu, and S. Akbas. 2015. Abuse characteristics and psychiatric consequences associated with online sexual abuse. Cyberpsychology, Behavior, and Social Networking 18(6):333-336.

Schoenebeck, S., O. L. Haimson, and L. Nakamura. 2021. Drawing from justice theories to support targets of online harassment. New Media & Society 23(5):1278-1300.

Smith, M. 2018. Four in ten female millenials have been sent an unsolicited penis photo. https://yougov.co.uk/topics/politics/articles-reports/2018/02/16/four-ten-female-millennials-been-sent-dick-pic (accessed September 19, 2023).

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

TikTok. 2023. Privacy policy. https://www.tiktok.com/legal/page/us/privacy-policy/en (accessed September 19, 2023).

Van Ouytsel, J., M. Walrave, L. De Marez, B. Vanhaelewyn, and K. Ponnet. 2020. A first investigation into gender minority adolescents’ sexting experiences. Journal of Adolescence 84(1):213-218.

Vecchio, P. D. 2018. This school year, let’s erase bullying. In SAMHSA Blog. https://www.samhsa.gov/blog/school-year-lets-erase-bullying (accessed November 28, 2023).

Vogels, E. 2021. The state of online harassment. Pew Research Center. https://www.pewresearch.org/internet/2021/01/13/the-state-of-online-harassment (accessed September 19, 2023).

Vogels, E. 2022. Teens and cyberbullying 2022. Pew Research Center. https://www.pewresearch.org/internet/2022/12/15/teens-and-cyberbullying-2022/ (accessed September 19, 2023).

Wang, B. 2023. A critical analysis of the Law Commission’s proposed cyberflashing offence. The Journal of Criminal Law 87(1):39-52.

Whittle, H., C. Hamilton-Giachritsis, A. Beech, and G. Collings. 2013. A review of young people’s vulnerabilities to online grooming. Aggression and Violent Behavior 18(1):135-146.

Winters, B. R. 2017. The hotel industry’s role in combatting sex trafficking. Monterey, CA: Naval Postgraduate School.

Wittes, B., C. Poplin, Q. Jurecic, and C. Spera. 2016. Closing the sextortion sentencing gap: A legislative proprosal. Center for Technology Innovation. https://www.brookings.edu/wp-content/uploads/2016/05/sextortion2.pdf (accessed September 19, 2023).

You, L., and Y.-H. Lee. 2019. The bystander effect in cyberbullying on social network sites: Anonymity, group size, and intervention intentions. Telematics and Informatics 45 (December):101284.

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×

This page intentionally left blank.

Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 175
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 176
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 177
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 178
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 179
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 180
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 181
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 182
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 183
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 184
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 185
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 186
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 187
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 188
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 189
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 190
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 191
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 192
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 193
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 194
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 195
Suggested Citation:"7 Online Harassment." National Academies of Sciences, Engineering, and Medicine. 2024. Social Media and Adolescent Health. Washington, DC: The National Academies Press. doi: 10.17226/27396.
×
Page 196
Next: 8 Research »
Social Media and Adolescent Health Get This Book
×
 Social Media and Adolescent Health
Buy Paperback | $40.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Social media has been fully integrated into the lives of most adolescents in the U.S., raising concerns among parents, physicians, public health officials, and others about its effect on mental and physical health. Over the past year, an ad hoc committee of the National Academies of Sciences, Engineering, and Medicine examined the research and produced this detailed report exploring that effect and laying out recommendations for policymakers, regulators, industry, and others in an effort to maximize the good and minimize the bad. Focus areas include platform design, transparency and accountability, digital media literacy among young people and adults, online harassment, and supporting researchers.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!