Skip to main content

Currently Skimming:

12. Technology-Based Tools for Users
Pages 267-326

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 267...
... Table 12.1 provides a preview of this chapter. 12.1 FILTERING AND CONTENT-LIMITED ACCESS Filters are at the center of the debate over protecting children and youth from inappropriate sexually explicit material on the Internet.
From page 268...
... (This is true in all cases except for instant help, in which the user is the child seeking help.)
From page 269...
... Mostly relevant to inadvertent exposure
From page 271...
... As a feature of their offerings, a number of ISPs provide Internet access only to a certain subset of Internet content. Content-limited ISPs are most likely to be used by organizations and families in which the information needs of the children involved are fairly predictable.
From page 272...
... Some content-limited ISPs, intended for use by children, make available only a very narrow range of content that has been explicitly vetted for appropriateness and safety. Thus, all Web pages accessible have been viewed and assessed for content that is developmentally appropriate, educational, and entertainin~.
From page 273...
... When activated by the user, these filters do not return links to inappropriate content found in a search, but they also do not block access to specifically named Web sites (so that a user knowing the URL of a Web site containing inappropriate sexually explicit material could access it) .5 Other search engines are explicitly designed for use by children.
From page 274...
... The reason is that interactive sources, almost by definition, can support a variety of different types of interaction the best example of which is an online friend with whom one may exchange sports trivia, conversation about school homework, and inappropriate sexually explicit material. Only real-time content recognition has a chance of filtering such content.
From page 275...
... However, as discussed in Section 2.3.1, all filters are subject to overblocking (false positives, in which filters block some anurouriate material 1 1 1 9By contrast, around 57 percent of public libraries do not filter Internet access on any workstation, while about 21 percent filter access on some workstations. About 21 percent filter all workstations.
From page 277...
... Underblocking results from several factors: · New material appears on the Internet constantly, and the contents of given Web pages sometimes change. When content changes, the judging parties must revisit the sources responsible for the content they provide frequently enough to ensure that inappropriate information does not suddenly appear on a previously trusted source or that the inappropriate material remains on the Web pages in question.
From page 278...
... He continued, "With so many people using the Net as the initial means to look at colleges, that's a serious disadvantage." In addition, he claimed that filters sometimes block e-mail from Beaver college staffers to prospective students. (See Craig Bicknell, 2000, "Beaver College Not a Filter Fave," Wired, March 22, available online at ; and CNN story, 2000, "Beaver College Changes Oft-derided Name to Arcadia University," November 20, available online at .
From page 279...
... Thus, the amount of sexually explicit material hosted overall by such services is likely to be small. (But, if such a service does host even one site containing inappropriate sexually explicit material and that fact is picked up by a filtering vendor that uses IP-based filtering, it will exclude all of the acceptable sites on that host.
From page 280...
... In a specific venue, filters will block some material that some parties deem inappropriate and there is a reasonable argument to be had over whether the blocking that occurs is worth the cost of overblocking. But it is impossible for a filter deployed in a school to block material sought in a cyber-cafe or at home, and filtering limited to schools and libraries will not prevent the access of children to inappropriate sexually explicit material if they are determined to search for it and have other venues of access.
From page 281...
... These comments apply particularly to sexually explicit material, especially material containing images. An individual seeking explicit images for the purpose of sexual arousal is not particularly sensitive to which of hundreds or thousands of images on as many Web pages can be retrieved.
From page 282...
... may be used in a site's metatags to circumvent filters that search for keywords. As a general rule, though, commercial vendors of sexually explicit material argue that they are not economically motivated to expend a lot of effort to get through these filters, because children are unable to pay for such material.
From page 283...
... For example, most schools and libraries have acceptable use policies (AUPs, as discussed in Chapter 10) that forbid use of school or library computer resources for certain purposes, such as viewing sexually explicit sites.
From page 284...
... Finally, many teachers and librarians are themselves uncomfortable in viewing certain types of inappropriate material26 and in the committee's informal discussions in its site visits, this was especially true for many sexually explicit images. Even apart from the claimed benefits of preventing exposure to inappropriate material, filters can offer children other benefits.
From page 286...
... (For example, a list of inappropriate URLs or keywords flagging inappropriate content might be grouped into such categories.) These filters then provide the user with the option of blocking or accepting content by category, so that a user can, for example, block only pornography and hate speech while accepting all other content categories.
From page 287...
... non-traditional religions and sexual orientation in the same category as material that no responsible adult would consider appropriate for young people." She also notes that "because filtering software companies protect the actual list of blocked sites, searching and blocking key words, blocking criteria, and blocking processes as confidential, proprietary trade secret information it is not possible to prove or disprove the hypothesis that the companies may be blocking access to material based on religious bias." At the same time, Willard finds that while "information about the religious connections can be found through diligent search, such information is not clearly evident on the corporate web site or in materials that would provide the source of information for local school officials," though she acknowledges openly that "it is entirely appropriate for conservative religious parents or schools to decide to use the services of an ISP that is blocking sites based on conservative religious values. It is equally appropriate for parents to want their children to use the Internet in school in a manner that is in accord with their personal family values." See Nancy Willard, 2002, Filtering Software: The Religious Connection, Center for Advanced Technology in Education, College of Education, University of Oregon, available online at .
From page 288...
... The committee heard stories of a number of complaints regarding filtering when filters were first installed, but in most such instances, the complaints ceased after a few months. Students without home Internet access seemed to accept a school's filtering policy as a given, and simply adapted to it, even if they were prevented from accessing valuable information.
From page 289...
... The statutory provisions of DMCA prohibit the circumvention of technical measures that prevent the unauthorized copying, transmission, or accessing of copyrighted works, subject to this rulemaking of the Copyright Office. The final rule establishes two exceptions to the anti-circumvention provisions, one of which allows users of Internet content filtering programs to view lists of Web sites blocked by such software.
From page 290...
... (Such differences would reflect the parent's belief that older children with more maturity and a broader scope of information needs might also require broader and less restricted Internet access.) To support different filtering policies, a filter would require the child to log into the system so that the child's individual filtering profile could be used.
From page 291...
... Such blocking can be used to promote the safety of children and to enforce prohibitions against giving out such information. In addition, some filters can block certain types of Internet access entirely: instant messages, e-mail, chat rooms, file transfers, and so on.
From page 292...
... In an institutional environment, there are costs of teaching the responsible adults what the filter can and cannot do, and providing training that familiarizes them with operating in a filtered en
From page 293...
... . Thus, for inappropriate sexually explicit material that might loosely be classified as "for adults only," some material that should not be placed into this category will be and will therefore be improperly blocked.
From page 294...
... In other cases, when they prepared a lesson plan at home (with unfiltered Internet access) , they were unable to present it at school because a site they found at home was inaccessible using school computers.
From page 295...
... However, label-based filters require content providers or third parties to cooperate in labeling content. To develop such an infrastructure, providers and third parties must have incentives to label content.
From page 296...
... Recognizing that the primary impediment to the success of rating schemes is the extent to which Internet content is currently not labeled, the Internet Content Rating Association (ICRA) has undertaken a global effort to promote a voluntary self-labeling system through which content providers identify and label their content using predefined, cross-cultural categories (Box 12.4~.
From page 298...
... . Such a safe harbor might be particularly applicable in labeling of sexually explicit material (as discussed in Section 9.3~.
From page 299...
... If that future comes to pass, the media containing such particularly objectionable content might also be selectively blocked (e.g., by blocking all sound files on sexually explicit Web pages)
From page 300...
... Bundling Filters with Other Functionality Filters are a special-purpose tool. Parents and others who purchase and install filters or filtering services thus can be assumed to feel that the problems raised by unfiltered Internet access are worrisome enough to warrant such efforts.
From page 301...
... Today's filters cannot be the sole element of any approach to protecting children from inappropriate sexually explicit material on the Internet 37The first widespread instance of such blocking occurred in 1995 when a major online service provider blocked all sites containing the word "breast," including those dealing with breast cancer. In the wake of widespread complaints, the service provider in question quickly restored access to breast cancer sites.
From page 302...
... However, as a child's Internet information needs outgrow what a kid-friendly service can provide, he or she will have to turn to other sources. Other sourcesby definition will provide information that is less thoroughly vetted, and will likely involve exposure of the now-older child to some inappropriate information; however, an older child may well be better able to cope with inadvertent exposures to such material.
From page 303...
... (For example, these individuals may be more inclined to take such a stance if the children in question are young.) Such consequences may include the blocking of some material that would be mistakenly classified as inappropriate sexually explicit material, and/or the blocking of entire categories of material that are protected by the First Amendment (a consequence of more concern to publicly funded institutions such as public libraries than to individual families)
From page 304...
... 11. Filters are a complement to but not a substitute for responsible adult supervision.
From page 305...
... 12.2.1 What Is Monitoring? Monitoring, as a way of protecting youth from inappropriate content, relies on deterrence rather than prevention per se.
From page 306...
... E-mail is generally not encrypted in storage, and thus may be readable by an adult who is responsible for a child. Monitoring tools can provide a variety of functions, enabling a responsible adult to review incoming and outgoing e-mail, instant message and chat room dialogs, Web pages accessed, and so on.
From page 307...
... For monitoring to be effective, usage must be tied to specific individuals. Thus, in an institutional setting, individual Hogans which tend to be more common in higher grades Wan in lower ones are necessary if mon~tor~ng Information is to be acted on after We fact of inappropriate usage.39 (If immediate action is taken, individual login information is not needed, since an adult can simply walk over to We Internet access point and talk to We child in question.)
From page 308...
... , in practice monitors access to inappropriate imagery. Text also can be monitored remotely, but in this case, the adult supervisor cannot tell at a glance if the text contains inappropriate material, and thus must spend more time in reading that text to make a judgment.
From page 309...
... Moreover, undertaking monitoring covertly leaves the question of what the responsible adult should do if anything in the event that monitoring reveals that the child is behaving inappropriately. If the adult does nothing except watch, learning that is directly coupled to inappropriate access or behavior cannot occur, and the inappropriate behavior may well continue.
From page 310...
... , each image appears in a smaller "thumbnail" version that makes reading most text on those screens difficult or impossible while at the same time usually enabling the supervisor to determine if an image is being displayed. Moreover, many inappropriate sexually explicit images are easy for humans to recognize even at low resolution and/or smaller size.
From page 311...
... (Recording screen images frequently for a large number of users also consumes hard disk space at a rapid rate.) A second cost in the institutional setting is that the effort needed to manage individual logins is significant.
From page 312...
... In general, a child's need for personal freedom increases as he or she grows older. 43According to a survey by the Kaiser Family Foundation in 2001, teenagers place a high value on privacy with respect to their Internet usage: 76 percent of online youth agreed that ''looking up information online is good because I can look things up without anybody knowing about it.
From page 313...
... " In a second scenario, the screen on the Internet access terminal is displayed on the school librarian's terminal (in her office) for several seconds at random intervals ranging from once every 5 minutes to once every 20 minutes.
From page 314...
... Infrastructure While filters can be installed on the client side without the cooperation of any other party, real-time monitoring requires a mechanism for displaying the contents of one monitor on another. When tools for monitoring are used on a large scale, a sufficient number of responsible adults is necessary to provide in-person intervention.
From page 315...
... . Alternatively, a request for access to a potentially inappropriate site can be transmitted to a responsible adult for approval within a certain period of time (e.g., 1 minute)
From page 316...
... 2. Overt monitoring in concert with explicit discussion and education may help children develop their own sense of what is or is not appropriate behavior.
From page 317...
... (Furthermore, because people habituate to warnings, children may respond to overt monitoring as though it were covert i.e., more negatively.) 12.3 TOOLS FOR CONTROLLING OR LIMITING "SPAM" "Spam," e-mail that is similar to the "junk mail" that an individual receives through the post office in the brick and mortar world, is sentunsolicited and indiscriminately to anyone with a known e-mail address.
From page 318...
... 318 YOUTH, PORNOGRAPHY, AND THE INTERNET refers to any form of unsolicited e-mail a person might receive, some of which might be sent by publishers of adult-content Web sites. A typical spam message with sexual content would contain some "come-on" words and a link to an adult-oriented Web site, but would in general arrive without images.
From page 319...
... One important issue is that spam often contains links to inappropriate sexually explicit material rather than the actual material itself, and no content-screening spam-controlling tool known to the committee scans the content for links that may be embedded in an e-mail. That said, some spam-controlling technologies are highly effective against spammers.
From page 320...
... Therefore, approaches to this problem are likely to be developed, regardless of the concerns about youth and sexually explicit material. However, this can easily turn into another race: as better spam-discriminating technologies are invented, alternative ways of wrapping the unsolicited e-mail are invented, and the cycle continues.
From page 321...
... It is easier for the school district to add another item to the spam filter than to have its lawyer sue the sender of the e-mails. As in the case of age verification technologies, expanded use of "mail deflection" beyond issues of sexually inappropriate material may warrant the trouble of installing Spam-Controlling systems.
From page 322...
... 12.4.1 What Is Instant Help? The philosophy underlying instant help is that from time to time children will inevitably encounter upsetting things online inappropriate material, spam mail containing links to inappropriate sexually explicit material, sexual solicitations, and so on.
From page 323...
... . For Internet access on a LAN, instant help could be configured to summon assistance from a responsible adult within the LAN, such as a teacher or a librarian.
From page 324...
... In many urban areas, crisis intervention hotlines (focused on helping people subject to domestic abuse, or feeling suicidal, struggling with substance abuse addictions, and so on) exist, but there are none known to the committee that give training to their volunteer staffs concerning children's exposure to sexually explicit material on the Internet.
From page 325...
... To the committee's knowledge, instant help functionality has not been implemented anywhere, and it remains to be seen if children would actually use it if and when they are confronted with inappropriate material or 48It is true that in schools or libraries a child should be able to request help from these individuals without instant help features. The primary advantage of clicking an instant help icon is that it can be done privately and without drawing attention from other users.
From page 326...
... In cases where a new type of offensive material or communication begins to occur for the first time on the Internet, the first instant help response center to identify this new material could share that information with schools and parents, other instant help response centers, youth (as warnings) , or even filtering vendors.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.