This extraordinary growth of the Internet ill.u.s.trates the extent to which the Internet promotes First Amendment values in the same way that the historical use of traditional public fora for speaking, handbilling, and protesting testifies to their effectiveness as vehicles for free speech. Cf. Martin, 319 U.S.

at 145 ("The widespread use of this method of communication [door-to-door distribution of leaflets] by many groups espousing various causes attests its major importance."); Schneider v.

State, 308 U.S. 147, 164 (1939) ("[P]amphlets have proved most effective instruments in the dissemination of opinion.").

The provision of Internet access in public libraries, in addition to sharing the speech-enhancing qualities of fora such as streets, sidewalks, and parks, also supplies many of the speech-enhancing properties of the postal service, which is open to the public at large as both speakers and recipients of information, and provides a relatively low-cost means of disseminating information to a geographically dispersed audience.

See Lamont v. Postmaster Gen., 381 U.S. 301 (1965) (invalidating a content-based prior restraint on the use of the mails); see also Blount v. Rizzi, 400 U.S. 410 (1971) (same). Indeed, the Supreme Court"s description of the postal system in Lamont seems equally apt as a description of the Internet today: "the postal system . . . is now the main artery through which the business, social, and personal affairs of the people are conducted . . . ."

381 U.S. at 305 n.3.

In short, public libraries, by providing their patrons with access to the Internet, have created a public forum that provides any member of the public free access to information from millions of speakers around the world. The unique speech-enhancing character of Internet use in public libraries derives from the openness of the public library to any member of the public seeking to receive information, and the openness of the Internet to any member of the public who wishes to speak. In particular, speakers on the Internet enjoy low barriers to entry and the ability to reach a ma.s.s audience, unhindered by the constraints of geography. Moreover, just as the development of new media "presents unique problems, which inform our a.s.sessment of the interests at stake, and which may justify restrictions that would be unacceptable in other contexts," United States v. Playboy Entm"t Group, Inc., 529 U.S. 803, 813 (2000), the development of new media, such as the Internet, also presents unique possibilities for promoting First Amendment values, which also inform our a.s.sessment of the interests at stake, and which we believe, in the context of the provision of Internet access in public libraries, justify the application of heightened scrutiny to content-based restrictions that might be subject to only rational review in other contexts, such as the development of the library"s print collection. Cf. id. at 818 ("Technology expands the capacity to choose; and it denies the potential of this revolution if we a.s.sume the Government is best positioned to make these choices for us.").

A faithful translation of First Amendment values from the context of traditional public fora such as sidewalks and parks to the distinctly non-traditional public forum of Internet access in public libraries requires, in our view, that content-based restrictions on Internet access in public libraries be subject to the same exacting standards of First Amendment scrutiny as content-based restrictions on speech in traditional public fora such as sidewalks, town squares, and parks: The architecture of the Internet, as it is right now, is perhaps the most important model of free speech since the founding. . . . Two hundred years after the framers ratified the Const.i.tution, the Net has taught us what the First Amendment means. . . . The model for speech that the framers embraced was the model of the Internet distributed, noncentralized, fully free and diverse.

Lessig, Code, at 167, 185. Indeed, "[m]inds are not changed in streets and parks as they once were. To an increasing degree, the more significant interchanges of ideas and shaping of public consciousness occur in ma.s.s and electronic media." Denver Area Educ. Telecomms. Consortium, Inc. v. FCC, 518 U.S. 727, 802-03 (1996) (Kennedy, J., concurring in the judgment).

In providing patrons with even filtered Internet access, a public library is not exercising editorial discretion in selecting only speech of particular quality for inclusion in its collection, as it may do when it decides to acquire print materials. By providing its patrons with Internet access, public libraries create a forum in which any member of the public may receive speech from anyone around the world who wishes to disseminate information over the Internet. Within this "vast democratic forum[]," Reno, 521 U.S. at 868, which facilitates speech that is "as diverse as human thought," id. at 870, software filters single out for exclusion particular speech on the basis of its disfavored content. We hold that these content- based restrictions on patrons" access to speech are subject to strict scrutiny.

4. Application of Strict Scrutiny Having concluded that strict scrutiny applies to public libraries" content-based restrictions on patrons" access to speech on the Internet, we must next determine whether a public library"s use of Internet software filters can survive strict scrutiny. To survive strict scrutiny, a restriction on speech "must be narrowly tailored to promote a compelling Government interest. If a less restrictive alternative would serve the Government"s purpose, the legislature must use that alternative."

United States v. Playboy Entm"t Group, Inc., 529 U.S. 803, 813 (2000) (citation omitted); see also Fabulous a.s.socs., Inc. v. Pa.

Pub. Util. Comm"n, 896 F.2d 780, 787 (3d Cir. 1990) (holding that a content-based burden on speech is permissible "only if [the government] shows that the restriction serves a compelling interest and that there are no less restrictive alternatives").

The application of strict scrutiny to a public library"s use of filtering products thus requires three distinct inquiries.

First, we must identify those compelling government interests that the use of filtering software promotes. It is then necessary to a.n.a.lyze whether the use of software filters is narrowly tailored to further those interests. Finally, we must determine whether less restrictive alternatives exist that would promote the state interest.

1. State Interests We begin by identifying those legitimate state interests that a public library"s use of software filters promotes.

1. Preventing the Dissemination of Obscenity, Child p.o.r.nography, and Material Harmful to Minors

On its face, c.i.p.a is clearly intended to prevent public libraries" Internet terminals from being used to disseminate to library patrons visual depictions that are obscene, child p.o.r.nography, or in the case of minors, harmful to minors. See c.i.p.a Sec. 1712 (codified at 20 U.S.C. Sec. 9134(f)(1)(A) & (B)), Sec.

1721(b) (codified at 47 U.S.C. Sec. 254(h)(6)(B) & (C)) (requiring any library that receives E-rate discounts to certify that it is enforcing "a policy of Internet safety that includes the operation of a technology protection measure with respect to any of its computers with Internet access that protects against access through such computers to visual depictions" that are "obscene" or "child p.o.r.nography," and, when the computers are in use by minors, also protects against access to visual depictions that are "harmful to minors").

The government"s interest in preventing the dissemination of obscenity, child p.o.r.nography, or, in the case of minors, material harmful to minors, is well-established. Speech that is obscene, under the legal definition of obscenity set forth in the margin, is unprotected under the First Amendment, and accordingly the state has a compelling interest in preventing its distribution.

See Miller v. California, 413 U.S. 15, 18 (1973) ("This Court has recognized that the States have a legitimate interest in prohibiting dissemination or exhibition of obscene material."); Stanley v. Georgia, 394 U.S. 557, 563 (1969) ("[T]he First and Fourteenth Amendments recognize a valid governmental interest in dealing with the problem of obscenity."); Roth v. United States, 354 U.S. 476, 485 (1957) ("We hold that obscenity is not within the area of const.i.tutionally protected speech of press.").

The First Amendment also permits the state to prohibit the distribution to minors of material that, while not obscene with respect to adults, is obscene with respect to minors. See Ginsberg v. New York, 390 U.S. 629, 637 (1968) (holding that it is const.i.tutionally permissible "to accord minors under 17 a more restricted right than that a.s.sured to adults to judge and determine for themselves what s.e.x material they may read or see"). Proscribing the distribution of such material to minors is const.i.tutionally justified by the government"s well-recognized interest in safeguarding minors" well-being. See Reno v. ACLU, 521 U.S. 844, 869-70 (1997) ("[T]here is a compelling interest in protecting the physical and psychological well-being of minors which extend[s] to shielding them from indecent messages that are not obscene by adult standards . . . .") (internal quotation marks and citation omitted); New York v. Ferber, 458 U.S. 747, 756-57 (1982) ("It is evident beyond the need for elaboration that a State"s interest in safeguarding the physical and psychological well-being of a minor is compelling.") (internal quotation marks and citation omitted); Ginsberg, 390 U.S. at 640 ("The State . . . has an independent interest in the well-being of its youth.").

The government"s compelling interest in protecting the well- being of its youth justifies laws that criminalize not only the distribution to minors of material that is harmful to minors, but also the possession and distribution of child p.o.r.nography.

See Osborne v. Ohio, 495 U.S. 103, 111 (1990) (holding that a state "may const.i.tutionally proscribe the possession and viewing of child p.o.r.nography"); Ferber, 458 U.S. at 757, 763 (noting that "[t]he prevention of s.e.xual exploitation and abuse of children const.i.tutes a government objective of surpa.s.sing importance," and holding that "child p.o.r.nography [is] a category of material outside the protection of the First Amendment").

Thus, a public library"s use of software filters survives strict scrutiny if it is narrowly tailored to further the state"s well-recognized interest in preventing the dissemination of obscenity and child p.o.r.nography, and in preventing minors from being exposed to material harmful to their well-being.

2. Protecting the Unwilling Viewer Several of the libraries that use filters a.s.sert that filters serve the libraries" interest in preventing patrons from being unwillingly exposed to s.e.xually explicit speech that the patrons find offensive. Nearly every library proffered by either the government or the plaintiffs received complaints, in varying degrees of frequency, from library patrons who saw other patrons accessing s.e.xually explicit material on the library"s Internet terminals.

In general, First Amendment jurisprudence is reluctant to recognize a legitimate state interest in protecting the unwilling viewer from speech that is const.i.tutionally protected. "Where the designed benefit of a content-based speech restriction is to shield the sensibilities of listeners, the general rule is that the right of expression prevails, even where no less restrictive alternative exists. We are expected to protect our own sensibilities simply by averting our eyes." Playboy, 529 U.S. at 813 (2000) (internal quotation marks and citation omitted); see also Erznoznik v. City of Jacksonville, 422 U.S. 205, 209 (1975) ("[W]hen the government, acting as censor, undertakes selectively to shield the public from some kinds of speech on the ground that they are more offensive than others, the First Amendment strictly limits its power.").

For example, in Cohen v. California, 403 U.S. 15 (1971), the Supreme Court reversed defendant"s conviction for wearing, in a munic.i.p.al courthouse, a jacket bearing the inscription "f.u.c.k the Draft." The Court noted that "much has been made of the claim that Cohen"s distasteful mode of expression was thrust upon unwilling or unsuspecting viewers, and that the State might therefore legitimately act as it did in order to protect the sensitive from otherwise unavoidable exposure to appellant"s crude form of protest." Id. at 21. This justification for suppressing speech failed, however, because it "would effectively empower a majority to silence dissidents simply as a matter of personal predilections." Id. The Court concluded that "[t]hose in the Los Angeles courthouse could effectively avoid further bombardment of their sensibilities simply by averting their eyes." Id.

Similarly, in Erznoznik, the Court invalidated on its face a munic.i.p.al ordinance prohibiting drive-in movie theaters from showing films containing nudity if they were visible from a public street or place. The city"s "primary argument [was] that it may protect its citizens against unwilling exposure to materials that may be offensive." 422 U.S. at 208. The Court soundly rejected this interest in shielding the unwilling viewer: The plain, if at times disquieting, truth is that in our pluralistic society, constantly proliferating new and ingenious forms of expression, we are inescapably captive audiences for many purposes. Much that we encounter offends our esthetic, if not our political and moral, sensibilities. Nevertheless, the Const.i.tution does not permit government to decide which types of otherwise protected speech are sufficiently offensive to require protection for the unwilling listener or viewer. Rather, absent . . . narrow circ.u.mstances . . . the burden normally falls upon the viewer to avoid further bombardment of his sensibilities simply by averting his eyes.

422 U.S. at 210-11 (internal quotation marks and citation omitted).

The state"s interest in protecting unwilling viewers from exposure to patently offensive material is accounted for, to some degree, by obscenity doctrine, which originated in part to permit the state to shield the unwilling viewer. "The Miller standard, like its predecessors, was an accommodation between the State"s interests in protecting the sensibilities of unwilling recipients from exposure to p.o.r.nographic material and the dangers of censorship inherent in unabashedly content-based laws." Ferber, 458 U.S. at 756 (internal quotation marks and citation omitted); see also Miller, 413 U.S. at 18-19 ("This Court has recognized that the States have a legitimate interest in prohibiting dissemination or exhibition of obscene material when the mode of dissemination carries with it a significant danger of offending the sensibilities of unwilling recipients or of exposure to juveniles.") (citation omitted). To the extent that speech has serious literary, artistic, political, or scientific value, and therefore is not obscene under the Miller test of obscenity, the state"s interest in shielding unwilling viewers from such speech is tenuous.

Nonetheless, the Court has recognized that in certain limited circ.u.mstances, the state has a legitimate interest in protecting the public from unwilling exposure to speech that is not obscene. This interest has justified restrictions on speech "when the speaker intrudes on the privacy of the home, or the degree of captivity makes it impractical for the unwilling viewer or auditor to avoid exposure." Erznoznik, 422 U.S. at 209 (citations omitted). Thus, in FCC v. Pacifica Foundation, 438 U.S. 726 (1978), the Court relied on the state"s interest in shielding viewers" sensibilities to uphold a prohibition against profanity in radio broadcasts: Patently offensive, indecent material presented over the airwaves confronts the citizen, not only in public, but also in the privacy of the home, where the individual"s right to be left alone plainly outweighs the First Amendment rights of an intruder. Because the broadcast audience is constantly tuning in and out, prior warnings cannot completely protect the listener or viewer from unexpected program content.

Id. at 748 (citation omitted); accord Frisby v. Schultz, 487 U.S.

474, 485 (1988) ("Although in many locations, we expect individuals simply to avoid speech they do not want to hear, the home is different."); see also Lehman v. City of Shaker Heights, 418 U.S. 298, 302 (1974) (plurality opinion) (upholding a content-based restriction on the sale of advertising s.p.a.ce in public transit vehicles and noting that "[t]he streetcar audience is a captive audience").

Although neither the Supreme Court nor the Third Circuit has recognized a compelling state interest in shielding the sensibilities of unwilling viewers, beyond laws intended to preserve the privacy of individuals" homes or to protect captive audiences, we do not read the case law as categorically foreclosing recognition, in the public library setting, of the state"s interest in protecting unwilling viewers. See Pacifica, 438 U.S. at 749 n.27 ("Outside the home, the balance between the offensive speaker and the unwilling audience may sometimes tip in favor of the speaker, requiring the offended listener to turn away.") (emphasis added). Under certain circ.u.mstances, therefore a public library might have a compelling interest in protecting library patrons and staff from unwilling exposure to s.e.xually explicit speech that, although not obscene, is patently offensive.

3. Preventing Unlawful or Inappropriate Conduct Several of the librarians proffered by the government testified that unfiltered Internet access had led to occurrences of criminal or otherwise inappropriate conduct by library patrons, such as public masturbation, and hara.s.sment of library staff and patrons, sometimes rising to the level of physical a.s.sault. As in the case with patron complaints, however, the government adduced no quant.i.tative data comparing the frequency of criminal or otherwise inappropriate patron conduct before the library"s use of filters and after the library"s use of filters.

The sporadic anecdotal accounts of the government"s library witnesses were countered by anecdotal accounts by the plaintiffs"

library witnesses, that incidents of offensive patron behavior in public libraries have long predated the advent of Internet access.

Aside from a public library"s interest in preventing patrons from using the library"s Internet terminals to receive obscenity or child p.o.r.nography, which const.i.tutes criminal conduct, we are constrained to reject any compelling state interest in regulating patrons" conduct as a justification for content-based restrictions on patrons" Internet access. "[T]he Court"s First Amendment cases draw vital distinctions between words and deeds, between ideas and conduct." Ashcroft, 122 S. Ct. at 1403. First Amendment jurisprudence makes clear that speech may not be restricted on the ground that restricting speech will reduce crime or other undesirable behavior that the speech is thought to cause, subject to only a narrow exception for speech that "is directed to inciting or producing imminent lawless action and is likely to incite or produce such action." Brandenburg v. Ohio, 395 U.S. 444, 447 (1969) (per curiam). "The mere tendency of speech to encourage unlawful acts is insufficient reason for banning it." Ashcroft, 122 S. Ct. at 1403.

Outside of the narrow "incitement" exception, the appropriate method of deterring unlawful or otherwise undesirable behavior is not to suppress the speech that induces such behavior, but to attach sanctions to the behavior itself. "Among free men, the deterrents ordinarily to be applied to prevent crime are education and punishment for violations of the law, not abridgement of the rights of free speech." Kingsley Int"l Pictures Corp. v. Regents of the Univ. of the State of New York, 360 U.S. 684, 689 (1959) (quoting Whitney v. Cal., 274 U.S. 357, 378 (1927) (Brandeis, J., concurring)); see also Bartnicki v.

Vopper, 532 U.S. 514, 529 (2001) ("The normal method of deterring unlawful conduct is to impose an appropriate punishment on the person who engages in it.").

4. Summary In sum, we reject a public library"s interest in preventing unlawful or otherwise inappropriate patron conduct as a basis for restricting patrons" access to speech on the Internet. The proper method for a library to deter unlawful or inappropriate patron conduct, such as hara.s.sment or a.s.sault of other patrons, is to impose sanctions on such conduct, such as either removing the patron from the library, revoking the patron"s library privileges, or, in the appropriate case, calling the police. We believe, however, that the state interests in preventing the dissemination of obscenity, child p.o.r.nography, or in the case of minors, material harmful to minors, and in protecting library patrons from being unwillingly exposed to offensive, s.e.xually explicit material, could all justify, for First Amendment purposes, a public library"s use of Internet filters, provided that use of such filters is narrowly tailored to further those interests, and that no less restrictive means of promoting those interests exist. Accordingly, we turn to the narrow tailoring question.

2. Narrow Tailoring

Having identified the relevant state interests that could justify content-based restrictions on public libraries" provision of Internet access, we must determine whether a public library"s use of software filters is narrowly tailored to further those interests. "It is not enough to show that the Government"s ends are compelling; the means must be carefully tailored to achieve those ends." Sable Communications of Cal., Inc. v. FCC, 492 U.S.

115, 126 (1989). "[M]anifest imprecision of [a] ban . . .

reveals that its proscription is not sufficiently tailored to the harms it seeks to prevent to justify . . . substantial interference with . . . speech." FCC v. League of Women Voters of Cal., 468 U.S. 364, 392 (1984).

The commercially available filters on which evidence was presented at trial all block many thousands of Web pages that are clearly not harmful to minors, and many thousands more pages that, while possibly harmful to minors, are neither obscene nor child p.o.r.nography. See supra, Subsection II.E.7. Even the defendants" own expert, after a.n.a.lyzing filtering products"

performance in public libraries, concluded that of the blocked Web pages to which library patrons sought access, between 6% and 15% contained no content that meets even the filtering products"

own definitions of s.e.xually explicit content, let alone the legal definitions of obscenity or child p.o.r.nography, which none of the filtering companies that were studied use as the basis for their blocking decisions. Moreover, in light of the flaws in these studies, discussed in detail in our findings of fact above, these percentages significantly underestimate the amount of speech that filters erroneously block, and at best provide a rough lower bound on the filters" rates of overblocking. Given the substantial amount of const.i.tutionally protected speech blocked by the filters studied, we conclude that use of such filters is not narrowly tailored with respect to the government"s interest in preventing the dissemination of obscenity, child p.o.r.nography, and material harmful to minors.

To be sure, the quant.i.tative estimates of the rates of overblocking apply only to those four commercially available filters a.n.a.lyzed by plaintiffs" and defendants" expert witnesses.

Nonetheless, given the inherent limitations in the current state of the art of automated cla.s.sification systems, and the limits of human review in relation to the size, rate of growth, and rate of change of the Web, there is a tradeoff between underblocking and overblocking that is inherent in any filtering technology, as our findings of fact have demonstrated. We credit the testimony of plaintiffs" expert witness, Dr. Geoffrey Nunberg, that no software exists that can automatically distinguish visual depictions that are obscene, child p.o.r.nography, or harmful to minors, from those that are not. Nor can software, through keyword a.n.a.lysis or more sophisticated techniques, consistently distinguish web pages that contain such content from web pages that do not.

In light of the absence of any automated method of cla.s.sifying Web pages, filtering companies are left with the Sisyphean task of using human review to identify, from among the approximately two billion web pages that exist, the 1.5 million new pages that are created daily, and the many thousands of pages whose content changes from day to day, those particular web pages to be blocked. To cope with the Web"s extraordinary size, rate of growth, and rate of change, filtering companies that rely solely on human review to block access to material falling within their category definitions must use a variety of techniques that will necessarily introduce substantial amounts of overblocking.

These techniques include blocking every page of a Web site that contains only some content falling within the filtering companies" category definitions, blocking every Web site that shares an IP-address with a Web site whose content falls within the category definitions, blocking "loophole sites," such as anonymizers, cache sites, and translation sites, and allocating staff resources to reviewing content of uncategorized pages rather than re-reviewing pages, domain names, or IP-addresses that have been already categorized to determine whether their content has changed. While a filtering company could choose not to use these techniques, due to the overblocking errors they introduce, if a filtering company does not use such techniques, its filter will be ineffective at blocking access to speech that falls within its category definitions.

Thus, while it would be easy to design, for example, a filter that blocks only ten Web sites, all of which are either obscene, child p.o.r.nography, or harmful to minors, and therefore completely avoids overblocking, such a filter clearly would not comply with c.i.p.a, since it would fail to offer any meaningful protection against the hundreds of thousands of Web sites containing speech in these categories. As detailed in our findings of fact, any filter that blocks enough speech to protect against access to visual depictions that are obscene, child p.o.r.nography, and harmful to minors, will necessarily overblock substantial amounts of speech that does not fall within these categories.

This finding is supported by the government"s failure to produce evidence of any filtering technology that avoids overblocking a substantial amount of protected speech. Where, as here, strict scrutiny applies to a content-based restriction on speech, the burden rests with the government to show that the restriction is narrowly tailored to serve a compelling government interest. See Playboy, 529 U.S. at 816 ("When the Government restricts speech, the Government bears the burden of proving the const.i.tutionality of its actions."); see also R.A.V. v. City of St. Paul, 505 U.S. 377, 382 (1992) ("Content-based regulations are presumptively invalid."). Thus, it is the government"s burden, in this case, to show the existence of a filtering technology that both blocks enough speech to qualify as a technology protection measure, for purposes of c.i.p.a, and avoids overblocking a substantial amount of const.i.tutionally protected speech.

Here, the government has failed to meet its burden. Indeed, as discussed in our findings of fact, every technology protection measure used by the government"s library witnesses or a.n.a.lyzed by the government"s expert witnesses blocks access to a substantial amount of speech that is const.i.tutionally protected with respect to both adults and minors. In light of the credited testimony of Dr. Nunberg, and the inherent tradeoff between overblocking and underblocking, together with the government"s failure to offer evidence of any technology protection measure that avoids overblocking, we conclude that any technology protection measure that blocks a sufficient amount of speech to comply with c.i.p.a"s requirement that it "protect[] against access through such computers to visual depictions that are (I) obscene; (II) child p.o.r.nography; or (III) harmful to minors" will necessarily block substantial amounts of speech that does not fall within these categories. c.i.p.a Sec. 1712 (codified at 20 U.S.C. Sec. 9134(f)(1)(A)).

Hence, any public library"s use of a software filter required by c.i.p.a will fail to be narrowly tailored to the government"s compelling interest in preventing the dissemination, through Internet terminals in public libraries, of visual depictions that are obscene, child p.o.r.nography, or harmful to minors.

Where, as here, strict scrutiny applies, the government may not justify restrictions on const.i.tutionally protected speech on the ground that such restrictions are necessary in order for the government effectively to suppress the dissemination of const.i.tutionally unprotected speech, such as obscenity and child p.o.r.nography. "The argument . . . that protected speech may be banned as a means to ban unprotected speech . . . . turns the First Amendment upside down. The Government may not suppress lawful speech as the means to suppress unlawful speech."

Ashcroft, 122 S. Ct. at 1404. This rule reflects the judgment that "[t]he possible harm to society in permitting some unprotected speech to go unpunished is outweighed by the possibility that protected speech of others may be muted . . . ."

Broadrick v. Oklahoma, 413 U.S. at 612.

Thus, in Ashcroft, the Supreme Court rejected the government"s argument that a statute criminalizing the distribution of const.i.tutionally protected "virtual" child p.o.r.nography, produced through computer imaging technology without the use of real children, was necessary to further the state"s interest in prosecuting the dissemination of const.i.tutionally unprotected child p.o.r.nography produced using real children, since "the possibility of producing images by using computer imaging makes it very difficult for [the government] to prosecute those who produce p.o.r.nography using real children." Ashcroft, 122 S.

© 2024 www.topnovel.cc