To: Jean Rhein, Library & Leisure Services Director
Suzy Goldman, Technical Support Services Manager
From: Harlan Wright, Assistant County Attorney
Date: June 18, 1996
RE: Ability of Library in Offering Internet Access
to Patrons to Use Filtering Software to Limit Sites Accessed;
Viewing Offensive Images and Text as "Disruptive" Behavior
Requiring a Patron to Leave the Library.
The Seminole County Library System (Library) is
planning to offer Internet access to Library patrons by placing
terminals in its several branches. The Library has already developed
a web page. This web page will become the graphical interface
from which patrons will access various Internet services including
the world wide web (WWW).[1]
The Library is concerned about the impact on other
patrons near terminals if an Internet user access images and sites
with profane language which other patrons find offensive. You
ask if the Library may legally use filtering programs[2] , e.g. "NET
NANNY", to limit those sites or the specific materials which
the patron[3] could access in order to avoid offensive images or
profane language. The Library would use the software not only
to block sites and materials of a sexual nature, but also sites
and materials objectionable on grounds that they were hate-related
or violent. If the Library may not utilize filtering software,
you ask, in the alternative, whether a patron who insists on accessing
sites which cause a reaction by other patrons, could be characterized
as engaging in disruptive behavior[4] and thus be asked to leave
the Library.[5]
As you stated in your memorandum, the area of the
Internet and the First Amendment[6] is new and unsettled law. The
recently issued landmark opinion in ACLU v. Reno, -- F.3d
- (3rd Cir. 1996) is strongly suggestive, but does
not directly control the issues you have raised. The Communications
Decency Act which the court just held was unconstitutional dealt
with criminal sanctions against the dissemination of patently
offensive and indecent materials. The issues you have raised deal
with a government agency's ability to restrict Internet access
to specific information, once that agency decides to offer general
access. To answer your questions, analogies to traditional first
amendment cases involving library books must be used. The issues
you have raised also not only involve the First Amendment[7], but
also have ramifications under Florida tort and criminal law.
Since the area of the Internet law is unsettled,
a cynical attorney might advise you to do as you please relative
to filtering software until a court ruled otherwise. Since the
law is still evolving in this area, it is doubtful that the County
would be found to be liable for damages under 42 USC 1983 for
violations of constitutional rights.[8] However, a policy to adopt
filtering software at Library Internet sites is not legally viable[9]
on the long term. It would most likely involve the County in litigation.
Although not all expression is protected by the
First Amendment in theory, the range of expression not so protected
in practice is very narrow.[10] Much of what commonly is criticized
as being pornographic or obscene is legally "only" patently
offensive or indecent and, thus, protected. The recent decision
in ACLU v. Reno is the latest illustration of this legal
reality. The court found that messages about graphic plays about
homosexual life, graphic movies about the transmission of AIDS,
graphic textual descriptions of prison rape, etc, were, at most,
indecent, not obscene. Similarly, the United States Supreme Court
has held that "depictions of nudity, without more, constitute
protected expression". See Osborne v. Ohio, 109 L.Ed.2d
98 (1990). Consequently, Internet messages with such content enjoy
protection under the First Amendment even when a (older) minor
is making or receiving the communication.[11] See ACLU v. Reno.
The court in ACLU v. Reno also held that the Internet
was a public forum.[12] This holding means any proposed regulation
of communication on the Internet is subject to the highest level
of relevant constitutional scrutiny. For direct government regulation
of communication based on content to survive court scrutiny, such
regulation must be a narrowly tailored measure and closely tied
to a critical government interest which is independent of the
content of the communication being regulated. For so called "content-neutral"
time place and manner regulation of the non-communicative impacts
of expression to survive, such regulation must serve an important
governmental interest unrelated to the content of the communications
being impacted, be narrowly tailored and permit alternative channels
of communication.
Under the scenario your memorandum proposes, the
Library will clearly be acting as governmental regulator. The
use of filtering software would not simply deal with the timing
or manner or place in which express communication occurs. The
use of screening software to block access to certain Internet
sites and materials from the Internet terminals it plans to provide
would involve the Library in direct regulation of communications
based on their content.[13] Moreover, filtering software blocks communication
even before it is completed. This attribute makes the policy even
more aggressive than the Communications Decency act which would
only have prosecuted persons after they had communicated certain
messages to minors. Freedom of expression includes both the right
to make and the right to receive a communication. See Virginia
State Bd. Of Pharmacy v. Virginia Citizens Council, Inc.,
425 U.S. 748 (1976).
First Amendment protection of expression applies
with especial force in cases where prior restraint is exercised
by a governmental body. A court may very well determine that filtering
software used by the Library under the instant facts is a form
of prior restraint like prohibiting the publication and circulation
of a newspaper or the broadcast of a television or radio program
until cleared by a censor. Even when targeted material is arguably
unprotected obscenity or pornography, informal prior restraint
measures like filtering software almost never passes court review.
For the screening software policy to be likely to survive constitutional
muster, it must provided for court determination of whether the
proposed sites and materials to be blocked are truly obscene or
pornographic. See Kingsley Books, Inc. v. Brown, 354 U.S.
436 (1957). Of course, such an arrangement is not feasible for
the Library. Moreover, even if it were, much of what screen software
blocks, would not be found to be legally obscene. Hence, the filtering
software policy would be gutted.[14]
Since the Internet is a new phenomenon, no appellate
case involving a public library and screening software exists
to specifically illustrate the foregoing legal analysis. A reasonable
analogy may be found in an United States Supreme Court opinion
involving a school library's removal of books as the result of
partisan, religious, social, lifestyle or other "controversial
political" preferences.
In Board of Education v. Pico, 73 L.Ed.2d
435 (1982), the United States Supreme Court ruled in a decision
with no one majority opinion that the removal of nine books from
a school library violated students' First Amendment rights. The
grounds for removal were a conservative[15] school board's determination
that the books were "anti-American, anti-Christian, anti-Semitic
and just plain filthy". The majority reached this result,
although the court wrote that it strongly defers to local community
control of education. An individual does not have a constitutional
right to school or public library services. Nevertheless, once
government decides to provide a library (or Internet access to
information), First Amendment protections take effect for the
patron's benefit. The court reasoned that school library use,
as opposed to school curriculae, entail individual student decision-making
and voluntary learning. Accordingly, First Amendment considerations
had to be given greater weight than community control.
The court stressed that its ruling in Pico
did not limit school boards' discretion as to the selection of
books to school libraries since the facts of the case did not
raise this issue. The tenor of the opinion suggests that the Supreme
Court would afford a local school board greater discretion in
the selection of books. However, the court probably would not
uphold a selection policy which was overly restrictive due to
decision-makers' dislike of ideas contained in works sought by
responsible faculty and students and commonly available in other
districts' school libraries.
Pico lends support to this memorandum's conclusion
that a court would not uphold use of prescreening software by
the Library. The use of screening software to prevent access to
various Internet sites and materials, after the Library decides
to offer Internet access, may be analogized to the furnishing
of a library with books only to have some subsequently removed.
Although the Library may have many minors as its patrons, the
Library caters mainly to adults. A public library is also not
so centrally involved in the inculcation of community values as
is a public school library. Accordingly, a court need not defer
to community control as in the case of a school library in determining
the scope on individual patrons' First Amendment freedom to receive
messages. Conversely, voluntary exposure to information and individual
choice must be accorded greater weight in a public library setting
that one involving a school library.[16] Even the use of a public
library by minors is further removed from curriculum decisions.
The one possible supportive aspect of Pico for a policy
of screening Internet sites is that a reviewing court may take
note of the Supreme Court's forbearance regarding original material
selection. The court may hold that deciding which Internet sites
or materials to block was the equivalent of deciding which new
book to purchase and that a local decision-making body may set
guidelines. However, this reasoning presumes that a public library
is the equivalent of a school library as to the balancing of the
interests of First Amendment values and community control values.
Assuming for the sake of analysis that a court found
filtering software did not constitute prior restraint, the probability
of a policy requiring the utilization of filtering software passing
constitutional muster would still be low. The policy and the software
are measures suffering from overbreadth and possibly "underbreadth".
The doctrine of overbreadth is applied by courts to nullify statutes
and regulations which are so broadly drawn that they not only
prohibit constitutionally unprotected expression, but also protected
expression. Under contemporary constitutional and standing law,
the doctrine of overbreadth permits a patron to successfully challenge
the filtering software policy, even though his or her own site(s)
contained patently obscene or other constitutionally unprotected
expression.
Courts do not apply the overbreadth doctrine lightly.
A statute or regulation must be substantially overly broad as
to the expression it prohibits before a court will declare the
measure to be unconstitutional. See Broadrick v. Oklahoma,
413 U.S. 601 (1973). Nevertheless, a reviewing court would likely
find the proposed filtering software policy overly broad. The
software targets Internet sites and information which, although
possibly offensive to many patrons, clearly lie within the ambit
of First Amendment protection for adults and older minors. The
fact that filtering software works in mysterious ways only worsens
the problem. For example, a possible 'false positive" blocked
by filtering software could be a Roman Catholic website addressing
the veneration of the Blessed Virgin, i.a., her immaculate
conception. Conversely, a truly obscene text-based[17] site
which strategically utilized the words "buttofucco"
or "funk" to describe certain activities could escape
detection, at least if its operators frequently changed its address.
This "underbreadth" reduces the policy's merit and the
potential harm of nullifying it.
In conclusion to this subsection, Florida law clearly
states that shielding minors from obscene communications is a
compelling governmental interest. See Sections, 847.001 (Definitions
of "Harmful to Minors", "Obscene", i.a.),
847.011 (Prohibition of Certain acts in Connection with Obscene,
Lewd, etc., Materials), 847.012 (Prohibition of Sale or Other
Distribution of Harmful Materials to Persons Under 18 Years of
Age), 847.0133 ( Protection of Minors), 847.0135 (Computer Pornography).
The courts have upheld this goal as an important legitimate governmental
interest which may be realized by narrowly tailored restrictions
on the manner of communications distribution. Unfortunately, the
Library's proposal to use screening software is not such a narrowly
tailored measure. Filtering software is too broad and too preemptive
and prevents often constitutionally protected communications from
reaching not only minors, but also adults. Contrast the proposed
filtering software policy with presumably constitutional Section
847.0135, Florida Statutes. The statute narrowly focuses upon
computer transmissions knowingly made for the purposes of obtaining
information (including visual depictions) regarding or facilitating
or soliciting sexual conduct with a minor. For the proposed filtering
software policy to have any change of legal viability, the Library
must find the funds for separate terminals for use by separate
age groups. Filtering software should probably not be used at
all at terminals reserved for adults and sparingly for minors
over fifteen years of age.
In the contingency that a court upheld the use of
filtering software at Library Internet sites, the use of such
software still entails legal risk for the County. Promising patrons
that they or their children will not be exposed to certain Internet
sites or material, and then failing to deliver on such promises
could subject the Library to tort liability for negligent infliction
of emotional distress. Undertaking to select which messages reach
its patrons through the Internet could also subject the Library
to liability for the intentional torts of defamation and invasion
of privacy.[18] This memorandum does not address potential tort liability
for damage to software due to the importation of computer viruses,
first encountered by a patron at a Library Internet terminal,
to other networks or computers. This memorandum also does not
discuss tort liability for illegal or unauthorized copying of
software or other intellectual property. These two assumptions
rest, in turn, upon the assumption that the Library will not offer
patrons Internet terminals with disk drives.
Liability for the negligent infliction of emotional
or mental distress will lie under Florida law even if the only
physical harm (a necessary element for this tort) present is the
byproduct or result of emotional trauma. See Gonzalez v. Metro,
Dade County, 651 So.2d 673 (Fla. 1995). The likelihood of
County liability for negligent infliction of emotional distress
is not great. However, the risk of liability, combined with the
costs of even unsuccessful litigation by plaintiffs, is sufficiently
great to be a factor in Library decision-making about whether
to adopt filtering software.
Emotional distress is not the only tort liability
which could resort from the Library's adoption of filtering software.
Under a second scenario, liability could emerge from the very
fact that the Library was undertaking to screen the sites and
materials its patrons could access. Libraries normally are classified
as "distributors" of information. As a result, they
may not be held liable for the obscene or defamatory contents
of books and other publications they select for their collections.
See Smith v. California, 361 U.S. 147 (1959). Pre-selecting
sites and materials could turn the Library into a "publisher"
for the purposes of defamation and invasion of privacy laws. Unlike
distributors, publishers may be held liable for obscene or defamatory
content of communications they disseminate or for invasion of
privacy. In essence, the Library would be falling into the same
trap as Prodigy did in Stratton Oakmont, Inc. v. Prodigy Services.
Co. - NY --, -- A.2d - (NY App. Ct. 1995), where the court
held the computer service liable for defamatory comments posted
on it.
If it uses screening software, reinforced by staff
guidelines, the Library would be exercising editorial control
over the specific content of information it was disseminating
as does the board of editors of a newspaper or encyclopedia. By
holding itself out to its patrons as being able to control the
content of sites accessible through its Internet terminals, the
library would only exacerbate the situation. These factors were
the ones that led a New York court to declare Prodigy a publisher
and liable for defamatory statements.[19] The same analysis for the
intentional tort of invasion of privacy.
Admittedly, the County and its agencies usually
may not be held liable for intentional torts. Section 768.28,
Florida Statutes, does not waive for sovereign immunity for acts
by employees which are outside the scope of employment, or committed
in bad faith or in a wanton and willful manner. See Williams
v. City of Minneola II, 619 So.2d 983 (Fla. 5th
DCA 1993). Local governments are, however, not completely immunized
from liability for intentional torts since some acts could be
committed within the scope of employment and without bad faith
or malicious purpose or a willful disregarding of human safety
or rights. See Richardson v. City of Pompano Beach, 511
So. 2d 1121 (Fla. 4th DCA 1987) and Hennegan v.
Dept. of Highway Safety, 467 So. 2d 748 (Fla. 1st
DCA 1985). One must recall that the word "intentional"
in the phrase "intentional tort" may refer to a mere
half-conscious decision to act, and not, just the common sense
meaning of consciously acting to effect a malicious purpose. Accordingly,
the Library's provision of Internet terminals and its selection
of which messages which users are intentional acts.
Using filtering software might also cause the Library
to commit a criminal violation. Many Internet filtering applications
keep records of the sites and materials accessed by a user, unbeknownst
to that user. Not to avoid or, at least, disable software with
this monitoring function could violate Section 257.261, Florida
Statutes. This section prohibits the release of and the permanent
maintenance of patron borrowing records. Even if criminal liability
did not lie under these facts, Section 257.261 may be relied upon
as persuasive authority in tort litigation by a patron seeking
relief for invasion of reasonable privacy expectations. At the
very least, the Library should inform patrons of the monitoring
functions existence. If the Library were to offer its patrons
Internet telephone capability in the future, such monitoring could
also lead to violations to Section 934.03 Florida Statutes. This
section prohibits monitoring or disclosure of aural telephone
communications outside the normal courts of business of a telephone
company.
A policy of asking patrons to leave the Library
for being disruptive simply because another patron did not like
the constitutionally protected graphic or test message the first
patron was receiving through the proper use of County-provided
Internet sites would be no more legally viable than the screen
software proposal. A court would likely rule that a patron's simple
viewing of an image or text message at a terminal provided by
the Library was part of the constitutionally protected Internet
communication process. To reiterate, the third circuit held in
ACLU v. Reno that the Internet is a pure public forum subject
to the highest level of protection under the First Amendment.
The disruption policy fails like the filtering software policy
because it prevents protected communication through the Internet
for reason of that communication's content.
Even if a court ruled that viewing a message received
over the Internet at a Library terminal constituted a different
state of communication, public libraries are themselves "limited
public fori". See Kreimer. This classification insures
that the making and receipt of communication in a manner normally
compatible with the operation of a Library receives significant
protection from government regulation. Past pretextual use by
libraries of local ordinances prohibiting disorderly conduct,
loitering and trespass to discriminate against minorities and
holders of unpopular viewpoints guarantees that the Library's
use of a policy against disruptive behavior will receive very
strict scrutiny as a matter of practice, if not of law. See Georgia
v. Rachel, 16 L.Ed.2d 925 (1966) and Achtenberg v. Mississippi,
303 F.2d 468 (5th Cir. 1968).
Preventing disruptive behavior, content-related
as your memorandum defines it, would be held not to be a legitimate
governmental goal. The right to receive expressive communication,
as well as its making, is protected by the First Amendment. Once
again, see Virginia State Bd. Of Pharmacy v. Virginia Citizens
Council, Inc. The United States Supreme Court has held that
unhappiness on the part of observers or bystanders with the content
of a speaker's message is not a legitimate ground for police to
interfere with the delivery of that message by invoking a breach
of the peace statute, even when the bystanders threaten violence.
See Terminiello v. Chicago, 93 L.Ed. 1131 (1949), and Edwards
v. South Carolina, 9 L.Ed.2d 697 (1963). These holdings surely
apply even more so to the orderly receipt of information in a
library setting where the chance of violence is remote. Admittedly,
Kreimer held that Library enforcement of policies to prevent
a patron from annoying or harassing other patrons did not violate
the First Amendment. However, the annoyance and harassment present
in the Kreimer fact pattern involved physical and verbal
behavior by a patron directed at other patrons. Nothing in Kreimer
dealt with unhappiness or indignation felt by one patron in response
to the content of communication voluntarily being received by
another in the ordinary course of library operations.
Even if a policy against disruption were treated
by a court as being a time, manner and place regulation, as opposed
to a measure directly affecting expression, the policy still would
very probably not pass constitutional muster. To reiterate, for
so called "content-neutral" time place and manner regulation
of the non-communicative impacts of expression to survive, such
regulation must serve an important governmental interest unrelated
to the content of the communications being impacted, be narrowly
tailored and permit alternative channels of communication. The
purpose of the proposed policy against disruption, preventing
other patrons' unhappiness over controversial images or text being
displayed, is not a legitimate governmental interest unless such
images and text do not come under the protection of the First
Amendment. In addition to not being a legitimate governmental
interest, such a broad application of a policy against disruption
would fail constitutional muster for not being content neutral.
The proposed policy would also not be narrowly tailored. For example,
an Internet computer terminal could be screened from general view
as a means to lessen public offense at controversial sites, sounds
and texts.
Footnotes:
1. Your memorandum states that certain Internet services will not be offered. I urge you to be cautious in stating what can be done. While email and Internet relay chat may be foregone by not including the appropriate software protocols, I believe that the integrated nature of WWW navigator, browser and search software will make it difficult to eliminate usenet access or the capacity to send email to some addresses. Using a 386 computer with Internet text-only access, browser and search engine software without forms capability, i.e. backward technology, I was able to quickly reach usenet groups after starting from the library's current web page. I was also able to send email from certain web sites which I reached using options provided for by the library's web page. Furthermore, I was able to reach "spicy" and "naughty" web, usenet and gopher sites in nine to eleven steps.
2. My understanding of filtering software is that such applications still use primitive, non-relational compare and contrast search algorithms. This attribute means, first, that they must be frequently updated in order to cover new prohibited site names. Second, the "spontaneous" decisions they make can lead to strange results. For example, one popular application will not permit the user to visit one congressional web site because it uses the word "coupling" in a non-sexual context. Third, many filtering applications use directories that any self-respecting eleven year old computer whiz can break into and modify or disengage. (Admittedly, even a whiz will probably not have sufficient time to accomplish this task working for a limited half-hour session at the library. However, the whiz could have access to computers at school or home.) Fourth, my understanding of the rote nature of filtering software means that they must be reset before they can handle different levels of concern about sex or profanity or violence. In other words, the "decency" delivered by a filtering program at level "A" may suit individual one, but not individual two. The ramifications of these limitations for the library in the context of potential tort liability are discussed below.
3 This memorandum assumes that the Library will not have sufficient equipment and staff budget to permit the segregation of terminal use by age group: Under thirteen years of age, thirteen to eighteen years of age and adults over eighteen. Consequently, this memorandum does not deal with "failures to supervise and segregate" issues which could lead to tort liability. It also does not deal with the problems of near-adult First Amendment rights for high school seniors and juniors. The United States Supreme Court has held that minors are entitled to a significant measure of First Amendment protection. Only in relatively narrow and well-defined circumstances may government bar public dissemination of protected materials to them. Erzonoznik v. City of Jacksonville, Florida, 422 U.S. 205, 212-213 (1975).
4 This memorandum assumes that the patron receiving the message is not engaging in any loud or otherwise attention-grabbing behavior which obviously be inappropriate for a library setting. If such were the circumstances, the focus would shift from the images or text being displayed on the computer screen to the individual's behavior. The issue of interference with the making or receipt of expression would disappear. Libraries may exclude patrons for anti or unsocial behavior such as loud boisterous behavior, physically harassing or annoying other patrons or poor personal hygiene. See Kreimer v. Bureau of Police, 958 F.2d 1242 (3rd Cir. 1992).
5 This memorandum assumes that the Library will formulate strict guidelines concerning the criteria used to determine which sites and materials are to be blocked by filtering programs and which types of materials displayed on terminal screens will constitute disruptive behavior. If the Library decides to pursue these proposed policies, these guidelines should be submitted to the Board of County Commissioners for approval. If staff are given leeway in the enforcement of these proposed policies, such leeway would give rise to additional constitutional objections to the proposed policies.
6 This memorandum assumes that any legal challenge of Library policies regarding the use of filtering software or "disruptive" use of the internet would not be a facial challenge. Rather, the legal challenge would be brought against the policies and regulations as enforced. This memorandum is making this assumption since the Library has not yet developed any specific policies and regulations which this memorandum can evaluate. Second, a facial First Amendment challenge against government policies and rules is less likely to succeed than one against policies and rules as applied.
7 Originally, I had planned not only to analyze issues raised by the use of filtering software and the proposed broad definition of disruption under the First Amendment, but also the Fourteen Amendment's Equal Protection Clause. I decided to forego this latter analysis out of considerations of time and this memorandum's length. However, I want to briefly advise you of two pitfalls in the event the Library starts using filtering software. In determining which sites or materials are blocked, contemporary preferences of the left and right must not dictate what gets cleared or what gets blocked.
For example, Klu Klux Klan sites must not be blocked not only due to the First Amendment (See footnote nine.), but also due to the Equal Protection Clause. Certain minority oriented sites could contain rap lyrics which are as racially conflict-oriented as material at a Klan site. Nevertheless, such rap lyrics are not as frequently criticized as are Klan pronouncements.In a similar vein, the library must be evenhanded in dealing with heterosexual and homosexual oriented sites and materials. A strong, if not uniform, trend in recent judicial decision-making is to afford homosexual (gay/lesbian/bisexual) interests protection under the First Amendment and Equal Protection Clause. If the Library were to block sites and materials of interest to these latter communities or to ask a patron viewing such sites or materials to leave the library for disruption, resulting litigation based on the First Amendment and the Equal Protection Clause could succeed. See Romer v. Evans, -- US – (1996) which held that gays and lesbians enjoyed protection under the Equal Protection Clause, and Shahar v. Bowers, 70 F.3d 1218 (11th Cir. 1996) which held that the Georgia attorney general could not deny employment to a lesbian attorney upon his learning of her lesbian marriage due to freedom of expression and association guarantees of the First Amendment, unless he could show a compelling governmental interest. Fears that employing an attorney involved in a lesbian marriage would be construed as tacit support for the concept and disrupt the functioning of the attorney general's office were deemed by an eleventh circuit panel not to be compelling governmental interests. A motion for rehearing was granted by the entire eleventh circuit in March, 1996. However, the motion was granted before Romer v. Evans was handed down. As this memorandum was being finalized, the holding of Romer v. Evans was augmented by the United States Supreme Court on June 17, 1996, when it vacated a sixth circuit opinion that found a Cincinnati ordinance disadvantaging gays and lesbians constitutional.
8 Nevertheless, in light of ACLU v. Reno and case law involving library books, I cannot rule out damages liability totally. Moreover, you must determine whether library use of filtering software would alienate members of the community who are normally supportive of the library.
9 With the exception of terminals reserved for children under thirteen which this memorandum is, once again, assuming the library will not be able to provide for reasons of budget and administrative practicality.
10 This discussion concerning constitutional problems in the use for filtering software focuses on sexually related sites and materials. If the library may not constitutionally block such sites, it will not be able to block most sites advocating racial, ethnic, lifestyle or gender-based conflict or ones discussing or depicting violent behavior. Conflict and violence sites are more likely than sexually-related sites to directly involve political speech which enjoys the highest degree of constitutional protection. Courts have also narrowed doctrines under which such conflict and violence-related sites may be excluded from protection under the First Amendment. The doctrines of "clear and present danger" and "fighting words" require expression to have the characteristics of directly advocating specific lawless or violent behavior, of directly targeting an object or recipient and of having the likely effect of immediately provoking lawless or violent behavior before such expression may be banned. To illustrate, the United States Supreme Court recently held that the placing by the Klu Klux Klan of a cross at a plaza of the Ohio Capitol was protected speech. See Capitol Square Review and Advisory Board, et. Al. V. Pinette, -- U.S. – (1995).
11 It must be remembered that filtering software has been designed for parental discretion regarding what sites and materials their children may access. What access a parent may prohibit their child from having and what access government may prohibit are radically different matters. Most of the subject categories which screening software permits a parent to choose in order to block messages contain constitutionally protection expression even for older minors: Partial and full nudity, ethnic impropriety, satanism, depictions of the drug culture, radical groups, gambling (discussions), depiction and discussion of alcoholic beverages, etc. See ACLU v. Reno.
12 The refusal of the court to equate the Internet with broadcasting destroyed the usefulness of an eleventh circuit opinion involving a public broadcasting television station as an analogy for the defense of screening software. In Muir v. Alabama Ed Television Com'n, 688 F.2d 1033 (11th Cir. 1982), the court held that the withdrawal of a previously advertised program ("Death of a Princess") did not constitute censorship under the First Amendment. In reaching this result, the eleventh circuit held that public television stations were not public fori in which individual viewers had the right to compel the broadcast of specific programming.
13 When a rule is directed at the origin of expression or the ultimate right of a person to present or procure expression, that rule is directed at the regulation of content. See American Booksellers v. Webb, 919 F.2d 1493 (11th Cir. 1990).
14 Indirect burdens placed upon protected speech for adults and for minors in order to regulate obscenity for minors or adults can survive if supported by important governmental interests. Protecting minors from truly obscene materials is such an interest. However, even assuming the sites and materials being blocked were obscene for minors, a governmental entity like the library may not prohibit adult access to material that is not obscene for adults. See American Booksellers at 1500-1502. A recent illustration of this rule is a case, Playboy Entertainment Group, Inc. v. United States. – F.Supp. – (DC Del. 1996), involving cable television, a medium which enjoys less First Amendment than the Internet. In granting a temporary restraining order, the court held that serious questions existed as to the constitutionality of a provision of the 1996 Telecommunications Act which required the blockage of adult video programming from minors by the means of scrambling. The court seriously questioned whether this arrangement was the least restrictive means of achieving the government's interest in regulation accessibility of programming to minors.
15 "Conservatives" are the traditional villains in censorship litigation. However, the roles played by "liberals" and "conservatives" may be changing in censorship cases with the establishment of "political correctness" on college campuses and in other public institutions.
16 Interestingly, the Georgia statute upheld by the eleventh circuit in American Booksellers only prohibited the commercial display and distribution of materials "harmful to minors". (The definition of this term closely resembles that of materials, the sale or distribution of which is prohibited to minors under Chapter 847, Florida Statutes.) Out of First Amendment concerns, the Georgia legislature exempted public, school and college libraries from the regulations imposed by the law. Id at 1509.
17 And possibly one utilizing visual images as well.
18 A court would likely find that the decision to adopt filtering software or to select among messages was a policy decision enjoying immunity under state law. See Dept of HRS v. BJM. 656 So.2d 906 (Fla. 1995). Conversely, this memorandum assumes that any failure of blocking software to prevent access to prurient, violence or hateful Internet sites and materials for reason of defective technology or deficient utilization would constitute an operational function (act or decision), not a policy level function which enjoys sovereign immunity. The same assumption applies to the failure to screen out defamatory statements or statements which invaded privacy, once the Library undertook screening messages. This memorandum further assumes that a policy provision to provide blocking software to protect the sensibilities of library patrons gives rise to a sufficiently defined group of beneficiaries that the County's raising of the defense of "no special duty owed to a plaintiff" would fail in preventing liability from arising, certibus paribus. See Rupp v. Bryant, 417 So. 2d 658 (Fla. 1982). This latter issue would not arise in the case of defamation or invasion of privacy since the party to whom a "publisher" (See the discussion in the main text.) owes a duty of care is readily identifiable.
19 Note the very different result from Stratton Oakmont in Cubby v. CompuServe, 776 F. Supp. 135 (SD NY 1991), in which the court held CompuServe was not liable for content because it did not review content of communications posted on its service. The CompuServe court reasoned that by not reviewing messages for content, CompuServe was a distributor, not a publisher, just like a typical library.