Rave Radio: Offline (0/0)
Email: Password:
News (Media Awareness Project) - US CA: OPED: Should We Tame The Internet?
Title:US CA: OPED: Should We Tame The Internet?
Published On:1998-07-08
Source:San Jose Mercury News (CA)
Fetched On:2008-09-07 06:31:20
SHOULD WE TAME THE INTERNET?

`Filters' are already prevalent in everyday life

LET'S BE realistic: The Internet has an image problem. If it's to approach
its potential as a global democratic-commercial medium, the perception of
the Internet as a latter-day Wild West -- a haven for child pornographers
and other unsavory characters -- must change.

In some ways, the Supreme Court's ruling last summer in Reno v. ACLU, that
the Internet was to receive the highest level of First Amendment protection,
only intensified the problem, by highlighting the difficulties of regulating
- -- or even monitoring -- the medium. It would be a bittersweet victory if
the court's speech-protection mandate was to slow the Internet's growth.

Since the Reno decision, the debate has shifted to technology. Perhaps
Internet filters -- software allowing users to ``screen out'' adult
materials -- will provide the answer without government regulation.

But many in the Internet community have responded with puzzling vitriol:
crying ``censorship'' and bemoaning the death of diversity on the Internet.

To be sure, early implementations of these technological filters -- like
SurfWatch and NetNanny -- provided easy targets. Compiling the massive
databases at their core is a daunting task; despite the efforts of their
creators, they both lagged behind the growth of the Internet (thus blocking
too little) and swept too broadly (thus blocking too much). But the
technology is changing, and so should the debate.

A promising new system, the Platform for Internet Content Selection (PICS)
is emerging. Unlike earlier filtering schemes, PICS does not rely on a
database. Instead, it uses ``tags,'' usually inserted into the invisible
parts of Web pages. These tags, placed by the producer of the material,
contain simple codes that allow PICS-compatible software to determine the
``rating'' of a particular Web page or e-mail message. If the material's
rating exceeds the limits that the user sets in the program's preferences --
by depicting explicit sex, for example -- then the content is blocked.

PICS solves a massive information problem: By knowing the rating before the
material is viewed, the user is shielded from unwanted materials. It's as if
you could know the unwanted junk mail that you would receive in your mailbox
today in time to have someone throw it away for you.

Disappointingly, many denizens of the Internet are no more pleased with PICS
than with the earlier filters. In some cases, their concerns are valid. The
potential for some entity to impose PICS filtering ``upstream'' --
pre-empting the end user's control -- is troubling. But it's also not likely
to happen. If done by the government, it would be unconstitutional; if done
by an Internet service provider, an individual could simply switch to
another service.

Further, the argument that PICS rating and blocking would target content
(and thus limit speech) fails to consider that PICS will operate in a
marketplace for ideas. Certainly someone could create a PICS ratings system
categorizing political affiliation, sexual orientation, or even whether the
creator used too much of the color blue. But in order to block content on
the basis of ``blueness,'' Web surfers would have to install and select the
``blueness'' PICS filter.

Internet users will ``vote'' for categories with their choice of PICS
filters. The more broadly based rating systems -- those unlikely to
categorize far beyond adult content -- will prove more acceptable. Over
time, universal PICS ratings systems will reflect the diversity of the
Internet's users.

Some commentators argue that Internet filters are unconstitutional. But what
gets lost in the rhetoric is that the Constitution distinguishes the public
realm from the private realm. The First Amendment says ``Congress shall make
no law . . . abridging the freedom of speech''; it says nothing about an
individual's efforts to be free of unwanted material. So while governmental
filtering would be impermissible, government support for private filtering
would not.

Every day we use the real-world content ``filters'' that the market and
modern life have provided: TV guides and channel changers, television and
movie reviews, to name a few. Rather than clinging to an idealized version
of the Internet's past, our focus should be on building cyberspace for the
future -- and recognizing the power of millions of individual private choices.

Polk Wagner is a 1998 graduate of Stanford Law School who has written
extensively about the intersection of law, technology, and the Internet. His
e-mail address is polk.wagner@pobox.com.

Checked-by: Melodi Cornett
Member Comments
No member comments available...