Porno for rugrats?

Afraid of Web smut? Try parenting, not censoring.

Published December 10, 1997 2:29PM (EST)

To keep kids away from potentially dangerous situations, parents install safety devices -- they put tabs on cupboards and gates at the tops of staircases. Now, the Internet industry would like parents to apply this kiddie-proofing principle to their home computers too. Bowing to growing pressure from the Clinton administration to make the Internet more "family friendly," computer industry leaders, legislators and members of the media met last week at The Internet Online Summit: Focus on Children, held in Washington, D.C., and proposed filtering software as the best way to keep children from encountering smut online.

"These tools promote a family-friendly Internet by giving parents the tools they need to provide their children with a safe educational and entertaining experience online," Vice President Al Gore said during the three-day conference.

Other intiatives they discussed included a ratings system for the Internet
and launching a series of public awareness campaigns to teach parents more about the online risks for children. Internet service providers and companies such as America Online and the Walt Disney Co. also promised to work with law enforcement officials to track down online pedophiles.

The software solution sounds simple in principle, but the conference offered no sure-fire plan on how to implement such a system effectively. In the wake of last summer's Supreme Court decision to strike down the Communications Decency Act -- which would have made it illegal to transmit "indecent" material to children online -- conservative groups such as the Family Research Council and the Christian Coalition continue to push for laws, not technology, to restrict children online. And civil libertarians still oppose infringements on free expression -- and point out that software filters often screen out a great deal of non-pornographic material in their quest to "clean up" the Net.

Salon spoke with conference speaker Lawrence Magid, a syndicated columnist for the Los Angeles Times, about the problems with filtering software. Magid, the author of two guides on children's online safety, is a proponent of shifting responsibility back to parents -- and away from technology and government.

Does making the Internet more "family friendly" infringe on First
Amendment rights?

The Internet is already family friendly. I heard that only 2.5 percent of the Web sites are
actually pornographic, the rest of it is basically positive. I think making the Internet so that it is only
suitable for children, to basically homogenize the Internet,
would be a terrible tragedy. It would
be like if every TV channel were the Disney channel or Nickelodeon. The Internet ought to reflect the diversity of
our society -- a wide range of opinions, beliefs and aesthetics.
To try to boil it down so it is only suitable for 7-year-olds would
destroy it.

Will the use of filtering software keep kids safe online?

If your primary worry is to keep pornography out of the hands of kids,
filtering can do that, but I don't think it can protect you from
predators. The only way a filter can do that is to lock kids out of the
Internet completely, or lock them out of chat or e-mail. By
the way, the National Center for Missing and Exploited Children has
documented only 60 cases of children who have been abducted or molested or
abused as a result of an encounter online. That is 60 too many, but when
put in the context of millions and millions of kids who go online, you
really have to realize that your chances of being physically hurt
as a result of what you do online is relatively slim.

Who is to blame for those 60 cases?

The Internet is just a network. It is just a bunch of servers and
connections. You can't blame it for anything. You can blame the way
people use the Internet. There are parents who exercise poor judgment, which sometimes results in their child being victimized, but ultimately the criminals are the ones who are responsible for the criminal acts.

Was the focus of the conference really on the safety of children, or was it simply a debate between talking heads?

There was genuine dialogue. The debate was mainly between people for whom civil liberties is their No. 1 priority vs. those who are trying to balance civil liberties with protecting children.

Certainly there were some participants at the summit who would censor the Internet. But the issue is, nobody, including the ACLU, thinks there is anything
inherently wrong with giving
parents the choice to filter. Everybody says that if a parent chooses to
use a product, that's fine. Where the policy debate comes in is regarding
schools and libraries. I think that anybody who thinks that pornography is
likely to be a major problem in public schools and in public areas of
schools and libraries doesn't understand pornography -- it is primarily used in private. That is, what people do with
pornography, they are not likely to do in a room full of people. That is
not to say that there aren't teenage boys who might not snicker at a screen
and go, "Look, augh, gee, goo."

Vice President Gore said at the conference that selective blocking isn't censoring, it's parenting. Is "parenting" using software that you can install, or is it talking to your kids directly and taking your own steps to make sure they avoid
inappropriate material?

Parenting is ultimately about developing a relationship with your child, where you're helping your child make good decisions. Parenting to me is not throwing a piece of software on the computer, walking out the door and saying, "OK, Johnny's safe. Now I can
go to my bridge club." Parenting is also not standing in front of the
computer 24 hours a day, watching Johnny, making sure he doesn't do the
wrong thing. Parenting is having the right and responsibility to monitor your children's behavior.

I have to be sympathetic to these people who say, "We can't watch our kid
24 hours a day. We need some help." That is the argument for filtering and
for blocking. If we are not going to do it through legislation, then we
have to do it through technology. Give parents the ability to use software
that does the equivalent of watching their kids online. I am not opposed to
that, but if that is how you are doing it, by blocking through Surfwatch, or
NetNanny, how is that helping your child deal with problems throughout
life? We don't have cops following us around all day, telling us not to do
this, not to do that. Hopefully, most of us are decent, law-abiding
citizens because we have a set of values, instilled in us by our parents.

- - - - - - - - - - - - - - - - - - - - -

Do you have any children? And if so, do they go online?

I have an 11-year-old son and a 13-year-old daughter. As far as I can
tell, they don't seem interested at the moment, in looking at
erotic material. My son is a webmaster of a popular video game site.

If the worst thing that happens is that my children
fantasize about normal sexual activities -- and by normal I include both
gay and straight -- it's not the end of the world. I did that when I was a
teenager; I looked in books and magazines. I fantasized about how great it
would be to make love to a woman. I think you need to put it in
perspective. I have to agree that there is material on the Internet that is
far more disturbing, but parents need to
understand what they were like when they were kids -- not to condone
this but to try not to freak out.

You suggest that parents talk with their children prior to going online. What do you advise them?

These are my rules for online safety for children: Never give your name or your school name out; never give out any information that could possibly
identify you so that somebody who you encounter online could come and hurt
you. Be aware that when you are online, you are in public and you don't
know who is out there. I have told my children that there are places online that could make them feel uncomfortable, that I don't want them to go to those
sites -- and if they go to a site that is obviously for adults they should
leave that site right away, and then let me know about it.

In fact, my son gets a lot of porno spam, which is basically e-mail inviting him to visit adult-oriented Web sites. He forwards them to my e-mail account, and I, to the extent
possible, do what I can. There is not so much you can do about it, but I am
at least aware of it.

Critics of filtering and the talk-to-your-kid approach say that if the children really want to see a Web site, they will.

I don't have an answer to that. It doesn't matter what government does or what the industry does. They are fighting hormonal drives and human nature, and I cannot think of a time in history when people have not gone out of their way, children and adults, to look at pornography.

In attempting to block out "inappropriate" material, some educational content has been lost in the shuffle.

Sex education sites that teach AIDS prevention and birth control could
be endangered, possibly even sites that teach women to do breast
self-examinations. Any form of arbitrary rating that
filters out material on a mechanistic basis is silly. We had cases on AOL
where the word "breast" was banned from public forum, and breast cancer
awareness forums were suddenly blocked. In Prodigy, they used to have this
filter for their public forums and in their canine forum, the word "bitch" was
banned, which happened to be a perfectly acceptable way to refer to a
female dog. It is all a matter of context.

Do you think that we are paying too much attention to this issue,
thereby ignoring more pressing issues that affect children?

What we have here is a real sound-bite issue. Pornography is a problem, I don't deny that. The number of predators that might sexually abuse or kidnap children is clearly a threat that we need to be aware of, but there are so many other things. One problem is companies encouraging children to reveal their identities on Web sites so they can market to them. There are
also people who commit fraud; people who overcharge you; people who try to convince you that a particular product is not what it seems to be, like toy companies who promote toys that look a lot better on TV or on the Internet.

Could the initiatives proposed at the conference be a strategic maneuver by software makers to make money, to sell their products under the guise of "helping the children"?

I think companies like Surfwatch and NetNanny and others are up front
that they are in the business of making money. I don't see it as anything
unusual that companies capitalize on trying to make life better or
healthier. That is what the system is all about. These companies
tend to be fairly small but large enough to be
profitable, like Surfwatch, which is owned by Spyglass, and Cyberpatrol,
which is now owned by the Learning Company. We are not talking about
Microsoft or AT&T or big corporations here.

Is it such a good idea to regulate a medium so early in its life? They are trying to stave off government regulations, but don't you
think they risk killing the whole medium?

That has been one of the major arguments against the CDA, that this
medium needs time to evolve, and the market has to decide. The other issue
is the question of what is a community standard, what are the values we
regulate? The government of Singapore has a set of standards, people in
California have a standard -- nobody can agree on a set of standards. Some
societies having a tendency to be very open and permissive, others having a
tendency to be extremely oppressive, and restrictive. On a global medium, it
is impossible to come to a community standard. That is an argument in favor
of these rating and filtering softwares, the idea that the parent can make
their own decisions, but I think there is some naiveti in that. The ACLU
points out that in reality these systems are all based in a very similar
set of values. Parents aren't going to really have a lot of choice as to
how to employ them.


By Dawn MacKeen

Dawn MacKeen is a former senior writer for Salon, and author of a forthcoming book about her grandfather’s survival of the Armenian Genocide, "The Hundred-Year Walk: An Armenian Odyssey" (Houghton Mifflin Harcourt, January 2016).

MORE FROM Dawn MacKeen


Related Topics ------------------------------------------