Fear of child abuse as a fig leaf for censorship.

One thing that really struck me at the Internet Governance Forum in Athens (see Human Rights at the Internet Governance Forum) was the way that many government and corporate representatives cited child pornography as a reason for filtering the internet. Perhaps it was quoted so much because opposition to child porn is seen as a shared global value, but i didn't like the way it was constantly invoked to symbolize the threat of the net. The Universal Declaration of Human Rights allows limited exceptions to freedom of expression so that things like child pornography can be dealt with. But these exceptions must go through proper legal process and be applied in a specific, proportionate and concrete way, not waved around as a general excuse for censorship.

At the IGF, Rikke Frank Jorgensen from the Danish Institute for Human Rights gave a good example of how this can become a slippery slope. Apparently the Danish police order web sites to be taken down on the basis of a phone call from the local branch of Save the Children, and these sites are added to a secret blacklist. Now, however contemptible the sites concerned, this isn't a good way to go about things - once websites are blacklisted simply because of 'common sense' and without being tested by legal process it is easy to widen the net to include any content that officials find objectionable.

I think this kind of cavalier approach to rights can become viral, especially in the online environment, as illustrated by the Personal Democracy Forum blog post Who's Molesting Who on MySpace? by Micah Sifry

"Apparently, the warrantless tactics that some prosecutors are now using to pull information off of MySpace pages to track sex offenders are now spreading to much lower level crimes. Henson discovered a thread on the Texas District and County Attorney's public user forum where a bunch of prosecutors are discussing whether it's OK to go online and create a fake profile on MySpace in order to get a kid to "friend" them and thus give them access to their private MySpace page, where they might find evidence of someone bragging about a petty crime like vandalism."

But, I hear you ask, how do we deal with all the bad stuff on the net? Firstly, human rights law makes adequate provision for dealing with truly illegal content. As for content that is objectionable, I would back the OSCE delegate at the IGF who pointed out that the presence of objectionable (as opposed to illegal) content is exactly what demonstrates the freedom of the media. There was plenty of constructive debate at the IGF in workshops like Content regulations from gender and development perspective organised by the APC Women's Networking Support Programme, where the question "should we define for children what content they can access, or rather let them decide what they want to access?" brought support for educating children about harassment on the internet and bringing their attention to some risks they need to manage, instead of censoring their version of the net. (As i remember, Danah Boyd also has some sensible stuff to say about this in relation to the Myspace scare in the USA).

I think some of the strongest challenges to triggy-happy content regulation are laid out in the intro to the APC workshop, especially the last point: "There are several problems which intersect to make content regulation in relation to 'harmful content' one of the most controversial areas for regulation and governance:

  • the definition of harmful content is contestable, subjective and open to a range of interpretations by multiple stakeholders;
  • the degree of 'success' of such controversial initiatives as 'clean-feed' and other filtering based systems, is primarily determined by the extent to which all affected stakeholders have been engaged in policy development and employment;
  • the key groups which are deemed to benefit from such systems - women and children - are largely absent from such discussions.