No! But yes.

I have received another letter from the DIA (PDF) in response to further requests.

As well as a copy of the whitepaper that I’ve already written about, I asked the DIA to clarify “whether the filter will only be used for images of child sexual abuse or will it also be used for text files as described by Trevor Henry, Senior Communications Advisor on 17/7/2009”.

The response from Steve O’Brien, manager of the Censorship Compliance Unit, was as follows:

Unfortunately, Trevor Henry’s statement has been taken by some commentators as “proof” that the scope of the Digital Child Exploitation Filtering System will expand. As stated above, the purpose of the filtering system is to block access to known websites that contain images of child sexual abuse.

Well, I’m glad we cleared that up, the filter is only going to be used for pictures. But wait, what’s this, he hasn’t finished yet:

These websites sometimes also contain text files that exploit children for sexual purposes, and where this occurs those text files will also be blocked.

The concept of a text file exploiting a child seems odd to me.

Even ignoring the slightly ludicrous phrasing, part of the rhetoric around the implementation of the filter has been that they are trying to stop photos of actual abuse – “…will focus solely on websites offering clearly objectionable images of child sexual abuse.” There is no child being abused in a text file.

The examples given by Trevor Henry clearly demonstrate that written material can be just as abusive as pictures.

I don’t believe that he did clearly demonstrate that (read what he wrote). While what he describes is creepy, to my mind there is a clear distinction between writing about how to abuse a child and actually abusing one.

So, has this response cleared anything up? The answer has to be no. The DIA is still claiming that it’s trying to only ban images of child sexual abuse (something I might be inclined to support if they did it openly and if it had any chance of working) while at the same time admitting that they’ll ban other things that aren’t images as well.

2 Responses to “No! But yes.”

  1. 1Toejam on Oct 6, 2009 at 9:50 pm:

    “As stated above, the purpose of the filtering system is to block access to known websites that contain images of child sexual abuse.”

    If they know of websites containing images of child sexual abuse, why have they not reported them to Interpol? Or the Virtual Global Taskforce?

    At the very least, send an email to the company that hosts the website. Someone did exactly that to a number of websites earlier this year, as an experiment (sadly the link escapes me); many of the sites in question went offline within hours, and within 24 hours I think 90% of them had either been deactivated, or the hosts had responded with an email confirming that the client possessed all necessary documentation to verify the ages of the participants.

    Seriously, who does this O’Brien feller think he’s kidding??

  2. 2Mark Harris on Oct 6, 2009 at 9:59 pm:

    Not under NZ law. Text files about grooming, or how to abuse, would fall under

    ‘‘132A Aggravating factor to be taken into account in
    sentencing, etc, for certain publications offences

    ‘‘(2) In sentencing or otherwise dealing with an offender for the
    offence, the court must take into account as an aggravating
    factor the extent to which any publication that was the subject
    of the offence is objectionable because it does any or all of the
    ‘‘(a) promotes or supports, or tends to promote or support,
    the exploitation of children, or young persons, or both,
    for sexual purposes:
    ‘‘(b) describes, depicts, or otherwise deals with sexual con-
    duct with or by children, or young persons, or both:
    ‘‘(c) exploits the nudity of children, or young persons, or
    both. ”

    The “promotes or supports” is a key criterion. While most people’s thoughts would go immediately to images, I can see why Steve O’Brien has worded his response to you that way. They do need to get their story straight.