Senator Conroy has responded to an article published on the front page of The Sydney Morning Herald which revealed that the Government has sat on a report that labeled mandatory ISP filtering as being fundamentally flawed since February. Senator Conroy also has announced the live trial has been delayed until mid-January.
Here is the entirety of his press release:
The Howard Government, at the instigation of the Internet Industry Association (IIA), commissioned a report to be conducted by Mr Peter Coroneos, IIA’s CEO. The previous government provided funding for the research and it was based on terms of reference agreed to by the IIA and the previous government. The report was to inform the previous government of the IIA’s and other stakeholders’ views, and international experience.
The report methodology was a literature review of existing studies as well as interviews and surveys. It involved no empirical testing of filtering technology.
The report highlighted a number of concerns the industry had previously raised with the current and previous governments, such as the potential for dynamic filtering to result in network performance impact and over-blocking and under-blocking content. It was not an analysis of the ALP’s policy.
“The Government is aware of technical concerns raised in the report, and that is why we are conducting a pilot to put these claims to the test,” Senator Conroy, Minister for Broadband, Communications and the Digital Economy, said today.
“On 10 November I released an Expression of Interest seeking participation of ISPs and mobile telephone providers in a live pilot. A number of applications have been received from ISPs expressing interest in participating in the field pilot of ISP content filtering.”
The live pilot trial will provide evidence on the real world impacts of ISP content filtering, including for providers and internet users. It will provide an invaluable opportunity for ISPs to inform the Government’s approach.
The pilot trial will not begin until mid-January and an announcement regarding participants will be made at that time.
The Howard Government Report is on the Department website at www.dbcde.gov.au
Here’s a direct link to the page where you download the report so you don’t have to scour the DBCDE website looking for it.
Associate Professor Bjorn Landfeldt, who helped author the report, has also responded on his blog. He writes:
First, I don’t think the study was very secret. In fact, the study made a wide consultation with the Australian ISP industry, content providers and other organisations / stake holders. There has been wide spread knowledge of this study even though the findings have not yet been widely released as far as I understand after reading the article. It is not my place to comment on at which time the government releases its reports even though I see no real reason not to release this specific report.
The issues raised in the report have largely been covered in preceding reports, at least the sections I was providing input to and even though they are very important issues to consider, I don’t think they are damning since the issues are well known.
So, what is the big issue as I see it? A blacklist requires manual effort in order to determine what should be included. The Internet is a network of networked computers that carry information in many forms and realms, one of which is the World Wide Web. If we restrict ourselves to talk about only WWW, we have a global network with billions of pages worth of information. The information is made up of all different languages of the world and incredible diverse. Some information has a very high profile and some information has very limited visibility.
Since a blacklist would rely on user reporting, it is questionable how efficient it would be to locate unwanted content in the first place. Second, every case would have to be tried to see if it breaches Australian law and falls within the categories specified for the filtering list. It will be a very difficult task to do this for content in the grey zone in all different languages. If the point is to stop child pornography, determining if a model is 19 or 25 in content from a different country with different jurisdiction is not an easy task and would be quite labour intensive. The next question is who is responsible for blocking of material that is legal if the wrong judgement is made?
The only way to identify such material quickly and significantly limit the risk of accidental access is to do some form of dynamic content filtering. However, the state of the art of such technologies is very limited in accuracy and if they are to be used there is a consequential performance impact on the response times of systems or at least an increased cost for the service provider. Current filters are rather good at detecting certain patterns of information such as a combination of many images and certain keywords usually means a porn site.
However, there are at least two additional dimensions to consider. First, the current filters only look at such patterns, they do not try to analyse the actual content in any meaningful way. It is therefore difficult to distinguish between different types of content where there are similarities. For example, if a web site contains information about sex education or erotic content. Second, more and more content moves to other forms of multimedia and filtering and detecting the nature of content is much much harder in this case. For example, analysing a video and detecting that it has adult content is not a lightweight computational task. Separating sex education from porn is even harder. Third, if indeed there would be widespread filtering of content the providers would see a need to obfuscate content to fool filters. When we step into this realm it becomes very difficult for any filters to keep up.
This discussion can then be applied to an environment with other addressing realms than the WWW such as P2P networks and social networking applications and realms which shows that the level of difficulty is very high indeed. There is also a strong movement to anonymise users on the Internet to counteract information logging. Using simple tools such as VPNs to cross the Australian border would also enable circumvention of any centralised filtering scheme.