The eSafety Commissioner has the power to issue takedown notices on various types of material, with exploitation material being the most common. One Nation supports these powers being used for this purpose. A small portion of their work involves removing material that is deemed “violent or distressing.” This was the power used in the case of the Bishop Mari Mari Emmanuel video. One Nation is concerned that these powers could be misused, as they are subject to political interpretation regarding what is and is not “violent or distressing.”
I asked the eSafety Commissioner if her department had a transparency portal where Senators and the public could see the material being taken down. The Commissioner responded by including exploitation material in her count, to show why such a portal was not feasible, yet I did not ask about exploitation material; my question specifically concerned material categorised as “violent or distressing.”
It is my belief that social media platforms primarily use AI to remove most of this material and that the department has only had to issue a small number of notices. I want to know what those notices were issued for and I will continue this inquiry during the next estimates session.
Transcript
Senator ROBERTS: Thank you for attending. My first question is about your newsroom statement from 4 October about the social media platform X and a transparency notice on the measures it’s taking to combat child sexual exploitation material. Is this the only transparency notice that has not been complied with?
Ms Inman Grant: Thus far, yes. Where we issued an infringement notice, we issued something called a service provider notification to Google for the same set of child sexual abuse material.
Senator ROBERTS: The only other platform is Google, and that hasn’t been issued with a transparency notice. Are there any others like Telegram or Facebook? Telegram does a lot of work in that area.
Ms Inman Grant: We are in the midst of a process around a series of very complex transparency notices in relation to terrorist and violent extremist material. Telegram is amongst them, and we’re engaging with them.
Senator ROBERTS: Thank you. This thread asks about a subset of your work—material that is violent or distressing. Do you have a transparency portal where your instructions to social media platforms to take down such material are registered in as close to real time as possible so we can see what you’re censoring?
Ms Inman Grant: We weren’t set up as a censor, Senator. We have frameworks provided through complaint schemes. Members of the public report content to us, particularly when the social media platform or messaging platform hasn’t responded. With respect to illegal and harmful online content, we also have very well legally defined requirements. We have both notice powers under the Criminal Code and then removal notices under the Online Safety Act and formal removal notices, which we exercised against both X and Meta during the Wakeley terrorist incident.
Mr Dagg: Can I just explain how we achieve the objective of transparency in terms of our actions. You may know that the Online Safety Act requires us to publish, under section 183, actions that we’ve taken in relation to a variety of harms. Our annual report has been published. You can find all of the information—
Senator ROBERTS: Your report has been published?
Mr Dagg: The annual report has been published, and we are required to report all of that information in the annual report. You can find that from page 223 in the appendices that relate to the eSafety Commissioner. That will show you all of the actions that we took for the financial year 2023-24.
Senator ROBERTS: Can you give us a bit of background on each one?
Mr Dagg: No—these are aggregated figures, so there’s no specific breakdown of each individual matter.
Senator ROBERTS: So there’s no breakdown and no opportunity for people to see how you’re doing it?
Mr Dagg: It would not be operationally feasible for us to report in real time the actions that we’re taking. Parliament expected us to report on an aggregated basis about the actions that we’ve taken, including requests, but we haven’t broken them down—
Senator ROBERTS: It’s just the aggregate numbers—
Mr Dagg: The aggregate numbers for a range of operational purposes, including security and operational feasibility.
Senator ROBERTS: So the platforms have to be transparent, and you don’t?
Mr Dagg: Well, the platforms report on things in an aggregated way, too, Senator. They’re not reporting on each individual specific matter that they deal with. They deal with millions of matters on a yearly basis. So, again, that just wouldn’t be feasible for them to do.
Senator ROBERTS: But the platforms have to be transparent to you.
Mr Dagg: Through the exercise of our compulsory transparency powers under the basic online safety expectations. But it’s important to note, Senator, that those transparency powers are around how the platforms are meeting the expectations. We’re not extracting from them specific information about how they’re dealing with this matter or that matter that might be reported to them. We’re interested in understanding how they take user reports, for example—if they’ve got reporting schemes in place, how their terms, services and policies are developed to meet the objects of the basic online safety expectations. The most recent determination includes some measures in relation to generative AI and how the companies are ensuring that these technologies aren’t being used, for example, to produce child sexual abuse material on a synthetic basis. That’s the kind of information that we’re drawing from the companies. We’re not drawing information about how they’re dealing with individual complaints.
Senator ROBERTS: The police force has long had transparency to the public through the court system. Whether you agree that the court system is perfect or not, that’s not the point. Who do you go through to provide transparency? How can we assess what you’re doing, rather than just in the aggregate?
Mr Dagg: When it comes to the principles of open justice, as a former police officer myself, the matters that make their way to court represent a tiny fraction of all matters that are reported to police. The matters that are reported to police are not reported on an individual basis. There are strict privacy concerns, for example, that ensure the protection of complainants’ identities and the specific matters that are reported to police forces. The Wakeley matter—the section 109 notice that we issued to Twitter X—is a good example of how that principle of transparency plays out in the Federal Court. The online file, for example, includes all of the evidence that the eSafety Commissioner relied on to make the case that the interlocutory measures ought to be accepted by the court.
Senator ROBERTS: The Senate is the house of review. What facility exists for the Senate to review your take-down notices of material? Where’s the supervision of your activity? Who oversees you?
Ms Inman Grant: There are a few different ways. One is through FOI, which you’ve exercised yourself, Senator. We’ve had a 2,288 per cent increase in FOIs over the past year. We are held accountable. We have reporting requirements that include any informal actions we take. Of course, we can be challenged in the Federal Court. We can be challenged at the AAT, or now the ART. We can be challenged by the Ombudsman, and a complainant can ask for an internal review to be done. So there are a number of different ways that we can provide transparency when it is asked for or required.
But, as Mr Dagg said, with 41,000 reports this year—and I think Mr Downey, who is now running the investigations branch, is expecting at least 60,000 reports next year—it would operationally be infeasible, and it would violate the privacy of the complainants. As I said before, that confidentiality is important. Even young people understand that one of the reasons children don’t report cyberbullying is they don’t want to be the dobber or the snitch, and they fear retribution. If we were to not treat some of these complaints as personal information—and the Information Commissioner agrees with us—I think it would undermine trust in us as an organisation.
Senator ROBERTS: I get that. Did you say that there was a 2,000 per cent increase in FOIs?
Ms Inman Grant: Yes, 2,288 per cent.
Senator ROBERTS: That’s a huge increase. It tells me that people are hungry to learn more.
Ms Inman Grant: Yes, and there have been some campaigns that have also encouraged people to put in FOIs, which we respond to.
Senator ROBERTS: You’ve used the defence of having so many infringements to take care of. That’s a big workload. What I’m interested in is not so much that but how you’re being held accountable. How can we see transparently what you’re doing?
Senator McAllister: Here we all are, Senator. What is the question that you seek to ask?
CHAIR: We call it estimates.
Senator McAllister: We are at estimates. The commissioner is here to answer your questions. If there are particular things that you’re interested in, you really should ask her.
Senator ROBERTS: What about the public? They need to know.
Senator McAllister: You are their representative, as you so often remind us.
CHAIR: You can send them the video of this.
Senator McAllister: You are a humble servant of the people of Queensland.
Senator ROBERTS: I want to go to freedom of information 24118, which asked for any guidelines you have with regard to the implied right to political communication to make sure you aren’t infringing on it as you issue take-down notices. I note that your freedom of information decision says: ‘There are no dedicated guides or policies with respect to the interaction of the implied right of political communication in use by the eSafety Commissioner or personnel who implement the various schemes under the OSA.’ There are no dedicated guides or policies?
Mr Dagg: We would need to assess each and every action we take through the lens of whether or not the implied constitutional right to political communication is infringed. That’s just operationally infeasible.
Senator ROBERTS: So are you saying, ‘To hell with the Constitution’?
Mr Dagg: No, not at all. The concern that a particular person’s interests may have been infringed in such a way as to raise a claim that the operation of the Online Safety Act is invalid is absolutely a matter that can be pursued through merits review or judicial review. But, to the commissioner’s point, we are going to be dealing with 60,000 complained URLs this year, which produces a significant percentage of actions we take. I’m sure you can understand that rigorously assessing whether or not they raise any specific issues in relation to the implied constitutional right makes it very difficult for us to make rapid decisions in line with the threshold set by the act. I think it’s important to note that the act contains very clear thresholds and very clear parameters for us to apply in terms of operational decision-making. The act itself, as you would have seen, is supported by a bill which was subject to exhaustive human rights review in its construction. We believe that, by properly administering the act on behalf of the commissioner, we’re taking actions which are in line with parliament’s expectations. If a person believes that their constitutional right—the implied right—has been infringed, there are avenues for review of that decision.
Senator ROBERTS: I can’t see how bypassing the Constitution or not including it as a consideration is in any way okay. The eSafety Commissioner and the delegates ordinarily—this is the quote: ‘The eSafety Commissioner and the delegates ordinarily proceed on the basis that the powers given to them under the OSA by the Australian Parliament are reasonably appropriate and adapted’. So you don’t turn your mind to whether you’re acting constitutionally at all; you just assume you are. How can this Senate be convinced that you are able to act within the Constitution when you don’t even have a document outlining the fundamental right of Australians to communicate in political matters? If you infringe on someone’s constitutional rights, then they complain? That’s it?
Senator McAllister: As you know, the constitutionality of any piece of legislation that comes before the parliament—
Senator ROBERTS: Not the legislation—
Senator McAllister: is quite frequently a matter of some discussion. Unless you seek to challenge it, we can assume that the legislative framework within which the commissioner and her staff operates is constitutional.
Senator ROBERTS: That’s a misrepresentation of what I said, Minister. I’m not saying that the act is unconstitutional; I’m saying that the consideration to take someone down needs to maintain constitutional rights—particularly political.
Senator McAllister: I think the two things are interconnected, Senator, because the powers that are exercised by the commissioner and the staff that work with her are enabled by the parliament and by the legislation.
Senator ROBERTS: I get that.
Senator McAllister: As I have indicated to you already, that is quite often subject to a discussion among senators about constitutional arrangements.
Senator ROBERTS: That still doesn’t answer the question—the right to political communication.
CHAIR: Senator Roberts, I am going to move on.
Senator ROBERTS: Thank you.
Agree. Oversight needed to prevent Censorship.
The E commisioner sadly has not shown her opposition to the draconian Misinformation bill going through the Senate.
Even the Human rights commission has objected.
Her silence would suggest she is a specially selected Labor government choice who will relish her extra powers to muzzle free speech if the bill passes.
Inman Grant will be an out woman for freedom of speech in Australia