Posts

I had a fantastic time chatting with Brodie Buchal on The Right Side Show! We dove into a range of topics, from Australian politics to the heated debate over the Under 16’s social media ban bill. We also tackled the lack of accountability in government processes and so much more.

The Labor-Liberal Uniparty has been advancing this bill based on a  case where bullying on social media led to a tragic suicide. In submissions on this bill, it became apparent that banning children from social media would cause as much harm as good. The best response to these tragic cases would be to empower parents to better manage their children’s use of social media.  This can be achieved by enhancing parental lock technology, making it more powerful, easier to use, and free (the best Apps available are commercial).   The Government ignored concerns raised by experts in their submissions and testimony, and pushed ahead with a bill that introduced a blanket ban for under 16.

Let’s be clear – this is a ‘world-first’ because the rest of the world knows such a ban is counterproductive.

Tech-savvy kids will get around the ban, and that’s where the real harm begins. The ban does not cover chat rooms in video games, which lacks the supervision present on social media platforms. Peer-to-peer chat apps are making a comeback, and some children may even turn to TOR, which is not supervised at all and by it’s design, is almost impossible to supervise. This bill will have the outcome of exposing kids to even worse forms of bullying.  

One Nation and the Greens united to stop Labor’s guillotine. We forced the government to remove the bill banning under 16’s on social media and extend scrutiny until February. Then, incredibly, the Liberal Senate leader, Simon Birmingham, moved to get the bill back in the guillotine process.  Barely hours later, Simon Birmingham informed the Senate that he was leaving. It’s clear he knew he was leaving and this was his parting gift.

I want to thank Senators Alex Antic and Matt Canavan for crossing the floor to vote against the Liberal-Nationals-Labor guillotine.  

One Nation will continue to fight against the social media ban, returning power to parents and families.  

Included are comments around Digital ID, which—despite claims to the contrary—will inevitably become part of this outrageous power grab.

Transcript

My remarks are directed to the minister but also to people listening at home to the Senate and to researchers and historians that will look back at this vote today in an attempt to understand what the hell the Senate was thinking. The amendment the government circulated, no doubt with the approval of the Liberal Party, answers that question. The Online Safety Amendment (Social Media Minimum Age) Bill 2024 can act to force every Australian to be the subject of a digital ID in the name of keeping children safe—and that’s what my question is about.  

The government accepted widespread public concern that the bill was designed to force everyone to get a digital ID and promised to include an amendment to specifically rule that out. In this government amendment that you’ve moved, SY115, new provision 63DB(1) excludes use of government issued identification or use of digital ID. That is great, except 63DB(2) provides that, if social media platforms can come up with an alternative means of assessing age that does not involve digital ID or government documents, they can—wait for it—accept a digital ID identification. In effect, this amendment specifies that a social media platform cannot use digital ID by itself but it can use digital ID as part of a more comprehensive verification. There’s no need to guess what that could be; this bill contains the answer: age-assurance software. The company which has been awarded the tender for the age-assurance trial is a British company called Age Check Certification Scheme. whose main business is provision of digital IDs backed by age-assurance software. 

TikTok has used age-assurance software to remove one million underage accounts from TikTok in Australia. This software can tell if a person is, for instance, under 12. That’s useful. The smaller the gap between the user and target age—16 in this case—the less accurate it is. This software can’t tell age within six months, and there’s no way of knowing a person turned 16 on the day of their application. You just can’t tell that from face scan. Accessing social media on your 16th birthday and, most likely, for months afterwards will require a second identifier containing the child’s facial scan and their date of birth, which is a digital ID, which this company specialises in. You’re setting them up. 

I have criticised this bill as an opportunistic attempt to capitalise on the public desire for better regulation of social media to force all Australians to get a digital ID. I’ll say that again. I have criticised this bill repeatedly, as have others, as an opportunistic attempt to capitalise on the public desire for better regulation of social media to force all Australians to get a digital ID. This amendment requires a change in my language, which is now that this bill is an opportunistic attempt to require every child, once they turn 16, to get a digital ID if they want to access social media. What age does the government’s digital ID start from? Sixteen. What a coincidence! This wasn’t the intention all along? That’s misinformation. 

This amendment exposes the original intention of the bell. Your amendment exposes the original intention of the bill, which was hidden in what looked like a poorly drafted bill. It wasn’t poorly drafted; it was deliberately dishonest, and the short committee referral, which the government fought against, has exposed the deceit. The truth is now out there, and the decision before the Senate is a simple one. A vote for this bill is a vote to require every child to get a digital ID on their 16th birthday. 

Compulsory digital IDs aside, there are many other reasons not to pass this bill. I will now share with the Senate and with posterity the words of Australian Human Rights Commission on the bill. One Nation fully supports the commission’s position, which deserves to be included in the Hansard record of the debate: 

Social media is a vital platform for young people to share their ideas and opinions, engage in dialogue, and participate in social and cultural activities. It can be a valuable educational tool by providing access to diverse perspectives, news and learning opportunities, as well as vital information about health, well-being and safety. A blanket ban risks unjustly curtailing these freedoms. 

Social media is integral to modern communication and socialisation. Excluding young people from these platforms may isolate them from their peers and limit their ability to ability to access much-needed information and support. This is particularly important for young people from marginalised, vulnerable or remote communities. 

These are the words of the Human Rights Commission. 

The social media ban will rely on effective age assurance processes being adopted, which means that all Australians may be required to prove their identity in order to access social media. This may potentially require all Australians to provide social media companies with sensitive identity information, which poses a risk to our privacy rights in light of recent examples of data breaches and personal information being stolen. 

Technological workarounds – such as VPNs and false age declarations – may undermine the effectiveness of the ban. Additionally, a ban will not address the root causes of online risks or make the platforms safer for everyone. 

The workarounds to this measure have not received enough debate. The bill carves out gaming sites, many of which have a chat feature. Children will move over to chatrooms and gaming sites which are not supervised. Tor—or, more accurately, onion routing—will provide another avenue for communication which is designed to make supervision exponentially harder than on mainstream social media platforms. I have advice from a leading internet security company that peer-to-peer social media, which again is harder for parents to supervise than current social media platforms, is making a comeback. As a result of this legislation, children will be exposed to more harm, not less. I had a call from a constituent— 

Senator Hanson-Young: You are right. 

Senator HANSON-YOUNG: It’s not often Senator Hanson-Young tells me I’m right. A moment ago, I had a call from a constituent who had called their local Liberal member of parliament about this bill and was told, ‘Oh, it’s okay; you can just sign up for your children.’ With age-assurance software, that will not work. With Digital ID connected to age-assurance software, the social media platform will know what you’re doing. Don’t be telling people: ‘It’s nothing. You can defeat it. You can still talk to Grandad on Facebook.’ You won’t be able to. Children may be able to use VPNs, virtual private networks, and the new PPNs, personal private networks, to appear to be in another country. That really won’t work either. The keystroke logging that accompanies the age-assurance software will assume someone pretending to be in Canada but interacting with Australian accounts is probably using a VPN. 

Minister, why did you say that this won’t lead to Digital ID when your amendment says exactly that? 

Today, the Senate held a Committee Hearing on the Online Safety Amendment (Social Media Minimum Age) Bill 2024. This expedited inquiry was scheduled with just one day’s notice, as the Liberal and Labor parties want to rush this legislation through. The first witness, Ms. Lucy Thomas OAM, CEO of Project Rockit, delivered six minutes of the most relevant, heartfelt, and inspirational testimony on the issue of censoring social media for those under 16. Her insights demonstrated the benefit of lived experience.

Before taking a position on this bill, take the time to listen to her testimony.

Transcript

Senator ROBERTS: Thank you all for being here. Ms Thomas, there are harms and benefits at school, and there are harms and benefits in life generally. Claude Mellins, professor of medical psychology in the Departments of Psychiatry and Sociomedical Sciences at Columbia University, stated: ‘For young people, social media provides a platform to help them figure out who they are. For very shy or introverted young people, it can be a way to meet others with similar interests.’ She added: ‘Social support and socializing are critical influences on coping and resilience.’ They provide an important point of connection. She then said in relation to Covid: ‘On the other hand, fewer opportunities for in-person interactions with friends and family meant less of a real-world check on some of the negative influences of social media.’ Isn’t the professor making an important point? It’s not about stopping real-world interactions it’s about balancing social media with real-world interactions. Isn’t it about a balance, not about prohibition? Isn’t it also the fact that parents and not governments are best placed to decide how their children develop?

Ms Thomas: Thank you for the question. I think you’re speaking to that idea of balance that a lot of us have been trying to refer to. We are acutely aware of the harms, and I think they’re beautifully captured in that quote, and acutely aware of the risk that we may create new harms by cutting young people off. I think this is a really important point, and I’d like to give you one example, a quote from a young person, Rhys from Tamworth, who commented: ‘Social media has helped me figure out and become comfortable with my sense of self, as there is a large community that is able to connect me with people all over the world. Living in a regional area, it’s difficult to find people dealing with the same personal developments, and social media really helped.’ This is beyond just direct mental health intervention; this is about finding other people like you. This is about finding spaces where we can affirm ourselves, use our voices and mobilise around actions that we care about, just like we’re doing here today. I’d love to point out that the Office of the eSafety Commissioner has done some fantastic research into the experiences of specific groups—those who are First Nations, LGBTQIA+ Australians, and disabled and neurodivergent young people. All of these group face greater hate speech online. Actually belonging to one of those communities, I can say that we also face greater hate speech offline. What was really important is they also found that young people in these communities that already face marginalisation are more likely to seek emotional support—not just mental health support, but connection, news and information, including information about themselves and the world around them. So I take your point.

Senator ROBERTS: Thank you. I have another quote from Deborah Glasofer, Associate Professor of Clinical Medical Psychology at Stanford University:

Whether it’s social media or in person, a good peer group makes the difference. A group of friends that connects over shared interests like art or music, and is balanced in their outlook on eating and appearance, is a positive. In fact, a good peer group online may be protective against negative or in-person influences.

Is this bill throwing out the good with the bad, instead of trying to improve support in digital media skills to allow children and parents to handle these trials better?

Ms Thomas: I think there is a risk of that, yes. I think we really need to, in a much longer and more thorough timeframe, interrogate and weigh up all of these risks and unintended possible impacts. I’d like to draw another quote from Lamisa from Western Sydney University. You spoke about influencers; we tend to imagine those being solely negative. Lamisa says: ‘Social media has given me creators who are people of colour, and I think it has really allowed me to learn that I don’t have to justify my existence, that I am allowed to have an opinion and that I am allowed to have a voice about who I am.’ So I absolutely think that there is a risk that we’ll throw out these experiences; in our desire to protect people, we create unintended harms that they have to live with.

Senator ROBERTS: I just received a text message from someone in this building, a fairly intelligent person, and he said: ‘I was born with a rare disorder. I spent more than four decades feeling isolated until I discovered people with the same disorder on social media. This legislation would prevent people under 16 from linking with the communities online that can provide them with shared lived experience.’ What do you say?

Ms Thomas: I’m going to give you one more quote. I’m aware that young people aren’t in the room, so I’m sorry I’m citing these references. Hannah from Sydney says: ‘Where I struggled in the physical world thanks to a lack of physically accessible design and foresight by those responsible for building our society, I have thrived online.’ The digital world has created so much opportunity for young people to participate and fully realise their opportunities. We just need to be very careful.

I know in talking about all these benefits, I’m probably going to receive an immediate response about some of the harms. I’m not here to say that harms don’t exist. They do. If anyone is aware of them, it’s me. I’ve been working in this space for 20 years. I started Project Rockit because I wanted to tackle these issues as a young person fresh out of school. We know they’re there, but we have to be very careful not to impact these positive benefits young people face.

Senator ROBERTS: Ms Thomas, isn’t there very important access to parents and grandparents on social media for their support and experiential interaction. A lot of children interact with their parents and grandparents through social media?

Ms Thomas: Am I allowed to answer this one?

CHAIR: Yes.

Ms Thomas: I think one of the big, grave concerns around implementation and enforcement is that it won’t just be young people who need to verify their ages online; it will be every Australian. The methods available, every Australian sharing their biometric data or presenting a government issued ID, are going to pose challenges for those Australians that you are talking about—older Australians who are already facing higher rates of digital exclusion and those from marginalised communities. Absolutely, this is a vital tool for grandparents and kids, for intergenerational play and learning, and we risk cutting young people off but also cutting older people off.

This is the third and final session on the Online Safety Amendment (Social Media Minimum Age) Bill 2024 — aka U16’s Social Media Ban – an important piece of legislation being waved through by the Liberal and Labor parties with minimal debate. The Department was called to explain the bill, which of course they defended with responses that would not hold up under closer scrutiny.  If only Senators had time to do this.

Several serious revelations emerged during the Department’s testimony, including this little pearl: it’s better for foreign-owned multinational tech platforms to control children’s internet use than for parents to supervise or manage their children’s social media and online interactions. One Nation strongly disagrees.  

I also raised concerns about the YouTube exemption, which is worded in such a way that it could apply to any video streaming site, including pornographic sites. The Department’s response was to point to other regulations and codes that “supposedly” protect children from accessing porn.   What utter nonsense! Any child in this country without a parental lock can access Pornhub by simply clicking the “Are you over 18?” box. Teachers nationwide report that even primary school students are being exposed to and influenced by pornography. If this bill accomplishes anything good, it should be to prevent children from accessing pornography, which it deliberately avoids doing.  

This bill claims to be about many things – keeping children safe is not one of them.

Transcript

Senator ROBERTS: Thank you for appearing today. Could you please explain the provisions around exemptions for sites that do not require a person to have an account, meaning they can simply arrive and watch? An example would be children watching cartoons on YouTube. What’s the definition here of a site that can be viewed without an account?

Mr Irwin: I guess it goes to the obligation around holding an account, or having an account, which relates to the creation or the holding of an account. So if there is any process—

Senator ROBERTS: Is it the creator’s responsibility?

Mr Irwin: Sorry?

Senator ROBERTS: Is it the creator’s responsibility? Is the account the creator’s responsibility?

Mr Irwin: No, all responsibility is on the platform. If a platform under this definition has the facility to create an account and/or has under 16s who have an account on there already, then they will have to take reasonable steps.

Senator ROBERTS: What’s the functional difference in your definition between YouTube and Porn Hub?

Mr Chisholm: One contains content that is restricted content that is prohibited to be accessed by children under law. Porn Hub is a pornographic website.

Senator ROBERTS: I understand that.

Mr Chisholm: YouTube has a whole range of information, including educational content and a range of information that doesn’t really match up with a site like Porn Hub.

Mr Irwin: That was the second limb of the age-assurance trial: looking at technologies for 18 or over, looking at pornographic material for age assurance. That also goes to the matter of the codes that DIGI were talking about before. Those codes relate to access to particular types of content including pornographic content.

Senator ROBERTS: Let me try and understand—

Mr Chisholm: The design of Pornhub is to provide pornographic material to people who are permitted to watch it. That’s the difference.

Senator ROBERTS: I guessed that, but I asked for the functional difference. Pornhub is 18-plus, but apparently you don’t have to prove it. Could you show me where in the legislation, in this child protection bill, you’re actually including porn sites?

Mr Chisholm: There are separate laws in relation to pornographic material, which we can step you through. This bill is more about age limits for digital platforms, imposing a 16-year age limit for digital platforms. There are other laws that prohibit access to pornographic material online including the codes process and classification system.

Mr Irwin: That’s correct.

Senator ROBERTS: What’s required for someone aged 16 or 17 to get access to Pornhub?

Mr Irwin: That’s subject to the codes that industry is developing right now, which DIGI talked about, in terms of what specifically is required. There is also a whole system of classification laws that are designed to prevent access to adult content by children. On top of that, there’s the eSafety Commissioner’s administration of things like basic online safety expectations and the phase 2 codes that are under development.

Senator ROBERTS: I’m glad you raised that because I was going to raise it. You exempt gaming sites because they already carry age recommendations. In fact, some video game sites are MA 15+; they’re not 16-plus. What will have to change? Will it be your bill or the MA 15+ rating?

Mr Chisholm: The bill doesn’t require them to change—

Ms Vandenbroek: Nothing will change.

Mr Chisholm: because gaming isn’t caught by the new definition. There’s nothing that requires gaming systems to change.

Senator ROBERTS: So social media is 16-plus, but video games are 15-plus.

Mr Chisholm: The policy here is to treat games as different to social media. For some of the reasons we talked about before, they are seen as a different form of content consumption and engagement to social media.

Senator ROBERTS: Doesn’t this indicate to people that this bill’s intent is not about what the government says?

Mr Chisholm: No, the bill is definitely about what the government says. It imposes a firm age limit of 16 on account creation for social media for all of the concerns and reasons outlined about the damage that’s being done to under-16s through exposure to social media. Games are also subject to classification rules, so they have their own regime they have to comply with now.

Mr Irwin: They’re subject to the broader Online Safety Act as well.

CHAIR: Senator Roberts, I’ll get you to wrap up.

Senator ROBERTS: I have a last question. I understand that there are parental controls that parents can buy—they’re sometimes free—in the form of apps that watch over what children are watching. What alternatives are already available for parents to control children’s social media and control their exposure? Did you evaluate them, and why don’t you just hand the authority back to where it belongs—to parents—because they can do a better job of parenting their child than government can?

Mr Chisholm: The very strong feedback that we received from parents during this consultation is that they do not want to bear the burden or responsibility of making decisions that should be better reflected in the law. At the moment, parents often refer to the 13-year age limit that’s part of the US terms of service—

Mr Irwin: For privacy reasons.

Mr Chisholm: for privacy reasons, that apply in Australia. That’s often used for parents to say to their children, ‘You can’t have a social media account until you’re 13.’ It’s really important for parents to point to a standard law, an age limit, that will apply to everybody. It’s also feedback we’ve received from a lot of children. They would rather have a universal law that applies to all children under the age of 16 instead of a situation where some children have it and some children don’t, and where all of the harms that we’re aware of from exposure to social media continue to magnify. We also don’t want a situation where there is any question the parents have some legal responsibility in relation to an age limit. The very strong view of the government is that that responsibility should be borne by the platforms, not parents.

Senator ROBERTS: We’re not going to have—

Mr Chisholm: The platforms are in a much better position to control their services than parents are.

Senator ROBERTS: So we want to put parenting in the hands of social media platforms?

Mr Chisholm: The parents have said to us that they have a very strong view that they want a 16-year age limit, and that the platforms are better placed to enforce that because it is their platforms.

Senator ROBERTS: How much notice did the parents get to give their comments? Because we got 24 hours notice of the closing of submissions.

Mr Irwin: We’ve been consulting, and I will add we do have evidence that 58 per cent of parents were not aware of social media parental monitoring, and only 36 per cent actually searched for online safety information.

Senator ROBERTS: So wouldn’t it be better to educate the parents?

Mr Chisholm: We are educating parents, too. That’s part of the digital literacy and other measures we are undertaking. Education is important, but it’s not enough.

Senator ROBERTS: I meant educating parents about the controls already available to keep the control over their children in parents’ hands, not usurping it and putting it in the government’s hands.

Mr Chisholm: I think it comes back to the point that we’ve made that the very strong view here is that platforms should bear the responsibility for imposing or following an age limit, not parents, who don’t have as much information about how these platforms operate as the platforms themselves.

The Inquiry into the Online Safety Amendment (Social Media Minimum Age) Bill 2024 — aka U16’s Social Media Ban – heard testimony from the Digital Industry Group (DIGI), the industry body for social media companies such as Google, Meta, and X (formerly Twitter). During the session, the witness was given a torrid time by some Senators who were not receiving the answers they wanted. I commend the witness for her patience.

My questions focus on the bill’s wording, which fails to clearly define core concepts. This lack of clarity makes it impossible for social media companies to implement the legislation. Instead, what it will do is grant massive power to the eSafety Commissioner. The bill is so broadly written that the eSafety Commissioner can just about do anything she wants. This is not how legislation should be drafted.

One Nation agrees with DIGI’s testimony and supports the bill being withdrawn and redrafted with proper checks and balances, clear definitions, and then subjected to proper debate.

Transcript

Senator ROBERTS: Thank you for appearing today. I’m trying to understand if YouTube will or will not be included in this bill. Section 63C defines age-restricted social media platforms as ones where the service allows users to interact, which YouTube does in the comments, or allows users to post material, which YouTube does, ora significant part of the purpose is to allow interaction, which YouTube does in some channels. Do you consider that YouTube is included in this bill?  

Ms Bose: This underscores the broader challenge of this broad definition that encompasses a range of services and also the discretion it affords the minister in relation to making those determinations. I might hand to my colleague, Dr Duxbury, who may have more to add around some of the questions we have around that discretionary determination of what is in scope.  

Dr Duxbury: Senator, you are absolutely right that the bill doesn’t make clear who is in or out of scope. To us, that is a really serious flaw in the bill. It is absolutely unclear who is in or out, and we don’t know what criteria willbe used to determine these exemptions. The explanatory memorandum suggests that some services will be out of scope, but that will not occur until a future date, and that date is unknown.  

Senator ROBERTS: Speaking of the explanatory memorandum, page 21 says that children can visit sites that do not require an account. Is that your understanding?  

Dr Duxbury: That is my understanding.  

Senator ROBERTS: You said in point 3 of your submission that parliament is being asked to pass a bill without knowing how it will work. No regulator worldwide has done age assurance successfully yet—nowhere. We’ve got almost no time to discuss this in public, so I don’t know how you are even here. Thank you for being here. You say, though the government’s trial exploration of age assurance in the bill is not yet complete, only a year ago the government concluded that these technologies were ‘immature’. Could you expand on that, please. 

Dr Duxbury: The conclusion was not only that the technologies were immature but also that there were risks about the reliability of the technology and their impact on digital inclusion. We heard earlier the fact that, because these requirements will apply to all Australians, the impact will be felt not only by young people but also by other Australians, who will be required to age-verify before they get access to a very broad range of services.  

Senator ROBERTS: This is quoting from your tabled opening statement: ‘If we are proceeding on this fasttracked timetable, what is most important is that the bill contains structures for future consultation.’ You go on to say: ‘As drafted, the bill only requires that the minister seek advice from the eSafety Commissioner before making legislative rules. However, given these expert warnings of youth harm from a social media ban, the unknown technology and the privacy implications, further consultation with the community and technical experts is vital.DIGI suggests amending section 63C of the bill to include an additional requirement for a minimum 30 days of industry and public consultation before making legislative rules.’ Could you expand on that, please. 

ACTING CHAIR: As quickly as you can. 

Dr Duxbury: Sunita, did you want to take that? 

Ms Bose: Jenny, I will hand over to you, but there is an additional reflection we had over the weekend that we didn’t have a chance to include in our submission issued on Friday evening that we might touch upon here in addition to what you’ve read there, Senator Roberts, around the need for reasons for a decision. Let me hand over to you, Jenny, to elaborate. 

Dr Duxbury: We have recommended additional consultation because we think that, in the current context, it’s quite likely that the bill will proceed and proceed quickly. We understand that this committee will only have one day to basically ponder that question. If the bill is going to proceed on its current timetable then, frankly, adding in a consultation requirement seemed to be the only thing that was likely to improve it, given the complete absence of detail as to how it will be implemented. However, another possible improvement to the bill would be to require additional transparency regarding the making of these decisions. I believe the minister has the power both to include particular services within the scope of the bill and also to exclude them. To the extent that legislative instruments are going to be made to flesh out the detail of the bill, I think additional transparency could be very helpful. 

Senator ROBERTS: ‘A complete absence of detail’—thank you.