Senate debates
Tuesday, 29 July 2025
Matters of Urgency
Cybersafety
5:46 pm
Helen Polley (Tasmania, Australian Labor Party) Share this | Link to this | Hansard source
I inform the Senate that the President has received the following letter from Senator Babet:
Pursuant to standing order 75, I give notice that today I propose to move "That, in the opinion of the Senate, the following is a matter of urgency:
"The need to recognise that the Internet Search Engine Services Online Safety Code which requires age assurance measures for account holders of search engines must be amended as it represents another layer of digital surveillance, dressed up as child protection and raises many privacy issues."
Is consideration of the proposal supported?
More than the number of senators required by the standing orders having risen in their places—
With the concurrence of the Senate, the clerk will set the clock in line with the informal arrangements made by the whips.
5:47 pm
Ralph Babet (Victoria, United Australia Party) Share this | Link to this | Hansard source
I move:
That, in the opinion of the Senate, the following is a matter of urgency:
The need to recognise that the Internet Search Engine Services Online Safety Code which requires age assurance measures for account holders of search engines must be amended as it represents another layer of digital surveillance, dressed up as child protection and raises many privacy issues.
This urgency motion seeks to defend a fundamental right of the Australian people: the right to privacy. The privacy implications of the Internet Search Engine Services Online Safety Code are nothing short of staggering. It is alarming, but the government remains silent, leaving it to me to stand here against the steady advance of the surveillance state. First it was a social media ban for under-16s, followed by a YouTube ban, both of which require mandatory IDs for all users of all ages. Now the focus has shifted to search engines. What comes next? This government has consistently failed to defend Australians' civil liberties. Week after week, we see new efforts in this place to erode our right to privacy and our personal freedom online. Let me say from the outset that protecting children online is a moral imperative. No-one in this chamber is going to question that. Measures like safe-search filters for minors, better parental controls and the restriction of harmful content are of course welcome, but let's not kid ourselves—this is not about protecting children; it is about building a surveillance infrastructure under the cover of safety.
Under this new code, Australians who are logged into search engines like Google, Microsoft and others will be required to undergo age assurance. That's not a polite, 'How old are you?' at the cinema; that's government ID checks. That's biometric scanning. That's data mining. We're rapidly marching towards a society where privacy online is not just frowned upon but perhaps going to become illegal. That's what's going on. Imagine this: your face, your ID and your personal browsing history all linked, logged and stored in the name of keeping kids safe. But I ask you this: who is keeping citizens safe from this creeping authoritarianism disguised as policy?
Let's be clear, most Australians are already deeply embedded in these platforms, Gmail, YouTube, Outlook et cetera. This is not a niche issue. This affects the vast majority of Australians and their right to explore the internet freely without facial recognition or ID uploads. It hangs in the balance. Even more alarming is that this framework wasn't designed by us here in this place; it was co-developed by the tech giants themselves and registered by our eSafety Commissioner, Julie Inman Grant. For now this code only applies to logged-in users, but we all know it's just the beginning. The slippery slope of Canberra and its bureaucracy is very, very real.
I cannot stress enough that we are not, nor do we want to become, China or North Korea. We're Australians. That's what we are. Australians have a right to privacy, to autonomy and to live free from constant digital scrutiny. The solution to unsafe content online is simply empowering parents, not expanding government backed surveillance. If this code continues in its current form, it's going to set a dangerous precedent that everyone's search history, browsing behaviour and identity can be monitored so long as it's done under the label of safety. It is a slippery slope, like I said before. No good will come of this.
Yes, we've got to protect kids online, but not by sacrificing the freedoms of every Australian adult in the process. The code has to be amended. We have to remove the age assurance requirement and restore some common sense before the only thing that's going to be safe online is big tech's grip on all of our lives. To the Greens I say thank you for supporting my motion in defence of privacy and against creeping surveillance. I never thought I would say that, but thank you. Hell must have frozen over! To the Liberal Party, your 'we believe' statement says that you support the inalienable rights and freedoms of all peoples. Here is your chance to show it. Stand on the right side of history and join me in defending the Australian people. All senators, support my motion. Let's see what the Libs do.
5:52 pm
Corinne Mulholland (Queensland, Australian Labor Party) Share this | Link to this | Hansard source
I rise to speak against the MPI moved by Senator Babet. We all know that the digital age we are living in is complicated, fast paced and potentially dangerous, especially online and especially for the most vulnerable, our kids. I am proud to be part of a government that has not only stepped up to the challenge of better protecting Australians in these unprecedented times but that is acting quickly to put those protections in place.
The Albanese government is a world leader in online protections for its citizens, and we are taking our responsibility to reduce online harm and exposure for young people very seriously. That's why last year we passed historic legislation to delay access to social media until the age of 16, a campaign that was echoed by parents in the pages of our national newspapers, the Let Them Be Kids campaign. This decisive action by our government was celebrated by parents across the country, so much so that there is now growing international pressure on governments around the world to follow Australia's lead.
We make no apology for prioritising the safety of Australian kids online. In fact, it is incumbent on governments to accept some responsibility for helping Australians mitigate those dangers. It is our responsibility to do whatever we can to protect them. That's why we have quadrupled the eSafety Commissioner's base funding to ensure that they can enforce the law. We have equipped the commission to better help Australians who face serious abuse online and to also better educate Australians about those online risks. We must particularly do that for our children, and I am not sure why you wouldn't want to protect our kids online.
That's exactly what the Internet Search Engine Services Online Safety Code does. The code was developed under the Online Safety Act 2021 and plays a vital role in protecting citizens, especially our kids, from harmful online content. The code requires search engines to implement measures to detect and remove harmful content like child sexual abuse, pornography, domestic violence and terrorism material. I'm not sure why you would be against that. It specifically requires the industry to implement age-appropriate assurances for account holders by no later than six months after the code comes into effect. The code mandates that search engines take steps to reduce the risk of children being exposed to harmful content by applying a 'safe search' functionality as the default for underage account holders. Again, I don't know why you would be against that. The code emphasises user empowerment by requiring search engines to help users manage their online exposure to potentially harmful content. Again, I'm not sure why you would be against that. The code helps to prevent illegal content from spreading by restricting access to harmful content and by restricting the distribution of harmful content, so it's harder to access or share illegal material online. Again, I don't know why you would be against that.
Best of all, the code has real power. It's backed by the Online Safety Act, which provides for civil penalties and injunctions to ensure compliance, and the eSafety Commissioner can investigate potential breaches. To me, it all just makes common sense, so I'm not sure why it's so controversial on the other side of the chamber.
I will give the senator the benefit of the doubt in assuming he also wants to protect kids from sexual abuse and terrorism material online, but we need to be clear about what the senator is arguing for here. He's squibbing about an obscure technicality. That, frankly, seems a little bit paranoid. He is arguing that, for the benefit of adults, there should be no protections against children accessing online pornography that depicts specific fetish practices or fantasies. He is arguing that, for the benefit of adults, there should be no protections against children accessing pornography that depicts sexual activity between adults. He is arguing that, for the benefit of adults, we shouldn't be protecting children from high-impact material, which includes violence, drug use, suicide, death, alcohol dependency and racism, all of which I personally find repugnant beyond description as an Australian and especially as a new mum. It's not something I want my child to be accessing online—I'm not sure about those across the chamber. The most ludicrous thing is wasting our time in the Senate when we could be getting on with the real business of debating legislation.
5:57 pm
David Shoebridge (NSW, Australian Greens) Share this | Link to this | Hansard source
I want to start by rejecting the binary argument that you have either the Internet Search Engine Services Online Safety Code or the wild west. That is not the way of framing this quite complex debate. I also want to reject, on behalf of the Greens, the wild conspiracy theories that often circle in this space: that any kind of move towards online regulation is driven by some mysterious international cabal that wants to take your freedom.
But, that being said, come December, millions of Australians will likely need face scans or ID checks just to use Google. That's the latest online safety code for search engines, and it's really been happening without any public discussion and without any real input from this place at all, because the code didn't go through parliament. Tech companies basically wrote it themselves, and the eSafety Commissioner approved it. We couldn't vote on it, and politicians never had a chance to vote on it in this place either. They even finished these rules before the actual age verification trial was complete, and when the results came in, well, the age verification technology looked a bit wobbly. It doesn't seem to work reliably, and companies seem to want to have access to troves of people's personal data.
Search engines aren't optional in 2025. They are basic infrastructure. They're like electricity, water or our phones, and making people have to scan their faces to access basic infrastructure and basic information verges on digital surveillance. These codes will not work. Who will get hurt most? It will be vulnerable people who can't or won't verify their identity. What about them—young people seeking sexual health information, people researching harm reduction, and communities already locked out of digital services? If we're not careful, we're building digital walls around essential information.
This law, this online safety code, will only apply to logged in users, which means anyone who understands the internet even a little bit will be asking themselves, 'Can't users just not log in?' Yes, that will still be possible, and what it means is that children can still use Google on a device that they're not logged in to or on an account that's owned by an adult. They'll be able to access pornography, adult content and other harmful content, and the government's own research proves this—people support protecting kids in theory but these online codes don't deliver that in practice. When the public learns that the abstract concept of providing child safety means face scans, handing over credit cards and handing over personal details, they're often horrified and there is no social licence for this.
We can and must do better. We must protect kids online. We must do what we can to prevent this harmful material from finding its way into young and vulnerable minds. But, instead of playing Whac-A-Mole with content, why don't we target the billionaire business models that profit from harm, including exploitative algorithms? Instead of surveilling everyone, why don't we work to regulate platforms properly? This code does entail significant digital surveillance dressed up as child protection, and it won't achieve the child protection goals. It bypassed parliament, it has ignored the evidence—some of it gathered by the government's own trial—and it will harm the people that it claims to protect. We support pausing this code and going back to the drawing board. Protecting children shouldn't mean surveilling everyone. We can have both safety and privacy but not through this broken process. So what would work? I'll tell you what would work: providing a statutory duty of care to the platforms, requiring that duty of care and then putting in place clear, enforceable obligations for that duty of care. We won't just stand by while Australia builds a surveillance state one safety measure at a time.
I want to just finish on this last point: this isn't about digital ID conspiracy theories either. Those narratives aren't helpful, and they distract people from the real concerns, about empowering tech companies to grab ever more of your personal data and information. What we need in this debate is evidence. We need a transparent process, and we need a broad commitment to child safety and privacy. (Time expired)
6:02 pm
Fatima Payman (WA, Australia's Voice) Share this | Link to this | Hansard source
I commend Senator Babet for putting this urgency motion forward. When the government announces changes like the Internet Search Engine Services Online Safety Code or the social media ban for under-16s, they often call them 'world-first changes'. Why is it that we always seem to have dubious privilege of leading the world in these changes? Why is it that it's our government who does things without thinking it through? That's because no other country on earth thinks these are good ideas. The government's noble goal of protecting children from offensive harmful content is vital. No-one here is disputing that. But the challenge and responsibility lies in balancing safety with privacy, security and some common sense.
I've previously raised concerns about the inherent flaws in age assurance and verification technologies. Whether it's the danger of uploading your driver's licence to the internet for an unspecified amount of time or using your face to prove your age, these systems are, at best, clunky and unreliable and, at worst, a goldmine for hackers. Even the government's own age assurance trial found that these tools were not guaranteed to be effective. We think of search engines as giant, airtight vaults where our information is kept secure. I'd like you to think again. Google suffered data breaches in 2018, 2014 and 2009, Microsoft Bing in 2020 and Yahoo in 2013 and 2014. This proposed code also raises technical and legal questions. How does the government plan to address VPNs? Will the checks, which are currently applied to logged-in users, apply to logged-out users? How long will age assurance information be stored for? Where will Australians need to identify themselves online next? Under this code, we've got way too many questions and few answers. (Time expired)
6:04 pm
Maria Kovacic (NSW, Liberal Party, Shadow Assistant Minister to the Leader of the Opposition) Share this | Link to this | Hansard source
The eSafety Commissioner, Julie Inman Grant, registered the Internet Search Engine Services Online Safety Code in June this year. The changes are due to take effect from December 2025. The intent of the code, developed under the Online Safety Act 2021, is to protect children from harmful online content like pornography, child abuse material, high-impact violence and self-harm material—for example, those which promote eating disorders and suicide—from being returned in search results. It's intent is also to implement restrictions on AI functionality and associated algorithms integrated with search engines from being used to generate synthetic versions of this type of material.
The code will require search engine providers to implement age verification safety settings, parental controls and crisis prevention measures so that, if a user is not identified or over 18 years of age, any such images will be filtered out from being returned in any search results. Once these changes take effect in December, if a user is not logged into their online account search, their search results will be filtered. These filters are generally guided by AI, and we know that they can get it wrong. Just look at the current Meta debacle, with people's business accounts being blocked and deleted.
Protecting children from harm is of paramount importance. That is not up for debate here. But since the code was registered we have received feedback from Australians who feel that this code, which was registered by the eSafety Commissioner in isolation of any legislative scrutiny or parliamentary oversight, will impinge on their privacy and personal freedoms. Those voices cannot be ignored. And, whilst the intent is to protect young people from harm, it is essential that this be balanced with an individual's right to privacy and protection of personal freedoms.
The coalition has consistently pushed back, and will continue to push back, on legislation that restricts or risks suppressing a right to freedom of speech. However, it is important to note that it is incorrect to suggest that all measures designed to protect children online are merely a Trojan horse for government surveillance of ordinary people's use of the internet. Government can both protect children from the very real risks of online harm and uphold the rights and freedoms of all Australians. It is not a matter of either/or. It can and must do both.
We all remember Labor's disastrous attempts to impose its misinformation bill. We were forced to dump it after a massive public outcry. Not a single senator in this place, other than those from the other side, would support it. It was the coalition which forced the government to make changes to the online safety laws last year to prevent the risk of digital ID being imposed. We make no apologies for holding this government to account to ensure that they do not overreach or go too far.
The eSafety Commissioner's dual role in developing, and regulating and enforcing her own policies is unique, but the concentration of this authority is now raising concerns which may require greater scrutiny—for example, through this parliament. We must ensure—and this must always be the case with legislation and delegated legislation—that the operations of bodies do not stray away from the intent of the parliament when they were first established.
Let's be clear. The eSafety Commissioner is not an elected position. It's decisions do not come before the parliament for us to scrutinise. The remit of the eSafety Commissioner, without adequate safeguards, is getting out of hand, and we must pause to consider this. Ensuring every adult logs into an account to browse the internet is taking the eSafety Commissioner's power to a new level which must be debated and scrutinised further. But we can't lose sight of the fact that we must protect our children from online harm.
6:09 pm
Malcolm Roberts (Queensland, Pauline Hanson's One Nation Party) Share this | Link to this | Hansard source
I thank Senator Babet for this motion, which One Nation supports. In the recent election, One Nation campaigned against government overreach. The Internet Search Engine Services Online Safety Code for class 1A and 1B material is a textbook case of government overreach. The eSafety commissar rejected the original code from the search industry, which was based on their knowledge of what was technically achievable. Instead, her office produced their own mandatory code with stronger provisions which are seriously flawed.
The code is designed to regulate the search industry to stop children accessing harmful material, including online pornography and material showing high-impact violence or self-harm. That's a worthy cause, yet with dangerous implications. To achieve this, every search engine must include robust age verification. In practice, this brings search engines and similar sites under the purview of the Identity Verification Services Act and the Online Safety Amendment (Social Media Minimum Age) Bill—legislation that was passed for social media and is now being extended to internet searches without further reference to parliament or with any justification for doing so.
Google or Bing search engines, as well as sites like TinEye, stock photo sites, YouTube, Vimeo, Rumble and, bizarrely, Google Maps and Apple Maps, will now require proof of age for every user. A trial of age verification software last year actually failed; it wasn't possible to tell age accurately. It can detect that someone is a child, yet not whether they are 15 or 16, making the technology pointless.
The only way to establish age is with biometric data—a face scan with photo ID. As I've already spoken about, age verification for holding an account does not work without continuous facial scanning to ensure that the person who's signed in to the browser is still the person using it. If the search engine detects a face it does not recognise—again, through continuous scanning—search engines must show only child-friendly material. This ensures that every person will need to provide biometric data to access one of these services or only see the child-friendly results. I'll say that again. This ensures that every person will need to provide biometric data to access one of these services.
These extreme measures have a point to them. Every user of any platform that disseminates accurate information that the government doesn't like can now be regulated, and material can be taken down in the name of keeping us safe. It's censorship—any link anywhere in the world; what a power to have. How long before any posts critical of the government get taken down as dangerous and the person gets a police visit? This is happening right now in the United Kingdom under very similar laws.
The code includes specific provisions which have no relevance to the core mission of the eSafety officer, like item 12, which states that a search engine provider must take reasonable steps to identify a reputable organisation to provide information on eating disorders, to become an eating disorders crisis intervention partner, including the provision of a hotline which the search engine must promote in search results. Remember, search results include maps and AI. If I said, 'Hey, Siri, show me a map of Brisbane,' Siri would reply, 'Sure, Malcolm. Here's the map, and I've put our eating disorder crisis prevention partner on there for you.' That's how this thing is written. If that's not the intention, well, that's what happens when the government walks away from industry consultation and writes the code itself.
The eSafety commissar is there to protect children from violent material and exploitation, not from eating disorders. The framing of that section begs the question: who will be the next crisis prevention partner? We do have a hint. Item 22 states a search provider must 'engage annually with safety and community organisations, such as civil society groups, public interest groups and representatives of marginalised communities, and academics to inform their measures'. A search engine provider knows what child exploitation and violent material look like. They don't need academics to tell them. This provision provides a clear power for the eSafety commissar to declare a social issue as harmful to children. If not, what's it doing here?
This code is designed to formalise the process of taking down links to any messaging which the eSafety commissar deems 'wrongthink' under the guise of that information being harmful, as they did during COVID, often for material that was later shown to be factual. When is the Liberal Party going to wake up to this agenda and stand up for everyday Australians instead of walking in lockstep with the Albanese government's communist agenda?
One Nation would repeal this code, sack the eSafety commissar, repeal the Digital ID Bill and return the office of eSafety to its core job of taking down material that's child abuse, exploitation and revenge porn and that is overly violent—a job that it has done well before this eSafety commissar— (Time expired)
6:14 pm
Alex Antic (SA, Liberal Party) Share this | Link to this | Hansard source
The Internet Search Engine Services Online Safety Code, as we've heard today, is set to come into effect in December this year, and what it is going to effectively do is require internet search engine providers to confirm the age of their account holders, amongst a range of other things that we've heard about this evening. But the crux of this is that what this will mean is that every Australian will be locked out of their Google account, their Yahoo account or their Microsoft account until they verify their age, which I think is quite incredible. I think it's unbelievable, actually. How does a big tech company actually achieve this in a practical sense? How does a big tech company actually identify you and confirm your age? Well, it does that by simply confirming your identity, and it means this is not just targeted at children; it's going to be targeted at everyone. You can't identify who is 16 and over unless you identify everybody.
That's what this is about. Let's be very clear about it. That's what this has always been about. It was the same with the original legislation banning social media for under-16s, which was a trojan horse for this very position. This was rushed through in parliament last year. Now we're seeing an industry code which, as many of the contributors tonight have said, has been brought out without industry consultation and with a bureaucratic stroke of a pen. The selling point has always been about protecting kids online, which nobody in this room disagrees with; it's trite to suggest otherwise. Yet the measures that are used to achieve this always seem to come back to the issue of requiring more government intervention and more government surveillance of Australians.
Why is it happening? What is the real story here? Well, the corporate sector and the administrative state have lost control of the narrative. They've lost control of the media cycle, and they're driven by the fear that the internet and social media platforms are now empowering populist and alternative views which they regard simply as unhelpful. It just sounds like free speech to me. The consequences of this new code will be yet further erosion of our privacy and another step towards this digital dystopia that we've been talking about.
For us, as a country that purports to identify itself as free and as a country that believes, I think, at its core that unnecessary surveillance is unacceptable, this is extraordinary. We're not communist China. But, if you want to know what Australia's future looks like on its current trajectory, go and have a look at Shenzhen in China—cameras everywhere, facial recognition, a digital currency and a social credit system. That is not an Australia that I want. It's not an Australia that I think anyone really wants. But it's an Australia that the bureaucracy and the political class are slow-marching us towards. It's that serious. This is not just about protecting kids online. Once this system is in place—once the digital snare trap is in place—it is going to be impossible to wind back.
Sue Lines (President) Share this | Link to this | Hansard source
The question is that the matter of urgency moved by Senator Babet be agreed to.