Senate debates
Tuesday, 29 July 2025
Matters of Urgency
Cybersafety
6:09 pm
Malcolm Roberts (Queensland, Pauline Hanson's One Nation Party) Share this | Hansard source
I thank Senator Babet for this motion, which One Nation supports. In the recent election, One Nation campaigned against government overreach. The Internet Search Engine Services Online Safety Code for class 1A and 1B material is a textbook case of government overreach. The eSafety commissar rejected the original code from the search industry, which was based on their knowledge of what was technically achievable. Instead, her office produced their own mandatory code with stronger provisions which are seriously flawed.
The code is designed to regulate the search industry to stop children accessing harmful material, including online pornography and material showing high-impact violence or self-harm. That's a worthy cause, yet with dangerous implications. To achieve this, every search engine must include robust age verification. In practice, this brings search engines and similar sites under the purview of the Identity Verification Services Act and the Online Safety Amendment (Social Media Minimum Age) Bill—legislation that was passed for social media and is now being extended to internet searches without further reference to parliament or with any justification for doing so.
Google or Bing search engines, as well as sites like TinEye, stock photo sites, YouTube, Vimeo, Rumble and, bizarrely, Google Maps and Apple Maps, will now require proof of age for every user. A trial of age verification software last year actually failed; it wasn't possible to tell age accurately. It can detect that someone is a child, yet not whether they are 15 or 16, making the technology pointless.
The only way to establish age is with biometric data—a face scan with photo ID. As I've already spoken about, age verification for holding an account does not work without continuous facial scanning to ensure that the person who's signed in to the browser is still the person using it. If the search engine detects a face it does not recognise—again, through continuous scanning—search engines must show only child-friendly material. This ensures that every person will need to provide biometric data to access one of these services or only see the child-friendly results. I'll say that again. This ensures that every person will need to provide biometric data to access one of these services.
These extreme measures have a point to them. Every user of any platform that disseminates accurate information that the government doesn't like can now be regulated, and material can be taken down in the name of keeping us safe. It's censorship—any link anywhere in the world; what a power to have. How long before any posts critical of the government get taken down as dangerous and the person gets a police visit? This is happening right now in the United Kingdom under very similar laws.
The code includes specific provisions which have no relevance to the core mission of the eSafety officer, like item 12, which states that a search engine provider must take reasonable steps to identify a reputable organisation to provide information on eating disorders, to become an eating disorders crisis intervention partner, including the provision of a hotline which the search engine must promote in search results. Remember, search results include maps and AI. If I said, 'Hey, Siri, show me a map of Brisbane,' Siri would reply, 'Sure, Malcolm. Here's the map, and I've put our eating disorder crisis prevention partner on there for you.' That's how this thing is written. If that's not the intention, well, that's what happens when the government walks away from industry consultation and writes the code itself.
The eSafety commissar is there to protect children from violent material and exploitation, not from eating disorders. The framing of that section begs the question: who will be the next crisis prevention partner? We do have a hint. Item 22 states a search provider must 'engage annually with safety and community organisations, such as civil society groups, public interest groups and representatives of marginalised communities, and academics to inform their measures'. A search engine provider knows what child exploitation and violent material look like. They don't need academics to tell them. This provision provides a clear power for the eSafety commissar to declare a social issue as harmful to children. If not, what's it doing here?
This code is designed to formalise the process of taking down links to any messaging which the eSafety commissar deems 'wrongthink' under the guise of that information being harmful, as they did during COVID, often for material that was later shown to be factual. When is the Liberal Party going to wake up to this agenda and stand up for everyday Australians instead of walking in lockstep with the Albanese government's communist agenda?
One Nation would repeal this code, sack the eSafety commissar, repeal the Digital ID Bill and return the office of eSafety to its core job of taking down material that's child abuse, exploitation and revenge porn and that is overly violent—a job that it has done well before this eSafety commissar— (Time expired)
No comments