Senate debates
Tuesday, 29 July 2025
Matters of Urgency
Cybersafety
5:57 pm
David Shoebridge (NSW, Australian Greens) Share this | Hansard source
I want to start by rejecting the binary argument that you have either the Internet Search Engine Services Online Safety Code or the wild west. That is not the way of framing this quite complex debate. I also want to reject, on behalf of the Greens, the wild conspiracy theories that often circle in this space: that any kind of move towards online regulation is driven by some mysterious international cabal that wants to take your freedom.
But, that being said, come December, millions of Australians will likely need face scans or ID checks just to use Google. That's the latest online safety code for search engines, and it's really been happening without any public discussion and without any real input from this place at all, because the code didn't go through parliament. Tech companies basically wrote it themselves, and the eSafety Commissioner approved it. We couldn't vote on it, and politicians never had a chance to vote on it in this place either. They even finished these rules before the actual age verification trial was complete, and when the results came in, well, the age verification technology looked a bit wobbly. It doesn't seem to work reliably, and companies seem to want to have access to troves of people's personal data.
Search engines aren't optional in 2025. They are basic infrastructure. They're like electricity, water or our phones, and making people have to scan their faces to access basic infrastructure and basic information verges on digital surveillance. These codes will not work. Who will get hurt most? It will be vulnerable people who can't or won't verify their identity. What about them—young people seeking sexual health information, people researching harm reduction, and communities already locked out of digital services? If we're not careful, we're building digital walls around essential information.
This law, this online safety code, will only apply to logged in users, which means anyone who understands the internet even a little bit will be asking themselves, 'Can't users just not log in?' Yes, that will still be possible, and what it means is that children can still use Google on a device that they're not logged in to or on an account that's owned by an adult. They'll be able to access pornography, adult content and other harmful content, and the government's own research proves this—people support protecting kids in theory but these online codes don't deliver that in practice. When the public learns that the abstract concept of providing child safety means face scans, handing over credit cards and handing over personal details, they're often horrified and there is no social licence for this.
We can and must do better. We must protect kids online. We must do what we can to prevent this harmful material from finding its way into young and vulnerable minds. But, instead of playing Whac-A-Mole with content, why don't we target the billionaire business models that profit from harm, including exploitative algorithms? Instead of surveilling everyone, why don't we work to regulate platforms properly? This code does entail significant digital surveillance dressed up as child protection, and it won't achieve the child protection goals. It bypassed parliament, it has ignored the evidence—some of it gathered by the government's own trial—and it will harm the people that it claims to protect. We support pausing this code and going back to the drawing board. Protecting children shouldn't mean surveilling everyone. We can have both safety and privacy but not through this broken process. So what would work? I'll tell you what would work: providing a statutory duty of care to the platforms, requiring that duty of care and then putting in place clear, enforceable obligations for that duty of care. We won't just stand by while Australia builds a surveillance state one safety measure at a time.
I want to just finish on this last point: this isn't about digital ID conspiracy theories either. Those narratives aren't helpful, and they distract people from the real concerns, about empowering tech companies to grab ever more of your personal data and information. What we need in this debate is evidence. We need a transparent process, and we need a broad commitment to child safety and privacy. (Time expired)
No comments