House debates
Monday, 30 March 2026
Adjournment
Cybersafety, Child Abuse
7:40 pm
Andrew Wallace (Fisher, Liberal National Party) Share this | Link to this | Hansard source
Earlier today, I attended a briefing that every member of this place should take very seriously. It dealt with one of the most urgent issues before us: the safety of children in an online world that has too often been built without proper safeguards. What we saw and heard today was confronting; but it was also clarifying, because the excuse that nothing can be done no longer exists. The technology exists. The capability exists. What matters now is whether this government has the will to act.
We heard from the eSafety Commissioner, international experts and frontline advocates, and the message was simple. Real-time child sexual abuse can be prevented. Livestreamed abuse can be stopped. Harmful content can be blocked before it is even seen, stored or shared—not after the harm is done, but before it happens.
That is the shift we need, because, for too long, the system has been reactive: content taken down later; investigations undertaken after the fact; support offered after the trauma. When it comes to children, that is simply not good enough. We should not be mopping up harm; we should be preventing it—turning off the tap, not cleaning up the mess.
What stood out today is that this can be done without breaking encryption—without compromising privacy. And that matters, because Australians should not be forced to choose between safety and privacy. We can have both: safety by design, built into the device, built into the platform, built into the system, from the very beginning, not bolted on as an afterthought.
And yet we heard clearly that many of the world's largest technology companies already have these tools, but they are not using them where it matters most. That is not a technical failure; that is a failure of responsibility.
The under-16 social media reforms are important. Age restrictions matter. But they do not deal with what is happening in real time. They don't deal with adult accounts. They don't deal with livestreamed abuse. They don't deal with encrypted messaging being used as crime scenes.
That is why the next step must be a strong digital duty of care—a duty that shifts responsibility back onto platforms. If you build it, if you profit from it, you must make it safe. We do not accept unsafe toys. We do not accept unsafe cars. So we should not accept unsafe digital platforms.
We warned about these risks years ago, through parliamentary inquiries and reports that I helped lead. We are now finally seeing courts overseas holding platforms accountable for failing to protect children. The direction is clear. The time for delay is over.
There is one principle that should guide everything we do in this place. Protecting children must come first—not second, not after consultation, not after harm occurs; it must come first. That principle must apply not just online but in our communities as well.
This brings me to an issue that continues to concern families on the Sunshine Coast: the presence of a New Zealand national accused of serious sexual offences involving children. Authorities were reportedly warned about this individual as early as 2024. And yet action appears to have only followed public exposure—and that was by A Current Affair.
The Minister for Home Affairs has clear powers under section 501 of the Migration Act—powers that exist for exactly this situation. So the questions are simple: Why has this not already been done? Why is this individual still in my community, when he should have been deported? No community should ever become a refuge for someone seeking to avoid facing serious allegations involving children.
Today's briefing shows us something very important. This problem is not unsolvable. Prevention is possible. Technology is ready. The law can be strengthened. The power already exists. What we need is action from this government.