Senate debates

Monday, 24 November 2025

Bills

Online Safety and Other Legislation Amendment (My Face, My Rights) Bill 2025; Second Reading

3:41 pm

Photo of David PocockDavid Pocock (ACT, Independent) Share this | Hansard source

I move:

That this bill be now read a second time.

I seek leave to table an explanatory memorandum relating to the bill.

Leave granted.

I table an explanatory memorandum and seek leave to have the second reading speech incorporated in Hansard.

Leave granted.

The speech read as follows—

We are moving into an unprecedented moment in human history. For generations, the tools we created changed the world around us. How we farm, how we travel, how we communicate, but they didn't touch the fundamental boundaries of our own identity. Today, that boundary is dissolving. We are now on the precipice of a technological shift so profound that it challenges not only how we live, work, and communicate, but who we are.

Artificial Intelligence (AI) is beginning to touch almost every aspect of our society. From the way children play and learn, to the security of our elections, to how businesses operate, there is almost no part of modern life that will remain untouched. But in the midst of this sweeping transformation, we must pause and recognise something simple but essential: in a world where anything can be fabricated, authenticity becomes even more precious. And protecting the individual identity of Australians is more important than ever.

Every wrinkle we have, every scar we carry, every sound we make, every facial expression, every gesture—these are not digital assets or interchangeable features. They are the biography of a human life. Our faces, our voices, our likeness: these belong to each of us. We cannot allow them to belong to, be unwillingly commercialised or exploited by whoever has the most powerful software or the least regard for consent.

Until very recently, this wasn't something we had to worry about. No matter how skilled the impersonator or how advanced the visual effects, humans could still tell the difference between imitation and reality. AI has changed that rapidly and dramatically. We are now confronted with images, audio, and video so lifelike that even experts can struggle to tell what is real and what is artificial. Deepfakes that our subconscious remembers and responds to even when the deception is pointed out to our conscious mind. And as the technology improves and costs to access it come down, the barrier to misuse gets lower and the potential for serious harm becomes exponentially higher.

In 2023, technologist and human rights advocate Sam Gregory said, "It doesn't take many seconds of your voice, many images of your face, to fake you, and the realism keeps increasing." We are now living in a world where this capacity is being used to deceive, to humiliate, to exploit, and to profit.

This bill—the Online Safety and Other Legislation Amendment (My Face, My Rights) Bill 2025—is a response to that reality. It is grounded in a simple principle: your face is yours; your voice is yours; your likeness is yours. And when these are taken without your consent, used to mislead or harm you, or turned into a product for someone else's gain, there must be consequences.

The bill provides additional protections and avenues for recourse lacking in our current legislative and regulatory framework.

It strengthens the Online Safety Act 2021 by creating a dedicated complaints system for deepfake material. It gives the eSafety Commissioner new powers to require platforms, hosting services, and individual users to remove non-consensual deepfake content. The bill also introduces civil penalties for when someone posts deepfake material without consent. Where a platform refuses to remove harmful fabricated content, the Commissioner can direct them to act quickly. These measures build on the successful systems already in place for cyberbullying, cyber-abuse, and image-based abuse, but recognise that deepfakes present a new kind of threat that requires a stronger, more tailored response.

The bill also amends the Privacy Act 1988 to establish a new cause of action for the wrongful use or disclosure of deepfake material. This gives individuals a direct right to take action in court when their likeness has been abused. They can seek injunctions, damages, or other appropriate remedies. Crucially, the action does not require proof of financial loss, because the harm caused by deepfakes is often emotional, reputational, or psychological. For someone whose identity has been misused, the violation itself is the injury.

Together, these reforms recognise that AI generated deepfakes can be used to inflict extraordinary harm: impersonation scams that trick family members out of life savings; fabricated videos designed to ruin reputations; fake political footage intended to mislead voters; or, as we are seeing all too often, sexualised deepfakes, predominantly of women and girls used for humiliation, coercion, or abuse.

These harms are not hypothetical. They are happening right now in Australia to people who often have no recourse, no avenue for removal, and no meaningful legal protection. This bill changes that.

It creates a uniform definition of deepfake material. It establishes clear standards for consent, ensuring it must be voluntary, express, and informed. It sets out responsibilities for service providers and introduces enforcement mechanisms that reflect the speed at which content spreads. And it recognises that the dignity, privacy, and autonomy of individuals must be protected not just in the physical world, but in the digital world where so much of our life now exists.

The bill helps to align our domestic law with Australia's international human rights obligations, including the International Covenant on Civil and Political Rights and the Convention on the Rights of the Child.

Importantly, the bill also contains safeguards. It preserves legitimate uses of manipulated or artificial content, by journalists acting under professional standards, by law enforcement and intelligence agencies acting in good faith, and by those engaging in genuine medical or scientific purposes. It ensures that while we protect individuals from harm, we also uphold freedom of expression and the public interest in a balanced and proportionate way.

This is not about banning satire, art, innovation, or political commentary. It is about stopping harm, stopping exploitation, and stopping the misuse of a person's identity against their will. It is about drawing a clear ethical line in a rapidly changing technological landscape.

The singular nature of each of us, the absolute uniqueness of our voice, our face, our presence, is part of what it means to be human. No one should be able to take that from us. No one should be able to hijack who we are, distort it, and deploy it for harm or profit. And no Australian should be forced to navigate that harm alone, without protection or means of redress.

This bill acknowledges a simple but profound truth: in the age of AI, our likeness is part of our identity, and our identity deserves protection.

With this bill, we take an important step toward ensuring that Australians have the rights, the remedies, and the protections they need in a world where the line between real and artificial grows thinner every day.

I commend the bill to the Senate.

I seek leave to continue my remarks later.

Leave granted; debate adjourned.

Comments

No comments