Senate debates
Wednesday, 4 March 2026
Motions
Online Safety Act 2021
9:31 am
Sarah Hanson-Young (SA, Australian Greens) Share this | Hansard source
I rise to contribute to this debate this morning put forward by Senator Payman. I want to thank Senator Payman for bringing this issue forward, because it is important. After almost 18 months since this government introduced the social media ban—a very blunt instrument—they have failed to respond to the 67 recommendations of the Online Safety Act review. These are 67 recommendations which go to the very heart of what needs to be done to make our online world safer for everybody—safer for children, safer for adults, safer for young people, safer for old people, safer women, safer for men, safer for everybody. But the government has been dragging its chain on implementing these recommendations precisely for some of the reasons that Senator Henderson has suggested. They don't really want to take on those who profit and who use the online space to peddle their fear, to peddle their hate, to peddle their misogyny all under the false claim of freedom of speech.
If you're not allowed to abuse people and threaten people and hurt people in the supermarket, in the street, at your local sporting club, in your schoolyard or in your workplace, why on earth should you be allowed to do it in the online space, which now, of course, is the public square? There are big, big forces, vested interests, pushing against making those who profit off of this hate online profitable. The big tech companies—those that are making massive profits of billions and billions and billions of dollars a year—don't want these types of laws enacted. They don't want the online safety commissioner to be able to do their job effectively, because they are making billions off seeing hate, misogyny and abuse circulate and go viral online. The entire business model of social media companies and the digital tycoons is peddled by individuals spreading hate, abuse and nastiness. It is why we need not only to ensure the online safety commissioner has more powers to do its job, crack down and hold individuals to account, and to take down damaging, abusive, dangerous content, but also to ensure, as individual users online, that we can control what comes into our feeds.
The algorithms that power our social media feeds and the online space are all designed to profit from fear, misogyny, racism, hatred and abuse. The tech companies—Elon Musk, Mark Zuckerberg and all of those big tech billionaires—have monetised fear and hatred. They do it through their algorithms. They control how viral it goes and how far that hidden abuse spreads. They turn it up or turn it down. They don't care about that end user—they don't care about us as individuals online, as people, as humans. All they care about is how much money each post, each eyeball or each click is making for them. They have become merchants of hate and fear, and they use their algorithms to profiteer.
That is why we need to implement not just these recommendations—and the government needs to stop dragging its feet—but laws that allow each individual user to control their own algorithms. We should be able to make the choice as to what we want to see in our feeds or not. We shouldn't have advertising rammed into our feeds even when it's dangerous and harmful. We shouldn't be having young people and kids bombarded with gambling advertising online, or alcohol ads, or ads that promote pornography sites, or ads that feed into young women's unhelpful and unhealthy body issues—or ads for quack medicine, phony drugs that help you get skinny or phony drugs that make you happy. We shouldn't have these companies being able to dictate what type of hurtful information is pushed into our feeds. If we don't want to see it, we should be able to turn it off. We should be in control of our own feeds. We can only do that if we regulate and make it legal for us, as customers—as the user, as the individual, as the people—to be able to control what we see and what we don't see. We should be able to turn off misogynistic posts. We should be able to turn off racist rubbish. We should be able to tune out of content that is unwanted and unwarranted. The tech bros don't want us to have control of our own feeds because they want to be able to push whatever nasty and viral content they want that is making them maximum profits at the time. They want to be able to control what goes into every single one of our phones, our tablets and our smart televisions.
This is about choice. Yes, we need to crack down on harmful and hurtful and abusive information, comments and content. People who are posting that abuse need to be held accountable—but those who are making money from this hatred are those that really need to be taken on. We have to hit them where it hurts, and that is their business model. They should not be able to ram advertising down the throats of minors online. They should not be able to control everything that comes into your feed or my feed or our children's feeds online. It is about choice.
When I hear the coalition and the coalition spokesperson stand here and make contributions about freedom of speech—what about the freedom to choose? What about the freedom to have control over our own algorithms and not just have it outsourced to billionaires that are making billions and billions and billions of dollars in profit at the expense of the safety of women and at the safety of people in vulnerable situations—those that are struggling with eating disorders or those that are struggling with racist comments and abuse or young men who just want to be young men growing up in the world and figuring out how they fit? They shouldn't be bombarded with misogynistic rubbish that tells them the only way to be a man is to be a bully and the only way to be a man is to be abusive to women. They should be able to opt out of that. They should not be bombarded with misogyny just because it makes people like Mark Zuckerberg millions and billions of dollars.
I urge the government to get on with the important steps of reform in this space. They promised to move on a duty of care bill. We still haven't seen it. Where is it? 18 months later, we've still got nothing. It is as if they introduced the social media ban for under-16s and thought: 'Oh well, job done. Move along. Nothing to see here.' Meanwhile, abuse, hatred, misogyny and racism thrive online. But they think they have done something.
There's a reason that the social media companies didn't want the government to touch their algorithms and the transparency of how they work and of what data is being used. As individual users online, we should have control of our own data. We should be able to decide whether it gets sold to advertisers. We should be able to choose who gets to see it and who doesn't. We should be able to choose what's in our feeds and what isn't in our feeds and what we want to see and what we don't. That is freedom. Spare me the tears from the coalition about freedom of speech. How about freedom to choose what is on our phones and what isn't and what Facebook, Meta, Instagram, Twitter and TikTok will make money off of from our content, our children's content, our young people's content and our mums' and dads' content? Australians should have the right to choose whether our data is used for advertising and what our algorithms show in our social media feeds. That would instantly make a big difference to the abuse that we see towards women, towards people of colour and towards minorities. It would see a huge decrease in the amount of abuse online, because there wouldn't be people making money off of it. You want to make online safer? You have got to hit them where it hurts and you have got to stop the merchants of hate and fear from being able to make money off our social media feeds.
No comments