House debates

Monday, 15 February 2021

Private Members' Business

Social Media Platforms

10:38 am

Photo of Anne WebsterAnne Webster (Mallee, National Party) Share this | Hansard source

I move:

That this House:

(1) is alarmed by the undue pain and distress experienced by Australians who are defamed, bullied or otherwise harassed on social media platforms;

(2) notes that:

(a) global technology companies which provide social media platforms inadequately monitor platforms for defamation, bullying or harassment of individuals; and

(b) global technology companies are slow to respond when complaints are made, increasing the damage to individuals;

(3) recognises that social media is a global sphere of communication in which vulnerable citizens can be unfairly targeted by individuals, with little consequence to the platform that hosts damaging content;

(4) expresses concern that current regulations do not adequately address global technology companies which control social media platforms; and

(5) calls on the Government to develop:

(a) a public regulatory framework within which decisions about removing content are made by social media platforms, to ensure community expectations around safety and free speech on social media platforms can be met; and

(b) legislation which holds social media platforms to account as publishers of the content hosted on their sites, impressing the legal responsibilities that designation entails on those platforms.

Big tech companies such as Twitter, Facebook and Google have amassed extraordinary power in the global corporate and political landscape. There is no doubt that 6 January 2021 will be remembered as a dark day in American history. The events of the day and the resulting fallout will be referenced, studied and analysed for years to come. One moment in particular—the permanent banning of Donald Trump's Twitter account—will be a watershed moment in the debate surrounding free speech and censorship on social media platforms and the question of regulation of big tech companies. Thanks to the social media platforms, we have arrived at a new reality, of 'glocalisation', where the local is merged with the global through online portals. We have become increasingly reliant on big tech companies and their services, and governments around the world have not kept pace with these transformations and their consequences. We are now working to catch up.

Freedom of speech is an inherent right that we must protect at all costs, but it is not a right to lie or incite violence. Free speech is vital to our democracy but must be limited to prevent harm. Limitations on free speech, however necessary they may be, will always be contestable. It remains a significant challenge to get these limitations right. However, the problem we face now is that big tech companies are themselves responsible for determining their own limitations. They are acting as the moral arbiters of our society, which, I argue, is the role of representative government, not a technology company. Big tech firms write their own rules and are accountable to themselves alone. This is causing serious issues for you and me as consumers.

I have personal experience of how these issues can affect people's lives. For several months in 2020, my husband and I, as well as the charity we founded to help single mothers access education, were the targets of baseless and defamatory accusations made by a conspiracy theorist on Facebook. It was unrelenting for months. Despite originating in New Zealand, the accusations were widely distributed and even reached local networks in my electorate of Mallee. My first thought was for the reputation of an essential charity that young, disadvantaged mothers in my community rely on. I was concerned that the mothers would be driven away from the service, because of these lies, and left even more vulnerable. On top of that, my husband and I were concerned for our safety, and we even installed security cameras at our home, for our peace of mind. It was an incredibly distressing time for my family.

Despite successive requests to have the damaging content removed, the posts remained public—in some cases for several months—until court proceedings got the attention of Facebook. These legal proceedings have cost us over $150,000 so far, despite a successful court case. The prospect of recouping these losses through the damages that have been awarded is also highly unlikely. Expensive civil proceedings are one of the only means of recourse currently available to people who have been defamed, bullied or harassed online. What concerns me is that many thousands of people who endure bullying and defamation online will lack the means to clear their name or protect their family. Social media enables the tarnishing of reputations and the destruction of lives, with very few avenues for justice. This is untenable, and it must change.

There are multiple ways these issues need to be addressed, and our government is working to keep Australians safe online. I welcome the strengthening of the role of the eSafety Commissioner and the provision of $39.4 million over the next three years. We're also introducing a new online safety bill. The bill includes a new adult cyber abuse scheme, which would capture abusive behaviour that has reached a criminal threshold. It would provide power for the eSafety Commissioner to direct platforms to remove abusive material within 24 hours. While these are positive measures, I believe further steps need to be taken. Social media platforms need to be held to account as publishers of the content that is hosted on their sites. If a newspaper, radio station or TV channel defamed an individual or incited violence through their publications, they could be sued or prosecuted under the full extent of the Australian law. At this point, the same is not true for Facebook, Google, Twitter, Instagram and other social media platforms.

The business models of social media giants are very similar to those of traditional news media, yet the rules governing print media, radio and television are vastly more proscriptive than those that apply to digital platforms. Traditional news media are held to a much higher standard under the law, which puts them at a commercial disadvantage to digital platforms. This isn't fair, and it doesn't provide for a competitive media industry. Social media giants hide behind the excuse that they are nothing more than a virtual town square and therefore they can't be held responsible for anything that is shouted out. But the fact is that the technology and algorithms that underpin these platforms are incredibly sophisticated. The platforms show you what they think you want to see. They are designed to keep you engaged for as long as possible. These facts alone demonstrate that big tech companies are making editorial decisions regarding the content that you see on their platform.

If big tech companies want to preserve their power to moderate and promote content on their sites, they need to be treated under the same legislative framework as traditional news media and held to account for the consequences of hosting damaging content. In addition, the government should pursue the creation of a public regulatory framework to guide the moderation of content on social media platforms to ensure community expectations around safety and free speech are met. I understand that work is under way to develop a voluntary code along these lines. At the government's direction, DIGI, an association representing the digital platforms industry in Australia, has developed a draft code. The Australian Communications and Media Authority is overseeing the development of the code, and I hope it's found to be sufficient to address the challenges we face regarding disinformation and defamation. Both measures—treating big tech companies as publishers and the introduction of a code of conduct—are essential. Holding big tech companies to account as publishers would provide an incentive for these companies to follow the code of conduct, thereby ensuring the decisions they take to moderate and promote content are in line with community expectations. I know the minister for communications will consider further measures should the voluntary code prove inadequate to address the problems we face.

Recently the member for Newcastle, Sharon Claydon, and I formed the Parliamentary Friends of Making Social Media Safe group to continue the important discussion around online safety. We are thrilled by the reception of the group so far. Seventy-five members and senators from both sides have joined the group already, which I think displays the interest and concern that so many have regarding social media. One step I've taken in my electorate is to inform young families in particular about dangers online. The internet is not just a harmless space for kids to watch YouTube. Consequently, I've drawn up a handout for the residents of Mallee about the importance of online safety. It's hot off the press and will be sent out this week to all residents in Mallee.

The progress made in global communication and interconnectedness, thanks to social media platforms, has been remarkable. With this progress comes a responsibility to ensure that people are safe when using these platforms. I am focused on fighting for change to ensure our kids and grandkids are safe online and that our society has a healthy relationship with social media, going forward.

Comments

No comments