House debates

Wednesday, 6 November 2024

Bills

Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024; Second Reading

11:45 am

Photo of Brian MitchellBrian Mitchell (Lyons, Australian Labor Party) Share this | Hansard source

There was so much wrong with that speech from the member for Longman, it would take all of my 15 minutes to address all the points. The irony is that we're standing here talking about a bill about misinformation and disinformation and we had to listen to that contribution. Nobody is banning opinions. Nobody is banning the right to express an opinion or have a say. The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 is about addressing the spread of misinformation and disinformation. It's about the prevalence of fiction being portrayed as fact.

This is not simply an issue about copy-and-paste Facebook circulars that claim your data will be protected under the Rome statute if you post certain words on your timeline. It's a serious issue that has direct implications for public safety, democracy, societal cohesion and our national economy. The fact is all sorts of very serious people in our society, from national security agencies downwards, are saying this is a serious issue that requires addressing.

This misinformation and disinformation bill is an essential piece of legislation to protect Australians from the harms caused by false and misleading information. We've always had these sorts of protections in the past, with broadcast media and print media. But social media and the internet are an entirely new beast, in the pace of it and the way algorithms are used. We are dealing with a different ecosystem for the spread of information.

Our society has always functioned in a way where broadcast media and print media have been required by society to be truthful and factual, and there are consequences when they're not. So what's at risk? The fact is, online platforms have become essential parts of our lives, connecting us, at their best, with family, friends, news and the world at large. That's a good thing. Whether it's Facebook, Instagram, Threads, X or any other platform, we use these sites to entertain, stay connected and even find out information about issues affecting us locally and across the globe. It's an information-sharing thing. This connectivity has many benefits, but it also presents real risks. Digital platforms, when unregulated, can and do spread information at speeds and scales never seen before, amplifying misleading narratives that threaten our society's wellbeing and, indeed, targeting users with even more misinformation. If you seek information about a certain subject, the algorithm says, 'Oh, this guy is really interested in that; we'll pump in more of that,' and before you know it you're down a really deep well of misinformation and conspiracy theories. Suddenly, your nice, normal next-door neighbour is a raving lunatic.

The fact is, we live in a world where misinformation can impact public health, such as during COVID-19, when false information about vaccines and the benefits of vaccines circulated so rapidly. This is not a trivial issue. Misinformation led many to question or outright refuse life-saving medical treatment. We see it even now, with young children being refused life-saving medical treatment because their parents believe the nonsense on social media saying that somehow it's not good for their kids. It has led to a distrust in government, and I'm not talking about Labor governments or any other sort of government; I'm talking about government in general. It has led to a growing distrust in government and an overarching thought that there is no social cohesion and that the only person you can trust and stand behind is yourself. It's a dangerous narrative for any society that seeks to bring people together.

The spread of misinformation also affects our democratic processes, our societal cohesion and even our physical safety. The US election that's underway right now has been rife with battles of misinformation and disinformation. We do separate it. The fact is, people are allowed to express their opinions no matter how out there they are. Nobody is denying Sky After Dark's right to express its opinion on the Right of the political scale. You've got Fox News in America doing the same, expressing their opinion. You can do that in a democratic society. There's a difference between expressing an opinion and spreading misinformation. If nothing else, the nature of the 2024 presidential race is a case in point for why this bill is so important. There's also the last presidential election in the US and the claims that it was stolen. The misinformation and disinformation emanating from that shocking assault on the Capitol, and the disinformation that has arisen from that shows that fact is under attack.

On a local level, recent misinformation spread like wildfire following attacks in Southport and at Bondi Junction. Misleading narratives and false reports spread rapidly on social media, escalating public fear and, in some cases, inciting hatred against specific minority groups who it later turned out had nothing to do with those heinous crimes. Such events make it clear that when digital platforms do not adequately respond to harmful content on their platforms, the consequences are real and deeply felt by communities across Australia.

Let's remember that broadcast platforms are responsible for the content on their platforms. Print platforms—newspapers and magazines—are responsible for content on their platforms. Digital platforms should be held to the same standard. It is vital that we address the spread of misinformation and disinformation as a priority for our democracy. The concerns of everyday Australians reflect this urgency. Research from the Australian Media Literacy Alliance's Adult media literacy in Australia report published this year indicates that 80 per cent of Australians—eight in 10!—are worried about misinformation. It is amongst the highest levels of concern reported globally and highlights a serious erosion of trust in the digital information landscape right here at home. Australians are looking to their government to take meaningful action to ensure the information they encounter online is reliable, accurate and safe. They don't want the government to buy into misinformation pushes and propaganda.

I don't deny that the digital platform industry and some specific social media platforms have taken some steps to seek to tackle misinformation and disinformation, but it has not been enough. The industry as a whole has a variety of voluntary codes. Not enough is being done. They make a lot of money in this country; they can spend more to deal with this. One example of the industry's attempts is the Australian Code of Practice on Disinformation and Misinformation. It's a code designed to tackle misinformation and disinformation. It's an initiative I welcome, but this code has some clear and undeniable limitations. The transparency reports produced under the code have been inconsistent, lacking the Australia-specific data necessary for effective monitoring and assessment. The voluntary code, while a step in the right direction, has not been enough to meet the growing threat of misinformation and disinformation online in Australia—and I don't buy the arguments about why it can't be done. The fact is if you're talking to your wife about some shoes that you like, somehow it ends up in your feed. Somehow there are ads popping up for those shoes. If they know that, they can deal with this. They're clever enough to have very sophisticated data-tracking and management systems. They can put some of that knowledge to work and deal with misinformation and disinformation. That's why this legislation is necessary.

The Australian Communications and Media Authority, ACMA, has repeatedly urged the government to introduce regulatory powers to hold digital platforms accountable. Now ACMA's not some big tsar of information and truth, which is what the member for Longman suggested. It doesn't want to sit there with a god-like status telling people what they can and can't believe. It's nothing like that. It's a trusted regulatory body. Those opposite, who are the alternative government of this country, should not be spreading distrust in government agencies.

ACMA's insights highlight the critical need for stronger enforceable standards that ensure digital platforms are transparent about how they handle misinformation and that they take concrete steps to protect users from harmful content. The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 answers that call. It amends the Broadcasting Services Act 1992 and makes consequential changes to related legislation, including the Australian Communications and Media Authority Act 2005, the Telecommunications Act 1997 and the Online Safety Act 2021. Through these amendments, the bill establishes a regulatory framework that empowers ACMA to oversee digital platforms and ensure that they responsibly manage misinformation and disinformation on their services.

The bill has three primary objectives, each carefully crafted to address community concerns. No. 1 is to require digital platforms to manage the risks of misinformation and disinformation. Under the bill, digital platforms will now be legally obligated to assess, mitigate and take responsibility for the potential harms caused by misinformation and disinformation in Australia. I repeat that these sorts of measures already exist in broadcast and print media. We're just bringing digital media into the fold. No. 2 is to increase transparency. Digital platforms will now be required to disclose the actions they are taking to combat harmful content, ensuring the public can see and understand those efforts. No. 3 is to empower users. Australians who use digital media should be equipped with the tools to critically engage with information online. By providing transparency, platforms can empower users to make informed decisions and identify false information themselves.

We know that tackling misinformation and disinformation online is everybody's job and can only be achieved by all users critically engaging in the content they receive online. I say that at a time when we see things like deepfakes emerging. Technology has become so incredibly advanced that videos can emerge online of people, just random people, purporting to be national leaders or celebrities, whoever they are, and the average user has no idea that what they are seeing and hearing is fake. There's no way to discern it, because the technology is just so good. And that spreads like wildfire. 'Look at what this political leader has said!' People believe it, and then they spread the chain. The digital platforms have to be responsible for tackling that. They have to be responsible for that.

This bill will also provide ACMA with a series of new powers to hold digital platforms accountable in fulfilling these three primary objectives. No. 1 is transparency and accountability, because transparency is essential to this legislation. Under the new framework, digital platforms are required to disclose their efforts to combat harmful misinformation and disinformation. This includes publishing a clear media literacy plan, detailing the steps they are taking to empower users in identifying and responding to false information. This requirement empowers Australians with the knowledge they need to navigate online content more confidently and critically. In addition, platforms will be required to publish policy documents on how they handle misinformation and disinformation and the results of any risk assessments they have conducted. This level of transparency is fundamental to public trust. Australians need to know that digital platforms are taking their safety and wellbeing seriously.

No. 2 is information gathering and record keeping. Another critical component of the bill is information and record-keeping powers. ACMA will have the authority to obtain information from digital platforms on their efforts to address harmful content. Platforms will be required to maintain records of their actions and may be called upon to provide these records to ACMA periodically. This creates a clear accountability trail, allowing regulators and the public to track a platform's progress over time and understand how they are evolving their strategies to combat misinformation. The bill includes protections for end users in this process. The information-gathering powers are designed to protect individuals' privacy, as they do not apply to content posted by regular users unless those individuals are employees, content moderators, fact checkers or otherwise providing services to the platform provider. The example used by the member for Longman about people being able to express an opinion about the health impact of eggs—totally safe. This approach balances the need for transparency with the protection of personal privacy.

No. 3 is the code and standard making powers. Under the bill, ACMA will have the power to register industry codes and make enforceable standards if voluntary efforts are inadequate. This ability to set standards is essential because it provides a regulatory backstop. Should voluntary measures prove ineffective, ACMA will be able to require platforms to implement specific practices. We're also safeguarding freedom of expression. I'm sure other speakers will go into detail on that.

The fact is that this bill is needed. Agencies say that it's needed. We live in a time where misinformation and disinformation are absolutely rife, where technology is providing things like deepfakes and a hostile and deliberate way for misinformation and disinformation to be spread, not as an accident but as a deliberate way to affect our democracy. This bill will help deal with that and will hold digital media platforms to the same sort of account as broadcast and print media.

Comments

No comments