House debates
Wednesday, 6 November 2024
Bills
Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024; Second Reading
6:59 pm
Kylea Tink (North Sydney, Independent) Share this | Hansard source
The prevalence of misinformation and disinformation both on digital platforms and in the media more broadly is a cause of great concern not only for me and my community of North Sydney but around the world. As a result, to effectively combat this issue, democratic governments worldwide are trying to tread a fine line that balances the need to objectively deconstruct untrue and harmful content whilst protecting freedom of expression. After all, online platforms can be used for purposes that both strengthen and undermine democracy and values. They can increase access to information and facilitate the free exchange of ideas, increasing diversity of voices contributing to public discussions and allow for broader public participation in a democratic conversation. But they can also be used in ways that pose a threat to democratic processes through disinformation campaigns that undermine trust in public institutions and exacerbate divisions within society. As the Australian Human Rights Commission stated in their submission to the Senate Select Committee on Foreign Interference through Social Media:
The challenge lies in ensuring that any policy responses mitigate the risks posed by the latter, without disproportionately impacting upon the former.
That is the challenge before the parliament now—to assess whether this bill strikes an equilibrium between safeguarding against disinformation while protecting freedom of speech.
This bill has drawn a lot criticism, and I'm sure I'm not alone in receiving substantial amounts of correspondence from my community with concerns about potential implications of the bill before us and getting the balance wrong. On the one hand veering too far towards combating misinformation risks shutting down free speech, delegitimising alternative opinions and justifying censorship. It risks imposing a single view of truth, as some have written, or a chilling effect on free speech that stymies innovation and alternative thinking. On the other hand, veering too far towards protecting free expression risks ignoring the spread of disinformation, leading to political interference in elections, inciting violence and, amongst other risks, spreading severe, harmful health related information.
Disinformation has already influenced democratic processes around the world, from the 2016 US presidential election to the UK's Brexit referendum. A World Health Organization's review of studies into misinformation and health found social media propagated poor-quality health related information during the pandemic, humanitarian crises and health emergencies at an increasing rate. The report notes that such spreading of unreliable evidence on health topics amplifies and promotes unproven treatments. The COVID-19 pandemic was a prime example. A review of 14 studies conducted across different countries found that serious harm had occurred because of COVID-19 related misinformation. Cases included hydroxychloroquine overdoses, drug shortages, changing treatments of patients with rheumatic and autoimmune diseases, and panic over supplies and fuel. In just one example, a popular myth that consumption of pure alcohol could eliminate the COVID virus led to 800 deaths and more than 5,000 hospitalisations.
Yet, while the reasons to legislate are becoming increasingly evident, striking the right balance is going to be tricky. It was clear that the government's first attempt in the 2023 iteration of this bill did not hit the mark. To its credit, the government has appeared to listen to some of that criticism, but the reality is that there is much in this bill that continues to need to be addressed. The growth of digital platforms over the past two decades has fundamentally changed the way we all access information, with social media platforms now the equivalent of the modern-day public square. Indeed, according to the Genroe Australian social media statistics, in 2023 almost four in five Australians had active social media accounts and more than half of them used social media as a source of news. Yet, while these digital platforms have developed rapidly, the regulatory environment has not. There is growing consensus, then, that the new digital environment, while increasing pluralism and engagement with politics, also poses significant threats to both our society as a whole and individual safety, with a number of recent reports highlighting the harmful effects of misinformation and disinformation disseminated on digital platforms on civil society and vulnerable minority groups.
From the Australian Competition and Consumer Commission's digital platforms inquiry to the Senate Select Committee on Foreign Interference through Social Media and the Joint Standing Committee on Electoral Matters report on the conduct of the 2022 federal election, legislators have been discussing this challenge for some time. To be clear, personally I am not a fan of government legislation as a way to regulate industry. I would much rather see sectors self-regulate effectively, ensuring they are putting the needs of their consumers ahead of their own profits if necessary. But Australia's voluntary Code of Practice on Disinformation and Misinformation implemented in response to the digital platforms inquiry has not been sufficient to stop the problem because a number of the major platforms, including Reddit, Snapchat, WeChat and, as of November last year, X, are not signatories. Voluntary commitments on the part of companies have not always been implemented effectively, and a lack of transparency and consistent data means that the implementation has been difficult to measure. Ultimately, the truth is that these platforms are commercially driven and are designed to keep users on them as long as possible to attract advertisers. In this way, the environment is absolutely primed for manipulative techniques.
That brings me to the bill before us, which aims to drive these commercial entities to prioritise their end users' welfare. Just as we wouldn't let a dodgy car on the road with bad brakes, we cannot allow these businesses to continue to operate with the same poor standards. The bill proposes to do that by regulating the platforms so that they take reasonable steps to ensure that their products are not being used to spread mis- and disinformation. The bill aims to do that by imposing obligations on the digital platforms to assess the risk of mis- and disinformation on their platforms and publish a risk report, to publish their policies for managing mis- and disinformation and to publish media literacy plans to detail the measures they will take to help users better identify mis- and disinformation.
It also provides the regulator, the Australian Communications and Media Authority, henceforth called ACMA, with powers to obtain information about mis- and disinformation from digital platforms, to make rules requiring digital platforms to keep records relating to mis- and disinformation and implement a process for handling complaints and resolving disputes, to register enforceable misinformation codes that set out the measures the industry will take to reduce the risk of mis- and disinformation, and to make misinformation standards for sections of the digital platform industry if the codes do not adequately protect from misinformation.
Importantly—this is really important for people to understand—these codes and standards will be disallowable by this parliament, and the digital platforms will continue to be responsible for the content they host and promote to users. In this way, these new powers are designed to promote transparency and hold digital platforms to account for the actions they take to counter the spread of mis- and disinformation on their services, but they do not enable ACMA to engage directly with the content producer nor demand that content be removed. As made evident through the impact analysis of the bill, significant effort has been put into drafting the legislation to ensure that these powers are appropriately balanced so that any limitation on freedom of expression would be proportionate to the ultimate objective of protecting the community from serious online harm.
That being said, many still have concerns about the legislation, and they deserve to be acknowledged and answered. The most common is that this bill will unduly restrict the implied right to freedom of expression and freedom of political communication. Many are concerned that, rather than protecting our community, these powers as exercised through ACMA will ultimately mean that it will be the government of the day that decides what is and is not acceptable to say, capturing valid expression under the definition of misinformation. As one person wrote, if the last few years have demonstrated anything, it's that what is often labelled 'misinformation' may actually later be proven to be correct.
Some are also worried about the potential for the silencing of legitimate concerns or over censorship, and these concerns are shared by some very prominent human rights organisations, with the Human Rights Commission highlighting a number of deficiencies in this bill. Again, I want to reiterate that I understand these concerns. It isn't surprising that the Australian community would be worried about their right to freedom of expression, given that this right is not explicitly protected in our federal Constitution or our statutory law. Additionally, across the globe we are increasingly seeing authoritarian governments using misinformation to their advantage, and at home we are witnessing the increasing suppression of the right to protest and the prosecution of whistleblowers. Again, while I acknowledge and understand these concerns, I do believe this bill includes appropriate guardrails to ensure that, to the extent that there is a restriction on freedom of communication, it is justified and proportionate.
Importantly, the fact that these new ACMA powers are directed to digital communication platform providers and not individual end users provides me with some comfort that, while certainly not perfect, this bill is at least headed in the right direction. By focusing on incentivising digital communication platforms to have robust systems and measures in place to address mis- and disinformation, I believe this legislation turns the responsibility back onto the service providers themselves. Importantly, if the legislation had enabled ACMA or any minister or politician to have direct take-down power for individual content or accounts, I would not even be considering supporting it. But, as it's currently drafted, this reform also actually protects certain implied rights, including the right to participate in public affairs, the right to vote and be elected, the right to security of person and the rights that protect against vilification and discrimination.
Ultimately, weighing up these competing interests and rights requires an ethical framework, and this is where we are at a disadvantage as a nation. Australia is the only liberal democracy in the world without either an overarching federal bill of human rights or a human rights act. If we had that in place, arguably this conversation and debate would be far easier.
On that point, I just want to draw particular attention to what I see as a completely inconsistent stance taken by the opposition on this legislation. Only a few months ago, as the Parliamentary Joint Committee on Human Rights recommended Australia introduce a human rights act, the opposition dissented vociferously on the basis that Australians' rights are already adequately protected. Give me a break. You can't have it both ways. Either rights are adequately protected or they're not.
I've also heard concerns from my community about excessive powers granted to ACMA. As one person put it to me:
The notion of ACMA acting with such authority, effectively becoming a gatekeeper for what is allowed on our digital platforms—from public health debates to political discussions—is deeply concerning.
These concerns are valid, but I do believe they are being addressed through the increased safeguards added to this bill since that first draft back in 2023.
In particular, I want to draw people's attention to section 70 of the bill, which provides a review mechanism which includes an assessment of the impact of the bill on freedom of expression three years after commencement and 'must be conducted in a manner that provides for public consultation'. I would have liked to have seen that review being undertaken within 12 months, but, having been here for 2½ years now, I get it—things don't move as quickly here as they do in the real world.
Further to this, however, I note the bill does not stipulate that this review be conducted independently from the government. In that context, I have been working closely with the minister and her team to try and ensure this is amended in the final legislation. In addition to this, I would draw the First Nations peoples' Aboriginal corporation concerns to the minister's attention as I believe they raise important issues on how cultural knowledge systems like Indigenous health practices may be wrongly classified as misinformation. Censoring community based health perspectives could risk exacerbating a lack of trust in health institutions, reducing First Nations engagement and necessary services and limiting the freedom of First Nations communities to manage their own health and wellbeing. To ensure these concerns are appropriately addressed, Indigenous leaders must be properly consulted in their review of the bill's impact and the government should give consideration to embedding a culturally aware framework into the legislation, as per the corporation's recommendations.
In addition to this, Australia's broader plan to combat online misinformation must be improved. Digital literacy campaigns are a useful shield against inappropriate online influence, with numerous studies and reports highlighting the importance of media literacy to combat mis- and disinformation without threatening the right to freedom of expression. In the Nordic countries, for example, which consistently top the Media Literacy Index, media literacy education is a core component of the school curriculum, as are critical thinking skills and comprehensive civics education. In comparison, a recent study by the Australian News and Media Research Centre found that 68 per cent of Australians had a low or very low level of news literacy. Clearly, there's an opportunity to increase Australia's digital literacy.
Ultimately, I believe combining education with effective regulation to hold digital platforms to account would potentially drive the best outcomes for all, but just because I think that doesn't make it true. Therefore, I commend the member for Goldstein for also moving amendments that she believes would strengthen researcher access to the information within the bill so that we can actually have independent research on how misinformation spreads and how best to combat it.
In closing, the risk that mis- and disinformation poses to this country is so great that I do believe we need to start somewhere. Perfect should not be the enemy of good. So, while there is certainly room for improvement in this legislation, particularly with regard to an independent review and academic access to information, with some amendments I will be inclined to support this legislation as a first step to reduce the spread of harmful mis- and disinformation.
No comments