Senate debates

Wednesday, 16 June 2021


Online Safety Bill 2021, Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021; Second Reading

11:14 am

Photo of Louise PrattLouise Pratt (WA, Australian Labor Party, Shadow Assistant Minister for Manufacturing) Share this | | Hansard source

I rise today to speak to the Online Safety Bill 2021 and the Online Safety Bill (Transitional Provisions and Consequential Amendments) Bill 2021, which, as we know, seek to create a new online safety framework for Australians, an updated regulatory framework that consolidates and builds on the existing legislative scheme in our nation for online safety. In the Labor Party, we have a strong track record of supporting online safety for Australians and we support these bills. We support measures to consolidate, update and enhance online safety laws for Australians. For many years, Australians have been protected by laws to support online safety, and it is important that these laws be kept up to date.

The Online Safety Bill responds to the independent review of online safety laws conducted by Lynelle Briggs, which was reported to government in October of 2018. It's clear that Australia's online safety laws should be brought together in one modernised act and that industry and government should be less reactive and more responsive when it comes to online safety in our nation. So we support this principle of consolidating existing safety database online safety laws into this new framework. For example, the Online Safety Bill retains and replicates provisions in the Enhancing Online Safety Act 2015, which protects Australians from online harms such as the non-consensual sharing of intimate images, a scheme that Labor is proud to have led calls for. It also reflects a modernised online content scheme to replace schemes in schedules 5 and 7 of the Broadcasting Services Act 1992, in order to address harmful online content such as refused-classification material, and it updates various elements, including by setting new industry benchmarks. We support new elements of the Online Safety Bill, including the creation of a novel complaints based removal notice scheme for cyberabuse perpetrated against an adult and the articulation of a core set of basic online safety expectations to improve and promote online safety for Australians.

However, that said, we very much acknowledge that there are some significant concerns with the Online Safety Bill as drafted, and we indeed share some of these concerns. There are concerns around the process and the government's handling of the development of this bill. There was a long delay—years, in fact—in releasing the exposure draft of the legislation, only to be followed by the rushed introduction of this bill into the parliament only eight business days after consultation on the exposure draft concluded. This has significantly undermined confidence in relation to the consultation process. A number of stakeholders were concerned that submissions had not been considered properly and are unsure as to the operation of this bill.

There are also significant concerns as to the substance of this legislation. These are points that have been well made by stakeholders, including concerns about consultation, transparency and review mechanisms among other things. We note there are some concerns with provisions in the Online Safety Bill which are, in fact, already the law of the land here in our nation, and it's disappointing that the government was unable to foster a clear, shared understanding of the elements of this bill that consolidate existing longstanding law.

We in Labor have sought to have constructive good-faith negotiations in addressing concerns with the Online Safety Bill. We did not oppose these bills in the House, on the basis that the government amendments would be forthcoming. Since then, we are have engaged with the government in a constructive good-faith way in order to gain an understanding and address concerns with these bills. Overall, we report that this engagement with government has been productive. We have appreciated the attention of the minister and his staff, as well as officials of the department and the commissioner and her staff, to Labor's concerns and suggestions. Some of the concerns have been addressed with proposed government amendments to the bill, as well as the supplementary EM. We welcome these amendments and understand a further addendum to the EM will be forthcoming, which we also welcome.

In conjunction with government amendments, some of Labor's concerns have been addressed with clarification from the government as to the operation of the bill. This has been useful. Hopefully it has served to clarify the government's understanding of the regime as well. However, some of our concerns have not been taken up and addressed by government and, therefore, we will be moving amendments in this place to strengthen transparency and review oversight of the commissioner in administering the online safety framework. In the spirit of this bipartisanship, in which online safety has historically been approached in this place, we encourage the government to support or at least not oppose these amendments.

We acknowledge the various bill scrutiny processes that have run and note the report of the Senate Scrutiny of Bills Committee, as well as the report of the Parliamentary Joint Committee on Human Rights, both of which made constructive suggestions. We accept the need for the commissioner to have flexibility in administering the framework. In return, however, it is important the government accept the commensurate need for greater transparency, oversight and review. There is an important balance to be found here between free speech and the protections against certain kinds of speech, and this can be complex. We are concerned that this bill represents a significant increase in the eSafety Commissioner's discretion to remove material without commensurate checks and balances.

The government is correct to expect digital platforms to offer more in terms of transparency but, indeed, so must the government be prepared to provide transparency around decision-making, particularly on matters that engage with human rights in our country. So while supportive of a scheme for adult cyberabuse, Labor finds it curious that a government that has made repeated attempts to repeal Section 18C of the Racial Discrimination Act on the grounds it unduly restricts free speech, despite the availability of defences in 18D, is now seeking to pass a bill that empowers the eSafety Commissioner with discretion to determine matters of speech in relation to adult cyberbullying without greater checks or balances or operational clarity. Labor is concerned that the adult cyberabuse scheme could, in the wrong hands, be used to stifle legitimate debate and freedom of expression, given the test for adult cyberabuse is material that is menacing, harassing or offensive.

I draw attention now to Dr Anne Aly's—the member for Cowan—remarks during the debate in the House. She said:

Imagine a scenario where somebody is trolling with racist remarks and gets called out for it, gets called a racist, and the person, the troll, takes offence to that and reports it and instead of the racist remark being removed, the remarks that are calling out racism get removed instead … It's a very likely scenario that somebody who is trolling another individual with racist commentary and gets called out for that racist commentary can claim that they are being bullied and harassed and take action against the person who has called them out.

Also consider the case of John Barilaro, the New South Wales Deputy Premier, who is reported to have pursued YouTube comedian Friendlyjeordies with charges of stalking and intimidation. According to reports, detectives from New South Wales police fixated persons unit, acting on a complaint by Barilaro, arrested Kristo Langker at his family home in Dulwich Hill on 4 June and charged him with two offences. He was charged with two counts of stalking and intimidating with an intent to cause fear or physical or mental harm. Here we see that, given the provisions in the adult cyberabuse scheme go to related concepts of menace, harass or offend, it is simply not beyond the realm of contemplation to imagine a politician asserting that a journalist, a satirist or a comedian might fall foul of these provisions in the adult cyberabuse scheme, even with clause 233 on the implied freedom of political communication. So we believe there can and must be greater transparency for review and oversight to ensure that this scheme is working to get the balance of human rights and freedom of expression right.

We put these concerns to the government during the Senate inquiry and during the debate in the House as part of good-faith negotiations and we welcome the government amendments that have been circulated which strengthen transparency and review, with greater detail in annual reporting as well as an internal review process. But we believe these need to go further, and that's why we are moving amendments, including for an ACMA review. We'll move an amendment to provide a pathway for independent review by ACMA where, with full discretion, ACMA may review decisions of the commissioner and report to the minister, with the report tabled in parliament. The commissioner is an officer of ACMA, and ACMA provides such a review pathway for decisions of the ABC and SBS, so this would be an important oversight mechanism. We'll move an amendment to provide greater detail in the commissioner's annual reporting to include information referrals—including to industry and end users, which account for much of the commissioner's approach—as well as categories of harm, dealt with in complaints, formal notices and informal referrals akin to the categories supplied by the Human Rights Commission in the reporting of complaints and investigations.

We'll move amendments to formalise the advisory committee arrangements for the commissioner. Currently, the commission has an advisory committee constituted on an informal basis, with representatives from academia, industry and civil society. As with ACMA, we believe that these should be formal in law to provide multistakeholder engagement and oversight. We appreciate that the make-up of the current committee could be improved with expertise to inform matters that engage with human rights and freedom of expression, such as the Human Rights Commission, ethnic and religious groups, disability groups, consumer groups, public-interest media and communications law experts. We think there needs to be independent review of the novel adult cyberabuse scheme in and after its first year of operation, with the whole bill subject to review in three years. We will also seek to move amendments to formalise consultation requirements around restricted access systems under the Online Content Scheme, as well as consumer representation in the development of industry codes and standards. This aligns with other provisions of the bill.

We note concerns about the regulation of private messaging services. We canvassed the concerns with government and interrogated their operation as well as the review process for industry around these provisions. We understand that the government will clarify the operation of this provision in an addendum to the EM, and this is an important clarification. We note that the current law regulates private messaging services and that a significant proportion of cyberbullying occurs in these services. On balance, given the flexibility that platforms have in warning end users or suspending accounts of end users who fail to accord with the platform's terms of use, and given the clarification from government around the intended operation of this element, we're satisfied that there are appropriate checks and balances on this power. I will go into these amendments in more detail during the committee stage.

In conclusion, we support steps to improve the online safety of Australians. I'm going to jump to the moving of my second reading amendment, and to that end I move the second reading amendment as circulated in my name:

At the end of the motion, add: ", but the Senate:

(a) notes that:

(i) it has been almost three years since the October 2018 Report of the Statutory Review of the Enhancing Online Safety Act 2015 and the Review of Schedules 5 and 7 to the Broadcasting Services Act 1992 (Online Content Scheme) by Lynelle Briggs AO recommended a new Online Safety Act,

(ii) since then, the Minister for Communications, Urban Infrastructure, Cities and the Arts has repeatedly spruiked the non-existent Online Safety Act in response to concerns about online harms, including online hate speech and racism in Australia following the Christchurch terrorist atrocity and graphic online content in the wake of a self-harm video circulating on social media,

(iii) the Minister was slow in releasing the exposure draft of legislation for consultation then rushed the introduction of the bill into Parliament, eight business days after consultation on the exposure draft concluded, which undermined stakeholder confidence in the consultation process,

(iv) the Review of Australian classification regulation is delayed and has fallen out of step with the bill, and

(v) the Government still has not released the report of an expert working group, convened by the eSafety Commissioner and participated in by industry; and

(b) calls on the Government to release the report of the expert working group convened by the eSafety Commissioner so that the broad range of stakeholders supportive of online safety may have the benefit of the work".

11:29 am

Photo of Nick McKimNick McKim (Tasmania, Australian Greens) Share this | | Hansard source

The bills before us today, the Online Safety Bill 2021 and the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021, establish a framework to regulate harmful online content in Australia. This is an extremely important issue that the Australian Greens acknowledge needs to be addressed. But it should be addressed carefully and in a considered fashion, and our submission is that the government is not addressing it in a careful enough way or in a considered enough fashion.

I want to be clear that the Australian Greens do support the establishment of a framework that provides for the quick take-down of inappropriate online content in Australia. I also want to be clear that the Australian Greens condemn the online bullying and the online abuse of Australians, including child abuse and the non-consensual sharing of intimate images. These are important issues. It is necessary that we reform the law in these areas but it is equally necessary that we get it right.

Before I speak about the details of the bills themselves, I want to speak about the process the bills have gone through to get to where they are today. Unfortunately, I have to say that the government is ramming these bills through this parliament without adequate consideration and adequate scrutiny. For example, the government flagged its intention to table the bills before consultation on the exposure draft had closed. The bills were then introduced into the House with just a few technical amendments to the exposure draft before the around 400 submissions to the exposure draft were made public. The bills were referred to the Environment and Communications Legislation Committee the next day, to report just two short weeks later. Then the government sought to have the bills listed as non-contro so that they could be quickly and quietly waved through the Senate. Then the government moved to exempt the bills from the usual requirements that regulate how quickly bills can be brought on for debate in the Senate. As an example of the indecent haste with which the government has operated, these bills were so rushed that the government is needing to use amendments to fix typos in the original bill. These bills, which are intended to protect people from cyberbullies, from cyberabuse, from the non-consensual sharing of intimate images and from violent and extremist materials—commendable objectives—are being rushed through this place. We want the parliament to get these bills right, and we believe we represent in that case most Australians, who want this parliament to get these bills right.

The Australian Greens do not support the bills in the current form in which they are presented to the Senate. We are aware of the amendments that have been circulated by the government that, if passed, would address some of the concerns the Greens have raised in regard to this legislation. We still have concerns about the huge regulatory and discretionary powers these bills confer on the eSafety Commissioner, a single person who is not elected by the Australian people to that position. We are concerned about the lack of oversight, in some cases, of the eSafety Commissioner.

These bills, I think everyone would agree, provide significant powers to the eSafety Commissioner, but, as drafted, provide for limited appeals against decisions of the commissioner. No internal review process was established in the bills as originally drafted and presented, and the risk there was that people or businesses who were acting lawfully and were adversely affected by decisions of the commissioner could have been left with no business or no income while potentially costly appeals slowly worked their way through the AAT and, ultimately, the court system.

Again, we acknowledge that the government has cobbled together an amendment to provide for an internal review process, but we note that this amendment doesn't actually create, of itself, an internal review process. What it does is require the commissioner to create such a process. So, in a way, the parliament is being asked to sign a blank cheque in regard to the creation of that process, because we have no possibility of knowing what kind of process the eSafety Commissioner will establish as we stand here and debate this bill today.

The problem for the government, of course, is that, because of its self-imposed deadlines and the fact that those were so short, it hasn't actually had the time to come up with such a review mechanism to include the provisions of that mechanism in this legislation. Effectively, it has handballed it off into the never-never. As well as lacking a quick and practical review and appeal process with appropriate remedies, the bills, as tabled, also lack robust transparency reporting. Again, the Greens acknowledge that this has been addressed to some extent by the government's amendments. As I said, there's the use of amendments to correct typos in the original legislation—and, for folks playing along at home, the amendment that the government has tabled changes 'cyber-bulling' to 'cyber-bullying'. Of course that needs to be fixed up, but the fact that we needed an amendment to fix up a typo is symptomatic of the rush to legislate in an extremely important but also extremely complex policy area.

These bills test the intention of a person posting online material by inquiring whether an ordinary, reasonable person would think it likely that the material was intended to cause harm. But this test considers no evidence other than the material itself. This is potentially problematic, because the context around violent imagery and content is crucial to understanding the purpose of disseminating that content and the level of any harm caused.

The Australian Greens also have significant concerns about the bills' powers being used, despite some limited protections, to block and take down public-interest news or campaigns that involve violent imagery—for example, campaigns against police brutality. The eSafety Commissioner has made public comment with regard to some of these issues, and we acknowledge those comments and thank her for them. But the person who is currently the eSafety Commissioner will not be the eSafety Commissioner forever, and it should be incumbent on parliaments to make sure that we legislate not just with one particular person in one particular position in mind but with a clear-eyed focus on the need to make sure that protections will exist past the incumbency of any one person in any particular position.

Under this legislation, the commissioner will be guided by the National Classification Code. That code is currently being reviewed and it doesn't provide appropriate classifications for online media. Therefore, in the view of the Greens, it is not fit for the purposes of this legislation. Classifications based on the code may capture non-violent sexual activity, including nudity and implied or simulated sexual activity, as well as materials considered unsuitable for a minor to see. The concern that the Greens have in this area is that the bills fail to differentiate between actual harm and subjective, moralistic constructions of harm. This would allow the commissioner to act as a moral censor and the commissioner's powers to be weaponised by people and organisations with moral or political agendas. Again I acknowledge comments made by the commissioner with regard to these issues, and again I point out that the person who is currently the eSafety Commissioner will not be the eSafety Commissioner forever.

The bills will also, inevitably, lead to online platforms resorting to automated processes based on algorithms and artificial intelligence to identify and remove content that could attract penalties. The use of AI and algorithms in similar circumstances in places like the US has been extremely controversial, to say the least, and we are concerned that the use of those technologies could lead to disproportionate outcomes, like blanket bans, even if that is not the intent of the commissioner. The use of algorithms and AI will also risk importing racial bias into the regulation of Australia's online content ecosystem. We know that that is a risk because that is exactly what has happened in the US under similar controversial laws, such as the Fight Online Sex Trafficking Act, the FOSTA, and the Stop Enabling Sex Traffickers Act, the SESTA.

While a complaint based framework for the non-consensual sharing of intimate images is very important and absolutely supported in principle by the Australian Greens, we want to make sure, again, that this complex area is legislated with full care and full consideration. The definition of an intimate image provided by these bills does not clearly state whether it applies at the moment an image is taken, which could have serious implications for the utility of the scheme for transgender folk in Australia.

Of particular concern to the Greens and many submitters to the Senate inquiry is the potentially devastating effect the bills will have on sex workers and adult content creators operating lawful businesses that provide lawful products and services, many of whom have migrated online as a result of the COVID-19 pandemic. We are worried about the potential for this framework to be used to drive people from the internet back into the streets or ultimately into insolvency. We are concerned about the unintended consequences that could be harmful to sex workers and adult businesses and to the broader community. Under the bills, as argued by Scarlet Alliance, sex workers will become more vulnerable as they potentially lose access to income, safety tools and strategies and vital peer connections. We are also concerned that the bills fail to promote the maximum safety and privacy protections that they could.

I say again: the Greens absolutely commend the stated objectives of these bills, to keep women, children and the broader Australian community safe in online environments. We absolutely support women's and children's rights, and we have staunchly opposed extremism and radicalisation, particularly right-wing extremism and radicalisation. But we need to make sure that we don't protect one set of rights by trampling over other rights. Bills this significant, targeted at problems this complex, should receive full and proper scrutiny in this place. That is what the government, unfortunately, is seeking to deny. That is why the Greens will be moving a second reading amendment, calling for the bills to be withdrawn and redrafted to take account of the many and significant concerns raised by submitters. I now move the second reading amendment standing in my name:

Omit all words after "That", substitute "the bills be withdrawn and the Senate:

(a) notes the Government's rush to legislate; and

(b) calls on the Government to redraft the bills to take account of concerns raised by submitters to the Environment and Communications Legislation Committee inquiry into the bill, including:

  (i) use of the National Classification Code, which is currently under review,

  (ii) the potential for elements of the bill to be used against lawful online content and content creators,

  (iii) inadequate rights of review for businesses and individuals whose content is wrongly blocked or removed, either by the Commissioner or online platforms,

  (iv) inadequate transparency and accountability regarding discretionary decisions made by a single, unelected officer,

  (v) powers covering restricted access/encryption services, and

  (vi) potential significant and detrimental effects on sex workers".

We will also move other substantive amendments in committee, including for a statutory review of the bill's powers in two years. I will speak more to those amendments when the time comes.

In conclusion, I say again: these bills are incredibly important and incredibly significant, but they deal with an extremely complex area of policy. This chamber should have taken the time to make sure that we get it right and that we avoid, to the greatest extent possible, any unintended consequences flowing from this legislation. The Australian Greens are disappointed that this chamber has not been given that opportunity.

11:44 am

Photo of Andrew BraggAndrew Bragg (NSW, Liberal Party) Share this | | Hansard source

In rising to make some comments about the Online Safety Bill 2021 I think it is important to note that the trend in this area of regulation and big tech, as it's widely become known, is only going to increase. Big tech has changed our lives for the better and it has also brought new risks which need to be managed by policymakers. That is what this bill is an attempt to do. It is an attempt to intervene into the market and into the way that these schemes operate to protect people. That is an important starting point because the philosophy that you bring to these debates is important.

Our view has always been that it is important to inject policy and regulation wherever there is consumer detriment. This is not a capital and labour thing; this is a consumer protection thing. The government has built up a decent record here of being prepared to intervene where there is consumer detriment or where there is broader community detriment in relation to technology companies. It's been widely said that tech companies are the railroads or oil companies of the 21st century. These companies have so much power. They have done much good, but they have the potential to do much bad.

Over the last few months we've seen world-leading media bargaining code legislation. We have led the world in trying to ensure that publishers and public interest journalists are paid for their work. We have also been prepared to intervene to ensure that consumers are protected. Social media really is the wild west. I am not in favour of regulation for regulation's sake, but there is so much content on social media which already contravenes our laws—many of them state laws, I should say. But I don't think we should be reluctant about moving into this territory of ensuring that big tech organisations have an appropriate level of regulation.

The conduct of the engagement of the big tech organisations during the media bargaining code legislation debate was probably the worst lobbying I have ever seen in my life. People would be aware that large companies threatened to leave Australia and threatened to do all sorts of things. I think once you have a global company threatening a democracy, threatening a country, the country has to win, because we can't get into a situation where companies are so large that they are effectively able to boss a democracy around. We have been here before. People can go back and look at what Theodore Roosevelt said about all these things. We borrowed much of the antitrust principles in Australia as well through our own competition law.

So this is a very welcome initiative. What it really does is bring to bear a simple, single framework for online safety. I think setting out the basic online safety expectations and arming the eSafety Commissioner with the power to effectively ensure that people are protected will be broadly welcomed. The concerns that I have here would be widely shared across this parliament—that there is bullying and abuse that goes on online. It is rampant at times. It is leading to people doing all sorts of dreadful things. I think that the commitments we made back in the election campaign to increase penalties for the use of a carriage service to menace, harass or cause offence from three to five years is really important. I think it's one of the most important commitments that we took to the last election. People are being bullied. People are being abused. Often it occurs under the cover of anonymity. There's nowhere else in our world, in our society, where you can, under the cover of darkness, pretend to be someone else and basically attack people—you can use all manner of things to attack people, and to try to destroy their lives—and it's just not good enough. You can't do it in broad daylight. You can't do it in any other theatre in life. The policy here, to effectively reign in carriage services, is a very important one.

I won't bore you with the talking points, but the point really is that this will be a clear framework. It is important that we respect the system of parliamentary oversight, and it is very welcome that the basic online safety expectations will be set by the minister and be disallowable by either chamber of the parliament. In this place, in this debate, just as you see in any other similar debate, we don't want to pass down to regulators rule-making capacity. These are important rules. We're balancing civil liberties against the desire to protect people, and these are judgements that should be exercised by a minister and they should be disallowable, and that is the intention. The eSafety Commissioner will do a great job here—I have a lot of confidence in the incumbent—but the framework of having the minister setting the regulation is an important one.

Ultimately, we want to have a system where Australia is not a backwater. And while we want to see technology used—technology is good—we also want to make sure people are protected. What we don't want to see is people being bullied and harassed online. We don't want to see people attacking people under the cover of anonymity, because they're too gutless to say who they really are. That's not the sort of debate we want to have. That's not the sort of country we want to have. I, personally, want to see that sort of behaviour reigned in. Social media is the Wild West. Anything goes, and it is not good enough for people to use social media platforms to break the laws of Australia.

We have laws in New South Wales against anti-incitement and defamation, and social media should not provide a back door to breaking the law. It's very important that this scheme is ultimately going to protect people from cyberbullying, from image abuse, and is done in a way which balances out the privacy concerns that are going to be legitimately held and that the rules are made by the minister in that way. This is a very important piece of legislation. It is utterly consistent with our Liberal philosophy to intervene where it's in the public interest, and it builds on the media bargaining code, which was a very important win for Australia. It was so important that we prevailed once the big tech companies started to threaten our country. We cannot have a situation where large tech companies, which have more power than almost any other non-state actor in the world today, can bully and defeat a country. Australia has led the world again. This is a very important bill, and I commend the bill to the Senate.

11:53 am

Photo of Helen PolleyHelen Polley (Tasmania, Australian Labor Party) Share this | | Hansard source

I rise to speak on the Online Safety Bill 2021 and the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021. These bills are important if we're able to tackle the sharing of illegal images and to address head-on the prevalence of revenge porn within our communities. Governments are almost always starting from behind when we're talking about technology. Technology advances travel so quickly, and governments are trying to catch their breath and keep ahead of the curve. Ideally, these bills continue to build on existing legislation schemes for online safety.

Labor welcomes funding to support online safety, with a particular focus on women and children. However, it is disappointing that this government cites the introduction of the Online Safety Bill 2021 that is now before parliament as evidence of its commitment to women's safety while it also permits the member for Bowman, Andrew Laming MP, to remain as a member of the LNP. He sits in the Liberal Party room, he sits on the government benches and he remains the chair of a parliamentary committee with the support of the Liberal and National parties.

As stated in a number of media reports recently, Mr Laming has a long history of trolling and abusing his own constituents on Facebook, which has undermined the safety and mental health of at least one woman. Mr Laming's conduct online is precisely what the proposed new adult cyberabuse scheme contained in this bill is designed to address—menacing, harassing or offensive material online. I think it's important that this is on the record and that those opposite are silent as to the actions of Mr Laming.

I recently read an article about a study by anthropologists about the use of smartphones and technology and how our phones are changing the human experience at a personal level and a group level. The anthropologists argue that our devices have become an extension of ourselves. Smartphones are now a transportable home. We use them to organise our schedules, find entertainment and communicate with family and friends. They are where we anchor our sense of identity and self. 'We have become human snails carrying our homes in our pockets,' they write, and:

The smartphone is perhaps the first object to challenge the house itself (and possibly also the workplace) in terms of the amount of time we dwell in it while awake …

It's an interesting argument. Smartphones have become such a pervasive part of our lives. And let's be honest: who can compete with a smartphone? The interactive capabilities of a smartphone on an intellectual level are superior to any human, in terms of the sheer amount and variety of information that is accessible from this device. The article goes on to argue that there needs to be a new etiquette to manage the digital age, because balancing a physical reality with a digital life causes frustration, disappointment or even offence, for those left staring at someone sitting hunched over their device.

We know that, while there is a prevalence of young Australians doing so, Australians of all ages are using their smartphones to transmit inappropriate images—images that can ruin people's lives. The Enhancing Online Safety Act 2015 operates to protect Australians from online harms such as non-consensual sharing of intimate images, but laws must also improve and ensure that the eSafety Commissioner has the ability to ensure that images are removed as quickly as possible—an advancement that all should applaud.

Labor supports measures to consolidate, update and enhance online safety laws for Australians. Online safety is an area of bipartisanship, and Labor is looking for bipartisanship regarding this policy area. It is too important not to get it right.

There are some concerns around due process, appeals, oversight and transparency requirements in relation to the novel adult cyberabuse scheme, given the important free-speech implications, and as to whether the powers given to the eSafety Commissioner could subvert the framework of safeguards put in place under the telecommunications assistance and access regime, including its warrant processes and the prohibitions it includes on actions that would introduce systemic weaknesses in the communications scheme.

Labor notes that it has been almost 2½ years since the Briggs review of October 2018 recommended a single, up-to-date online safety act. Given the significant passage of time, it is disappointing that the Morrison government has proved incapable of conducting a process that satisfies stakeholders in terms of process and substance.

The government has been spruiking this new online safety act for almost two years. In the lead-up to the May 2019 federal election, the Morrison government promised to introduce a new online safety act. In September 2019, the minister for communications spruiked the new online safety act in answer to questions about what the government was doing to keep Australians safe online, including in relation to the rise of right-wing extremists, online hate speech and racism in Australia following the Christchurch terrorist atrocity.

A year later, in September 2020, the minister again spruiked the non-existent online safety act in response to questions about what the government was doing to curb graphic content on social media platforms in the wake of a self-harm video on Facebook and TikTok. The minister's October 2020 op-ed kept the promise of a new online safety act alive, while his department at Senate estimates put the delay down to 'pressures on drafting resources'.

A number of stakeholders are concerned that the Morrison government introduced the bill into parliament on 24 February 2021, only eight days after consultation on the exposure draft of legislation concluded on 14 February 2021. The short time frame at the end of this drawn-out process has undermined confidence in the government's exposure draft consultation process, with a number of stakeholders concerned that submissions have not been considered properly. At the time the bill was referred to this inquiry on 25 February 2021, Labor senators had no visibility of either the number of submissions that had been made on the exposure draft or the range or nature of concerns raised in those submissions. In evidence to the inquiry, the department confirmed that 376 submissions on the exposure draft were received, uploaded and made available publicly only the day prior to the inquiry hearings. The department further advised that it had assessed the submissions and identified 56 issues that warranted further consideration by the minister, and that seven amendments of a technical nature were made to the bill as a result of that consideration. We note that the review of Australia's classification regulations—for which public consultation closed over a year ago, on 19 February 2020—is delayed and has fallen out of step with this reform process as a result.

On the major issue of free speech, Labor understands that the balance between free speech and protection against certain kinds of speech is a complex endeavour. We are concerned that this bill represents a significant increase in the eSafety Commissioner's discretion to remove material, without commensurate requirements for due process, appeals or transparency over and above Senate estimates annual reporting and AAT appeals. While supportive of a scheme relating to adult cyberabuse, we on this side find it curious that a government that has made repeated attempts to repeal section 18C of the Racial Discrimination Act on the grounds that it unduly restricts free speech, despite the availability of defences in section 18D, is now seeking to rush through a bill that empowers the eSafety Commissioner with discretion to determine matters of speech in relation to adult cyberbullying without greater checks and balances.

The Morrison government talks a big game about its expectations of social media platforms, yet, to date, it has failed to do its job by updating Australia's online safety laws. While the government is right to expect digital platforms to offer more in terms of transparency, so too must the government be prepared to provide transparency around decision-making, particularly on matters that engage with human rights. For this reason, we on this side recommended that the government consider further amendments to clarify the bill in terms of its scope and to strengthen due process, appeals, oversights and transparency requirements, given the important free-speech and digital-rights considerations it engages.

I would encourage those opposite to stop slouching over their smartphones and have a good, intent, face-to-face conversation with those in this chamber, so that we can protect more Australians from the misuse of smartphones. If they do that, we can make this a truly bipartisan improvement to the lives of all Australians, to ensure technology enriches our lives instead of determining our lives. We have all read on social media or heard through other forms of media about the devastating effect that these faceless people have and the anonymity that they use to bully, harass and embarrass people through social media. As with all new interventions—and we know, as I said, how quickly technology moves—we have to always ensure that people are protected from bullies, from people who want to try and intimidate them, from people who want to embarrass them, and from people who want to use this technology for illegal activities and for abusing people.

I urge people in this chamber, and in particular the government, to support our amendment. I therefore support the bill.

12:05 pm

Photo of Larissa WatersLarissa Waters (Queensland, Australian Greens) Share this | | Hansard source

I rise to speak on the Online Safety Bill 2021 and the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021. The Australian Greens agree that online safety is a significant issue and an important concern for Australian internet users. You don't have to be a parent to share concerns about children having access to graphic online content, the prevalence of sharing of intimate photos, exposure to cyberbullying—all without the respectful relationships and consent education that children need to navigate it.

I'm also keenly aware of the misogynistic abuse that people experience online. As the Greens spokesperson for women, I understand the national crisis of violence against women and their children and the growing scourge of online and technology facilitated abuse. I hear story after story of coercive and violent ex-partners continuing their abuse online: sending violent images, sharing intimate images without consent, bombarding social media with threatening messages and harassing their victims via email. Any abuse of women and children is completely unacceptable. The reported trebling of cyberabuse and image based abuse during COVID is a salutary warning that abusers will use all tools at their disposal to perpetuate their control over others.

The Australian Greens believe that we must protect vulnerable internet users and stamp out abuse and violence, online and offline. I also acknowledge the work being done by the eSafety Commissioner on these issues and the recognition in the budget of the need to tackle the rise in technology facilitated abuse. However, this bill, as it is currently drafted, is not the right solution to this very real problem. This bill has been rushed, and it gives far too much unfettered discretion to the unelected eSafety Commissioner. The bill as it stands has largely ignored the concerns raised in the over 350 submissions received in response to rushed consultation that happened over the Christmas break. Many of those submissions highlight that what is proposed goes beyond what is needed to address core concerns, while still failing to adequately address the more insidious forms of online and technology facilitated abuse that are emerging. It represents a missed opportunity to find a path through this complex area that will achieve appropriate protections.

For example, the submission from the Women's Services Network, also known as WESNET, notes that the bill will go some way towards improving online safety:

… but in our view underestimates the ways in which perpetrators of domestic and family violence can misuse technology to harm and abuse their victims using online mechanisms.

WESNET is concerned that the Online Safety Bill may be presented as an institution to technology-facilitated abuse by surveyors of domestic and domestic violence. In reality, the misuse of technology is far broader than the coverage of this bill. The dynamics of domestic and family violence are often also more complex and multi-faceted and require a much larger and coordinated response.

The amendments that my colleague Senator McKim will move aim to correct that balance, preserving the parts of the bill that provide additional protections for vulnerable internet users, strengthening the protection for digital rights, and removing the provisions likely to be weaponised against women and undermine the overall objectives of the bill.

Submissions from WESNET and Domestic Violence Victoria raise concerns that the bill as drafted will be easily circumvented by perpetrators and could be used against survivors by making false complaints, coercing children to make unsubstantiated complaints, reporting dating profile information from a former partner or by impersonating the survivor online. This is not the stuff of fantasy. The eSafety Commissioner has reported that, in more than a quarter of family violence cases, perpetrators pretend to be the adult victim-survivor online. The bill introduces wide powers for the eSafety Commissioner but does not balance these with adequate rights of appeal or ways to prevent vexatious abuse of the complaints process.

WESNET and DV Victoria also caution that 'ordinary person' and 'serious harm' tests set up in the bill fail to recognise the unseen harm done by perpetrators who are controlling and are skilled at using victims-survivors' personal experiences and fears against them. In a recent national survey undertaken by WESNET and Curtin University, many frontline experts working with victims-survivors observed that threats are often covert. They're targeted and they're harmful and they have meaning for the victim that doesn't seem abusive to another person. What might seem to an outsider like a benign message can in fact be a targeted threat. A request to pack the kids' Medicare card for their weekend with their dad 'in case they get hurt' might be seen as a responsible reminder if you ignore the family violence context in which this is a coded threat to harm the children. It is these complexities that are not adequately addressed by the current bill as it's currently drafted. This bill may be an attempt to improve online safety for victims-survivors of abuse, but, without amendment and redrafting, it does not create the measures needed to stamp out this abuse, and it has harmful unintended consequences on digital rights and online work.

12:11 pm

Photo of Sarah HendersonSarah Henderson (Victoria, Liberal Party) Share this | | Hansard source

It's my great pleasure to rise and speak on the Online Safety Bill 2021 and the Online Safety Bill (Transitional Provisions and Consequential Amendments) Bill 2021. As of this year there are 22.82 million internet users in Australia, which is 89 per cent of the population. More than 20 million of those people are active social media users. On average, Australians spend six hours and 13 minutes per day on the internet. That's almost 40 per cent of their waking hours. A full third of that time is spent on social media. The online world is an incredible place. It is an ineradicable part of the everyday lives of millions of Australians. It is a world where we can find everything from obscure mathematical theorems to the latest in fashion. It is a world in which we can engage in conversations with others thousands of kilometres away in forgotten parts of the world in forgotten cities. It is a world of over 30 trillion unique webpages.

In such a dizzying labyrinth of texts, images and videos, it is easy to be overwhelmed by the sheer volume of information and opinion thrown at us every time we enter it. For an increasing number of Australians, the experience is harrowing and destructive. Over 20 per cent of young people experience abuse online, and the statistics for the population more generally are no better. Let me just say that again: over 20 per cent of all young people going online experience some form of abuse. Alarmingly, some 87 per cent of young people have witnessed cyberbullying online. Again I say that number: 87 per cent. Unlike abuse on the street, cyberabuse and harassment can happen at any time and be broadcast to thousands, if not millions, of strangers. Cyberbullying also shows human beings at their most petty and their most shallow and, at times, their most destructive. Seventy-two per cent of victims of cyberbullying are targeted because of the way they look. Given these statistics, it has never been more important to ensure that Australians stay safe online. We should enjoy the same standards online as we do in the town square, and the online safety bill guarantees exactly that.

I note with some concern the criticism of other senators opposite, from the Greens and the Labor Party. While they are supporting this bill, I note their criticisms. But I say very strongly today that I'm incredibly proud that the Morrison government is the government fixing this issue. We did not see this sort of action from Labor when it was in power. We did not see the Labor Party combating these issues. Along with all the other work we've done to protect the safety of women, in particular, and children and to combat domestic violence, these bills are a very important part of the suite of measures that our government has taken to protect people in our community.

The bills establish a set of basic online safety expectations for industry, and include mandatory transparent reporting requirements which allow the eSafety Commissioner to require online services to provide specific information about online harms. These include responses to material depicting abhorrent, violent conduct and volumetric attacks in which organised digital lynch mobs can overwhelm a victim with abuse. The bills also include a strengthened cyberbullying scheme for Australian children, building on the government's existing scheme for protecting children online. The bills set out a new cyberabuse scheme to remove serious forms of online abuse from the internet and, very importantly, this is backed up by strong civil penalties. Similarly, internet content hosts face new requirements to take down image based abuse within 24 hours, on pain of penalty.

The eSafety Commissioner's powers—and let me say that the eSafety Commissioner is doing an extraordinary job in our community in protecting people, particularly children, online—have been expanded. The commissioner can now use a rapid website-blocking power to block material depicting abhorrent, violent conduct during an online crisis, such as what occurred during the Christchurch massacre when, disgracefully and disgustingly, Facebook failed to remove that abhorrent content in any reasonable time frame. The commissioner's information-gathering powers have also been expanded so that the commissioner can unmask the identities of anonymous online accounts being used to bully, abuse and humiliate innocent people.

Statistics tell one kind of story, but it is a remote and abstract one. The concrete reality is that most of us know someone who has been affected by online cyberbullying, cyberabuse or humiliation. This is an issue particularly close to my heart as well, and I have stood in this chamber previously and spoken about the trolling to which I have been subjected by my political opponents. In my particular experience, which happened over a period of some four years, I was subjected to shocking abuse, humiliation and false claims by people running a number of anonymous Twitter accounts. They made some really distressing claims, including attaching my face—my head shot—to the photo of a woman who has the name Sarah Henderson. She is a woman from Texas who had been charged and has now been convicted of killing her two children. I can say that my local political opponents on these anonymous Twitter accounts thought it was okay to compare me with a woman by the same name who had killed her two children. How revolting and disgusting is that?

I'm very pleased that Geelong police took this on. They investigated this conduct and sought a warrant from Twitter. Equally disgustingly, Twitter refused to comply with that warrant to provide Geelong police with any information about the cowardly people behind those anonymous accounts. I will not give up on this issue; I will not give up on continuing to hold the people behind these anonymous Twitter accounts to account for what they thought was okay to do. As I say, it was absolutely disgraceful that Twitter refused to comply with a police warrant, despite the head of government relations telling me that Twitter would have no issue in doing so once they had a police warrant or a court order.

So I am very, very pleased with the government's work in relation to these bills. We must do everything we can to arrest this abhorrent and corrosive phenomenon. I am a very, very proud supporter of this legislation. It is necessary. It is effective. It provides immediate response powers. It holds many people to account for their conduct online, and it obviously supports the government's determination to protect Australians from the tsunami of horrific material which can occur online. I commend these bills to the Senate.

12:20 pm

Photo of Nita GreenNita Green (Queensland, Australian Labor Party) Share this | | Hansard source

I am pleased to have the opportunity to speak on the Online Safety Bill 2021 and the transitional provisions that go with it. I was one of the members of the Senate committee that performed an inquiry into this bill and I have taken a keen interest in the development of this legislation. The bill will establish a complaints system for cyberbullying material targeted at Australian children, the non-consensual sharing of intimate images and cyberabuse material targeted at Australian adults, and it will establish the Online Content Scheme. It will also provide for the minister to determine basic online safety expectations for social media services, relevant electronic services and designated internet services.

It also creates a new complaints based system which allows for a removal notice scheme for cyberabuse targeted at an adult. It creates a specific and targeted power for the eSafety Commission to request or require an internet service provider to block access to materials that promote, incite, instruct in or depict abhorrent, violent conduct. These notices would be for time limited periods.

Labor strongly support the objective of this bill. We can't be clearer than that. But we do have genuine concerns about how long it took for this bill to be drafted after the minister was out there championing this bill for so long without it actually being drafted or introduced into this parliament. The consultation period, as we've heard, was lacking. The fact that the government didn't take into account many of the recommendations from the department or stakeholders continues to be a concern. Many stakeholders have raised multiple concerns about this bill, including: the functions and the powers of the oversight of the eSafety Commissioner, who is an unelected official; the clarity and breadth of the basic online safety expectations; the services in the scope of the online content regulation schemes; the clarity and proportionality of the blocking scheme; and the appropriateness of the Online Content Scheme. There have also been concerns about the response time required to remove notices if there is an objection.

The safety of Australians online is of paramount importance. I don't think any senator in this chamber would disagree with that. For many years Australians have been protected by laws to support online safety, and it is crucial that these laws are kept up to date and improved to keep up with technological changes. Labor support the consolidation of online safety laws into a new framework and we also support the new elements of the bill, which are the elements that attracted so much attention through the inquiry phase.

We live in an era that is increasingly spent online, which demands measures to mitigate online harm be kept up to date. Fundamentally, I understand the importance of the need to balance the right protecting against harmful speech and the right protecting free speech. That's not always a black-and-white thing for legislators to do. So I want to thank all of the people who contributed to the Senate inquiry and the senators who came along to that inquiry and asked a lot of detailed questions of the commissioner and of stakeholders about how this bill would impact on people, how it would impact on free speech and how it would be delivered. As senators, we seek to protect Australians from any harm, but we must also be aware of the impact that these provisions could have on freedom of expression. That is why you will see senators around the chamber today raising these questions in good faith whilst also supporting the objectives of this bill.

One of the issues raised with me during the Senate inquiry into this bill was the lack of transparency around the use of unprecedented powers by the commissioner. I understand that the government has now circulated, or will be circulating, amendments requiring the commissioner to report in their annual report the frequency of the use of these powers. This is a real win for the Senate inquiry process and for the submitters that contributed to the Senate inquiry, because that was a gaping hole in this legislation in relation to transparency and accountability to this parliament. Labor supports a holistic, multifaceted, layered approach, including safety by design, adult supervision, technological measures and education of both adults and children. While Labor will be supporting this bill, we're certainly not happy with how it has been delivered.

As I said, we acknowledge the concerns raised by stakeholders. As usual, the government has been big on announcement and slow on delivery. You won't be surprised to hear that it was almost 2½ years ago that the Briggs review recommended a single up-to-date online safety act, and here we are. The government was spruiking an online safety act the entire time, but we're only now getting to the point where we're able to debate this legislation in the parliament. In the lead-up to the May 2019 federal election, the Morrison government promised to introduce a bill, but we've had to wait two years before they could achieve that. It's interesting because, in an extraordinary move, the minister was actually taking credit for the act without having actually passed it through parliament first. In 2019, when faced with the rise of right-wing extremism, online hate speech and racism in Australia, an online safety act was the answer, but it hadn't actually been passed through parliament yet. The minister also used the bill in response to questions about what the government was going to do to curb graphic online social media content. In October 2020, when the minister published an op-ed about the new online safety act, the department was in budget estimates, admitting further delays to the bill because of pressures on drafting resources. So, while the government are congratulating themselves today, it has taken a long time to get here—too long, Labor would say.

During the consultation period, the department confirmed that there were 376 submissions on the exposure draft, but there were 56 issues with the bill identified by the department itself. Only seven amendments, of a technical nature, were made as a result. I note that one of those amendments presented to this chamber is actually to correct a spelling mistake in the word 'bullying'. When asked about the key operational aspects of the bill, the eSafety Commissioner called it 'a sausage still being made'. There's still a lot we don't know about how this bill will be implemented. Some 2½ years later, the government has proved incapable of conducting a process that satisfies stakeholders in terms of both process and substance.

What is even more disappointing is how the government cannot seem to hold its own party room to the same standards set out by the legislation we are voting on today. Those opposite have pointed out the government's so-called commitment to women's safety online. Even the minister himself was quoted as saying:

"The Morrison Government wants Australians to engage online confidently -- to work, communicate and be entertained, without fear of being viciously trolled or exposed to harmful content," …

But, when it comes to turning those words into action, to taking real action, well, the government is completely silent. Just recently, in my home state of Queensland, the very proud member for Redlands, Kim Richards, stood in the state parliament and tabled evidence of harassment of her and other women online by the government's very own member for Bowman, Mr Laming. She called on the LNP to do the right thing with regard to Mr Laming—to turn their words into action. Mr Laming has a long and proven track record of trolling and abusing his constituents online—

Photo of Paul ScarrPaul Scarr (Queensland, Liberal Party) Share this | | Hansard source

A point of order, Mr Acting Deputy President: my friend, the senator from Queensland, is making personal reflections upon a member in the other place, the honourable member for Bowman, and I would ask her to withdraw those reflections, which I note are the subject of action being taken by the member for Bowman in order to protect his reputation.

Photo of James McGrathJames McGrath (Queensland, Liberal National Party) Share this | | Hansard source

Senator Green, if you could withdraw—or are you on the point of order?

Photo of Nita GreenNita Green (Queensland, Australian Labor Party) Share this | | Hansard source

On the point of order. I'm actually referencing comments that Mr Laming himself made on Facebook. He said on Facebook, 'I control them.' So I'm actually referring to comments that he made and which are publicly available. I am being very careful about what I'm saying. If Mr Laming has called himself a 'Facebook troll', my submission to you, Mr Acting Deputy President, is that I should be able to refer to those comments.

Photo of James McGrathJames McGrath (Queensland, Liberal National Party) Share this | | Hansard source

On that basis, there is no point of order.

Photo of Nita GreenNita Green (Queensland, Australian Labor Party) Share this | | Hansard source

Mr Laming himself has stated that he is a Facebook troll. He freely admits to running over 37 Facebook pages. Mr Laming has made social media posts inciting stalking behaviour, offering cash prizes to get the member for Redlands to answer questions and in one instance actually followed the member for Redlands into a public park and took pictures of her. On Valentine's Day, Mr Laming incited more stalking behaviour via a Facebook post, once again offering a cash prize for details on the location of the member for Redlands, a female politician, and the people in her company. The member for Redlands even sought advice—this is what Kim Richards, the member for Redlands, was forced to do because of Mr Laming's behaviour—about the invasive and personal behaviour of Mr Laming and, as a result, had to install CCTV cameras and an electronic security gate. She was also advised, because of the detriment that she was receiving, because of the behaviour of Mr Laming, that she should speak with a psychologist. That is how severe Mr Laming's harassment has been of this woman.

Mr Laming's inappropriate online content is exactly what the government's cyberabuse scheme is aiming to stamp out. In fact, I put this question to the department in estimates. The department secretary told me that the scheme would provide a pathway for people to make complaints about exactly this kind of behaviour, saying:

… it is fair to say that absolutely the intention behind this new bill, when it's passed, is to provide an avenue for people experiencing that kind of activity—to have a pathway to make complaints and have someone able to take some action.

So we have this curious situation where the government are here talking about how important online safety is and particularly how important it is for women's safety online for this bill to be passed, yet they will not take any action against Mr Laming, despite the evidence, despite the public comments by Mr Laming himself that he is a Facebook troll. The hypocrisy of this government on the one hand championing this bill but also, on the other hand, championing Mr Laming is absolutely galling. It should not be happening and the government know it. But they're too afraid to step up and do something to make sure that Mr Laming isn't able to harass any more women online. He continues to be a committee chair. He continues to get the support of the government members in the House of Representatives. Every single day when something is put to them, where they have the choice to support Mr Laming or to support a motion calling out his behaviour, they want to support Mr Laming.

I can tell you that there are many more women like Kim Richards. I am very proud of the member for Redlands for stepping up and talking about this in the Queensland parliament. I will talk about this every single time the government seek to talk about online safety, because, until they take action against Mr Laming—until they see that the behaviour that he has contributed to Kim Richards, the member for Redlands, a female politician, who deserves so much better from this government—then all of the things that they say about this bill and about this issue are completely hollow.

Labor supports this bill and its various elements designed to strengthen protections for Australians online. However, the bills have been delayed, and here we are again talking about how this government can believe that such important bills for Australian women should be passed today when they haven't taken action against the member for Bowman. He's still on the government backbench. He's still a chair of a parliamentary committee. He's still earning an extra $20,000 a year for that role. Senators opposite should hang their heads in shame, because they have failed to step up and do the right thing. The standard that you walk past is the standard that you accept, and you are telling the many women in Redlands who have suffered online abuse from this member that you accept that abuse.

Labor support these bills, but we do not support the behaviour of Mr Laming and we in this place are not afraid to say so. I also want to foreshadow that, because of this very curious hypocrisy, I will be moving a second reading amendment to the bills requiring the government to lead by example when it comes to keeping women safe online and to ensure that the member for Bowman is discharged from the Standing Committee on Employment, Education and Training immediately.

12:36 pm

Photo of Jordon Steele-JohnJordon Steele-John (WA, Australian Greens) Share this | | Hansard source

The Greens oppose the legislation, the Online Safety Bill 2021 and the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021, as it has been put before the Senate today, and I would like to outline for the Senate and for those following at home exactly why. In doing so, I would first like to thank my colleague Senator McKim, for his outstanding and detailed work on this legislation, and also his office, led by Andrew Perry in the policy area. I would also like to thank Noelle Martin of Western Australia, who has, for a very long time, been a great source of information and advice to me in relation to these often complex and deeply important issues.

Let's start from a place where I think everybody should be able to agree—that every person should be able to live their lives free of violence, exploitation, abuse and neglect. When we are talking about the need to safeguard this right and this expectation in relation to abuse that somebody might experience in the digital space, we often talk about a broad terminology called cyberbullying, or online abuse. What is often lost in that terminology is some of the deep complexities and different forms that abuse takes when an online environment is also involved. I'm somebody who has been aware of these issues in a personal capacity, and I have experienced some of them myself.

As a newly-elected senator, one of the first inquiries I was part of explored some of these very complicated issues. I think it's useful to break them down broadly into three categories. We have the non-consensual sharing of intimate images, then we have what we might call bullying that takes place in a cyberrealm as well as in personal interaction between individuals, and then, finally, we have online facilitated abuse. These are very distinct forms of abuse that have distinct characteristics, all of which require, funnily enough, a bespoke policy response from legislators at both the state and federal level.

The non-consensual sharing of intimate images is a disgusting and disgraceful phenomenon in our society today. Its roots are deeply set in the disrespectful and dehumanising ways in which women, people of colour and people who are queer are treated, and what standards of behaviour are accepted and expected. To deal with those issues, we need policy responses that deal with both the outcome and the root cause.

In relation to cyberbullying, it is often denoted by different social contexts and relationships between those that are the subject of the form of abuse or bullying and those that are the perpetrators. It was put very clearly to me during one of these inquiry processes that what is critical to understand when analysing cyberbullying that it is the social phenomenon of bullying moving into a digital realm. Those who are victims and perpetrators of it often know each other, and there's a high likelihood that the victim of a cyberbullying incident may well also be the perpetrator of a subsequent incident and vice versa. Often, cyberbullying takes place in a close-knit social environment, like a school, and will involve somebody below the age of 18. So, again, the policy response required is specific, bespoke and balancing the reality that we may well be legislating in an area where people below the age of 18 are involved and where there's a need, ultimately, to solve these problems both at the end and at the root cause—often by involving an entire school, entire community approach to solve that problem.

And all the while we must recognise that the vast majority of kids and children at school do not engage in cyberbullying behaviour. The best estimate is that around 40 per cent of kids experience or perpetrate these types of abuses. While that's a large percentage, we shouldn't build a picture of an entire generation behaving in this way. And we should also ground it in the reality that before there was a digital space for this to occur it just happened interpersonally. It's been a factor in school life and adolescence for a very long time.

The third and final category, online facilitated abuse, is denoted and often marked out by the absence of a relationship between the perpetrator and the victim—the perpetrator is not known to the victim and vice versa. The victim is selected at random and is often part of a broader cohort which the perpetrator is targeting, and they're often above the age of 18. Again, they utilise multiple platforms, multiple identities and therefore require—you guessed it!—bespoke, nuanced and well-thought-through legislative responses that address this and hold people to account for their actions while also dealing with the deep-rooted causes for why people behave in this way, particularly to groups such as women, queer folk or people of colour. These are the groups which overwhelmingly experience this online facilitated abuse.

People who work in this area understand these social complexities and the bespoke policy responses needed. If you speak to them, they're really willing to give you this information. I have learnt so much from working in this area over this period of time. There are folk who are absolutely committed and who deeply and genuinely want to see action in all three of these areas to make sure that people are not subjected to violence, abuse or exploitation in any context or setting. Those people and their genuinely held beliefs are often driven by a lived experience with this type of phenomenon, and they should be honoured, respected and appropriately engaged with. We—well, I say 'we', but the major parties as the parties of power in this place—should do them the honour and respect always of engaging thoroughly and in detail with them.

This is not what has occurred in relation to the creation of these bills. The government have cited a response to a piece of evidence that was given to them as the result of an inquiry report in 2018, and there have been many years intervening since then. The reality of this legislation before this chamber right now is that it's totally and utterly undercooked; it has been rammed through in the smallest possible time. First, the government published an exposure draft of the legislation, and they allowed merely a number of weeks for submitters to give their evidence in relation to what the final piece of legislation should look like. Then the government introduced the legislation and attempted to pass it through during a section of this chamber's time when it would be open to no amendment nor to a vote of opposition. When they figured out they couldn't do that—

Debate interrupted.