House debates

Thursday, 4 July 2024

Adjournment

Cybersafety

4:29 pm

Photo of Zoe DanielZoe Daniel (Goldstein, Independent) Share this | | Hansard source

Last week, Victorians broke for the school holidays, and, with that, kids around our state, often including mine, will be glued to their phones for the next few weeks. While the internet and social media hold many benefits for people of all ages, the online environment inherently presents a range of risks, and Australians intuitively understand what these risks are.

One of the most urgent risks to the health of our young people online is the spread of eating disorder or heavy dieting content which can damage mental health, self-esteem and wellbeing. Eating disorders among youth aged 10 to 19 have risen by 85 per cent since 2012, roughly coinciding with the rise of social media. We can all agree that a 15-year-old comparing themselves to an influencer who partakes in extreme dieting or an excessive bodybuilding lifestyle is not normal or the kind of content that encourages healthy emotional development. In many cases, social media algorithms will recommend this kind of content to get more likes, clicks and engagement.

Another risk relates to addictive design features, which are programmed by teams of experts in Silicon Valley to keep our kids scrolling and sharing. Red-dot notifications at the top right-hand corner of apps are designed to create a sense of alert and urgency. This feature often creates a strange urge for us to open the app just to clear the backlog of notifications. The use of slot-machine-type features on Instagram and TikTok are designed to create a dopamine spike and, like real-world pokies, keep you playing.

Many new regulatory approaches to combat addiction are being tested around the world. The EU, a world leader in this space, has recently opened proceedings against TikTok over a new feature which rewards kids with a virtual currency for the more videos they watch, posts they like and friends they bring to the platform. Named the task and reward program, users can exchange this currency for rewards like Amazon vouchers, PayPal gift cards or tips for TikTok influencers.

Originally released for France and Spain, the EU's new regulation allows it to intervene against the further rollout of this feature, based on criteria like the protection of minors, advertising transparency and the risk of social media addiction. The investigation and the proceedings are ongoing. No such law exists in Australia. Here, digital platforms are largely free to test and roll out features of this kind, and the government currently lacks any power to meaningfully challenge this.

Much of the debate on social media regulation in recent months has centred on the question of whether access should be restricted by age. Do we really think that clicking 'I'm over 18' or implementing an age verification or assurance system are long-term structural solutions to counter the social harms of social media? As far as I know, the technology to make this work properly doesn't yet exist, and, arguably, our kids are too smart not to find ways around it. If they can find ways to get into a pub, they can find ways to get onto social media.

I'll await the results of the government's age assurance trial with interest. However, I would argue that a newer, systemic approach to regulating digital platforms in Australia is needed—one that focuses on and puts the onus of responsibility onto the platforms themselves, like Meta, TikTok, Snapchat and various others, to become healthier spaces for our young people to inhabit.

What is colloquially known as the algorithm is actually an array of various systems and elements which work together to keep us scrolling. To fully protect Australian adults and children from online harm, we must look under the hood and properly regulate these systems, much like the EU and the UK have done. What exactly does this look like? First, it's applying an overarching legal duty of care, just like in the hospitality, medical and professional services industries. There's no reason in my mind that this shouldn't apply to social media companies too. Second, it's requiring platforms to assess all of their systems and elements which could conceivably cause risk to children and adults, with specified risks to report against and mitigate, like mental health, addictive design features and gendered violence. These measures should be supported by healthy design features, such as the ability to turn off content recommendations and reset your algorithm whenever the user chooses. Online safety can't be safe unless it is systemic.