Social media ‘duty of care’ laws would force online giants to take preventative action on mental health harms


Social media companies would be required to take proactive steps to keep Australians safe online under a federal government plan to legislate a “Digital Duty of Care”.

It marks the latest move by the Albanese government to put responsibility on the shoulders of the social giants to ensure users, particularly children, are safe on their platforms.

The Digital Duty of Care was recommended in the yet-to-be released independent review of the Online Safety Act, handed to government last month.

It follows similar moves by the United Kingdom and European Union and would require platforms to shift from reacting to harm towards taking reasonable steps to prevent foreseeable harms.

Communications Minister Michelle Rowland said the obligations would build on existing complaint and removal schemes under the act.

“What’s required is a shift away from reacting to harms by relying on content regulation alone, and moving towards systems-based prevention, accompanied by a broadening of our perspective of what online harms are,” she said.

A woman with short hair and a black top, speaking.

Michelle Rowland says social media platforms must shift from reacting to harm towards preventing harm. (ABC News: Ian Cutmore)

Strong penalties for companies that fail to take preventative action

The new duty would be in addition to the government’s move to ban children and teens younger than 16 from using social media, announced last week.

The Albanese government has increasingly been focusing on growing online harms, including explicit and hateful content.

Ms Rowland said changes were required to put more focus on how content can harm mental health.

“The Albanese government is clear about where it stands — on the side of millions of concerned parents, children and citizens at large,” she said.

“This, as part of a growing global effort, will deliver a more systemic and preventative approach to making online services safer and healthier.

“Where platforms seriously and systemically breach their duty of care we will ensure the regulator can draw on strong penalty arrangements.”

man in suit rubbing hands together next to logo of a white x on black background

The government wants to apply strong penalties to platforms that fail to take preventative action to keep people safe. (Reuters: Gonzalo Fuentes)

The minister said the new duty would mean platforms have to continually assess and take preventative action to mitigate potential risks.

Ms Rowland is making the announcement as the government suffers further blows to another bill that aims to curb the spread of misinformation and disinformation online.

The government’s chances of passing its mis- and disinformation bill took another hit on Wednesday with independent senator David Pocock saying he would oppose the bill, leaving the government with a narrow pathway to make it law.

Digital duty follows international efforts

That global effort has included the EU’s Digital Safety Act, which has mandated that platforms pay attention in the design and operation of their platforms to ensure people aren’t harmed when using them.

Companies found to breach that act can be fined up to 6 per cent of their annual turnover, which could total hundreds of millions of dollars in the case of platforms like Meta.

The UK has also introduced laws that place the onus on platforms to prevent access to content that promotes suicide or eating disorders, as well as restricting access to pornography.

The Albanese government said its proposed changes would make Australia a world leader in online safety.



Source link

spot_imgspot_img

Subscribe

Related articles

spot_imgspot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here

18 − fourteen =