Meta to begin booting teens of Facebook and Instagram ahead of social media ban for under-16s


Tech giant Meta will begrudgingly start booting teens from its hugely popular social media platforms in preparation for Australia’s impending minimum age laws taking effect.

Children under 16 will be blocked from using Facebook, Instagram and Threads — as well as Snapchat, TikTok, X (formerly Twitter) YouTube, Kick and Reddit — when the world-leading ban comes into play on December 10.

It will be a big transition, with a reported 1.5 million profiles needing to be deactivated.

Know the news with the 7NEWS app: Download today Arrow

Snapchat reported it has about 440,000 monthly users aged 13-15, Instagram about 350,000 in the same cohort, YouTube about 325,000 and TikTok about 200,000.

These figures are likely an “underestimation of the true numbers”, eSafety Commissioner Julie Inman Grant said earlier this year.

From December 4, Meta will begin blocking access to established Instagram, Threads and Facebook accounts for teens aged under 16, and stop the creation of new ones.

“We’ll begin sending notifications today to give those affected a heads-up that they will soon lose access, and will have 14 days to download their memories or delete their account,” Meta said on Thursday.

“They can also update their contact information so we can get in touch and help them regain access once they turn 16.”

Young Facebook and Instagram users will be notified soon that their accounts will be closed down.Young Facebook and Instagram users will be notified soon that their accounts will be closed down.
Young Facebook and Instagram users will be notified soon that their accounts will be closed down. Credit: Joel Carrett (AAP) and Meta

Companies face fines of up to $49.5 million if they fail to take “reasonable steps” to stop under-16s using their platforms.

Meta said it has systems in place to ensure ongoing compliance, including requiring age verification if someone tries to change their birth date to someone who is over 16.

Online gaming and standalone messaging apps are among services that are excluded under the legislation, with teens able to retain access to Facebook Messenger.

Discord, Steam, YouTube Kids, Lego Play and WhatsApp are not subject to age restrictions.

Gaming platform Roblox was excluded too but announced on Wednesday it will ask users to verify their age following concerns it was “becoming a playground for paedophiles”.

Users will only be allowed to chat on the platform with their group and adjacent age groups to reduce the risk of inappropriate conversations.

Parents’ approval

Meta stands in the government’s corner when it comes to creating safe online experiences but argued that cutting teens off from their friends and communities “isn’t the answer”.

It has called for legislation requiring parental approval whenever a child under 16 downloads an app.

“Teens are resourceful, and may attempt to circumvent age assurance measures to access restricted services,” Meta said.

“Realistically, we can only do so much to determine age without requiring everyone to provide a government ID — which isn’t safe, poses significant privacy risks, and could lead to identity theft.

“Parents could approve the download and verify their teen’s age when setting up their child’s phone, eliminating the need for teens to verify their age multiple times across different apps.

“We’re committed to meeting our compliance obligations and are taking the necessary steps to comply with the law.

“That said, we hope to continue engaging constructively with the Australian government to find a better way forward.”

Not everyone is on board the changes, which is set to impact more than 1.5 million Australian accounts.Not everyone is on board the changes, which is set to impact more than 1.5 million Australian accounts.
Not everyone is on board the changes, which is set to impact more than 1.5 million Australian accounts. Credit: AAP

The government has argued the minimum age laws will protect young Australians from “pressures and risks that users can be exposed to while logged in”.

“There’s a time and place for social media in Australia, but there’s not a place for predatory algorithms, harmful content and toxic popularity meters manipulating Australian children,” Communications Minister Anika Wells said.

“Online platforms can target children with chilling control. We are mandating they use that sophisticated technology to protect them.

“I have met with major social media platforms … so they understand there is no excuse for failure in implementing this law.”

PM Anthony Albanese said parents “who have gone through tragic circumstances” had also lobbied for change.

Not everyone is on board, however. An influencer family with millions of followers is leaving Australia and cited the upcoming restrictions as a key factor in their decision.

The chief executive of AI startup Omniscient said fines of close to $50 million for systemic breaches were insufficient for billion-dollar companies to encourage compliance.

The bigger concern for platforms was whether the ban would spread globally, which would add complexity for compliance and impact revenue, according to Stephen Scheeler, who led Facebook’s Australian office from 2013 to 2017.

“If Australia is the only country that does this, it will make no difference,” Scheeler said.

“If you’re impeding the ability of people between 13 to 16 from being on a platform, my guess is probably that 15 per cent of users would be in that category … so they’re losing that amount of logged-in users.

“If that happened globally, that’s a substantial hit on revenue.”

While only a handful of platforms will initially be required to abide by age-restriction laws, the eSafety Commissioner said assessments will be ongoing and the list will be dynamic, meaning some could be added or even taken off.

– With AAP



Source link

spot_imgspot_img

Subscribe

Related articles

spot_imgspot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here

14 − four =