Home News Social media ban for under 16s passes Parliament

Social media ban for under 16s passes Parliament

by


Social media ban for under 16s passes Parliament

A ban on children under the age of 16 from using social media is now a step closer after the Federal Government’s ‘Online Safety Amendment’ passed the upper house of parliament, 34 votes to 19 

The move, announced in November, is in response to growing concerns over social media’s impact on youth mental health and academic outcomes. 

A growing body of research has found that adolescents who spend more than three hours per day on social media face double the risk of experiencing poor mental health outcomes, including symptoms of depression and anxiety. 

The new laws place the onus on social media platforms – not young people or their parents – to take reasonable steps to prevent Australians under 16 years of age from having accounts, and ensures systemic breaches will see platforms face fines of up to $49.5m. 

The minimum age will apply to ‘age-restricted social media platforms’ as defined in the Bill, which includes Snapchat, TikTok, Facebook, Instagram, X and others. 

In a statement today, Prime Minister Anthony Albanese said the bill ensures that the law is “responsive to the ever-evolving nature of technology, while enabling continued access to websites and apps that are primarily for the purposes of education and health support. 

“Social media is doing social harm to our kids. We’ve called time on it,” Albanese said today. “We want our kids to have a childhood and parents to know we have their backs.” 

Minister for Communications, Michelle Rowland said the passage of the legislation reflects the Federal Government’s “resolute commitment” to keeping children safe online. 

“We’ve listened to young people, parents and carers, experts and industry in developing these landmark laws to ensure they are centred on protecting young people – not isolating them,” Rowland said. 

“Good government is about facing up to difficult reform – we know these laws are novel, but to do nothing is simply not an option.” 

Questions hang over government’s reforms 

The Digital Industry Group Inc (DIGI), the industry association for companies that invest in online safety, privacy, cyber security, and the digital economy, said the quick introduction and passage of the legislation means no one can confidently explain how it will work in practice. 

“The community and platforms are in the dark about what exactly is required of them,” DIGI Managing Director Sunita Bose said. 

“This law has passed despite advice from Australia’s Human Rights Commissioner, the Children’s Commissioner, the Privacy Commissioner, 100 youth experts in an open letter to the Prime Minister and a coalition of mental health organisations. The consultation process must be robust in addressing their concerns.” 

A spokesperson for Meta, the company that runs the popular social media apps Facebook/Messenger, Instagram, WhatsApp, and Threads, also has reservations about the government’s legislation.  

“Naturally, we respect the laws decided by the Australian Parliament. However, we are concerned about the process which rushed the legislation through while failing to properly consider the evidence, what industry already does to ensure age-appropriate experiences, and the voices of young people,” a Meta spokesperson said. 

“Last week, the Parliament’s own committee said the “causal link with social media appears unclear,” with respect to the mental health of young Australians, whereas this week the rushed Senate Committee report pronounced that social media caused harm.” 

The spokesperson said this “demonstrates the lack of evidence underpinning the legislation and suggests this was a predetermined process.” 

“The task now turns to ensuring there is productive consultation on all rules associated with the Bill to ensure a technically feasible outcome that does not place an onerous burden on parents and teens and a commitment that rules will be consistently applied across all social apps used by teens,” the spokesperson said. 

“One simple option is age verification at the operating system and app store level which  reduces the burden and minimises the amount of sensitive information shared.” 



Source link

You may also like