Albanese Government to enact social media age limits, legislation penciled for late 2024
The Albanese Government will introduce new legislation to impose age restrictions on social media access, the Prime Minister has confirmed. The national crackdown takes inspiration from South Australian Premier Peter Malinauskas’ push to ban kids under 14 from setting up social media accounts, though it’s not yet clear what ages would be captured under the federal legislation.
Discussing the landmark reforms on the Today Show program on Tuesday morning, Prime Minister Anthony Albanese made clear that the Government plans to work with states and territories to establish a national system, “because we know this is a national issue”.
“We want to get kids off their devices and onto the footy fields, onto the netball courts, into the swimming pools. We want them to have real experiences with real people. And we know that social media is causing social harm, which is why we put funding in the Budget to have a trial to make sure that we get it right,” he told host, Karl Stefanovic.
Albanese said federal legislation would be introduced in parliament “by the end of the year” to provide Australians with a “clear direction that it’s going in”.
It follows the publication over the weekend a 276-page report by former High Court chief justice Robert French outlined a legislative vehicle to enact such a ban and enforce social media companies to establish parental consent before allowing teenagers aged 14 and 15 to use their platforms.
The report, commissioned by Malinauskas in May, responds to concerns raised by parents, teachers, and experts about the negative impacts social media such as Facebook, Instagram and TikTok has on young people’s mental health, wellbeing and development.
Victoria Premier, Jacinta Allan, threw her support behinds the age limits with a brief statement posted on Facebook on Monday backing the Federal Government’s plans.
“One of the biggest things I hear from parents is they are worried about their kids on social media,” she said. “Social media can be a great thing but it’s just not a place for kids before they are ready. It harms their development and hurts their focus. It’s not just parents telling me that – it’s kids and young people too,” she stated.
“It’s like a social media tsunami they feel they can’t stop. So it’s time to give parents the power to push back -not against kids but against the tech giants that’s why we’re going to put age limits on social media. These rules won’t target parents or kids, but the tech giants because they are the ones we need to hold to account.
“Age limits will help parents at home and protect kids from harm. It won’t solve everything, but it’s the right place to start.”
Recent research by Australia’s online safety regulator, eSafety, found almost two-thirds of 14 to 17-year-olds have viewed extremely harmful content including drug abuse, suicide or self-harm, as well as gory or violent material online, while a quarter have been exposed to content promoting unhealthy eating habits.
It also revealed that around a quarter of 8 to 10-year-olds use social media at least weekly, while almost half of 11 to 13-year-olds log on at the same rate.
In a statement on the Government’s approach to children’s online safety, Communications Minister Michelle Rowland said it was “the role of Government to put the necessary guardrails in place to help keep all Australians, including young people, safe online”.
“This week, eSafety put social media platforms on notice to reveal how many children are on their sites and how they detect and block underage user. I gave eSafety the powers to demand this transparency by strengthening the Basic Online Safety Expectations (BOSE) in May” said Rowland.
“Knowing how many children are on their sites was only one part of these important changes. I also made clear Australia’s expectations that platforms must ensure the best interests of the child is at the heart of their services. This means platforms must be thinking about our children when they are making decisions about the services they offer – not just after harm occurs.”
Meanwhile, a statement from the eSafety office acknowledged the French report contained “some useful policy ideas for future regulatory reform” but said the challenges and complexities of implementation underlined the need for a national approach to be adopted, “if one can be achieved”. It urged that any separate reforms undertaken at the state level “support rather than conflict” the existing national regulatory framework under the Online Safety Act 2021 (OSA).
Based on recommendations by eSafety in its Age Verification Roadmap in 2023, the regulatory body said age assurance mechanisms were already being explored by the Department of Infrastructure, Transport, Regional Development, Communications, and the Arts.
Under the OSA, eSafety said it had already delivered a number of strong outcomes, including a requirement for digital industry associations to submit eSafety enforceable codes by the end of this year limiting access by children to a range of inappropriate content types, including pornography and self-harm content or else face potential mandatory standards.
“Compulsory codes addressing the worst harms online, including child sexual abuse material and pro-terror content, are already in place, with stringent standards focusing on a wide range of services, including messaging and file-hosting services, set to commence in December,” eSafety outlined.
The OSA is currently under independent review by Delia Rickard PSM, with recommendations expected to be delivered to Government by the end of 2024.