Ireland’s Children’s Rights Alliance believes platforms shouldn’t be trusted to decide and design their own safety rules for children.
Months after introducing ‘teen accounts’ to Instagram, Meta has expanded the feature to Facebook and Messenger, automatically enrolling teenage users into accounts with added safety measures.
Once enrolled, users under 16 will require parental consent to change the settings to reduce restrictions.
The new teen accounts will be rolling out to under-18s in the US, UK, Australia and Canada before expanding to other regions “soon”.
The social media giant is also introducing additional restrictions on its Instagram teen accounts – blocking users under 16 from being able to go live on the app or turn off protections from unwanted images in direct messages without parental permission. The new updates are expected to be implemented in the next few months.
Teen accounts are private by default – users need to accept new followers and accounts that don’t follow them can’t see their content or interact with them. Teen accounts can also only be messaged by people they follow or are connected to.
These accounts also have settings to limit addiction – getting notifications telling them to leave the app after 60 minutes each day.
Meta says there are at least 54m active teen accounts on its platform globally and nearly all teenagers aged between 13 and 15 have kept their default settings on Instagram. Moreover, the company says that 94pc of parents in the US say that teen accounts would be helpful.
NGO launches online safety monitor for children
Meta’s updates come as an Irish organisation has today (9 April) introduced the country’s “first” Online Safety Monitor for children.
The monitor, which is being launched at a conference hosted by the Children’s Rights Alliance, also outlined its recommendations to the State.
The non-governmental organisation is calling on the Government to establish an individual complaints mechanism where complaints raised by children are given priority, strengthen the oversight of platforms under the EU Digital Services Act and introduce targeted initiatives to raise awareness around online harms to children – including child sexual abuse, grooming and sexual exploitation.
In addition, the monitor wants the Government to bring reforms to EU laws that combat the production, hosting access and use of Child Sexual Abuse Material (CSAM), ensuring that these laws properly address the various aspects of online CSAM including grooming, encryption, detection and secure storage of the illegal material.
“We cannot trust platforms to decide and design their own safety rules for children,” said Noeline Blackwell, the online safety coordinator with the Children’s Rights Alliance and a human rights lawyer.
“These platforms are inherently risky in their set-up, favouring profit over protection. While there have been significant strides in recent years to end this era of self-regulation, there are gaps that ultimately put children at risk,” she added.
Ireland introduced the Online Safety and Media Regulation Act in 2022 and adopted the Online Safety Code late last year – albeit after much delay and a fine.
The Code requires platforms to prevent children from encountering pornography or violent content, as well as have appropriate age verification measures in place.
However, the Children’s Rights Alliance says that minors are still being subjected to an unnecessary level of online harm.
“Last year, it was estimated that over 300m children globally were victims of online sexual exploitation. That is 10 cases every second. We have reached a grim milestone, and yet, EU regulation to address this has stalled,” Blackwell said.
She explained that the Code gives platforms “too much scope” to determine their own safety standards, adding that there are no restrictions around recommender algorithms that platforms have designed to “feed children harmful content”.
“If laws and regulations fail to keep pace with the digital world, it is children and young people who pay the price,” she added.
Last month, the UK launched investigations into how social media platforms TikTok and Reddit and the image sharing and hosting platform Imgur protect children’s privacy in the country.
Don’t miss out on the knowledge you need to succeed. Sign up for theDaily Brief, Silicon Republic’s digest of need-to-know sci-tech news.