Facebook has said it will be introducing several features including prompting teenagers to take a break using its photo-sharing app Instagram, and “nudging” teens if they are repeatedly looking at the same content that is not conducive to their well-being.
News of the controls came in the aftermath of recent damning testimony that the company’s platforms harm children.
California-based Facebook is also planning to introduce new controls for adults of teenagers on an optional basis so that parents or guardians can supervise what their youngsters are doing online.
These initiatives come after Facebook announced late last month that it was pausing work on its Instagram For Kids project.
But critics say the plan lacks details and they are sceptical that the new features will be effective.
The new controls were outlined on Sunday by Nick Clegg, Facebook’s vice president for global affairs, who made the rounds on various Sunday news programmes in America where he was grilled about the firm’s use of algorithms as well as its role in spreading harmful misinformation ahead of the January 6 Capitol riots.
“We are constantly iterating in order to improve our products,” Mr Clegg told Dana Bash on the State Of The Union show.
“We cannot, with a wave of the wand, make everyone’s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use.”
Mr Clegg said that Facebook had invested 13 billion US dollars (£9.5 billion) over the past few years to keep the platform safe and that the company had 40,000 people working on these issues.
The flurry of interviews came after whistle-blower Frances Haugen, a former data scientist with Facebook, appeared before Congress last week to accuse the social media platform of failing to make changes to Instagram after internal research showed apparent harm to some teenagers and of being dishonest in its public fight against hate and misinformation.
Ms Haugen’s accusations were supported by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit.
Josh Golin, executive director of Fairplay, a watchdog for the children and media marketing industry, said that he did not think introducing controls to help parents supervise teenagers would be effective since many youngsters set up secret accounts any way.
He was also dubious about how effective nudging teenagers to take a break or move away from harmful content would be.
He noted that Facebook needed to show exactly how they would implement it and offer research that showed these tools were effective.
“There is tremendous reason to be sceptical,” he said.
He added that regulators needed to restrict what Facebook did with its algorithms.
Mr Golin said he also believed that Facebook should cancel its Instagram project for kids.
When Mr Clegg was grilled in interviews about the use of algorithms in amplifying misinformation ahead of the January 6 riots, he said that if Facebook removed the algorithms people would see more, not less hate speech, and more, not less misinformation.
Mr Clegg said the algorithms served as “giant spam filters”.
Speaking on a news show on Sunday, Democratic senator Amy Klobuchar said it was time to update children’s privacy laws and offer more transparency in the use of algorithms.
“I appreciate that he is willing to talk about things, but I believe the time for conversation is done,” said Ms Klobuchar, referring to Mr Clegg’s plan. “The time for action is now.”
Comments: Our rules
We want our comments to be a lively and valuable part of our community - a place where readers can debate and engage with the most important local issues. The ability to comment on our stories is a privilege, not a right, however, and that privilege may be withdrawn if it is abused or misused.
Please report any comments that break our rules.
Read the rules here