Journal content

Parenting: Social media offers more controls. But do they help?

As concerns about the harmful effects of social media on teens continue to grow, platforms from Snapchat to TikTok to Instagram are launching new features they say will make their services safer and more responsive. at the age. But the changes are rarely about the elephant in the room — the algorithms pushing endless content that can send anyone, not just teenagers, down harmful rabbit holes.

The tools offer some help, such as blocking strangers from messaging children. But they also share deeper flaws, starting with the fact that teenagers can circumvent boundaries if they lie about their age. Platforms also place the burden of enforcement on parents. And they do little or nothing to filter out inappropriate and harmful content served by algorithms that can affect teens’ mental and physical well-being.

“These platforms know that their algorithms can sometimes amplify harmful content, and they take no action to stop it,” said Irene Ly, privacy advisor at nonprofit Common Sense Media. The more teens keep scrolling, the more engaged they are — and the more engaged they are, the more profitable they are for platforms, she said. “I don’t think they have too much incentive to change that.”

Take, for example, Snapchat, which on Tuesday introduced new parental controls in what it calls the “Family Center” – a tool that lets parents see who their teens are messaging, but not the content of the messages. themselves. One catch: both parents and their children must register for the service.

Nona Farahnik Yadegar, director of platform policy and social impact at Snap, likens it to parents wanting to know who their kids are dating.

If the kids are heading to a friend’s house or meeting up at the mall, she says, parents usually ask, “Hey, who are you going to meet? How do you know them?” The new tool, she said, aims to give parents “the insight they really want to have to have those conversations with their teen while maintaining the privacy and autonomy of the teenager”.

These conversations, experts agree, are important. In an ideal world, parents would regularly sit down with their children and have honest discussions about social media and the dangers and pitfalls of the online world.

But many kids use a bewildering variety of platforms, all of which are constantly evolving — and that puts the odds against parents who are expected to master and monitor controls across multiple platforms, said Josh Golin, group executive director of Fairplay children’s digital defense.

“It is far better to require platforms to make their platforms more secure by design and by default than to increase the workload of already overburdened parents,” he said.

The new controls, Golin said, also fail to fix a myriad of existing issues with Snapchat. These range from children misrepresenting their age to “compulsive use” encouraged by the app’s Snapstreak feature to cyberbullying facilitated by the disappearance of the messages that Snapchat is still famous for.

Farahnik Yadegar said Snapchat has “strong measures” to deter children from falsely claiming to be over 13. Those who lie about their age have their accounts immediately deleted, she said. Teenagers over 13 who pretend to be even older have a chance to correct their age.

Detecting such lies is not infallible, but platforms have several ways to discover the truth. For example, if a user’s friends are mostly in their early teens, it’s likely that the user is also a teenager, even if they said they were born in 1968 when they signed up. Companies are using artificial intelligence to find age mismatches. A person’s interests can also reveal their real age. And, Farahnik Yadegar pointed out, parents could also discover that their children were lying about their birthdates if they tried to turn on Parental Controls but found their teens ineligible.

Child safety and teen mental health are at the center of Democratic and Republican criticism of tech companies. States, which have been much more aggressive in regulating tech companies than the federal government, are also turning their attention to the issue. In March, several state attorneys general launched a nationwide investigation into TikTok and its possible harmful effects on the mental health of young users.

TikTok is the most popular social app used by American teens, according to a new report released Wednesday by the Pew Research Center, which found that 67% say they use the Chinese-owned video-sharing platform. The company said it’s focusing on age-appropriate experiences, noting that some features, such as direct messaging, aren’t available to younger users. He says features like a screen time management tool help young people and parents moderate the amount of time kids spend on the app and what they see. But critics note that these controls are leaky at best.

“It’s really easy for kids to try to get around these features and figure it out on their own,” Common Sense Media’s Ly said.

Instagram, which is owned by Facebook parent Meta, is the second most popular app among teens, according to Pew, with 62% saying they use it, followed by Snapchat with 59%. Not surprisingly, only 32% of teens said they had ever used Facebook, down from 71% in 2014 and 2015, according to the report.

Last fall, Frances Haugen, a former Facebook employee turned whistleblower, unveiled internal company research concluding that the social network’s attention-seeking algorithms contribute to mental and emotional health problems in adolescents. using Instagram, especially girls. This revelation led to some changes; Meta, for example, has dropped plans for an Instagram version aimed at children under 13. The company has also introduced new parental control and teen wellbeing features, such as prompting teens to take a break if they scroll too long.

Such solutions, Ly said, are “sort of tackling the problem, but essentially working around it and not getting to the root cause.”