Snapchat Launches Parental Controls To Help Manage Teens’ Social Media Usage: NPR


Snapchat is rolling out new parental controls that allow parents to view their teen’s contacts and confidentially report any accounts of concern to the social media company. A child lies in bed lit by the glow of a cell phone.

Elva Etienne/Getty Images


hide title

toggle title

Elva Etienne/Getty Images


Snapchat is rolling out new parental controls that allow parents to view their teen’s contacts and confidentially report any accounts of concern to the social media company. A child lies in bed lit by the glow of a cell phone.

Elva Etienne/Getty Images

Snapchat is rolling out parental controls that allow parents to view their teen’s contacts and report to the social media company, without their kids’ knowledge, any accounts that might concern them.

The goal, executives say, is to allow parents to control their children’s connections without compromising teen autonomy. Called the Family Center, the new toolkit launched Tuesday requires both the caregiver and the teen to sign up.

“It allows parents to see who is in their teen’s universe,” said Nona Farahnik, director of platform policy for Snap, the company that makes Snapchat. “It gives parents the ability to ask who someone might be, how they might know a contact, which leads to those kinds of real-time conversations about who teens are talking to.”

Farahnik says that Family Center is based on parenting in real life.

“If your teen is heading to the mall, you can ask who they’re going with. ‘How do you know them? Are they on a sports team together? Do they go to school together?'” Farahnik said. “But you won’t be sitting in the mall with them listening to their conversations.”

Similarly, parents cannot see the content their teen sends or receives on Snapchat. They can only see who their child has communicated with in the last seven days. Snapchat is popular with young people, in part because messages on the platform disappear within 24 hours.

The company says it consulted with security experts and academics and held focus groups with parents to develop Family Center and plans to roll out more features in the coming months. The tool is only for parents of children under the age of 18.

With Family Center, Snap follows other social media platforms, including Instagram, which recently improved parental controls. for at least one poll, Snapchat is the second most popular social network among teenagers. The first, TikTok, offers “Family Sharing,” which gives parents a few ways to limit the videos shown to their children.


A promotional screenshot of Snapchat’s new Family Center, shared by the company ahead of launch.

Snapchat


hide title

toggle title

Snapchat


A promotional screenshot of Snapchat’s new Family Center, shared by the company ahead of launch.

Snapchat

“I think these platforms want to show that they can take action to protect children, that they can self-regulate and that they are capable of doing it themselves without involving the government,” said Irene Ly, policy adviser at Common Sense Mediawhich reviews apps, games and media for families.

Bipartisan Legislation in Congress would require more sweeping changes aimed at protecting children on social media, however lawmakers have yet to vote on the measures.

Advocate: Social media should be ‘safer by design’ for kids

Parental controls can be helpful for some families, says Josh Golin, executive director of Fair play, an advocacy group focused on improving online safety for children. But they require parents to have the time, energy, and commitment to discover social media tools and use them regularly.

“Are you going to spend 20 minutes a day figuring out what’s going on on Snap and another 20 minutes on TikTok and another 20 on Instagram?” he said. “I don’t think parents particularly want to spend their time this way. What they’d rather see is these platforms take real steps to be safer by design.”

For example, Golin says, it should be easier for kids to put down their phones and take a break from social media.

“As a 12-year-old, you might feel like, ‘Oh my God, my life is going to be over if I don’t contact my friend today on Snapchat,'” Golin said. “I don’t think we should give kids rewards, badges and other things for using online platforms more. That doesn’t encourage intentional and thoughtful use. I think it encourages compulsion and only benefits the company.”

Snap’s terms of use require a child to represent that they are 13 years of age or older before signing up for the service. Snap says it screens underage users for compliance with the Children’s Online Privacy Protection Act.

“We have millions of young people who are already on Snap, including millions who are under 13 and shouldn’t even be there in the first place,” Golin said.

He says companies could do a better job of verifying the ages of their users, rather than taking them at their word.

Common Sense’s Ly says companies could also see their algorithms amplify content that could be harmful to children.

For example, Ly said, a child might interact with a post that encourages healthy eating for a workout routine. But algorithms built to show users more of what they like could quickly lead that child down a rabbit hole of misinformation about eating disorders or other harmful eating habits.

Leave a Comment

Your email address will not be published.