TikTok today becomes the latest tech company to roll out increased protections for minors on its platform in the wake of increased regulatory scrutiny. The company says it will introduce a series of product changes for teen users aged 13 to 17, aimed at making their TikTok experience more private, safer and less addictive. TikTok’s news follows similar moves recently announced by other tech companies catering to teens, including Google and YouTube, as well as Instagram.
The changes TikTok plans to roll out over the coming months will address in-app messaging, the public nature of users’ videos, the default download settings for videos and TikTok’s use of push notifications.
This expands on the changes to privacy settings and defaults for users under the age of 18 that TikTok introduced in January. At the time, TikTok debuted stricter rules for teens aged 13 to 15 and slightly more permissive settings for users 16 to 17 focused on default account types, commenting and the use of TikTok’s interactive features, like Stitch and Duet.
Now, TikTok says new users aged 16 to 17 will have their Direct Message setting be set to “No One” by default and existing users would be prompted to review and confirm their settings the next time they use the messaging feature.
The company won’t prevent teens from using Direct Messages, however, but they will have to make a more explicit choice to do so.
The app will also now display a pop-up message when a teen under the age of 16 publishes their first video that asks them to choose who can watch their content — either followers, friends only or only themselves. (The “Everyone” option is disabled.) Before, TikTok had limited who would come across the accounts belonging to teens under the age of 16, which would reduce the visibility of their content to only followers they approved, when using the default settings. Now, it’s more directly pushing teens to make a choice about how public they want their content to be — and they have to decide in order for the video to be published, TikTok notes.
TikTok also said it will disable Duet and Stitch for users under 16, but this is not new — it was a part of the privacy changes that rolled out in January.
Separately, teens 16 to 17 will now be asked to make a decision about whether or not their videos can be downloaded by others. While TikTok won’t prevent the teens from making their content downloadable, it will pop up a box that asks them to reconfirm their choice, while reminding them that this means the videos could be shared to other platforms. (Downloads remain disabled for users 13 to 15, meanwhile.)
The final change is perhaps the most interesting because it’s something neither YouTube nor Instagram introduced: TikTok will limit push notifications.
Younger teens ages 13 to 15 won’t receive any push notifications after 9 PM, while those aged 16 to 17 won’t receive any notifications after 10 PM.
This part of the update is reflective of TikTok’s global mindset and its parent company’s Chinese roots. Today, China is in the midst of a tech crackdown, encompassing antitrust regulations, data security practices, tech business models and even social mores — like the addictive nature of video games, which state media equated to a drug like “opium.”
TikTok, too, has been called out as one of the most addictive social apps on the market, thanks to its advanced personalization technology, interactive design and simple interface, and psychological tricks that activate the pleasure center of users brains. The company already inserts “take a break” videos inside its main feed, because users have been losing hours to scrolling the app. Its decision to limit notifications is yet another acknowledgment of the app’s ability to lead users — and particularly younger users — to create negative digital media habits. By preventing notifications during certain hours, TikTok can point potential regulators to a feature that demonstrates it’s doing something to address that problem.
The changes come at a time when there’s a broader shift in the industry in terms of how tech companies cater to their younger users, as concerns about screen time, addictiveness, online abuse, data collection, privacy and more have been brought to light.
In the U.S., Congress has been pressuring companies to do more to protect younger users from the more harmful and negative impacts of technology.
One key piece of legislation in the works is an update to the decades-old children’s privacy law, COPPA (Children’s Online Privacy Protection Act). A new bill, the Protecting the Information of our Vulnerable Children and Youth Act, would expand COPPA to include teens under the age of 18 and prevent tech companies from using targeted advertising, among other things.
As a result, tech companies have been revamping their products to make teens’ experience more private and they’ve increased protections, including over how teens’ data is collected and used by advertisers.
TikTok was early to take action on protections to teens, as a result of the multimillion-dollar fine by the U.S. Federal Trade Commission for earlier violations of children’s privacy laws in a crackdown by the government agency that later extended to YouTube. Beyond the privacy changes from earlier in the year and take a break reminders, TikTok also led the market by bundling parent controls inside its app with the Family Pairing feature. The company also offers other resources for parents, including educational safety videos and parental guides. And it brought in outside experts to advise on policy creation, with the introduction of the TikTok Content Advisory Council.
“Our priority is to ensure teens on TikTok have a safe and age-appropriate experience as they create and share on our platform,” said Tracy Elizabeth, TikTok’s Global Minor Safety Policy Lead, in a statement about today’s changes. “This announcement builds on our industry-leading efforts to make all accounts under 16 private by default, age-restrict features like direct messaging, and empower parents with Family Pairing,” she said.