The popular app TikTok is beginning to test ways to age-restrict some types of content amid a push to beef up safety features for teens.
Brooke Erin Duffy, associate professor of communication, is an expert on social media platforms who studies the intersection of media, culture and technology. She says the key concern is how the company will define and regulate mature content.
“For years, TikTok was able to evade the scrutiny of U.S. policy-makers and the public alike. Before the pandemic, parents seemed much more concerned about the risk of social networking sites like Instagram and Snapchat. However, TikTok has undergone a mainstreaming in recent years: its top creators have become household names, and many of the platform’s viral challenges – including the moral-panic inducing ‘school shooting challenge’ – have attracted widespread media attention. The announcement of their plans to increase youth safety measures indicates that the company is acting on the defensive.
“One key concern is how the company will define and regulate ‘mature’ content. Creators tell me the platform’s existing content guidelines are fuzzily defined and not always evenly applied. As such, TikTok’s enhanced efforts to provide a safer environment may adversely impact its creator community – particularly those who associated with stigmatized content genres.”