September 21, 2024

TikTok fails to protect children’s privacy, unsafe digital environments

 

Alana Frank

Photo Editor

The social media platform, TikTok, is facing a $29 million fine after t

he U.K.’s Information Commissioner’s Office (ICO) discovered that the company had violated child data privacy laws. Due to the ongoing investigation over the past 2 years, TikTok needs to reevaluate its policies for minors, disable certain features for younger kids, and prevent harmful challenges becoming trends. 

The ICO discovered that TikTok was tracking online browsing behaviors of children 13 years of age and younger without parental consent. Tiktok was also accused of taking personal data such as religion, ethnicity, and sexual orientation on illegal grounds. 

 

Since 2019, the Chinese company ByteDance, has been sued multiple times for violating the Children’s Online Privacy Protection Act. Recently in July 2022, two underage girls lost their lives attempting a trending TikTok challenge called, “the blackout challenge”, which dared people to choke themselves to unconsciousness. When asked about the challenge, Junior Brianna Santoro said that she had never heard of it but was shocked by the story. The fact that “the blackout challenge” was allowed to surface to this point, shows TikTok’s failures in regulating its content.  

 

TikTok’s fame has escalated exponentially over the past few years. According to Brian Dean, a search engine optimization (SEO) specialist, TikTok has over a billion active users with the majority being minors. TikTok had a growth rate of 1157.76% globally between 2018 and 2020 and 180% during the pandemic among the ages 15-25. TikTok needs to make sure that it’s making minors’ safety a priority instead of increasing their number of users. 

 

With the majority of its users being minors, TikTok needs to make data privacy policies the utmost priority. In 2019, the platform took steps to protect minors by implementing different features for those under 18 such as disabling direct messages, screentime reminders, and having a family safety mode. In the near future, TikTok should limit the amount of information that minors can share on the app. For example, the ability to connect other social media platforms to your TikTok account, should be disabled for underage kids. 

 

The ICO’s investigation into the violations started in 2019 to uncover how TikTok collects sensitive data and if those policies breached the General Data Protection Regulation (GDPR). The GDPR requires companies to input extensive safety measures for minors. TikTok has been targeting children with addictive trends and is designed to evade parental responsibility and authority. 

 

TikTok is failing to make the app a safe place for underage users. An anonymous Costa student shared that their most used app in a week is TikTok, with over 20 hours being spent on the app. With so much time being spent on TikTok, that leads to more personal data consumption. Information Commissioner John Edwards says that children should be able to learn and experience the digital world without having to worry about data privacy and TikTok has failed to meet that requirement. 

 

Even though TikTok has violated privacy laws, their terms and conditions require users to be at least 13 years of age. Many of the users lie about their age when signing up for TikTok. TikTok believes that they haven’t been violating privacy because in the terms of conditions, they disclose the dangers of children using the platform and warn children. However, many like Attorney Matthew Bergman, who represented the families of the two girls who participated in “the blackout challenge”, argue that TikTok isn’t doing enough to protect minors. 

 

With this being one of many lawsuits and fines that TikTok has faced for violating privacy laws, cyberbullying, and even certain challenges that have led to deaths among users, TikTok is not a safe social media platform to use.

 

Be the first to comment

Leave a Reply

Your email address will not be published.


*