Block on Trump's Asylum Ban Upheld by Supreme Court
Children might not be a demographic that people expect tech companies to target, but when kids make something go viral, that big bump for business could come with a couple caveats.
As TikTok, the makers of the popular app (formerly known as Music.ly), learned the hard way, when you don't put certain safeguards in place for minor users, the penalties can be costly. In short, the company was hit with a $5.7 million fine from the FTC for failing to protect the data of minors in various ways.
Parental Consent to Data Collection
TikTok is a social media video platform that is basically like Vine but with more features. If that doesn't make any sense, you can watch this ABC reporter explain it on YouTube.
One of the primary problems TikTok encountered was their failure to obtain parental consent to collect the minors' names, email addresses, and more, from children under 13 years old. What's even more scary though is the fact that there were adults messaging young children using the app, and the app had no way to block this activity.
After the FTC fine, TikTok decided to ban children from the platform but also launched a platform that is designed to be more kid friendly by not allowing the sharing of personally identifiable data as well as limiting other features and focusing on age appropriate content.
However, given TikTok's massive popularity and growth over the past year, and the fact that it was purchased by Chinese company ByteDance for a billion dollars, the FTC fine might just be a drop in the bucket.