On February 27, 2019, the U.S. Federal Trade Commission announced a record-setting $5.7 million fine to popular short-form video sharing platform TikTok, formerly known as Musical.ly, as part of a consent order over allegations the company violated the Children’s Online Privacy Protection Act (COPPA). The settlement is now the largest COPPA penalty ever obtained by the FTC, beating the previous 2011 record of $3 million. In addition to reinforcing the FTC’s commitment to enforcement with respect to children’s privacy issues, the FTC here took an expansive view of what services qualify as “directed to children” under COPPA. Although that expansive view is expressed in the form of a complaint and not a judicial opinion, it nonetheless changes the risk landscape for online service providers, particularly in the social media and videogame industries. Many companies have previously felt comfortable not addressing COPPA in their sites and services, arguing they are not “directed to children.” However, following TikTok, it is now time for such companies to re-evaluate their exposure to and steps needed to comply with COPPA.
COPPA, passed in 1998 and revised in 2012, is one of the United States’ older consumer-facing privacy laws. COPPA applies to the operator of any website or online service “directed to children” that collects personal information from children, or any website or online service that has actual knowledge that it is collecting personal information from children. Unless an exception applies, an operator subject to COPPA must obtain verifiable parental consent before collecting any personal information from a child in those circumstances. For an online service that is not directed at children, but may nonetheless appeal to them, or where a substantial number of users are under 13, the FTC permits companies to implement an “age-screen” or “age-gate,” where users are asked to enter their age and their experience and data is handled accordingly.
Although COPPA is a U.S.-specific law, it has extraterritorial application to companies outside the U.S. that collect personal information from children in the U.S. It has also served as a model for many international children’s privacy laws, including the EU’s GDPR. The FTC has repeatedly enforced COPPA in the online/games space; for example, against TinyCo and Yelp in 2014, and LAI Systems and Retro Dreamer in 2015. The FTC has increased its focus in this area on the heels of child-protection aspects of the GDPR and the California Consumer Protection Act.
The TikTok/Musical.ly App (App) allows users to create and share short videos with other users. The videos typically involve the users lip-syncing to popular music. To create an account, the App requires an email address, phone number, username, first and last name, a short biography and a profile picture. In addition to creating and sharing videos, the App allows users to interact by commenting on their videos and sending direct messages. User accounts are public by default.
Initially, the App did not request age information from its users. This was changed in July 2017, but the App did not request age information from existing users who created their accounts prior to the change. While Musical.ly’s 2018 Privacy Policy states that the platform “is not directed at children under the age of 13,” according to the FTC complaint, Musical.ly was aware of the popularity of its platform with children, as shown in a number of ways:
This settlement is notable because of the FTC’s broad interpretation of what makes an online service “directed to children” under the App. For context, the Amended COPPA rule explains that the FTC will evaluate whether a site or service will is “directed to children” based on the “subject matter of the site or service, its visual content, the use of animated characters or child-oriented activities and incentives, music or other audio content, age of models, presence of child celebrities or celebrities who appeal to children, language or other characteristics of the website or online service,” or the content of advertising on the service. In addition to these factors, the FTC will also rely on other “competent and reliable empirical evidence regarding audience composition.”
Filed in February, the FTC complaint, weighing the factors, concluded that Musical.ly is a child-directed service due to the activity involved (creating lip-syncing videos) and the presence of emojis like “cute animals and smiley faces,” “simple tools” for sharing content, songs related to “Disney” and “school,” and kid-friendly celebrities like Katy Perry, Selena Gomez, Ariana Grande, Meghan Trainor and others. However, this type of content (lip-syncing, approachable design, bright colors, emojis, presence of pop music, etc.) can arguably be found on many sites and games not directed to children as well. (Take, for example, RuPaul’s Drag Race and its associated App).
Either the FTC is now adopting a more expansive definition than previously understood, or it may instead be relying less on the subject-matter factors and more heavily on “other competent and reliable empirical evidence” of audience composition; specifically, the complaints from parents, press articles, Musical.ly’s own admissions in its guidance to parents, and Musical.ly executives’ actual knowledge of popular underage users on their platform.
Absent further guidance from the FTC, service providers should view this decision with caution and take steps to protect their site or service now, especially if there is a chance the site or service may be seen as child-directed.