On December 19, 2022, the Federal Trade Commission (FTC) announced a settlement with Epic Games Inc. (Epic) over its wildly popular game “Fortnite.” The settlement requires Epic to pay $275 million in penalties to resolve alleged children’s privacy violations and $245 million to refund consumers for allegedly unfair billing practices that duped players into making in-game purchases. These actions are the latest chapter in the FTC’s continued aggressive enforcement of the Children’s Online Privacy Protection Act (COPPA) and mark its most significant action against so-called dark patterns.
Jump to Best Practices and 10 Simple Steps Companies Can Follow to Reduce Risk.
“Fortnite” is a free-to-play multiplayer online game that launched in July 2017. Players glide onto an island where they battle other online players in a battle royale game format. Today, “Fortnite” has more than 400 million players worldwide, many of whom, according to the FTC’s COPPA Complaint, are under the age of 13. The Complaint alleges that “Fortnite” is directed to children under 13 and that Epic collected kids’ personal information and shared their voices and screen names without verifiable parental consent.
There is no bright-line rule defining when a site or app is “directed” to children under 13 and therefore subject to COPPA’s verifiable parental consent requirements. Instead, the COPPA Rule—the regulations issued by the FTC that implement the statute—provides that the FTC will consider numerous factors, including the site’s subject matter, content, use of animated characters or child-oriented activities, child-directed advertising, the presence of child celebrities, evidence regarding the intended audience and empirical evidence of the ages of users.
The Complaint points to the cartoonish characters in the game, the absence of blood and gore in the game, and the fact that players are not “killed” but eliminated. While it would likely be a stretch to conclude that these factors on their own make the game “child directed,” the Complaint also points to numerous other factors. This includes evidence that Epic had oversight over the licensing of merchandise for kids’ toys and costumes, empirical surveys that show that 53% of U.S. kids between the ages of 10 and 12 played “Fortnite” weekly, and evidence of employee statements reflecting an intent to appeal to young players and reflecting knowledge that many users were under 13.
The FTC alleged that because “Fortnite” was child-directed from the start, Epic was required to notify parents of its privacy practices and obtain verifiable parental consent before kids could create accounts or play. Although “Fortnite” implemented an age gate two years after launching that requested players self-identify their age, the FTC alleged that the age gate was ineffective because it had no effect on players who did not create an account and it did not affect all previously registered users.
The Complaint alleges child users were harmed by the game’s failure to comply with COPPA. Much attention is directed at the game’s player matching function that matched young players with older players and an on-by-default voice chat feature that exposed children to harassing and abusive voice and text communications from older players. The FTC also alleged that the on-by-default chat functions for both teen and child users was an unfair practice that violated Section 5(a) of the FTC Act.
To resolve these allegations, Epic agreed to a consent order that prohibits it from enabling the voice and text chat functions for both child and teen users without obtaining their affirmative consent through a privacy setting. Notably, this is the first enforcement action in which the FTC has required affirmative consent for chat functions for users who are 13 or older. The order also requires Epic to delete all personal information previously collected from “Fortnite” users unless it obtains verifiable parental consent, or those users identify as 13 or older through a neutral age gate. The $275 million penalty is the largest penalty ever obtained for violating an FTC rule, eclipsing the next-largest COPPA penalty (against YouTube) by more than $100 million. The settlement was unanimously approved by the FTC.
In a separate administrative Complaint, the FTC alleges that Epic employed myriad design tricks known as “dark patterns” to trick consumers into making in-game purchases without their express, informed consent. The term “dark patterns” describes design practices that trick or manipulate users into making choices they would not otherwise have made and that may cause harm. Once a largely academic concept, dark patterns are now regulated by major privacy laws around the U.S. (e.g., the California Privacy Rights Act) in addition to being the subject of numerous recent enforcement actions by the FTC. In September 2022, the FTC released a report on dark patterns, called “Bringing Dark Patterns to Light,” based on findings from a public workshop with the same name held in 2021. The report identifies the types of misleading and manipulative practices the agency believes can harm consumers.
Many of the practices highlighted in the report are present in the administrative Complaint. Specifically, the FTC alleged that Epic used dark patterns to:
In settling these allegations, Epic has committed to implementing a “hold-to-purchase” button on its store page that reconfirms a player’s intent to buy as an additional safeguard to prevent unintended purchases. The company is also expanding parental controls, including the ability to authorize real money purchases before they are made. Finally, Epic announced new instant cancellations for certain purchases and an expanded refund system for the purchase of digital goods.
This enforcement action serves as a cautionary tale to other game developers and online platforms used by children. As Epic points out in their blog post announcing the settlement, the video game industry likes to move fast and innovate. In its announcement, Epic comments “statutes written decades ago don’t specify how these ecosystems should operate.” But the recent enforcement examples have provided clarity on how the law will be applied by the FTC. For example, although the game “Fortnite” was rated “Teen” and Epic took the position that it was directed at an older teen and college-aged audience, the FTC alleged that the combination of empirical evidence about its users, its licensed product targeted audience, and the appearance of figures in the game demonstrated that it was directed to children under 13. This enforcement makes it clear that developers who create a teen-rated or mature-rated game cannot assume the FTC will agree the game is not directed to children.
Another example is in-app or in-game purchases by children. Although certain games or apps are free to download, the FTC will look closely at the ability of children to charge in-app purchases to parents or other accountholders without consent. The FTC has also made clear that companies designing user interfaces should look not just at the effect their design choices have on sales or other profit-based metrics, but also at how those choices affect consumers’ understanding of the material terms of the transaction.
With robust additional guidance now available from the FTC, game developers and other online platforms would be wise to carefully review their products.
Companies can follow 10 simple steps to reduce risk: