Federal Trade Commission Bans Fake Reviews and Testimonials

Written by
Jen McKeeman
Aug 15, 2024
FTC doorway

On 14th August 2024, the Federal Trade Commission (FTC) announced a final rule to crackdown on fake reviews and prohibit deceptive review practices such as AI-generated reviews, censoring genuine negative reviews and paying for positive reviews. The rule enables the agency to strengthen enforcement and seek civil penalties against violators.

A long time in the making, the FTC first announced an advance notice of rulemaking back in November 2022, and a notice of proposed rulemaking in June 2023 which then led to an informal hearing held in February 2024. 

FTC Chair, Lina Khan, said:
“Fake reviews not only waste people’s time and money, but also pollute the marketplace and divert business away from honest competitors,” said FTC Chair Lina M. Khan. “By strengthening the FTC’s toolkit to fight deceptive advertising, the final rule will protect Americans from getting cheated, put businesses that unlawfully game the system on notice, and promote markets that are fair, honest, and competitive.”

The final rule is available here, summary below:

  • Fake consumer reviews, consumer testimonials and celebrity testimonials are now illegal under the rule. 
  • The final rule tackles AI-generated fake reviews, or cases where someone did not have actual experience with the business or its products or services. 
  • Businesses are prohibited from both selling and buying fake reviews. This applies equally to negative and positive reviews.
  • Insider reviews, such as those written by company employees, that do not clearly and conspicuously indicate the reviewer’s connection to the company, are prohibited.
  • Businesses are prohibited from providing incentives or compensation to elicit reviews with a particular sentiment - either positive or negative - from consumers.
  • Review suppression is not permitted. A company’s website must be a genuine reflection of ALL the reviews received - both positive and negative - in order to fairly represent the company.
  • Businesses are barred from using unfounded legal threats, physical threats or intimidation to prevent or remove a negative review.
  • Review websites that claim to be independent, but are actually company-controlled, aren’t allowed.
  • Buying or selling social media engagement such as likes, follows or views through hacked accounts or bots is prohibited.

The maximum civil penalty is set to $51,744 per violation, but courts may impose lower per-violation penalties as they deem appropriate.

Becoming effective within 60 days of being published in the Federal Register, the Commission believes the rule will enhance deterrence and strengthen FTC enforcement actions in the fight to level the playing field for honest companies and protect consumers from the scourge of fake reviews. 

Market for Fake Reviews

In the final rule, the FTC outlined calculations to estimate the revenue generated from products and services for which consumers consider reviews as part of their decision-making process. This figure is $1.146 trillion. Accordingly, using conservative estimates, the Commission estimates that revenue generated from review-manipulated products amounts to $21.6 billion! This puts a monetary value on the power and worth of reviews. It’s no wonder bad actors manipulate ratings to capture more revenue.

Section 230 and Platforms' Responsibilities

Section 230 of the Communications Decency Act 1996 protects platforms hosting user-generated content from liability for that content. However, if reputation, trust and user experience are important to platforms, they need to be proactive in protecting themselves and their users from harmful fake reviews. Fake reviews are often one part of a bigger problem for platforms - fake accounts, fake listings, scam activity and counterfeit goods. Fabricated reviews erode consumer trust and can even put them at risk of actual harm in the case of non-genuine positive reviews for healthcare professionals, childcare services, or beauty products. Once a platform has a reputation for fake reviews, it can be incredibly challenging to repair that broken trust.

Review Fraud Clampdown: UK’s DMCC Act Bans Fake Reviews

And it’s not just the US taking action to clamp down on review fraud. The UK’s Digital Markets, Competition and Consumer (DMCC) Act, passed in May 2024,  has also made fake reviews illegal. Similar to the FTC’s final rule, it has been over a year in the making and is anticipated to take effect before the end of 2024.

Under the Act, the Competition & Markets Authority (CMA) has been given direct powers to enforce against fake and misleading reviews that could unfairly affect consumer decision-making. It’s been given the ability to gather evidence resulting in infringement decisions whereby it can impose fines on infringing companies of up to 10% of worldwide turnover. Individuals can also face personal fines. 

Keen to use its new enforcement powers, the CMA’s CEO stated in a podcast that consumer protection law enforcement under the DMCC Act will be “an incredibly important focus” and this will “drive real improvements in outcomes for consumers.”

Pasabi’s Fake Review Detection expertise

At Pasabi, we feel strongly about the need for authentic, trusted reviews. In September 2023, we commented on the FTC’s notice of proposed rulemaking as we felt it was important to highlight our concerns regarding the proliferation of reviews abuse through our experience in detecting fake reviews for online platforms. Pasabi is proud to have contributed and our opinion is quoted and credited in the FTC’s final rule.

Pasabi’s fake review detection technology uses AI, behavioral analytics and cluster technology to identify suspicious patterns of review behavior at the account level. In addition to analyzing the review content, we look at key data points that enable us to detect businesses suspected of buying reviews from review brokers, engaging in insider testimonials or incentivizing reviews. Our proven behavioral approach allows us to accurately identify fake reviews and associated fake accounts, removing any reliance on content analysis especially when gen AI tools offer fraudsters fast, free and credible-sounding review writing capabilities. Our technology gives enforcement teams the ability to automate at scale, providing a more trusted and valuable experience for users. And now with the announcement of the FTC’s final rule and the UK’s DMCC Act, Pasabi’s technology helps to comply with regulation on both sides of the Atlantic.

Get in touch today if you’re interested in learning more about what we can do to help you tackle the fake reviews challenge effectively.

Up next

DMCC Act - Fake Reviews Legislation in the UK

July 9, 2024

Gen AI: Faking it has Never Been so Easy

Gen AI: Faking it has Never Been so Easy

September 3, 2023

Detect Fake Reviews | Detecting Fake Reviews | Detection of fake online reviews

Detecting Fake Reviews - How Do We Do It?

April 25, 2024

DMCC Act - Fake Reviews Legislation in the UK

On 23 May 2024, the UK Parliament passed the DMCC Act. Find out what this means for the laws on fake reviews.

Gen AI: Faking it has Never Been so Easy

While many people are looking to explore the use of AI for good, there are also those looking to exploit it for bad.

Detecting Fake Reviews - How Do We Do It?

Learn about the importance of detecting fake reviews on your platform using machine learning and AI.