Guest article

Beware of Bots and AI: Minimizing the risk of fraudulent claims in class action settlements

© Just_Super / Getty Images
© Just_Super / Getty Images

Related tags Class action Dietary supplements Lawsuit Fraud

In this guest article, Elizabeth Chiarello and Julie Becker from Sidley’s Products Liability & Mass Torts Litigation group, provide actionable guidance to help functional food and supplement companies minimize risks associated with a new form of fraud in which scammers use bots to file thousands of claims during the settlement process, and how to navigate this new class action litigation environment.

Advances in technology have made it easier for plaintiff’s lawyers and claims administrators to locate and notify more class members than ever before. Instead of sending out postcards or running ads in local newspapers, the internet and social media have made it possible to alert large numbers of potential class members with a single click.

But this technology comes with its own set of problems. In recent years, an increasing number of class action settlement agreements have been victimized by a new form of fraud, in which scammers use bots to file thousands of claims in hopes of receiving large or multiple settlement payouts.

Bots create significant problems for both parties seeking final approval in a class action settlement. Bot attacks artificially drive up the claims rate and potentially the cost of settlement and have the potential to undermine the legitimacy of the class action settlement process.

A claims rate that is unrealistically high may suggest there are more aggrieved class members than there are and may lead a judge to decline to approve the settlement because the amount paid per claim is too low. 

A settlement that is not given final approval because of a misunderstanding of the true number of claims wastes the time of all parties involved, both plaintiffs, defendants, the Court, and unnamed class members. While the issue has so far been seen with “bot attacks,” the use of AI has the potential to affect claims made in class action settlements as well.

AI “robot lawyers​” are already being developed to perform legal services such as drafting contracts and demand letters, so it is safe to assume that AI will play an increasing role in litigation broadly, including in the class action claims process. Settlement administrators and the parties must adapt to this changing landscape in order to protect parties’ class action settlements from being derailed.

How to Spot a Bot Attack or Use of AI

Elizabeth Chiarello
Elizabeth Chiarello and– partner in Sidley’s Products Liability & Mass Torts Litigation group

Sure signs of a hostile bot takeover, AI process, or other fraudulent activity include receiving tens of thousands of claims in the first several days of claims administration, when expecting fewer claims than that for the full duration of the claims period.

Other indicia include receiving thousands of claims from the same IP address or receiving claims with no email address or signature. For example, in Opperman v. Kong Techs., Inc. et al.​, 13-cv-00453-JST (N.D. Cal. Jul. 6, 2017), several major app developers agreed to pay a consolidated $5.3 million to resolve claims related to invasion of privacy. See ​ECF 910 (Mot. for Final Approval of Class Action Settlement). But the claims administration process was hijacked by a bot: more than 5,400 claims were submitted by the same IP address, with about 1,000 of them submitted by someone living in a single-family residence in Toledo, Ohio. See ​ECF 911 (Mot. for Direction Regarding Potentially Fraudulent Claims).

Anticipating the Court’s frustration, the plaintiffs filed a motion requesting an order from the Court either 1) denying the fraudulent claims altogether or 2) requiring verification of identity from the claimants to resolve the issue (a process not originally included in the settlement agreement). Id.

The Court approved the second proposal, requiring the plaintiffs to notify all potentially fraudulent claimants within one week of the order requiring them to submit proof of identity. See ​EFC 918 (Order Granting Unopposed Mot. for Direction Regarding Fraudulent Claims). Ultimately, once the fraudulent claims were identified and eliminated, the Court approved the settlement. See ​EFC 925 (Order Granting Final Approval).

How to Prevent a Bot Attack or Use of AI

Julie Becker
Julie Becker, managing associate in Sidley’s Products Liability & Mass Torts Litigation group

1. Preventing Fraud Via Careful Drafting of the Settlement Agreement: ​Parties should draft the settlement agreement with fraud prevention in mind. For example, it is wise to design the claim application to be more challenging for bots or AI to manipulate, including:

  • Use unique claim identification numbers. ​Authentic members of a class might be provided with a unique claim identification number to list on their claim form or that will automatically populate on an electronic claim form. This way, a settlement administrator can easily confirm that a claimant is entitled to a settlement benefit.
  • Add specific fraud provisions.​ Provide explicit provisions governing potentially fraudulent claims in the settlement agreement. This may include provisions giving those suspected of fraud some period of time to correct or support their claim before the claim is denied.
  • Require an affirmative selection:​ Claimants could be required to make an affirmative selection—identifying a specific product, the date of their purchase, or their preference for a certain settlement benefit—to make it more difficult for the bot to submit false claims.
  • Require a declaration:​ Another option is to require the claimant to declare under penalty of perjury that their answers on the claim application are accurate, creating a legal remedy with which to hold bot fraudsters accountable. Such a requirement would give courts a means of investigating and prosecuting individuals who use bots to take advantage of the class action settlement process.
  • Caps on settlement funds:​ To avoid oversubscription, consider putting a cap on the settlement fund, with a pro-rata reduction of payments to class members if the settlement becomes oversubscribed. You can also implement a cap on the number of claims per household, or the number of products a class member can claim without proof of purchase, which would be defined in the settlement agreement.  

2. Preventing Bot Fraud in the Claims Administration Process:​ Parties should also consider preventing bot fraud through the claims administration process once the settlement agreement is drafted.

  • Retain a savvy settlement administrator: ​It is important to retain a settlement administrator who has experience in identifying fraudulent claims and who has sophisticated fraud software and analysis.
  • Utilize effective fraud technology:​ The settlement administrator’s fraud technology should use, among other things, claim identification numbers and IP address tracking to make claims more traceable, as well as CAPTCHA or other technology that requires claimants to verify that they are not a robot.
  • Check proofs of purchase:​ Make sure the claims administrator has a cost-efficient process to vet the proofs of purchase. They should, for example:
    • Identify and reject claims with identical/duplicate receipts—it’s easy for some people to submit the same photo of a receipt over and over again.
    • Watch for credit card statements that are submitted as proof of purchase, looking out for indicators of fraud such as purchase dates in the future.

Although bots have started to infiltrate the class action claims process as technology advances, there is an opportunity through thoughtful lawyering to prevent them from affecting settlements that are fair, adequate, and reasonable.  By incorporating defensive strategies into the settlement agreement itself and the claims process and remaining vigilant against such attacks, bots and AI can be stopped before they derail parties’ hard-won class action settlements.

* Summer associate Gillian Friedman also contributed to this article

Related topics Manufacturers Suppliers Regulation

Related news

Follow us

Products

View more

Webinars