The Role of AI in False Claims Act Detection & Deception

June 17, 2024
The role of AI in False Claims Act Detection & Deception

AI is a new frontier in the technological landscape, changing the way many people work, receive their information, and do many tasks. It also has some tools to simplify data collection, assimilation and interpretation of information, and on the flip side, a technique for evasion by knowing what regulators are looking for. AI technologies are used for a wide range of applications, including speech recognition, language translation and image recognition. (1) That being said, though AI is in the next stage, it may have far reaching implications in all walks of life including the legal realm. It can be utilized to both identify fraud as well as perpetrate fraud and conceal it, so we must take the good with the bad and know how to harness its powers properly and ethically.

The Role of AI in False Claims Act Detection & Deception

Use of AI in FCA Detection

AI has many potential uses for identifying fraud under the False Claims Act (FCA). Functionally, the government is already using AI to determine outliers in coding to detect Medicare Fraud and Medicaid Fraud. That is, AI can determine what medical practices are billing the most for a certain code and then scrutinize the submissions to see if they are compliant and consistent with the regulations. In building the better mousetrap, the fraudsters can use the AI to figure out how to bill using alternative codes that historically have been undetected for prosecution. This dance of AI to detect and to conceal will rock on for quite some time.

Data mining is a term used with or without AI but will only be enhanced by AI’s learning capabilities. PPP loan fraud cases have been assembled using data mining, and also, although one can eye up potential defense contractor fraud by who is obtaining the most bids and potentially engaging in procurement fraud, unless you’re on the inside its hard to cobble together an FCA lawsuit from outsider AI.

AI can be used to train databases to review documents – cataloguing them and training them to identify certain patterns, or perhaps the relevance of certain kinds of related documents for an ongoing case. AI could conceivably identify digital trails to hone in on fraud by analyzing various sources of information and identifying patterns that one might not notice on their own. The technology can also be used to identify targets for enforcement perhaps, doing the preliminary work of investigations for enforcement authorities such as the Department of Justice (DOJ). The following are some examples of how AI specifically can be used for fraud detection:

  • Evaluating a mountain of case law for legal research; notably a long, and monotonous process. (2)
  • Helping lawyers make informed, data-driven decisions thereby improving their efficiency. (2)
  • Streamlining searches of company records across multiple databases, increasing efficiency (3)

Speak with the Lawyers at Brown, LLC Today!

Over 100 million in judgments and settlements trials in state and federal courts. We fight for maximum damage and results.


Use of AI in FCA Deception

Fraudsters have an array of reasons to utilize AI in conducting their crimes. Some of them include: (1)

  • Speed and efficiency
  • Anonymity
  • Misleading investigators to evade detection
  • Financial gain
  • Creating fake or misleading information to deceive people
  • Automating scams
  • Spoofing email addresses and phone numbers
  • Generating fake documents
  • Making cyber attacks more sophisticated, so they are harder to track and/or identify
  • Impersonation
    • Mimicking style, tone and speech patterns of an individual
    • Generating fake activity on social media
    • Generating images or videos that look like a particular person, such as a deepfake

AI Training Act

On October 17, 2022, President Biden signed the AI Training Act into law. The law mandates that the federal government’s workforce has knowledge of how AI works, as well as its benefits and risks. (4) The bill requires the Office of Management and Budget (OMB) to “establish the AI training program for executive agencies, and it is required to update the program every two years, and ensure there is a way to understand, and measure the participation of the workforce, and to receive and consider feedback from program participants.” (5)

The bill, passed with near unanimity before being signed into law, demonstrates the understanding that both the federal government and the regulators and law enforcement bodies beneath them, have that AI is a part of the future. It signals that mastering the use of AI and understanding all its applications, both positive and negative, is not something to overlook. Having federal government workers in agencies like the DOJ understand how AI works can only help whistleblowers seek justice against fraud perpetrated with AI.

Although AI has great utility, the true extent of its abilities is yet to be determined, as it still requires training by humans to do much of the work it’s intended for. While common applications of AI today pose minor threats to widespread fraud, this does not mean AI is something to write off. Many technologies are still being developed, which will change the nature of FCA claims for better or for worse. There will be new methods used by fraudsters to cover their tracks, or have AI do the dirty work for them, and there will also be new tools that will streamline the investigative process for detecting fraud and reducing, or curtailing the quantity or length of the steps involved in bringing forth suits on behalf of the US government with the assistance of FCA whistleblower attorneys and law firms.

It is pivotal to work with a False Claims Act whistleblower law firm considering the rise of AI, as there is a rapidly changing legal landscape shifting at breakneck speed. There are new tools at the disposable of both whistleblowers and investigators to combat fraud under the FCA, and with that opportunity for justice, also comes an opportunity for perpetrators to evade detection, and deceive others for their own greed. A False Claims Act whistleblower lawyer can help you understand how to maximize your odds of a successful case, file things properly so nothing comes back to bite you, protect your rights, and work to obtain a False Claims Act whistleblower reward for your cooperation. After all, how many chances do you have to do the right thing, and potentially receive 15-30% of the recovered funds under the False Claims Act, with many of these cases ending in multimillion dollar settlements? You don’t need AI to know that adds up!

How Artificial Intelligence is Used in Legal Practice | Bloomberg Law

New AI Training Requirement for Certain Federal Government Employees | Littler Mendelson P.C.

2551 – 117th Congress (2021-2022): AI Training Act | | Library of Congress