How AI Kills – Cigna’s Use of AI in Denying Insurance Claims
Cigna, a prominent insurance company in the healthcare space is facing legal scrutiny allegations of systematically denying medically necessary claims without human review, solely from an AI making a callous determination lacking real sentient thought. If you had a CIGNA claim denied because of AI, the whistleblower law firm of Brown, LLC wants to hear about your experience. Please call our qui tam law firm at (877) 561-0000 so we can determine if we can assist. The consultation is free, and we’re only paid if we win your case.
Attached is a copy of a sample lawsuit against Cigna for using AI to deny healthcare claims. When lives are at stake, computers can’t be making the decisions. These decisions have monumental impacts on people’s lives and result in significant financial burdens for people who need healthcare and may have to pay for essential services out of pocket. Healthcare companies have a reputation already of denying claims to save money, but to do so without even a whisper of humanity when viewing the file is egregious. This practice by healthcare companies of using AI in these decisions has raised questions about whether patient care is being compromised for financial gain. We’ll explore the allegations made in a legal case against Cigna and examine how the company’s use of an automated system, known as PxDx, is impacting insured individuals.
Understanding the Allegations
Cigna is accused of using artificial intelligence and technology to systematically deny insurance claims, some of which are for necessary medical treatments. This practice allegedly occurs without proper medical reviews, which may violate consumer protection laws at both the state and federal levels. As a result, these actions cause financial hardship for individuals, forcing them to pay for medical services that should have been covered by their insurance. Concurrently, Cigna is accused of benefiting financially from this process.
Cigna is said to have implemented an automated system called “procedure-to-diagnosis” or “PxDx” to further this alleged scheme. PxDx allows Cigna’s medical directors to automatically reject claims on supposed medical grounds, even without thoroughly examining the patient’s medical records. Consequently, this leads to patients receiving unexpected bills for treatments thought to be covered or potentially those that should have been covered.
The impact on those insured by Cigna is substantial. Within just over two months, Cigna’s medical directors reportedly gave automatic denials to over 300,000 payment requests. Shockingly, they spent an average of only 1.2 seconds reviewing each case, according to the allegations.
These practices are believed to be motivated by Cigna’s financial interests. By automatically denying more claims, even without strong justification, Cigna stands accused of unethically cutting corners just to save money or increase its profits. This is particularly concerning because Cigna has a responsibility to act in both the best interests of the plaintiff and others in the same class. They should carefully interpret the terms of each insurance plan and review claims properly. Cigna conceivably may not have lived up to these responsibilities, nor has it provided the coverage required under the health plans based on recent allegations.
The Nefarious Role of Technology with PxDx
PxDx is an automated system that automatically adjudicates claims, frequently denying them without proper medical justification. This system flags claims that don’t match pre-approved conditions and denies them almost instantly, all in an effort to cut costs. The program’s originator allegedly aimed to save Cigna substantial sums of money through these automatic denials.
There were many instances where Cigna doctors using the PxDx system rejected claims in seconds without taking patient records or professional medical judgment into account. This raises concerns about the quality and thoroughness of the review process.
One striking aspect is the low rate of patient appeals. The complaint estimates that only about 5% of denied claims are appealed. This suggests that many insured individuals are not fully aware of their right to challenge these automatic denials or face significant barriers to do so.
Based on cost-benefit analysis, Cigna may consider adding more procedures to the automatic denial list. This emphasizes the potential conflict between financial interests and patient care.
The legal case against Cigna is multifaceted and includes claims of violation of the Connecticut Unfair Trade Practices Act (CUTPA) and the Connecticut Corrupt Organizations and Racketeering Activity Act (CORA). Additionally, the plaintiff alleges breach of contract, unjust enrichment, and breach of the implied covenant of good faith and fair dealing. This case is sure to implicate other state law consumer protection services nationwide.
The legal case against Cigna sheds light on concerns related to the denial of insurance claims driven by AI systems, which ultimately affects patient care and financial well-being.
It’s essential to follow developments in this case, as it could have implications for how AI is used in the insurance industry and its impact on insured individuals. Insurance consumers need to be aware of their rights and explore options for recourse if they encounter unjust claim denials. You can read more about it here (link to the complaint)
If you’re aware of the denial of a claim based on AI with Cigna or elsewhere, call our whistleblower attorneys at (877) 561-0000 to learn your rights. The call is free and confidential, and the firm is only paid if we win your case. Read the full complaint here.