Is there an arms race involving using technology for submitting claims and auditing them? In one example, an electronic health record system was deployed that would increase reimbursements. It would do this by inserting into patient charts default templates that allowed physicians and others to copy in information from patient records – assuming, of course, that the information was more or less the same.
This would save a great deal of time, particularly for specialists who routinely handle the same type of case day after day.
Research found that this significantly increased the amount of reimbursements that the healthcare provider was able to obtain.
The Centers for Medicare & Medicaid Services (CMS), however, soon got hold of the news and then modified their recovery audit program so that it would use information systems to identify any default templates or any copy-and-paste of data, or any cloned records. In this way, more than $1 billion in Medicare reimbursements were recovered.
It is difficult to know whether the claims themselves were faulty, or whether the demand for reimbursement was simply based on the fact that some data was replicated or cloned, even though that data itself was accurate for the patient.
The problem here, of course, is that even cloned data can be perfectly accurate, particularly if the patient has more or less identical problems to another patient.[1]
Double Standard in Algorithms
Of course, it is too early to say, but this case appears to indicate a double standard emerging in the use of machine, learning and artificial intelligence. That lesson is that if the technology is used for the benefit of the provider in a way that may increase the amount of reimbursement they are able to obtain, then it is assumed to be invalid.
On the other hand, if the machine learning and artificial intelligence is used to audit or crack down on the healthcare provider and take away their revenue, then it is presumed to be valid. In general, there is no way to correct such an implicit bias in the overall auditing framework.
The AI Arms Race in Auditing
There is certainly an arms race in competing technologies. One army is operated by the healthcare providers. The other army is operated by the auditors. These two armies are competing against each other, and are constantly employing the most advanced technologies in order to optimize their positions.
For the healthcare providers, the goal is to make sure that they are equitably paid for all of the services they are providing. For the payers, which is usually the government or insurance companies, the desire is to make sure that the amount they pay is a small as possible, and that absolutely nothing considered fraud or waste or abuse is paid for.
Each innovation of one side produces another innovation on the other. Like what we may have observed in the arms race in strategic thermonuclear weapons, we can expect that eventually, it will be necessary to have a type of arms control treaty.
What would this look like? In the case of Medicare fraud, the only possible solution is that no claim that is false can be accepted and paid for.
The only way to have an outcome such as this is for all of the research and development (R&D) to be put into developing real-time screening systems that operate simultaneously, as the claim is being entered. If extra information is needed, the provider will be told immediately and given the chance to supplement the information. If the information is not clear enough, the same will happen. The system might “hold” the claims for a minimum period, such as 45 days, giving time for the provider to enter the required supplemental information. Over time, the number of improper claims will go down and be practically zero.
And in this way, use of artificial intelligence and machine learning can be used to eliminate the auditing profession altogether. That is, however, something that may be far in the future.
Summary
Artificial intelligence in Medicare auditing for healthcare claims is a common thing. It is part of a technology war between providers and auditors in order to increase or decrease revenues, depending on which side one is on. Regardless of the warning signs, the investment in this area continues to accelerate, without any intervention or regulation.
In an industry so concerned with standards and details criteria for decisions, it is astonishing that there are no standards for the quality of AI and no way of verifying whether the software is good or horrible.
The healthcare provider is left in a position of having to present legal arguments against something not understandable, and against decisions for which there is no record of how they are made.
[1] Ganju, Kartik K., Hilal Atasoy, and Paul A. Pavlou. “Do electronic health record systems increase medicare reimbursements? The moderating effect of the recovery audit program.” Management Science 68, no. 4 (2022): 2889-2913. https://dl.acm.org/doi/abs/10.1287/mnsc.2021.4002