Can Algorithms Really Be Biased?

One should avoid passing judgment on those who create the algorithms.

I recently read a study in Science Magazine that focused on racial biases found in popular algorithms that are used in the area of population health. I would encourage you to read the entire article to get some detailed understanding of the data, methods, analytics, findings, and conclusions, as this is not the forum for such a detailed discussion. 

In general, the study authors found that these population health algorithms, which are used to dole out healthcare services to individuals within certain specific populations, are biased against black people. Their thesis is that these biases occur due to health disparities that are conditional upon some risk score that is calculated as part of the algorithm. The authors also opined that much of this occurred because the algorithm uses cost as a basis for predicting health needs. For example, it found that across every level of algorithm-predicted risk, “blacks and whites have (roughly) the same costs the following year.” The issue, then, is that the data studied would indicate that black people were, in essence, sicker than the corresponding white population for whom data was available.

Imagine, then, that you had two populations (population a and population b). Let’s say that the population contains patients that are significantly sicker than patients in population b. One would expect that the costs for those patients in the population also would be significantly higher than those in population b, but in this study, such was not the case. One can argue whether the algorithm predicts outcomes based on the correct variable, but in my experience, besides avoiding catastrophic health crises (which, by the way, can be very costly), looking to lower the cost of healthcare is not an unreasonable goal. In fact, it is likely pursued by every payer, including Medicare and Medicaid, and hence the growing interest in population health programs.

In this study, however, the problem was an inequity between the two populations (blacks and whites), wherein the costs were the same, but the severity of illness was not. The authors suggest that perhaps a more suitable approach would be to predict some measure of health. For example, the number of active chronic health conditions could be used, or a comorbidity score, which is mostly used in medical research. In this way, the algorithm would predict the need for healthcare services based on the severity of illness rather than the costs – which, according to the authors, “necessarily means being racially biased on health.” Why is that? Again, according to the researchers, poor patients face more barriers to healthcare access – and in this case, they are stating that socioeconomic status and race are closely correlated. In other words, the authors opine that black populations tend to be poorer, and therefore, are in the group with greater access challenges. These challenges included geography, transportation issues, competing demands from jobs and/or childcare, and/or even just the basic knowledge that a certain health condition may require medical care. 

The authors also opined that some determinants depended upon the racial relationship between black and white primary care physicians. They referenced another study that found black providers would be more aggressive about preventive care when the patient was black, as opposed to a white provider. In essence, the authors are laying a foundation for bias among people, and then associate those people to the human intelligence used to create those algorithms. 

This type of algorithmic bias, as the authors duly point out, is not isolated to just healthcare. Several years ago, I worked with a law enforcement agency to develop an algorithm that would predict when and where crimes (of violence, in particular) might occur. The results almost always identified neighborhoods that reported a lower socioeconomic status, and because that was correlated to race, the algorithm would predict these events in neighborhoods with a higher proportion of blacks and Hispanics. Now, there are many reasons that this type of an algorithm might be biased, but they are not due to the algorithm, per se, but rather, the programming and the data that go into that algorithm.

For example, if, historically, the rate of violent crime was significantly higher in specific neighborhoods, then it would make sense that the algorithm would predict future events in those neighborhoods. One reason suggested was that there was a more anemic police presence in those neighborhoods, and as such, there was less of a disincentive to commit violent crimes. And this may very well be true, but it is an indictment on society and the data, not necessarily those who create and develop those algorithms (or the algorithms themselves).

The authors give as examples credit-scoring algorithms or hiring or retail algorithms, which they claim all are influenced by racial and gender biases. I can very well see where these findings should raise the discussion as to how these types of biases could be checked and even eliminated, but I would caution against passing judgment on those who create the algorithms. In this case, for example, the reasonable approach to controlling costs may have resulted in a biased outcome, but not because the developers are biased, nor that the data is biased, but rather than the system is biased, producing the data that drives these algorithms.

And that’s the world according to Frank.

Facebook
Twitter
LinkedIn

Frank Cohen

Frank Cohen is Senior Director of Analytics and Business Intelligence for VMG Health, LLC. He is a computational statistician with a focus on building risk-based audit models using predictive analytics and machine learning algorithms. He has participated in numerous studies and authored several books, including his latest, titled; “Don’t Do Something, Just Stand There: A Primer for Evidence-based Practice”

Related Stories

Leave a Reply

Please log in to your account to comment on this article.

Featured Webcasts

Enhancing Outcomes with CDI-Coding-Quality Collaboration in Acute Care Hospitals

Enhancing Outcomes with CDI-Coding-Quality Collaboration in Acute Care Hospitals

Join Angela Comfort, DBA, MBA, RHIA, CDIP, CCS, CCS-P, as she presents effective strategies to strengthen collaboration between CDI, coding, and quality departments in acute care hospitals. Angela will also share guidance on implementing cross-departmental meetings, using shared KPIs, and engaging leadership to foster a culture of collaboration. Attendees will gain actionable tools to optimize documentation accuracy, elevate quality metrics, and drive a unified approach to healthcare goals, ultimately enhancing both patient outcomes and organizational performance.

November 21, 2024
Comprehensive Inpatient Clinical Documentation Integrity: From Foundations to Advanced Strategies

Comprehensive Outpatient Clinical Documentation Integrity: From Foundations to Advanced Strategies

Optimize your outpatient clinical documentation and gain comprehensive knowledge from foundational practices to advanced technologies, ensuring improved patient care and organizational and financial success. This webcast bundle provides a holistic approach to outpatient CDI, empowering you to implement best practices from the ground up and leverage advanced strategies for superior results. You will gain actionable insights to improve documentation quality, patient care, compliance, and financial outcomes.

September 5, 2024
Advanced Outpatient Clinical Documentation Integrity: Mastering Complex Narratives and Compliance

Advanced Outpatient Clinical Documentation Integrity: Mastering Complex Narratives and Compliance

Enhancing outpatient clinical documentation is crucial for maintaining accuracy, compliance, and proper reimbursement in today’s complex healthcare environment. This webcast, presented by industry expert Angela Comfort, DBA, RHIA, CDIP, CCS, CCS-P, will provide you with actionable strategies to tackle complex challenges in outpatient documentation. You’ll learn how to craft detailed clinical narratives, utilize advanced EHR features, and implement accurate risk adjustment and HCC coding. The session also covers essential regulatory updates to keep your documentation practices compliant. Join us to gain the tools you need to improve documentation quality, support better patient care, and ensure financial integrity.

September 12, 2024

Trending News

Featured Webcasts

Patient Notifications and Rights: What You Need to Know

Patient Notifications and Rights: What You Need to Know

Dr. Ronald Hirsch provides critical details on the new Medicare Appeal Process for Status Changes for patients whose status changes during their hospital stay. He also delves into other scenarios of hospital patients receiving custodial care or medically unnecessary services where patient notifications may be needed along with the processes necessary to ensure compliance with state and federal guidance.

December 5, 2024
Navigating the No Surprises Act & Price Transparency: Essential Insights for Compliance

Navigating the No Surprises Act & Price Transparency: Essential Insights for Compliance

Healthcare organizations face complex regulatory requirements under the No Surprises Act and Price Transparency rules. These policies mandate extensive fee disclosures across settings, and confusion is widespread—many hospitals remain unaware they must post every contracted rate. Non-compliance could lead to costly penalties, financial loss, and legal risks.  Join David M. Glaser Esq. as he shows you how to navigate these regulations effectively.

November 19, 2024
Post Operative Pain Blocks: Guidelines, Documentation, and Billing to Protect Your Facility

Post Operative Pain Blocks: Guidelines, Documentation, and Billing to Protect Your Facility

Protect your facility from unwanted audits! Join Becky Jacobsen, BSN, RN, MBS, CCS-P, CPC, CPEDC, CBCS, CEMC, and take a deep dive into both the CMS and AMA guidelines for reporting post operative pain blocks. You’ll learn how to determine if the nerve block is separately codable with real life examples for better understanding. Becky will also cover how to evaluate whether documentation supports medical necessity, offer recommendations for stronger documentation practices, and provide guidance on educating providers about documentation requirements. She’ll include a discussion of appropriate modifier and diagnosis coding assignment so that you can be confident that your billing of post operative pain blocks is fully supported and compliant.

October 24, 2024
The OIG Update: Targets and Tools to Stay in Compliance

The OIG Update: Targets and Tools to Stay in Compliance

During this RACmonitor webcast Dr. Ronald Hirsch spotlights the areas of the OIG’s Work Plan and the findings of their most recent audits that impact utilization review, case management, and audit staff. He also provides his common-sense interpretation of the prevailing regulations related to those target issues. You’ll walk away better equipped with strategies to put in place immediately to reduce your risk of paybacks, increased scrutiny, and criminal penalties.

September 19, 2024

Trending News

Happy National Doctor’s Day! Learn how to get a complimentary webcast on ‘Decoding Social Admissions’ as a token of our heartfelt appreciation! Click here to learn more →

CYBER WEEK IS HERE! Don’t miss your chance to get 20% off now until Dec. 2 with code CYBER24