Artificial intelligence (AI) is used in more and more applications, including medicine. Hospitals around the country, including in Florida, are using AI in many diagnostic tools. But while the expectation is that AI can speed up diagnosis and delivery of care, a number of studies have recently demonstrated that AI can in fact worsen bias, miss serious illness, and deny care to those in need. If your doctor or medical system used AI to make diagnosis decisions and you were hurt, do you have a medical malpractice case? We can help determine your options.
What we call AI is really a complex mathematical instruction called an algorithm that tells the computer what to do. This allows for high-speed automation of actions that would take a human much longer to complete. AI is said to “learn” by being fed a great deal of data from which it recognizes patterns for completing its task. In the case of medical diagnostic tools, AI is fed massive amounts of data and draws conclusions from that.
However, as the old saying goes, “garbage in, garbage out.” If the AI is fed poor data, it will draw poor conclusions. If the algorithm itself is faulty, it will draw incorrect conclusions.
Unfortunately, such errors have been found only after people have been harmed by AI tools since humans are not able to flag the errors quickly.
Examples of AI failure that hurts patients
A 2022 study found that AI systems examining medical imaging were misdiagnosing based on socioeconomic status, which compounded for intersectional under-served subpopulations, such as Hispanic and female. Rather than lessening human bias, AI was actually amplifying the error due to either erroneous data or poor programming.
A 2019 study of one widely-used diagnostic algorithm used in health decisions found a bias against Black patients versus white patients because the algorithm was based on healthcare expenditures. Since Black people often have less money to spend based on their needs, the algorithm concluded that Black patients were healthier than white patients based on financial expenditure. When the bias was adjusted, the number of Black patients receiving additional care rose from 17.7% to 46.5%.
In 2016, Arkansas instituted an AI program for in-home care for their Medicare patients that was intended to eliminate arbitrary decisions or favoritism in the current human assessment system. The result, however, was the sudden gutting of care for severely handicapped people due to the cold and inaccurate calculations of the system.
In a rather alarming discovery, a computer model used to diagnose imaging was able to predict with greater than 90% accuracy the self-reported race of the patients, even though no race information was fed to the computer. Doctors and computer programmers have not yet been able to determine what factors are fueling this ability. The concern is, as in the other cases, a bias could be “learned” by the program and doctors would not recognize it.
The danger of misdiagnosis
While computers can help doctors care for their patients, the tools must not take the place of the expertise of a trained physician. Doctors must strengthen their own diagnostic abilities and hone their skills of observation, deduction, and intuition. A good doctor will ask the right questions, note any peculiarities, and look deeper than the cold calculation of a computer program.
Many people have been harmed when medical professionals depend too heavily on imperfect technological tools. If you believe you have been hurt by the negligence of a medical practitioner or medical institution, call me. As a Florida personal injury attorney, I believe in fighting for the rights of the little guy against big business, insurance companies, and medical institutions that have the financial resources to squash your efforts to recover damages due to their negligence or faulty products. I have a proven track record of winning substantial compensation for my clients. Contact me at (954) 448-7288, 24/7 from anywhere in Florida for a free consultation to see how I can help you.