Skip to content
  • About
  • Contact
  • Contribute
  • Book
  • Careers
  • Podcast
  • Recommended
  • Speaking
  • All
  • Physician
  • Practice
  • Policy
  • Finance
  • Conditions
  • .edu
  • Patient
  • Meds
  • Tech
  • Social
  • Video
    • All
    • Physician
    • Practice
    • Policy
    • Finance
    • Conditions
    • .edu
    • Patient
    • Meds
    • Tech
    • Social
    • Video
    • About
    • Contact
    • Contribute
    • Book
    • Careers
    • Podcast
    • Recommended
    • Speaking

AI-powered surveillance in China and the U.S.

L. Joseph Parker, MD
Tech
June 4, 2024
Share
Tweet
Share

Today, in China, if you walk across the street where you are not supposed to, expect a ticket to arrive in the mail.  Somehow. Out of all the faces in China. An AI-monitored camera will see you and report your crime to the authorities. Lately, these fines have been coming almost instantly by text message. This is not just about pedestrian safety, though that will be the excuse. It’s about control. Nothing says that Big Brother is watching you better than a demonstration like this. And these tickets aren’t benign.

Your face will be plastered onto large LED screens for all to see and ridicule, smugly thinking to themselves about how they were smarter than that until the day that they are late or just aren’t thinking, then their ticket will arrive.  These tickets also go into China’s new social credit system. Announced in 2014, this system was designed to build trust in society by rewarding good behavior and punishing bad. This system combines financial credit with data from government agencies to promote state-sanctioned “moral behavior.”

A government’s definition of “moral behavior” may not be the same as yours.  Just look at any totalitarian nation from the past or present. But while most followed flawed collectivist economics and collapsed, China is much more practical.  Recognizing the stupidity of centralized economies, they have gone to almost full free market capitalism economically while maintaining an iron grip on social and political control through mass purges and prosecutions. Protecting all its citizens from the very liberties that make life worthwhile.

Now, the American federal government seems convinced that it also knows what is right for everyone and is doing its best to catch up with China’s “progress“ in this new arena. China’s AI and social credit databases and algorithms are developed by private companies contracted by the government. These systems track everything from income and purchases to travel and health care, putting everything into a big network that especially notices any political statements or activities not approved by the Chinese Communist Party.

This was less of a problem when human beings had to evaluate the data. Having mountains of information does the state no good unless an intelligent analysis is made of that data, and neither the American nor Chinese governments could hire that many people without going broke.  But then came AI, and everything changed. AI can process billions of data points in seconds, looking for trends and patterns that no human mind could see. Solving problems that had stumped human beings for over fifty years. Like protein folding.

DeepMind was able to correctly predict protein folding in just a few days. But that was just a start; a newer AI algorithm named AlQuraishi did the same in seconds. This is the power of the new generative neural net systems coming online and is how China can recognize a citizen crossing the street between lights. The problem is that these systems don’t come up with an equation that can be checked for accuracy. Neural nets, like humans, can tell you the data input but not exactly how they made their determinations.  So, no one knows.

All data made available to the system will be clustered, associated, transformed, weighted, averaged, and summed to create the neural network patterns that produce an answer.  The problem is that the quality and quantity of data can lead to incorrect generalization and underfitting or overfitting, causing Ais to perform well on training data but poorly on real-world information. Overfitting is when the model captures noise and outliers as true patterns while underfitting is when the model fails to capture true underlying patterns.

That is why AI facial recognition systems trained on white faces have a much higher failure rate for black ones.  Just like people, they overgeneralize and miss important details. For these and many other reasons, it is necessary to use cross-validation and bias-variance tradeoff measures to ensure accuracy.  AI errors are called hallucinations, as several attorneys discovered when they filed briefs using AI.  AI is an amazing tool that will revolutionize medicine and life in general.  But it threatens both when we fail to validate.

The FDA recognizes that software used for medical purposes is a “device“ that must pass the rigorous validation and accuracy testing of any machine. Software that is used only for administrative support like medical billing or claims processing and does not influence diagnosis or treatment is called Class I or Low Risk and can qualify for a 501k exemption, skipping FDA review and requiring no real testing. Class II software, on the other hand, is considered Moderate Risk but still cannot be involved in life-determining decisions.

Patient management and tracking software would fall into this category, like electronic health records and image-analyzing software that can assist in clinical decision-making, but the ultimate decision is always up to the provider.  Class II must still undergo performance testing and post-market surveillance.  Class III software will be involved in critical medical decision-making and must meet the stringent requirement of premarket testing, validation, and approval because it will directly influence treatment decisions.

This ensures everyone’s health and safety, as it would be madness to let an untested algorithm loose on patients.  Who wants their life to be in the hands of a hallucinating AI? Life and death decisions are influenced by these systems, and they are supposed to undergo open algorithm testing and validation through clinical trials before use, with post-market surveillance and analysis. And everyone MUST be able to see the testing data used.  That’s because if you give an AI data, it WILL use that data in its decisions, no matter what.

If you tell an AI someone’s race, that becomes a factor it considers. If it’s told that liquid assets equal criminal behavior, it believes. And unlike humans, who are imperfectly biased, it will be perfectly biased. Because of biases endemic in American policing,  almost all black doctors have a relative with a felony; this will be tabulated against them.  The fact that persecuted minorities, blacks in America, Indians of Muslim descent, and Jews from anywhere don’t always trust banks and keep cash on hand is tabulated against them.

When the U.S. Justice Department decided to develop an AI system to comb through health care data in the United States, it did not properly validate the results, it did not go through the proper FDA process, yet its decisions are being used to target, prosecute, and destroy any doctor who dares to ignore its dictates.  It does this in a biased and racist way because, like a child, it wasn’t taught not to.  These black-box AIs are destroying American medicine by targeting the few physicians willing to treat pain and addiction.

ADVERTISEMENT

Right now, doctors and pharmacists are typing patients’ names into a computer system and are deciding whether or not to prescribe or dispense medications based on the scores these black box proprietary systems make. These algorithms are used by the DEA and in many states to reduce a human being’s life down to a simple metric.  Are you worthy of effective medical treatment?  Or should we put you in the “drug-seeking“ category?  Denying you this treatment because the box says it’s not good for you. How does this work?

Let’s have an imaginary discussion with one of these black box programs, shall we? A few simple questions will make the point.

Are you poor? Statistically, poor people are more likely to sell some of their pain medications to get by than wealthy people.  You look poor, so for your own good, we better not give you controlled medications.  Try some yoga or acupuncture.

Are you a woman who has suffered a trauma?  Studies show that you have been sensitized to pain and, therefore, feel it at a higher intensity than non-traumatized people.  Some might think that means you need MORE effective treatment, but not the “expert“ who fed me the data.  He thought women couldn’t be trusted. Try some counseling for your trigeminal neuralgia.

What about men? Oh, I wasn’t fed that data.  The men who did that study didn’t seem to draw the same conclusion when it came to males. I’m not sure why. I don’t have access to that information.

By the way, are you black? Statistically, black people are only 12.8% of the population, but they make up one-third of all drug arrests, so I need to know. Please stand next to that Sherwin Williams Color Swatch.  Looking a bit dark there today. So, that’s a big ‘no.’“

It’s for the best, really. Or so I’m trained.

This fictionalized scenario plays out every day now in clinics and pharmacies all over America.

Now, you can say that no one “forces“ doctors or pharmacies to prescribe or dispense controlled medications based on these algorithms.  They are just used to inform the health care professionals, so they can make better decisions.“ I would agree with you if state and federal law enforcement were not forcing doctors and pharmacists to follow this “guidance“ or face prosecution. As long as these algorithms are used against doctors in court, health care in America is being dictated by the federal government.

This leaves health care providers with a simple choice.  Do what you think is best for the patient and risk prosecution and prison, or simply say no.  There’s no downside to the provider for saying no.  No one has ever been prosecuted for a patient’s death because effective pain or addiction treatment was NOT prescribed or dispensed. As long as these false metrics are used in court against health care providers, we must choose between being healers dedicated to our patients or agents of the state, enforcing their “morality.”

They do pay some medical shill to agree with their arguments in court, but this proves nothing. Our black box AI says you’re bad, and this guy we paid $100,000 says you’re bad, so off to prison.  There was no premarket notification, no demonstration that the software is accurate in the real world, no post-market surveillance, no safety and effectiveness validation through clinical trials, and no comprehensive documentation required by federal law. In short, the use of this AI is itself a criminal act. But what can we do about it?

There are few governments on this Earth less willing to admit fault than the Americans.  The closest recent parallel would be the Soviet Union, but that nation’s admission that innocence did not matter at least put citizens on notice.  “You have no rights, and your freedom exists at the will of the state“ was common knowledge.  Most Americans still believe the lie that “this can’t happen here” or “they can’t do that to someone.”  The truth is that while they shouldn’t, they clearly do, and so far, no one has been able to stop them.

Alan Bates was a subpostmaster in the United Kingdom, and he is my hero. I think he should be the model that all pain patients, addiction patients, health care providers, and pharmacists follow as we seek justice in the federal government’s war to gain control over the practice of medicine in America. We must not let soulless unvetted algorithms or bureaucrats bending to every political wind take away our right to treat every patient as an individual and a valued member of our society.  The odds are long, but we must at least try.

We can file amicus briefs with appellate courts when a doctor is wrongfully prosecuted and convicted.  We can reach out to legislators to get protection from political interference in medical practice enshrined into law. We can use the Federal Food, Drug, and Cosmetic Act, the Americans with Disabilities Act, and 42 USC 1395 as weapons to fight back. We can, under the constitution, petition the government for a redress of grievances.  And we must if we ever hope to regain our rights. 

L. Joseph Parker is a distinguished professional with a diverse and accomplished career spanning the fields of science, military service, and medical practice. He currently serves as the chief science officer and operations officer, Advanced Research Concepts LLC, a pioneering company dedicated to propelling humanity into the realms of space exploration. At Advanced Research Concepts LLC, Dr. Parker leads a team of experts committed to developing innovative solutions for the complex challenges of space travel, including space transportation, energy storage, radiation shielding, artificial gravity, and space-related medical issues. 

He can be reached on LinkedIn and YouTube.

Prev

An infamous medical malpractice case

June 4, 2024 Kevin 5
…
Next

A doctor's broken heart: lessons learned from a failed relationship

June 4, 2024 Kevin 1
…

Tagged as: Health IT

Post navigation

< Previous Post
An infamous medical malpractice case
Next Post >
A doctor's broken heart: lessons learned from a failed relationship

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

More by L. Joseph Parker, MD

  • The shocking truth behind the DEA’s role in America’s pain crisis and doctor prosecutions

    L. Joseph Parker, MD
  • How the DEA’s use of predictive algorithms is worsening crises in urban communities and raising suicide rates among African Americans

    L. Joseph Parker, MD & Neil Anand, MD
  • Why good doctors are being jailed—and what it means for you

    L. Joseph Parker, MD

Related Posts

  • Are negative news cycles and social media injurious to our health?

    Rabia Jalal, MD
  • Sharing mental health issues on social media

    Tarena Lofton
  • AI enforcement in health care: Unpacking the DEA’s approach to the opioid epidemic

    L. Joseph Parker, MD
  • 5 things I learned from Nepali health care

    Simona Adhikari
  • Digital health equity is an emerging gap in health

    Joshua W. Elder, MD, MPH and Tamara Scott
  • Improve mental health by improving how we finance health care

    Steven Siegel, MD, PhD

More in Tech

  • Closing the gap in respiratory care: How robotics can expand access in underserved communities

    Evgeny Ignatov, MD, RRT
  • Model context protocol: the standard that brings AI into clinical workflow

    Harvey Castro, MD, MBA
  • Addressing the physician shortage: How AI can help, not replace

    Amelia Mercado
  • The silent threat in health care layoffs

    Todd Thorsen, MBA
  • In medicine and law, professions that society relies upon for accuracy

    Muhamad Aly Rifai, MD
  • “Think twice, heal once”: Why medical decision-making needs a second opinion from your slower brain (and AI)

    Harvey Castro, MD, MBA
  • Most Popular

  • Past Week

    • The silent toll of ICE raids on U.S. patient care

      Carlin Lockwood | Policy
    • Why recovery after illness demands dignity, not suspicion

      Trisza Leann Ray, DO | Physician
    • Addressing the physician shortage: How AI can help, not replace

      Amelia Mercado | Tech
    • Bureaucracy over care: How the U.S. health care system lost its way

      Kayvan Haddadan, MD | Physician
    • Why does rifaximin cost 95 percent more in the U.S. than in Asia?

      Jai Kumar, MD, Brian Nohomovich, DO, PhD and Leonid Shamban, DO | Meds
    • Why medical students are trading empathy for publications

      Vijay Rajput, MD | Education
  • Past 6 Months

    • What’s driving medical students away from primary care?

      ​​Vineeth Amba, MPH, Archita Goyal, and Wayne Altman, MD | Education
    • Residency as rehearsal: the new pediatric hospitalist fellowship requirement scam

      Anonymous | Physician
    • The broken health care system doesn’t have to break you

      Jessie Mahoney, MD | Physician
    • Make cognitive testing as routine as a blood pressure check

      Joshua Baker and James Jackson, PsyD | Conditions
    • A faster path to becoming a doctor is possible—here’s how

      Ankit Jain | Education
    • The hidden bias in how we treat chronic pain

      Richard A. Lawhern, PhD | Meds
  • Recent Posts

    • When errors of nature are treated as medical negligence

      Howard Smith, MD | Physician
    • Physician job change: Navigating your 457 plan and avoiding tax traps [PODCAST]

      The Podcast by KevinMD | Podcast
    • The hidden chains holding doctors back

      Neil Baum, MD | Physician
    • Hope is the lifeline: a deeper look into transplant care

      Judith Eguzoikpe, MD, MPH | Conditions
    • Why medical students are trading empathy for publications

      Vijay Rajput, MD | Education
    • From hospital bed to harsh truths: a writer’s unexpected journey

      Raymond Abbott | Conditions

Subscribe to KevinMD and never miss a story!

Get free updates delivered free to your inbox.


Find jobs at
Careers by KevinMD.com

Search thousands of physician, PA, NP, and CRNA jobs now.

Learn more

View 1 Comments >

Founded in 2004 by Kevin Pho, MD, KevinMD.com is the web’s leading platform where physicians, advanced practitioners, nurses, medical students, and patients share their insight and tell their stories.

Social

  • Like on Facebook
  • Follow on Twitter
  • Connect on Linkedin
  • Subscribe on Youtube
  • Instagram

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

  • Most Popular

  • Past Week

    • The silent toll of ICE raids on U.S. patient care

      Carlin Lockwood | Policy
    • Why recovery after illness demands dignity, not suspicion

      Trisza Leann Ray, DO | Physician
    • Addressing the physician shortage: How AI can help, not replace

      Amelia Mercado | Tech
    • Bureaucracy over care: How the U.S. health care system lost its way

      Kayvan Haddadan, MD | Physician
    • Why does rifaximin cost 95 percent more in the U.S. than in Asia?

      Jai Kumar, MD, Brian Nohomovich, DO, PhD and Leonid Shamban, DO | Meds
    • Why medical students are trading empathy for publications

      Vijay Rajput, MD | Education
  • Past 6 Months

    • What’s driving medical students away from primary care?

      ​​Vineeth Amba, MPH, Archita Goyal, and Wayne Altman, MD | Education
    • Residency as rehearsal: the new pediatric hospitalist fellowship requirement scam

      Anonymous | Physician
    • The broken health care system doesn’t have to break you

      Jessie Mahoney, MD | Physician
    • Make cognitive testing as routine as a blood pressure check

      Joshua Baker and James Jackson, PsyD | Conditions
    • A faster path to becoming a doctor is possible—here’s how

      Ankit Jain | Education
    • The hidden bias in how we treat chronic pain

      Richard A. Lawhern, PhD | Meds
  • Recent Posts

    • When errors of nature are treated as medical negligence

      Howard Smith, MD | Physician
    • Physician job change: Navigating your 457 plan and avoiding tax traps [PODCAST]

      The Podcast by KevinMD | Podcast
    • The hidden chains holding doctors back

      Neil Baum, MD | Physician
    • Hope is the lifeline: a deeper look into transplant care

      Judith Eguzoikpe, MD, MPH | Conditions
    • Why medical students are trading empathy for publications

      Vijay Rajput, MD | Education
    • From hospital bed to harsh truths: a writer’s unexpected journey

      Raymond Abbott | Conditions

MedPage Today Professional

An Everyday Health Property Medpage Today
  • Terms of Use | Disclaimer
  • Privacy Policy
  • DMCA Policy
All Content © KevinMD, LLC
Site by Outthink Group

AI-powered surveillance in China and the U.S.
1 comments

Comments are moderated before they are published. Please read the comment policy.

Loading Comments...