There is a conversation happening in the medical-legal space right now, and most clinicians doing expert witness work are not part of it. It is about artificial intelligence (AI). Specifically about whether AI belongs anywhere near an expert witness report. I want to weigh in. Not as a technology enthusiast and not as a skeptic. As someone who has spent a decade as a practicing physician assistant, earned a law degree, and then spent another decade running a medical-legal firm, training clinical experts, and reviewing the reports attorneys trusted most. Here is what I actually think.
The case against AI in expert witness opinions
Let me start here because this is where the conversation needs to begin. An expert witness report is a legal document. It carries your name, your license, your professional credibility, and your sworn opinion. It can be subpoenaed. It can be used against you in cross-examination. It can define your reputation in the legal community for years.
No AI tool can carry that responsibility. No AI tool has clinical judgment. No AI tool has professional licensure. No AI tool can be held accountable when an opinion is challenged under Daubert or torn apart on the stand. If you are using AI to generate your expert opinions, you are not doing expert witness work. You are doing something else entirely, and it will catch up with you. The legal community is watching this space closely, and the consequences of AI-generated opinions being discovered in submitted reports are significant for your credibility, your relationship with retaining counsel, and potentially your license. This is not a gray area. AI does not belong in the opinion itself.
Where AI actually has a role
Here is where the conversation gets more nuanced and where I think the medical-legal space is missing an opportunity. AI in expert witness work is not inherently problematic. The problem is misapplication. There is a meaningful difference between using AI to generate an opinion and using AI to practice the skills required to write one. The first is dangerous. The second is exactly what simulation-based learning has always done in medicine.
Think about how clinicians learn procedural skills. You do not perform your first intubation on a real patient. You practice on a mannequin. You run simulations, get feedback, and build the muscle memory and mental framework before real stakes are involved. Expert report writing is no different. It is a skill. A learnable, trainable, practicable skill. And the gap between how clinicians are trained to write, subjective, objective, assessment, and plan (SOAP) notes, conclusion-first, outcome-focused, and how legal writing actually works, issue, rule, analysis, and conclusion (IRAC) and conclusion, rule, explanation, application, and conclusion (CREAC) frameworks, reasoning-first, conclusion as the destination, is significant enough that most clinicians struggle with it for years before they figure it out on their own if at all.
If you are lucky, you might have an attorney who provides some guidance and feedback, but that is a rarity. Attorneys are much too busy to train you on how to write. They hired you to help save time, not spend it teaching you how to do your job. AI-powered practice environments, when built responsibly and used appropriately, can close that gap faster than any other method available. Not by writing the opinion. By helping the clinician practice writing it themselves, with feedback calibrated to legal standards rather than clinical ones.
The strengths of AI in training contexts
When used as a practice tool rather than a production tool, AI brings genuine advantages to expert witness training:
- It is available on demand. A clinician working a night shift cannot call their mentor at 2 a.m. for feedback on a practice draft. An AI training tool can respond immediately, consistently, and without scheduling constraints.
- It is infinitely patient. Learning legal reasoning takes repetition. Writing the same type of opinion paragraph 20 times with feedback each time is how the brain flip, the moment your clinical brain understands what legal writing actually requires, actually happens. AI makes that repetition accessible in a way human instruction alone cannot.
- It is consistent. Human feedback varies. A well-built AI training tool applies the same standard every time, which matters when you are trying to internalize a framework rather than just satisfy a reviewer. It can read your reports faster, scan for errors and point out issues within seconds. Much faster than any human can.
The weaknesses and risks
None of this means AI in this space is without risk. The weaknesses are real and clinicians need to understand them:
- AI makes errors. Legal standards are jurisdiction-specific, case-specific, and evolving. An AI tool trained on general legal principles may miss nuances that matter in a specific context. Everything an AI training tool produces should be verified independently before it influences any real work product. As with any help, human or AI, you need to check with your attorney for specific jurisdictional nuances that need to be followed. This is just best practice across the board.
- AI cannot assess scope. Whether a particular opinion falls within your clinical expertise is a judgment call that requires understanding your credentials, your experience, and the specific demands of the case. AI cannot make that determination for you.
- AI is not a substitute for real instruction. A training tool is exactly that, a tool. The framework, the standards, the attorney-informed perspective that makes expert writing effective has to come from somewhere credible. AI amplifies instruction; it does not replace it.
- AI creates a false confidence risk. The danger of any simulation is that it can make you feel more prepared than you are. Clinicians using AI training tools need to maintain humility about the gap between practice performance and real-world performance under cross-examination.
What this means for clinicians entering expert witness work
At least we know we are in a line of work that will not be replaced any time soon by AI. The clinicians who will thrive in this space over the next decade are not the ones who refuse to engage with AI out of principle. Nor are they the ones who outsource their thinking to it. We must not forget how to think for ourselves.
They are the ones who understand the distinction between AI as a production tool and AI as a practice tool. Those who use technology to accelerate their learning without compromising their integrity. Who approach every report with their own clinical judgment, their own reasoning, and their own name on the line.
The standard in expert witness work has always been and will always be yours to own. What changes is how efficiently you can reach it. AI, used correctly, can help you get there faster. Used incorrectly, it can end a career before it starts. Know the difference.
Tracy Liberatore is the founder of the National Expert Academy and a pioneer of the C.L.E.A.R. Method, an expert report-writing framework developed within a working medical-legal firm. A former physician assistant with ten years of clinical practice, she later earned her law degree with valedictorian honors and spent the next decade founding and operating Med Legal Pro, where she trained and placed clinical experts for attorneys handling medical malpractice, personal injury, and nursing home negligence cases nationwide.
She is the author of From Medicine to Law: Creating Winning Legal Teams with Medical Expertise and the host of the Statutes & Stethoscopes podcast. Her writing includes work on why accomplished clinicians may struggle in expert witness roles, the future of medical legal consulting, medical errors, and attorney concerns about medical records. Her work has been featured in USA Today, MSN, and CEO Feature.
Tracy now trains licensed clinicians to write expert reports attorneys trust, request, and refer. She shares insights through her personal LinkedIn, as well as the platforms for National Expert Academy and Med Legal Pro.









![Politics and fear have replaced science in U.S. pain management [PODCAST]](https://kevinmd.com/wp-content/uploads/11c2db8f-2b20-4a4d-81cc-083ae0f47d6e-190x100.jpeg)








