Introduction: a double-edged disruptor
Artificial intelligence (AI) has quickly insinuated itself and is transforming nearly every corner of modern life. Health care is no exception. With the rise of advanced chatbots, symptom checkers, and health-focused algorithms, patients now have 24/7 access to vast medical knowledge at their fingertips in seconds. The excitement is understandable: AI can demystify medical “doctor-speak” or jargon, suggest possible diagnoses and treatment options, and empower patients to become more engaged in their care, becoming “activated,” meaning participating with confidence to manage their care and take more personal responsibility for following the prescribed treatment plan from their health care professionals.
As with any disruptive tool, there are pros and cons, meaning opportunities and pitfalls. Used wisely, AI can be a valuable assistant in your health journey. Used incorrectly or unwisely, it can become misleading or worse, dangerous. The key is balance: taking advantages of the benefits without ignoring the downside risks.
The upside: What AI can do for patients
- Easy access to medical information 24/7, without waiting for appointments.
- Improves health literacy by translating medical jargon into understandable language.
- Decision support: helps patients generate better questions and identify red flags.
- Convenience and potential cost savings by triaging minor concerns at home (similar in many respects to existing nurse advice lines that are available).
- Personalization through reminders, lifestyle suggestions, and chronic disease support.
- Record organization, including summaries of test results and visit notes (note, most of these are already present and available in EMRs if being used).
- Educational role in prevention, offering evidence-based lifestyle strategies.
The downside: What AI cannot (and should not) do
- Risk of misdiagnosis if symptoms are oversimplified or context is missing.
- Delays in professional care when patients over-rely on AI.
- False confidence in AI outputs, which cannot replace exams or lab testing.
- Data privacy and security concerns, with sensitive health data potentially stored or shared.
- Bias in algorithms, which may misrepresent underrepresented populations.
- No accountability if something goes wrong.
- Anxiety from long differential lists, sometimes highlighting rare diseases.
- Inadequacy in complex cases requiring nuanced judgment or physical exam.
Best practices: How to use AI safely in your health care
Given all of this information, how is one to know how to safely use AI as it applies to their personal wellbeing and more importantly, their health care? The concept of “best practices” must be applied as in many other industries: like finance, cybersecurity, manufacturing, engineering, aviation, transportation, construction, and architecture. The application of a best practices approach in using AI in health care is the only way to protect patients’ safety, provide consistent care that is reliable and useful, and is an adjunct but not a replacement for professional care. Articulated below is an AI best practices outline for consideration:
- Use AI for education, NOT diagnosis: It is excellent for preparing questions, looking up/clarifying medical terms, and exploring general guidelines — but it is not your doctor.
- Know the red flags that always require care: chest pain, shortness of breath, neurological deficits, severe pain, heavy and/or continuous bleeding, allergic reactions, rapidly worsening symptoms, and emergencies in infants or pregnancy all mean skip AI and call the appropriate resource for help and/or 911.
- Provide full context in your queries: Age, sex, important medical history like that involving your heart, lungs, kidneys, liver, medications, allergies, exposures, and objective numbers (like blood pressure or glucose). This will make outputs more useful.
- Demand probabilities, not certainties: Ask for several possible explanations, what could make each more or less likely, and which are dangerous to miss.
- Verify with authoritative sources: Always cross-check AI answers with NIH, CDC, specialty societies, or your own clinician.
- Never change prescriptions or a treatment plan based on AI: Do not start, stop, or adjust medications without medical supervision.
- Protect your privacy: Limit personal identifiers when using AI tools; be mindful of how your data may be stored or shared. Be aware that AI is not HIPAA compliant.
- Recognize bias and blind spots: AI may miss conditions in women, children, older adults, and underrepresented groups.
- Use AI as a prep tool for your doctor visit: Summarize your concern, timeline, signs and symptoms, what makes them better/worse, treatments tried, and top questions.
- Trust your instincts: You know your body better than anyone. If something feels wrong, seek real care irrespective of what AI suggests.
In conclusion: an assistant or partner, but not a replacement
AI has enormous potential to empower patients, increase understanding, and streamline care — but it must be treated as an adjunct or partner tool, not a replacement for clinicians. Remember to keep in mind AI’s limits, verify its output against trusted sources, and follow best practices. AI is most useful for helping patients become more informed and engaged, while accountability remains with licensed professionals.
The future of health care is increasingly digital with EMRs, Google, and now AI — but safety, individual context, and human judgment will always remain the essential ingredients for optimal clinical outcomes.
AI safety and best practices: one-page guide for patients
When to use AI
- Education on conditions, terminology, and guidelines
- Preparing questions for your doctor
- Lifestyle and wellness information (diet, sleep, exercise)
When NOT to use AI
- To make or confirm a diagnosis
- To decide on urgent/emergency care
- To start, stop, or change prescription meds/doses
Red flag situations: Seek immediate care
- Chest pain, trouble breathing, new weakness/numbness/confusion
- Severe headache “worst ever,” high fever with stiff neck
- Severe abdominal pain, heavy/continuous bleeding, allergic reaction
- Rapidly worsening symptoms, pregnancy concerns, infants with illness
- Bleeding that continues despite pressure, dressings, or ice
Best practices for safe use
- Give complete context (age, history, meds, allergies, exposures)
- Ask AI for probabilities, uncertainty, and danger signs
- Cross-check AI with NIH, CDC, Mayo Clinic, Cleveland Clinic, or specialty societies
- Protect privacy: Avoid sharing identifying information
- Do not self-medicate or change prescriptions based on AI advice
Practical tips
- Use AI as a prep tool for your doctor visit
- Keep a concise note with concern, timeline, questions, meds/allergies, key numbers
- Document symptoms and test results with dates
- Trust your instincts
Quick prompt template
“I am using this for education only. Give possible causes, danger signs, and questions to ask my clinician. I am a [age/sex], with [conditions/meds/allergies]. Symptoms started [when], are [better/worse with]. What would make this urgent?”
Alan P. Feren is an otolaryngologist.