A veteran came into my clinic with quiet frustration in his eyes. He was 100 percent service-connected and medically retired, struggling to survive in San Diego. He handed me a DMV disability form—something that could save him hundreds of dollars in car registration fees. But another VA doctor had already refused to sign it.
The form asked whether his disability caused “an impaired gait.” It was vague. He had chronic back pain and limited mobility from service-related injuries—but he didn’t use a cane or walker. He also had post-traumatic stress disorder. His last doctor said he didn’t qualify.
Now I had to decide.
I could see what the form meant, but also what it did. Bureaucratic ambiguity like this often alienates the very veterans we’re supposed to serve. When we split hairs, they walk away feeling dismissed and disrespected.
And there’s more to gait than muscle and bone. PTSD affects balance, coordination, and motivation to move. Anxiety can change posture. Hypervigilance can make walking in public physically exhausting. This veteran didn’t need a walker to be mobility-impaired—he was already living with the daily friction of chronic pain and mental illness.
I didn’t have time to comb through policy manuals or call another colleague. I needed help—fast. So I turned to ChatGPT.
I typed in the form’s language and the patient’s diagnoses. In seconds, the AI gave me a clear, structured explanation of how “impaired gait” might be interpreted, including clinical examples and reference points. It didn’t tell me what to do—but it gave me language, clarity, and confidence. It fit seamlessly into my work—no disruption, no delays—just a quick tool to help me think clearly in a tight spot.
I documented my reasoning and signed the form.
A week later, the veteran sent me a secure message:
“Thank you for taking the time to understand. You didn’t just save me money—you made me feel like I mattered.”
This wasn’t about AI replacing my clinical judgment. It was about using AI—quietly, unobtrusively—to help me cut through red tape and advocate for someone who needed support.
We talk a lot about the promise of AI in health care. Here’s a case where it delivered. No algorithm made the decision. But in the middle of a jam, with a frustrated veteran in front of me and a vague government form in my hand, AI helped me do the right thing—quickly, confidently, and compassionately.
David Bittleman is an internal medicine physician.