Much of the news we hear about primary care is disheartening at best, frightening at worst. Most of us have heard that there is a shortage of primary doctors, and with over seven million new patients enrolled (and counting) through Obamacare health exchanges, that shortage is going to get much worse. On the horizon, while medical schools are increasing their enrollment, there continues to be a dearth of students willing to apply to the field of primary care. And who can blame them when primary care doctors continue to be paid half of what the average dermatologist or cardiologist earns.
However, there is hopeful news. Last year, I went to the American Academy of Family Physician’s (AAFP) annual conference in Kansas City, tasked with recruiting medical students to apply to Columbia’s family medicine residency program. I spoke with dozens of students who were incredibly passionate about primary care. At an AAFP conference, that’s not a surprise. What was a surprise was how nearly all of the students were talking about how interest in primary care is surging at their medical schools. Indeed, AAFP student membership has soared to an all time high of 26,000. And this year, Columbia’s family medicine residency program, along with most other primary care programs in the country, continued the recent trend in the rising quantity and quality of medical students applying to the program.
But why the sudden interest in primary care? Are medical students suddenly becoming more selfless? While possible, that’s unlikely. Medical students today remain as competitive as ever. More likely, the underlying dynamics in medicine are shifting, and more medical students are starting to see it. In fact, when I asked the students at AAFP why they were choosing primary care, the two most common reasons were not at all surprising: prestige and money. (Though they never use those exact words, at least during interview season.)
Before discussing this recent rise in primary care, it’s important to understand its fall. It wasn’t all that long ago that one in two physicians in America were primary care doctors. There were fewer specialties back then — emergency medicine only began as a field in 1979, and the term “hospitalist” wasn’t coined until 1995. That meant there was more variety in primary care practice. One could see patients in clinic, pick up an ER shift, round in the hospital, and co-manage patients in the ICU. With fewer specialists, it was more common for generalists to manage interesting and complicated cases. Charting mainly involved quickly scribbling illegibly on a clipboard and moving on to the next patient.
However, as medicine has become more complex, we have managed that complexity by creating more specialties and sub-specialties. My last attending in the ICU was a nephrocardiologist — a kidney doctor specializing in heart failure — a field I didn’t even know existed until last month. While this has meant better care for patients in many respects, it has also meant a serious decline in the breadth and vitality of practice for many primary care physicians.
Along with this slashed responsibility came cuts in pay. In 1982, the gap between primary and specialist pay was only about $30,000 per year. However, at the time, the cost of health care was rising quickly, and there was no set method for how Medicare and insurance companies should pay doctors and hospitals. In general, a doctor generated a bill, “a usual and customary fee,” and Medicare paid it, whatever it was. That all changed in 1985, when in an effort to rationalize the way we pay doctors, William Hsiao, a Harvard economist, developed the relative value unit (RVU) scale. The scale attempted to determine how much work was involved in each doctor’s tasks, relative to other doctors’ tasks.
For example, Hsiao’s team deemed that a hysterectomy required 3.8 times more mental effort and 4.47 times more technical skill than a psychotherapy session. They used their algorithm to create RVUs for the several thousands of activities that physicians perform daily. Whatever one’s feelings are about the scale — Atul Gawande compared it to “being asked to measure the exact amount of anger in the world” — it became the basis for how doctors are paid. Thus began the devaluation of primary care. By 1990, the gap between primary care and specialist pay had increased by 66 percent to $50,000. By 2003, the gap had increased 120 percent to $74,000 per year, adjusted for inflation.
Meanwhile, along with reduced vitality of practice and reduced pay, came exponentially increasing administrative work from insurance companies with new and different billing codes, all combining to making primary care depressingly less attractive. It had become quite common to hear of physicians seeing patients all day and spending most of their evening charting on computers, for no additional pay. It is not surprising that from the late 1980s to the late 2000s, this country saw a significant decline in the proportion of doctors becoming primary care physicians.
Nowadays, so-called “lifestyle specialties” like dermatology, where physicians perform frequent procedures and get the highest pay-per-hour, have become the most desirable and competitive residencies in medicine. Medical students applying to dermatology had the highest median board scores and the highest percentage of candidates in medical honors society. Only 61 percent of dermatology residency applicants matched into dermatology this year, while 98 percent of internal medicine applicants matched and 99 percent of family medicine applicants did.
Salary discrepancies and lack of variety in day-to-day practice have played a large part in the fall of primary care as a sought-after medical specialty, but despite these trends, interest in primary care is resurgent.
Anoop Raman is a family medicine resident who blogs at Primary Care Progress.