The feminization of health care is here. And that’s a good thing.
The feminization of health care is fundamentally changing care delivery in the United States and it is doing so in ways that will accelerate the pursuit of improved quality and affordability.
Historically, health care providers and health care leaders have been selected for and nurtured traits that are traditionally seen as “masculine” — traits such as heroism, independence, and competition. Yet it …