I just realized something. With the massive movement from Western medicine to holistic health, women are taking the lead.
Holistic health practitioners are about 70% women.
Their clients are about 70% women, although more and more men are becoming clients every day.
In a way, this is the feminization of medicine. Everything about holistic health and the Eastern medical traditions (traditional Chinese medicine, ayurveda, etc.) are much more balanced with feminine and masculine styles. The Western medicine is very much a cowboy, masculine style. "Don't worry, I'll pull out my knife and cut you open and pull out that tumor! Boo-yah!"
This is an interesting phenomenon to me. What other parts of society are being feminized?