Skip to main content

Women’s Health Initiative

Meaning

The Women’s Health Initiative (WHI) is a landmark, large-scale, long-term national health study sponsored by the National Institutes of Health in the United States. It focused on the most common causes of death, disability, and poor quality of life in postmenopausal women. The study encompassed both a rigorous randomized controlled trial and an extensive observational study component, investigating the effects of postmenopausal hormone therapy, diet modification, and calcium/vitamin D supplementation. Its published findings profoundly and lastingly reshaped clinical practice regarding hormone replacement therapy globally.