Skip to main content

Women’s Health Initiative (WHI)

Meaning

The Women’s Health Initiative (WHI) is a comprehensive, long-term national health study focused on major causes of morbidity and mortality in postmenopausal women. It investigated strategies for preventing cardiovascular disease, cancer, and osteoporotic fractures, providing critical data on women’s health.