health care
nounDefinition of health care
: efforts made to maintain or restore physical, mental, or emotional well-being especially by trained and licensed professionals
—usually hyphenated when used attributively health-care providers
Keep scrolling for more
Keep scrolling for more