Nurse

/nɜːrs/ noun; verb

Definition

Noun: a trained person who cares for sick or injured people, often in a hospital or clinic. Verb: to care for someone’s health, or to feed a baby with milk from the breast.

Etymology

From Old French “norrice” or “nourice,” meaning “wet nurse,” from Latin “nutrire,” to nourish. The meaning expanded from feeding babies to caring for the sick in general.

Kelly Says

The word began with feeding infants and grew into modern professional health care, which shows how central care and nourishment are to medicine. When we say someone is “nursing a grudge,” we’re metaphorically ‘feeding’ a negative feeling so it grows.

Ethical Language Guidance

Gender History

Nursing has a strongly gendered history, with the role constructed as women’s work tied to stereotypes of natural female caregiving, even as it became a highly skilled profession. Male nurses have faced stigma, and women nurses have often been subordinated to male physicians despite extensive expertise.

Inclusive Usage

Use “nurse” without assuming gender; specify “nurse” rather than “male nurse” or “female nurse” unless gender is directly relevant and consented to.

Inclusive Alternatives

["nurse","registered nurse","nursing professional"]

Empowerment Note

Women nurses have been central to advances in public health, wartime medicine, and hospital care, frequently without proportional recognition or leadership opportunities.

Related Words

Explore More Words

Get the Word Orb API

Complete word intelligence in one call. Free tier — 50 lookups/day.