Dentist

/ˈdentɪst/ noun

Definition

A doctor who is trained to care for people’s teeth and gums, including cleaning, repairing, and removing teeth.

Etymology

From French *dentiste*, from *dent* meaning 'tooth', from Latin *dens* ('tooth'). The specialized term developed as dentistry became its own medical profession.

Kelly Says

The word is literally 'tooth-ist'—a specialist of teeth. Dentistry is one of the few medical areas almost everyone visits regularly, which makes the dentist a kind of 'maintenance engineer' for your mouth.

Ethical Language Guidance

Gender History

Dentistry, like many medical professions, was historically male-dominated, and language and imagery often assumed male dentists and female assistants or patients. Women dentists faced barriers to training and professional recognition.

Inclusive Usage

Use 'dentist' as a gender-neutral professional term and avoid assuming a dentist's gender; specify pronouns only when known or relevant.

Empowerment Note

Women dentists and dental researchers have significantly expanded access to oral healthcare and diversified the profession.

Related Words

Explore More Words

Get the Word Orb API

Complete word intelligence in one call. Free tier — 50 lookups/day.